In its first quarterly Community Guidelines Enforcement Report, Youtube has allowed the users to view the status of videos they&rsquo;ve flagged for review. The report reviews the promises YouTube made in December last year that licensed users to handle abuse and review videos as per their convenience.The whole chaos was rattled up after a report by The Times. Lately, people complained about the Youtube policies violations which allowed paid advertisers to flash their commercials videos and that too with extremist content. Some of the content creators are facing the brunt of such actions, as they are complaining about the loss in monetization, even when they are fully complying with the Youtube policies.In a post on its official blog, the company said, &ldquo;We are taking an important first step by releasing a quarterly report on how we&rsquo;re enforcing our Community Guidelines. This regular update will help show the progress we&rsquo;re making in removing violative content from our platform. By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, the speed of removal, and policy removal reasons.&rdquo;&nbsp;FINALLY. @YouTube&#39;s new transparency report breaks out content flags by category. @ACLU_NorCal has long called for this necessary information. Your move, @Facebook. https://t.co/O0lsjHXwj7&mdash; Jake Snow (@snowjake) April 23, 2018&nbsp;It further added, &ldquo;We&rsquo;re also introducing a Reporting History dashboard that each YouTube user can individually access to see the status of videos they&rsquo;ve flagged to us for review against our Community Guidelines.&rdquo;The inaugural report reflected the data from October to December 2017. During this period, Youtube removed 8.2 million flagged videos. The majority of these videos were mostly spam or people attempting to upload adult content - and represent a fraction of a percent of YouTube&rsquo;s total views during this time period. Youtube&rsquo;s anti-abuse algorithms removed 6.7 million videos even before they received a single view. As per one of the instance given by Youtube, 8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views at the very onset of the last year.Other than that, 1.1 million videos were flagged by members of YouTube&rsquo;s Trusted Flagger program that lets a particular person to report the content. The list of Trusted Flagger program includes individuals, government agencies, and NGOs that have received training from the platform&rsquo;s Trust &amp; Safety and Public Policy teams.