You must have come across social media platforms where even after reporting a fake profile, video, ad or any such content, one is not sure if any steps were taken to remove them or whether it has come in the notice of the reviewers also. Though the content may not be visible to you, it may be visible to another person as it could be under review.
To streamline this process, YouTube has come up with a Reporting History dashboard where users will be able to track the status of the videos that have been reported or 'flagged'. This will keep both YouTube and the user on their toes. This content will normally be removed if it is against YouTube's Community Guidelines. You can check them by clicking here.
The new update comes when nearly a year ago social media platforms like YouTube were accused of hosting extremist content, fake news and hate speech. YouTube did try to remedy the situation by bringing on more reviewers, but it wasn't enough. It also employed machine learning to strengthen the process. But there have been instances where YouTube has acted rather slowly to extremely controversial content. One such instance was Logan Paul’s post from Japan’s suicide forest where he made insensitive comments and remarks about a victim that he stumbled across.
Keeping aside the matter of how insensitive the video was, YouTube’s late response met with backlash. Paul was subsequently removed from YouTube Preferred Ads program and the YouTube Red partner program as well.
Despite setbacks like these, YouTube in a blogpost said that it has removed 8 million videos from YouTube in 2017. Most of these were either spam or from people who were trying to upload content adult content. Of these, 6.7 million videos were flagged by machines and out of this list, 76 percent were removed even before they received their first view.
YouTube in a blog post has elaborated on how it has been reviewing content over the years.