Facebook held a press event at Menlo Park to provide an update on maintaining the integrity of content that circulates through the Facebook family of apps.
Facebook is launching a new section on Facebook Community Standards where you can track updates they make and a Group Quality feature.
The company is collaborating with experts and third-party fact-checkers, reducing the reach of groups that spread misinformation and incorporating "Click-Gap" for people to see less low-quality content.
They are expanding News Feed Context Button to images, adding Trust Indicators & more information to the Page Quality tab. Messenger has now been updated with Verified Badge, Updated Block Feature, Forward Indicator and Context Button.
You would still have the right to appeal to decisions on individual posts. Facebook is using a combination of technology and human review to remove harmful content.
The company is reducing the content that isn't against Instagram's Community Guidelines but is inappropriate for the platform. Such posts would not be reccomended on Exlpore tab or hashtag pages.
For example, if a post is sexually suggestive but does not depict a sexual act or nudity would still get demoted.
The fight against inappropraie and harmful content has been going on since a while and probably will continue till the end of it's existence.
Recently, after the Christchurch attacks, Instagram identified 900 different videos of those 17 minnutes. After which, they made changes to their reviewing process and used AI tools to identify andd remove such content.