Credit: Andrea Piacquadio via Pexels

Meta Will Allow Greater Moderation Control on Facebook

The social-media giant Meta is set to roll out new policies in coming weeks that will grant users greater control over what they see on their Facebook timelines. The new settings will allow users to decide whether certain content is “demoted” and some control over fact-checked content — although fact checks still cannot be completely turned off. 

“People will have even more power to control the algorithm that ranks posts in their feed,” said a spokesperson for Meta. “We’re doing this in response to users telling us that they want a greater ability to decide what they see on our apps. This builds on work that we’ve been doing for a long time in this area and will help to make user controls on Facebook more consistent with the ones that already exist on Instagram.”

How It’s Done Now

Currently, Facebook ranks posts, including ads, based on users’ profiles, to keep them engaged on the platform. Meta “demotes” or down ranks certain content based on moderation, and blocks other content entirely

The moderated content that is not blocked falls within a number of categories. Going forward, four of those categories will be subject to user control: lower quality content, like scams, unoriginal content reposted by websites to boost ad revenue, content that’s violent or graphic in nature, and fact-checked content that’s determined to be partially false. 

How it Will be Done

Users will be able to stop the low-quality and unoriginal content from being demoted. And they will also

Credit: Anna Shvets via Pexels

be able keep the violent or graphic content at the default level, or increase its moderation. 

On the fact-checked content, users in the US will be able to reduce how large the fact-checked notice appears on the post. They can make the notice smaller, or make it so large it covers the post entirely. 

Meta employs 102 fact-checker organizations to screen content and many users, left and right, have complained that political content is unfairly labeled or censored. Meta requires their fact checkers to be accredited, based on a code of principles, by  a group called the International Fact-Checking Network, which is a subdivision of the Poynter Institute.

Will it Solve This Issue?

The result is a single piece of content is sometimes rated by multiple fact checking organizations. Sometimes the ratings conflict, with one organization labeling the content false and another labeling it partially false. 

Going forward, the Facebook default will mean the content will be moderated to a greater degree, unless a user actively reduces the moderation setting. 

Picture of By Jim Daws

By Jim Daws

Jim Daws is Managing Editor for Innovation & Tech Today.

All Posts

More
Articles

[ninja_form id=16]

SEARCH OUR SITE​

Search

GET THE LATEST ISSUE IN YOUR INBOX​

SIGN UP FOR OUR NEWSLETTER NOW!​

* indicates required

 

We hate spam too. You'll get great content and exclusive offers. Nothing more.

TOP POSTS THIS WEEK

INNOVATION & TECH TODAY - SOCIAL MEDIA​

Looking for the latest tech news? We have you covered.

Don’t be the office chump. Sign up here for our twice weekly newsletter and outsmart your coworkers.