Facebook No Longer Relying On Humans For News
Facebook will now deliver its news to users by means of an algorithm that will pull relevant articles from new sources sans any type of human intervention, raising concerns over the potential for this change to contribute to the spread of mis-/disinformatin online.
This article is more than 2 years old
Social media is changing. We’ve seen the evolution over the last couple of years and now we can’t deny it. Facebook is losing money and is now called, Meta. More recently, Twitter’s blue bird of happiness seems to be bringing nothing but doom and gloom. So it’s not surprising to learn that Meta (Facebook’s parent company) is no longer using actual humans to curate the news it pushes out.
According to TechCrunch, Facebook will solely rely on its algorithm to pull nationwide news and push it out on the platform. This is a change because three years ago when Facebook introduced the “News” tab, it was curated, and thought about, prior to Facebook publishing.
Now, a non-human computer program will give billions of people their news on a daily basis. We’ll let this sit with you for a moment, to just contemplate. Remember the last election? Eight in ten people get their news from a smartphone social media platform, including Facebook News.
During the last election, theories and rumors of election fraud overtook the news. And that news was coming from social media, curated by people who made judgment calls. Well, now the software will determine what gets published. Can software make judgment calls or think critically? Um, no. Not yet anyway.
Facebook is putting money into what consumers are using most. And that is not news, says the company. The news we’re talking about here is tucked away under a tab that says, “News.” Most people don’t use features past the main page scrolling function. And that, they consider NEWS.
On the one hand, Facebook is right in shifting its resources to features used most often. Considering that most people get information from Facebook’s home scrolling screen, and not Facebook news, it might seem like a good move to add resources there–just to check in to make sure people are being responsible. If you thought that, you’d be wrong.
Meta recently axed its entire team which helps it make ethical decisions. The Responsible Innovation Team was shown the door a couple of weeks ago as a preparation for this, which goes beyond Facebook news changes.
Other Meta team members were let go in a massive overhaul of priorities. So, what are the priorities, then? If it’s not news, and not making ethical decisions, what is it? It’s the Metaverse. A place where anyone can be anything and do pretty much anything.
Meta is making serious changes to its company and Facebook, and Facebook news is suffering a bit. It’s almost like Facebook is having a bit of a makeover–or makeunder if you will. As the Metaverse takes centerstage, online quizzes telling you which Game of Thrones character you are will go by the wayside.
As meta focuses on a place where people can meet virtually in the same place, but look like cartoons, the world will continue to turn. People will continue to look for news in a place that no longer exists.