Julia Sta Romana

Top Contributor
Top Poster Of Month
Joined
May 2, 2017
Messages
370
I read this article from Search Engine Journal yesterday:


Facebook will now be targeting individuals, not just pages and groups, who are spreading misinformation and fake news.

Personally, I think it's a good thing. We all have that crazy relative who just won't stop spreading conspiracy theories. Facebook is a private company. They do have a right to control and censor the kind of content that goes on the platform.

But some people worry that Facebook is going a bit too far with this.

But what do you guys think? Is this good for us in general? Not really?
 

djbaxter

Administrator
Joined
Nov 10, 2016
Messages
2,468
I also like the idea, although I question whether Facebook has the infrastructure to actually do this effectively.

There is far too much misinformation, disinformation, mythology, denialism, anti-science, and conspiracy theorizing on the net — and unfortunately too many gullible or wilfully ignorant people reading and eager to embrace it.
 

Julia Sta Romana

Top Contributor
Top Poster Of Month
Joined
May 2, 2017
Messages
370
I also like the idea, although I question whether Facebook has the infrastructure to actually do this effectively.

There is far too much misinformation, disinformation, mythology, denialism, anti-science, and conspiracy theorizing on the net — and unfortunately too many gullible or wilfully ignorant people reading and eager to embrace it.
I agree. And how would Facebook deal with people and groups who use ads to promote misinformation? I wonder how much of their ad revenue come from these groups. What are they going to do? Deny the ads? Add fees to allow these ads to go through?
 

djbaxter

Administrator
Joined
Nov 10, 2016
Messages
2,468
Google does refuse ads from such groups via their quality control department.

I don't know about Facebook. Do they even have a quality control department?

I don't always like what Google does but I have much less faith in Facebook.
 

VirtualGlobalPhone

Moderator
Joined
Apr 22, 2016
Messages
1,172
I have a slightly inward direction, Instead of they going after article of others they should go after their own revenue generating ads. Today more than 70% of the ads on the facebook is spammers / cheaters and junk sellers. They have lots to clean, will they is the question.

Every now and then they get more problems than they can handle - finally government agencies have to intervene.
 

djbaxter

Administrator
Joined
Nov 10, 2016
Messages
2,468
Their "sponsored" articles and ads are more often than not scams or clickbait. I definitely agree with that part.

But is government intervention the best answer? Perhaps it is... I don't know. But if you let government into one door, you let them into the building and where does that path lead?
 

VirtualGlobalPhone

Moderator
Joined
Apr 22, 2016
Messages
1,172
Its absolutely doable but govt wants easy value and don't want to do their part.

Govt when depend on product - services, employment and corporate Taxes then why they cant take front seat and give guidelines and process which don't allow scammers to use the new or large platform like FB, Twitter , Youtube etc
 
Last edited:

Dora Wi

MVP
Joined
Aug 19, 2020
Messages
55
I also like the idea, although I question whether Facebook has the infrastructure to actually do this effectively.

There is far too much misinformation, disinformation, mythology, denialism, anti-science, and conspiracy theorizing on the net — and unfortunately too many gullible or wilfully ignorant people reading and eager to embrace it.

Google does refuse ads from such groups via their quality control department.

I don't know about Facebook. Do they even have a quality control department?

I don't always like what Google does but I have much less faith in Facebook.

I wonder too if they have the infrastructure for this. Also, I'm curious about where they would draw the line, there are many gray areas between misinformation and proven facts.

As for quality control, they do have staff who review posts to stop the spreading of content promoting or displaying violence and gory images. I heard about this from an article that detailed the horrible work environment of these people, having to look at all this content without any help to combat the mental health effects of it. This was quite a while ago, though, so I don't know if the situation has changed since then.
 
Top Bottom