Facebook is tackling the Fake News phenomenon. Great, and good luck to them. The question may be: “Will people, having already been emotionally swayed by an ideological or political assertion, actually be amenable to rational argument and the presentation of verifiable facts ?”
People who have made up their mind on an issue very rarely recant, even when presented with compelling evidence. Human psychology may be in some way essentially, fundamentally and logically discontinuous. Fact does not always trump belief or opinion in shared communication across the social or cultural cyber-spaces in which emotionally hyper-sensitive debates and arguments rage like wildfire.
An advisory of the doubtful provenance or truth of an assertion is an admirable feature but in the end, people may simply click past it like they click through so many other things – just like the desensitised big-city dweller who “turns off” and “tunes out” to all the extraneous noise and distracting hyper-stimulation. Facebook is arguably culpable of creating this kind of (virtual, cyber, social-media) environment of rational desensitisation with the overwhelmingly cluttered information-space they present. So much of the User Interface is cluttered with advertising and identity-farming trivia that people may have developed largely autonomous sanity-preservation and psychological survival mechanisms for switching off the constant drone of technologically-mediated white noise.
Something clearly needs to be done and this is a step in the right direction but I think this represents the starting point of a Sisyphean task and, perhaps, is really the opening salvo of the Online Truth Wars.
Context: Facebook just made it harder for you to share fake news
Related: On the Internet, Nobody Knows That You’re A Russian Bot