Sentiment Manipulation Through False Information
Jessie Strongitharm
Student, SFU CMNS 253
There’s no ‘I’ in misinformation.
You got it, didn’t you? The joke. Clever, right?
I thought so too.
In deliberation over how to introduce my area of research within the realm of misinformation/disinformation/fake news/alternative facts--whatever your term du jour is that speak to the siege of modern politics--what I settled on was this quote: There’s no ‘I’ in misinformation. For this quip goes beyond just a witty aphorism. It epitomizes the kind of brash, extrospective thinking that lays the foundation so conducive to misinformation and disinformation. It also speaks to the dismissive and divisive principles that thrive within participatory online spaces. There’s no ‘I’ in the word because, the idea goes, you are not responsible for the obfuscation going on; it’s them.
Well, not so fast. What I ask of you, reader, is to openly embrace the fact that there is undeniably an ‘I’ in misinformation. It’s designation as 2018’s Word Of The Year--alongside 2017’s nomination of ‘Fake News’ and 2016’s ‘Post-Truth’-- spells out just how pervasive, invasive, and affective falsehoods have become in the digitized era. Our interdependence on new media ensures this. Though false information has a long history spanning since the beginnings of political propaganda and yellow journalism, what differs now is the ease at which they rapidly disseminate through online discourse. It is the communicative practices and structural politics of our social networking platforms which form the milieu for dis/misinformation to flourish. And in doing so, exploit our own psychological mechanisms for self-preservation that engender censure rather than constructive discussion and critical reflection. Here’s my main point: It’s not just them. It’s not just you either. It’s us.
At SFU, I am investigating how online communication practices instigate the proliferation of fake news as part of the School of Communication’s course on new media. In doing so, one thing that’s become exceedingly clear is the role of psychological processes, emotion and affect. While it’s comforting to imagine that our capacity for reasoning is what we employ when tasked with finding truth or reaching understandings, psychologists have pinpointed countless cognitive shortcuts like confirmation and implicit biases alongside bias blindspot that colour our perceptions in order to help us reach decisions faster. Respectively, these biases promote us to only accept information we have already decided is true, to primarily trust those belonging to our own ingroups (dare I say, political party), and to neglect viewing ourselves as subjective to bias in the same way that we view others. On top of that comes countless empirical evidence demonstrating how misinformation takes hold. One notable example is Hugo Mercier and Dan Sperber’s seminal work showing that the primary purpose of reasoning is argumentative: in debating, we typically reason to support a predetermined verdict, not to discern truth. This maps onto Dr. Dan Kahan’s research on motivated reasoning, another psychological construct that describes individuals’ unconscious tendency to shape their methods of cognitive processing--sensory perceptions, assessment of sources and credibility-- with some end goal in mind. Our willingness to perform the mental gymnastics required to protect our worldviews and minimize cognitive dissonance has been observed across multiple studies spanning many governmental affiliations and subject matters. Too often, debates and arguments tend to increase polarization between individuals rather than nurture mutual understanding. What that means for us in our current political climate is this: when confronted with spurious news designed to push our buttons through partisan and political operatives, our critical faculties are already fighting an uphill battle.
Now take this logic and log on with it.
Platforms are motivated to provide every opportunity for users to interact with those in their digital circles, as well as the ability to react to global phenomena and “news” content, thanks to their commercial-mandate to aggregate eyeballs. They galvanize our desire to contribute to the online imaginary of the public sphere, and in doing so provide the structural imperative to like/dislike, reply, repost, and generate constant digital commentary. This would be enough on its own to cause antagonism--as inevitably what makes us uniquely sentient is our consciousness, convictions, and constructs of identity that come preloaded with aforementioned affective biases--but add in digital disinformation and the discordance only increases. A study by MIT researchers found fake news travels faster than real ones, in part because of the novelty factor but also because of the anger and disgust-inducing affective qualities which encourage us to respond and retort. And whether you’re supporting or dissenting online, your reaction only further reinforces the circulation.
Unwinding the different causes and conditions for our disinformation age is a heavy undertaking. There are many diverse yet delicately interconnected aspects which warrant more than just a technological patch or a presidential change to address. Alongside the structure of social networks and their intensification of false narratives comes the psychological sentiments and social factors, the inflammatory appeals to affect, the us-vs-them mentality arising from our own biases. And while it’s easy to pin the blame on someone else (after all, we’re cognitively designed to do so), I encourage you to take a moment to reflect on how you engage in online discourse. Are you quick to react? Have you chosen a side? Do you always seek out the truth? Other considerations for mitigating this post-factual era include fostering ambient and ongoing curiosity, and making a concerted effort to expose yourself to multiple viewpoints. We can only be in charge of the change within ourselves, and that means owning the ‘I’ within misinformation. Because on the bright side... there’s also an ‘I’ in informed.
Confronting the Disinformation Age Blog Posts
-
May 15, 2019
Confronting the Disinformation Age
Asmita Lawrence, SFU Community Ambassador
-
Apr 18, 2019
Sentiment Manipulation Through False Information
Jessie Strongitharm, Student, SFU CMNS 253
-
Apr 13, 2019
Big Data... Bad Data? How Can We Stop the Latter From Occurring
Karugi Gathumbi, Student, SFU CMNS 353
-
Apr 12, 2019
In-faux-mation: John Gray
John F. Gray, CEO/co-founder Mentionmapp Analytics Inc.
-
Apr 4, 2019
Who Needs Vaccinations When You Have Healing Crystals?
Jasmine Kaur, Student, SFU CMNS 353
-
Jan 7, 2019
The Post-Truth Era — Should We Be Worried?
Luis Fischer, International Program Intern, SFU Public Square