The rise of the Information Pandemic

Patrick Shannon

Patrick Shannon

August 5, 2020

Currently the world is very sick, dealing with a viral outbreak covering the globe unlike anything ever seen before. A silent and invisible enemy spreading panic, confusion and unrest amongst the population.

Oh, yeah…COVID-19 is a pain in the ass too. But as if we needed anything else in 2020…there is another pandemic.

Only this one didn’t just suddenly hit the world out of nowhere. The spread isn’t caused through sneezes or physical contact. Masks won’t filter it. Standing six feet away isn’t far enough. While most people would name the coronavirus as the biggest crisis of 2020, I will remember this year as the peak of the “Information Pandemic.”

In the before times…

Information flows today at a pace that I believe humans were never meant to process at once. Not that the world ever lacked ways to spread propaganda, but it’s fair to say that most common chatter barely exceed the dinner table or water cooler when traditional media was still our main source for world events. Unless you had a famous (or infamous) name, the only chance for someone to gain a national soap box was writing a strongly worded letter to the editor (if even published).

Of course the web would blow all this wide open, but it still had a limited reach early on. Until flat monthly rates, hourly access fees kept time in cyberspace limited. Discovering new content absolutely required one to be proactive with a search engine. Using the web for anything like commerce, dating or research still had stigma or skepticism (as professors loved to remind you). With many households still not owning a home computer as the millennium turned over, the most viral content had trouble spreading to the masses still unplugged.

Later as chat rooms waned and message boards were still confined to subcultures and interests, emerging Web 2.0 platforms like blogging and online videos made it easy for anyone to publish their views and passions to the world. But these endeavors still required time investment to both produce (and consume) content, and getting exposure worth a damn involved signal boosting and SEO efforts.

The internet has always given everyone the ability to say whatever they wanted to the world. What was never promised was an audience to spread the word.

New world (dis)order

Fast forward to today. From phone apps, social networks, email inboxes, streaming movie queues, video game backlogs, photo sharing and much more, the digital world has never been so demanding of our time. But a loaded gun was left on the table when comments and feedback became attached to nearly everything we read online, tempting us to pull the trigger.

The ease of comments sections following articles is literally like handing out megaphones to the entire audience: one completely clueless on the topic is as empowered to shout as the speaker who holds the podium. Writers with expertise are placed on the same level as an ultracrepidarian, degrading news and topics into fights and squabble. Offering little succulence to readers, the only ones who benefit are content providers and ad networks that depend on such turmoil to artificially boost page interaction.

Compounding this ten-fold is social media: literally a never-ending comments section designed to keep people engaged by encouraging them to weigh in on anything and everything they’re not necessarily well-informed about. Not held to journalistic ethics and citing sources, the actions of social denizens contribute more towards proliferating misinformation / disinformation, anger and confusion than any “fake news.” While the platform has highlighted noble causes, welfare and movements, I often wonder if social media has brought a whole lot more trouble in return.

Mass discourse begins in 280 characters or less. Innocuous things are blown out of proportion. Heated discussion threads exceed the size of novels. Feeds stuffed with politics and crappy memes. Families at odds with one another over opposing views. News media reporting tweets and slacktivism over real world census. Suicide becomes an escape to harassment. Conspiracy theories. Toxic fandom. Cancel Culture. Clickbait. Hashtags. All whispering one seductive word…SHARE.

From screen annoyances like floating emoticons, a spastic and nonsensical chat thread impossible to keep up with and…uh, Mark Zuckerberg, Facebook live feeds are a distracting mess that encourage others to speak AT – not about – serious issues.

All that is only a mere PORTION of the things we deal with on a daily basis, with what amounts to undue stress over bits and bytes. Respite once afforded by stepping away from the computer is defeated by smart phones – virtually ensuring that we remain plugged into the chaos 24/7. The combination of being hyper-connected to numerous distractions, the lack of accountability from those passing around data and our angst over processing all the noise shapes the Information Pandemic we face today.

The loss of context

Rather than simply wag the finger at technology, we need to look to our own habits. With shorter attentions spans and more algorithmic feeds, it’s not surprising that content on them has devolved into emotionally-charged click-bait and share-bait. We frequently skim for things that catch our interest, and are more accustomed to being spoon-fed content by third-party sources rather than proactively seeking it ourselves.

Complexity is boring. But sensationalism or emotion-ridden language gets attention: often a mere headline is enough to goad people into passing the word around or hopping into the comments to fume. As readers become more reactionary, others are given opportunity to step in and replace the actual story with their own version of events. Maybe it’s even unintentional mistakes or confusing satire with truth. Or perhaps we are so inundated with it all that we go with whatever feels right. In all cases, we are in danger of losing the context behind what we’re seeing entirely. The modern web is full of examples.

A great example of a meme depicting only a hint of truth. While Trump’s income does match projections, it overlooks the context that it pays to be a former president in the private sector – particularly with book deals and the public speaking circuit.

Memes are the very antithesis of context. While fun for jabs at life’s quirks and idiocies, they’ve increasingly become vehicles to spread and enforce controversial points of views. The problem with seemingly “witty” memes is that they often oversimplify a nuanced issue by ignoring context. If not flat out incorrect, I’ve rarely seen a meme that didn’t rely on cherrypicked or dated info, faulty analogies or apples-to-orange comparisons to bend an argument. Using persuasive language (“I bet this doesn’t even get one share”) to misinterpreted quotes, the goal is to invoke guilt or outrage and manipulate you into sharing it around.

Video sharing is absolutely susceptible to losing context. But wait…if it’s caught on camera it must be true, right? Think again. What happens outside the frame is as much of a part of the story as what is captured inside of it. Clips edited down for time (or short attention spans) can misappropriate one’s words, and even raw video from misguided bystanders can be guilty of spreading a false narrative. (Worse yet, deepfake technology may soon have us questioning if what we’re seeing and hearing is even happening.)

I’ve said my entire piece about Twitter before, and the spontaneous nature of short-form platforms does no favors for involved subjects. Anyone from an angsty teenager to entire mobs can spin up a controversy out of something completely inconsequential to the public. (Others have even made a social experiment out of it…which worked a little too well.)

These examples are just a few ways how context is disregarded and lead people to believe just about anything they hear, whether about science, pandemics, Pizzagates or Bill Gates. Hungry for easy answers or self-validation, this line of thinking is highly contagious, essentially turning everyone (including influential leaders) into carriers who spread it around indiscriminately well beyond their own networks.

I don’t know about you…but that sounds an awful lot like how a virus behaves to me.

The quest for truth

These are dangerous times…when the world is so information-dense and people find, share and dispute anything they come across, what do we choose to trust and believe in? When I’m overwhelmed, I take a deep breath and keep perspective by remembering one thing that never changed with the digital age…and that is the objective truth.

The greatest creations and revelations of our time have largely followed the same process: hypotheses are formed and tested through iterative experimentation and directions. Findings are published and attempted by others. When firm evidence is widely accepted and the results are proven to be repeatable, we are on the path to establishing an objective truth. Time and evolution may challenge those conclusions again, but what matters is that we do the most with the knowledge we have at any given time.

If this sounds like the scientific method, you’re right. I don’t wear a lab coat, but research is very much a part of my job as a UX practitioner. I design and validate proposed solutions against targeted end-users, distinguish evidence from bias and assumption, and try other directions when findings are ineffective. Not to say I never go with gut instinct, but my reputation is based not only on my record but being trusted to make the right call for the greater good when the data disagrees.

Unlike the social web, science is NOT a town hall discussion where anyone can walk in and “argue” away a problem. Domain knowledge is crucial towards your capability and credibility to reach a conclusion in any given field. While some would consider this attitude elitist, it’s how we approach most professions in general: you wouldn’t necessarily trust your mechanic to wire the electricity in your house or tailor your clothing.

I think the unprecedented COVID-19 situation is very much an exercise in this process right now. Facing something we don’t quite understand yet, the public is quick to point the finger at scientists who seemingly “contradict” themselves based on new information and rapidly changing circumstances. We try comparing it to things we already know (the flu, common cold, etc), seek explanations with anecdotal data yet widely unproven, or simply find comfort in believing wild theories that others pass around to explain away the madness.

That said, healthy objectivity is a crucial part to learning and it’s equally irresponsible to blindly accept something just because an “expert” said so. But that comes with accountability…do you have enough information to challenge with substantial evidence? Humble enough to acknowledge when other findings sufficiently prove you wrong? Or do you simply disregard what you don’t like and continue to peddle the narrative that you wish to be true?

THAT is the difference between the pursuit of objective truth and subjectively believing whatever we like. In the end, reality is always the judge…even if the distorted information age conceals exactly what that is sometimes.

Fighting the pandemic

So what is the answer to eradicating the Information Pandemic? I honestly wish I knew…I think whoever figures that one out is absolutely deserving of the Nobel Peace Prize.

I’ve noticed as responsibilities have increased as I’ve gotten older, time has become a more precious commodity that needs to be carefully managed across both my professional and personal life. This involves anything from voting with my feet (or is that now Zoom?) on unsolicited meeting invites easily covered by an email, abandoning passion projects collecting dust, or deleting games from my Steam collection that I know I’ll never finish.

Point being…I think our daily digital flow NEEDS to be handled in the same way: ridding ourselves of potential distortion and take more responsibility for how we approach and share information. But I stress that this shouldn’t be mistaken for trapping ourselves in bubbles or echo chambers, outright rejecting what we don’t want to be true. Instead, it’s about staying open to new ideas while not wasting attention on unreliable sources that have absolutely nothing to offer.

Like town hall meetings, social platforms attract rambling weirdos resembling a form of performance art over poignant debate. But unlike the city council, YOU are not obligated to grant them time, attention or a platform.

Clean house of unnecessary noise within your social feeds. Not just of those overly political, toxic or angry friends, but those who seem to indiscriminately share every little thing they find on an hourly basis. They might mean well, but it shows a lack of discernment for content that might potentially waste your time. No need to be rude and unfriend anyone, just quietly mute them out of your feed.

Do your own research before sharing…ESPECIALLY for memes and videos. That’s what the web is for, isn’t it? Use a search engine to check sources: most times a particularly viral hoax or fabrication has already been covered by fact-checkers. (If you don’t know what to search for, try typing in some of the text verbatim from the content in question.) And yes, it’s fair to consider the track record and reputation of the source.

Ask questions about what you’re seeing. WHY are you sharing this…to enrich others or just chest-thump your point of view? (One is easily confused for the other.) Does this news depict a single or anecdotal data point? Is the information even current (or redacted)?

Seek out alternate, supplemental or extended recordings of videos. I’m dubious of anything involving numerous jump cuts or very short clips where context easily suffers. (C-SPAN is a pretty good source for entire recordings of political stuff.) Whether recorded by a news crew or a civilian bystander, once again use a search engine and check the story out.

With a few exceptions (such as tight-knit communities or warnings against foolish electrical advice on YouTube), don’t spend time on comments sections, live chats and social pages that attract the greater public (like news affiliates). Your interactions may be broadcast across your friends’ feeds, making you an unwitting carrier of the aforementioned content spread without knowing it. If you want meaningful discussion, join something in-person like a Meetup instead.

Twitter or Facebook shouldn’t be your window to the world. RSS feeds might be old-fashioned but not dead, and YOU are in control of what appears in your feed – not social algorithms. If you’re distrustful of American media, try watching other sources of news in the world like public television and international news networks streaming online. Talk to others (like doctors or nurses) directly involved in current events, but maintain perspective: someone dealing with few cases will have a different account from those within an epicenter. (Neither is wrong…but neither tell the entire story by itself.)

Finally, accept that not everyone is satisfied with facts and inconvenient truths. Some are simply not critical thinkers, and the worst ones are just too fanatical, loony or impressionable to be worth the time debating. Serious claims require serious evidence: the burden of proof is on THEIR shoulders. Don’t engage: just laugh and take comfort in the fact that they’re (hopefully) not authoritative figures in that subject.

Future coda

The social web isn’t going away anytime soon. And it need not always be so political or serious…we should keep sharing (with moderation) the things that we find entertaining, silly or fascinating. And sharing our own creations and talents absolutely represents the best of us.

Few people will probably say on their death beds that they wished they spent more time debating online…the things you would imagine are the very things you can be doing today. I’d also wager that one who spends just a little time passing knowledge or passion onto a single child will potentially do more to change the future than gigabytes of tweets and hashtags that will be buried by an algorithm in five hours.

We don’t have to always get it right…or even be right. The important thing is that we try our best to be proactive and stay focused, because succumbing to the Information Pandemic risks losing sight of the most important context of all.

Progress.

About Patrick Shannon

As a user experience (UX) designer / researcher based in St. Louis, I've worked with technology partners across the country to study end-users and create fictionless products and solutions that today's audiences connect with. In my spare time, I enjoy photography and building ideas out of anything from electronics to wood...still determined to build a life-sized replica of Optimus Prime someday.