Clusterfackt

Mike Dwyer – Anchor Staff

Nowadays, facts come at you fast. News cycles and timelines are on overdrive. Fake news is common place. Clusterfackt is an ongoing series that asks readers to question everything. Think of it as an exercise in critical thinking. Each week readers will be given a giant clusterfackt of scientific findings meant to replicate the dizzying news loops that dominate our lives. However, there’s a catch. One statement within the clusterfackt is entirely false. Identify the falsehood and win a prize by emailing editorinchief@anchorweb.org and don’t repeat anything you read here without doing your research!

Since this is the last issue of Clusterfackt, there will be no lie lurking between the following lines of light-hearted research — just cold, unforgiving science.

Last week, we discussed the newest revelations and fall-out from Facebook, that Google no longer tells their employees “Don’t Be Evil,” that a long term study conducted by MIT determined that Twitter diffuses false news significantly farther, faster, deeper and more broadly than the truth in all categories” and that “contrary to conventional wisdom, robots accelerated the spread of true and false news at the same rate, implying that humans, not robots, are more likely responsible for the dramatic spread of fake news.”  We ended last week’s clusterfackt with a promise to spend this final issue analyzing the specific ways that Facebook, Twitter, Google, YouTube and others have hijacked the human brain and compromised our free will.

Many of the highest paid techies working for Google and Facebook have studied in undergraduate and graduate programs that combine the science of technology with the science of persuasion. Leading the pack is Professor BJ Fogg, founder of Stanford’s Behavior Design Lab, who has coined a term for this hybrid field of study- captology. He has created a behavior model which asserts that a behavior occurs when three factors converge at once – ability, motivation and triggers – leading to his motto “place hot triggers in front of motivated people.” According to his website, Fogg “teaches innovators how to use his models and methods in Behavior Design. The purpose of his research and teaching is to help millions of people improve their lives.”

And what is the result of this education? Using behavior change, students learn how to make social media users addicted to the platform — in a very literal sense of the word “addicted.” Without question, addiction is the behavior that they are provoking. Apps, social media platforms and devices are designed to hold our attention, which is how the company makes money. The more time and attention a user devotes to their product, the more money the company makes. The way tech companies do this is not always ethical. One such example is their peppering of intermittent variable rewards into the code of their products – the same design that can be found in slot machines and other forms of video gambling – which can ultimately lead to legitimate addiction.

For example, when you first log on to Facebook or Twitter there is a brief pause before your notifications appear. This pause causes anticipation in the user, resulting in a surge of dopamine. According to the current director of the US Institute on Drug Addiction Nora Volkow, it is not the reward itself which gets users addicted, but the anticipation of the reward. According to her research, users often report feeling a decreased pleasure in the reward over time and yet cannot stop themselves from seeking out that reward.

Through positron emission tomography, Volkow discovered that the surge in dopamine upon receiving the reward decreased over time whereas the surge that coincides with the phase of anticipation increases. It is that surge of dopamine when the user is anticipating the reward that gets them addicted and explains why addicts report less pleasure and satisfaction with the reward yet cannot stop themselves from seeking it out.

So, are social media and technology users similarly chasing the dragon? Every time your phone dings or beeps or buzzes, you get a little surge of dopamine as you anticipate the reward: a message from a potential love interest, an event invite, a like or thumbs up, a catchy news story that excites your emotions. You may stay logged on because of a notification, but then something in your news feed caught your attention and long after receiving the reward you’re still engaged with the product. Why were you brought to the news feed first? That’s the choice they gave you and once they have you- the motivated user- they will continue to place hot triggers in front of you to keep you logged on.

Additionally, the tech giants exploit universal social anxieties. Each time someone interacts with you there is a feeling that the gesture must be reciprocated. Facebook and others exploit this need for social decorum by including such features as informing others when you’ve read their message and telling them when you’re online. Your profile pic? According to BJ Fogg it’s your brand and the most important thing on your profile page. Comments, hyperlinks, autoplay, suggestions, tags- all of these hot triggers are a product of captological study.

These are just a few of the techniques being used right now but there are hundreds, if not thousands, of books, labs and programs that teach techniques like this to a very small and homogenous group of individuals who will go on to affect the behavior of more than two billion people globally. That’s more than any single government, religion or ideology, and it should concern us enough to put down our phones, even if for just a moment to look up, notice our surroundings and see we are not alone.

For more information on this subject and possible opportunities to volunteer and advocate for a more humane design to technology, please refer to the non-profit organization Time Well Spent, founded by Tristan Harris, a former tech insider turned cyber crusader.