What Are Algorithms?
Algorithms are the invisible track guiding our interactions with technology and social media. These algorithms are often really helpful:
Google Assistant relies on algorithms that can track payment calendars to remind me when my credit card balance is due.
Other times we question the algorithm; when Facebook’s newsfeed starts getting clogged with advertisements rather than updates from friends and family, for example.
Ethical Algorithms
Until recently, we haven’t tended to think of the ethical significance of the algorithms we rely on as a society. The Russian bots that spread fake news on our Facebook newsfeeds have brought the issue into clear relief.
Yet there are some less obvious reasons for why we need to think deeply about our algorithms.
In a TED conversation, TED curator Chris Anderson talked about algorithms and social behavior with Netflix CEO Reed Hasting. Hastings explained why Netflix changed the underlying algorithm that personalizes the user experience and provides recommendations:
Everyone would rate Schindler’s List five stars, and then they’d rate Adam Sandler, The Do-Over three stars. But, in fact, when you looked at what they watched, it was almost always Adam Sandler. And so what happens is, when we rate and we’re metacognitive about quality, that’s sort of our aspirational self. And it works out much better to please people to look at the actual choices that they make, their revealed preferences by how much they enjoy simple pleasures.
Anderson responded:
But isn’t it the case that algorithms tend to point you away from the broccoli and towards the candy, if you’re not careful? We just had a talk about how, on YouTube, somehow algorithms tend to, just by actually being smarter, tend to drive people towards more radical or specific content…
This short exchange raises all sorts of ethical questions.
Is it the responsibility of a for-profit company to attempt to direct us toward “the broccoli?” And who decides what is broccoli and what is candy, and on what basis is that judgment made?
The TED talk on YouTube algorithms that Anderson referenced reveals a more pernicious issue: algorithms on YouTube designed to secure high viewership to generate advertising revenue have led to an environment that is remarkably unsafe for young children.
James Bridle begins his talk by discussing the seemingly innocuous “surprise egg” videos that have garnered millions of views on YouTube and are a hit with young kids. These videos are simple: people open up chocolate eggs to reveal hidden prizes.
Bridle points out that “if you search for ‘surprise eggs’ on YouTube, it’ll tell you there’s 10 million of these videos, and I think that’s an undercount.” It sounds innocent enough, but Bridle explains why it’s concerning: “These videos are like crack for little kids. There’s something about the repetition, the constant little dopamine hit of the reveal, that completely hooks them in. And little kids watch these videos over and over and over again, and they do it for hours and hours and hours.”
Unfortunately, super addictive but still kid-friendly content is not the only kind of video that YouTube’s algorithm unwittingly puts in front of young children. Bridle shows how just a few clicks can take a kid from an innocent cartoon to bot-created videos that feature those same cartoons in grotesque situations with weird violent or sexual undertones. Bridle warns that “this stuff really, really does affect small children. Parents report their children being traumatized, becoming afraid of the dark, becoming afraid of their favorite cartoon characters.”
As disturbing as the content itself can be, the deeper concern is that we often don’t even know the source of these videos. Bridle asks:
Is this a bot? Is this a person? Is this a troll? What does it mean that we can’t tell the difference between these things anymore? And again, doesn’t that uncertainty feel kind of familiar right now?
Bridle has one key piece of advice for dealing with these algorithms: “If you have small children, keep them the hell away from YouTube.”
That’s good advice as far as children go. But it doesn’t solve the bigger problem:
We built a system that seems to be entirely optimized for the absolute worst aspects of human behavior [. . .] we seem to have done it by accident, without even realizing that we were doing it, because we didn’t really understand the systems that we were building, and we didn’t really understand how to do anything differently with it.
These two TED Talks make it abundantly clear that both corporations and consumers need to think deeply about the ethical questions embedded in our technology. Thankfully, there have been some notable contributions to that work of thinking deeply.
Tristan Harris created the Center for Humane Technology in the hopes of “reversing the digital attention crisis and realigning technology with humanity’s best interests.” The Center provides practical steps for using technology responsibly.
Further Reading
We have put together a helpful guide about digital citizenship for parents that includes more research and helpful tips for guiding our children as they learn how to navigate the digital world.
Leave a Reply