Stories

Different Routes

10.02.2022

Different Routes: Algorithms cause outrage, irritation and delight – and tie us to Facebook and Google

Kuvitus: Hans Eiskonen

Algorithms are dumber than we think, yet they still manage to mess with our emotions. In journalist Johanna Vehkoo’s article, researcher Minna Ruckenstein explains how data giants’ algorithms have quietly entered our daily lives and started competing for our free time and even our sleep.

Text Johanna Vehkoo
Illustrations Hans Eiskonen

In May 2021, a rumour began to spread on social media and Internet forums that the Finnish tabloid newspaper Iltalehti was censoring its comment column. The basic argument was that Iltalehti was removing comments that support Prime Minister Sanna Marin or other left-wing and Green Party politicians, while publishing comments that are favourable to the right-wing populist Finns Party. The theory was fuelled by the fact that Iltalehti has outsourced the moderation of its comment section to Utopia Analytics, whose founder and Chairman of the Board is the Finns Party MP Tom Packalén. The moderation is mainly performed by an algorithmic system.

The tone of the comments dealing with the supposed censorship was outraged. The allegations spread widely enough that Editor Perttu Kauppinen started answering the questions and explaining how Iltalehti’s moderation actually works. 

“I received dozens of questions about it sent directly to me,” Kauppinen says.

He explained to the people asking about it that Utopia Analytics’ model does not perform all Iltalehti’s moderation; instead, the human moderators of the Finnish News Agency (STT) take care of some of it. Both the algorithmic system and the humans make moderation errors. The machine learns from the work humans do, but this means that the possibility of human bias is also transferred from the people to the machine.

The newspaper’s editors also responded to questioners via email. Many people who had sent questions even got to see the moderation decisions concerning their comments. They showed which decisions were made by humans and which by the machine. Errors in removing comments did happen, but typically they were made by humans. Iltalehti also published an article explaining that its moderation system does not favour the Finns Party.

“Our website receives between 300–400,000 comments per month. We will never achieve a perfect situation where machine or human moderators make no false decisions and display no bias at all,” Kauppinen says.

The newspaper conducted a report of its own and found no evidence that the Finns Party MP’s company had adjusted the artificial intelligence used to favour the Finns Party. 

The magic of algorithms 

“This is an excellent example of the emotions algorithms evoke in people,” says Minna Ruckenstein who works as an associate professor at the University of Helsinki’s Centre for Consumer Society Research. 

Ruckenstein has been studying algorithms and people’s perceptions of them from different perspectives for years, and is currently writing a book about algorithms and emotions. Another work on the subject, a book on everyday automation edited in collaboration with Ruckenstein’s Swedish and Australian colleagues, has been published in May 2022.

In the project Algorithmic Culture, which will continue until 2023, Ruckenstein’s international team has studied, among other things, the moderation of discussions, the development of the ethics of AI and a Swedish algorithm called Algot that makes decisions on social benefits. The themes of the project are daily life and emotions, power and the economy, ethics and technology.

“In everyday discussions, algorithms are associated with all kinds of conspiracies and magic. By studying everyday experiences and emotional reactions, we can actually get closer to the political debate and what drives current critique of algorithms.”

Many researchers talk about algorithmic systems, which work in a specific technical environment. This article, however, takes a slightly different approach. 

“When I talk about emotions and algorithms, I’m not referring to factual algorithms but something that people react to. They exist in the cultural imagination,” Ruckenstein explains. “In this sense, the algorithm has broken away from what it actually does in the field of technological development.” 

Ruckenstein has noticed that when people talk about algorithms, they start sharing the feelings related to digital environments. Algorithms leave no one cold.

Illustration: Hans Eiskonen

Then why doesn’t the algorithm know me?

Another example of the emotions algorithms bring up in people is the persistent belief that Facebook is eavesdropping on its users’ conversations through their mobile phones. You know the story: I was out walking with a friend and we were talking about a particular brand of sneakers and as soon as I opened Facebook, there was an ad for those exact sneakers. Ruckenstein’s interview material includes a large number of stories like this. They involve different theories on how this eavesdropping occurs and what kinds of incidents it relates to, but what the stories have in common are strange coincidences and treating them as evidence.

Facebook has always denied eavesdropping to phone calls or other conversations. There is no evidence that Facebook is spying on people’s conversations through their phones’ microphones. It doesn’t have to, it gets quite enough information about us as it is. 

“This idea of phone-tapping has become an affective fact,” says Ruckenstein. 

An affective fact is something that feels true even though there is no actual evidence to support it. An affective fact also gets stronger when denied.  

“If Facebook says they are not eavesdropping on phones, people think that’s exactly the kind of thing Facebook would do. People consider their own experiences of getting ads that seem very apt conclusive evidence of their phones being tapped.”

In the endless stream of online advertisements, the ads that stand out particularly are those that relate to a person’s own vulnerabilities. A woman who has not been able to get pregnant or does not even want to have children notices the adverts for fertility treatments or pregnancy and ovulation tests that are peddled to her. A balding man is troubled by ads for hair implants. Again, they both get the feeling that they are being spied on.

“Our research has revealed a lot of irritation about pregnancy test ads,” Ruckenstein says. 

“In reality, Finnish datasets are so small that, in practice, it’s not even possible to target advertisements to the exact woman who wants to get pregnant. Targeting is much simpler than that: age, gender, place of residence.”

When browsing a newspaper or walking down the street, you hardly notice an advertisement for fertility treatments, unless that is what you’re looking for. On social media, the ad penetrates your stream of content curated by algorithms. That is why it feels much more personal.

“Advertising pushes its way into an intimate space, and this intensifies emotional reactions.”

According to Ruckenstein, people learn about the logic of algorithms through targeted advertising when they discover that the information they have provided to a service comes back in some form.

“This feedback mechanism is important for our everyday understanding of algorithms.”

Ruckenstein finds it interesting that Facebook’s like button hasn’t attracted similar conspiracy theories, though maybe it should. 

“The like button constantly collects information about users. Although it has considerable spying power, it is not treated with suspicion.”

The thing is that Facebook’s like buttons exist on countless websites, not just Facebook. If you have recently signed in to Facebook, the service will be able to track your visits to other websites that include the like button. You don’t even need to click on it. In fact, Facebook stores data about users who don’t even have a Facebook account or who aren’t signed in to their account. The company talks about this on their website, but in a very general way: “The data we receive includes your user ID, the website you’re visiting, the date and time and other browser-related info.”

Google stores users’ browsing data for two weeks, but as far as anyone can tell, Facebook stores it for as long as three months. The company claims it does not use the data to track users. According to Facebook, the data is anonymised and cannot be connected to specific users. However, it is used for targeting ads on a larger scale. 

Similar irritation is evident, for example, when people complain about Spotify’s clumsy and inappropriate music recommendations. I only listen to sophisticated music, why is it shoving Ed Sheeran down my throat! If the service is tailor-made for me, then why can’t it be personalised properly!

“People have excessive and unrealistic expectations for what algorithms are capable of doing,” Ruckenstein explains. 

“Techno-optimists anticipate artificial intelligence to fix all the problems we humans can’t solve. AI will find a solution to poverty because it knows how to allocate resources better. Climate change will be solved by AI modelling which will forecast how things should be done. Of course, this is not how it’s going to go.”

Digital geography of fear 

Ruckenstein has built her work on the notion of structures of feeling by Raymond Williams and separated three structures into which the emotional responses to algorithms can be placed. One of them is the irritation described above, and that is what Ruckenstein finds the most interesting. 

“The feelings of irritation bring us closer to the logic of the algorithm, because they urge us to express what it is that annoys us about it. By doing so, irritation reveals the incompatibilities between humans and their algorithmic companions.”

Our lives are already closely intertwined with algorithms. We are in a daily symbiosis with them. By using social media, we have agreed to a trade-off in which we pay for services with our personal data, and now it seems difficult to withdraw that consent.

“Then we realise that algorithms stuff us into stereotypical, oversimplifying pigeonholes and, for example, the world of strict gender dichotomy,” Ruckenstein explains.

That is why we get annoyed when the recommendations algorithms make are mainly based on gender, location or age. After all, we are so much more than that!

The most common emotional responses associated with algorithms, however, are linked to convenience, “just feeling good or right”. Algorithms have made communication and being sociable much easier. You can take care of your bank transactions online and have food delivered to your doorstep. As algorithms streamline our daily lives, they have become an important part of digital infrastructures — so much so that it is hard to imagine living without them. 

The third structure of feeling relates to the fears and dystopias associated with algorithms. This is where Ruckenstein applies the concept of the geography of fear. In Finland, geographer Hille Koskela studied the geography of fear in the 1990s. According to her, women’s urban space is more cramped than men’s because they are afraid to go to certain places, especially at night. Although statistically, violence against women occurs more in homes than in parks, dark alleys and thickets feel scary.   

“The digital geography of fear operates in the same way. Fear has become an experience that defines technology relations. People fear privacy violations and hacking and think their phones may be listened to. At the level of the bigger picture, there are dystopias. Is the Internet going in a completely wrong direction? Will Russia and China soon be spying on everyone? These are fears we may not be able to verify,” Ruckenstein says.

“We cannot tell whether the dystopia is going to come true, but there is a structural experience many people recognise that all is not well in this world, that something is wrong. People often describe this feeling with the verb kuumottaa – heating up – a word that has no direct translation in English, as it is a local expression that refers to an emotional state or affect. What it indicates is an awareness of the negative effects of digital developments and the frustration that little can be done to improve the current situation. 

The digital geography of fear also includes situations where people decide not to say something because they are afraid of online hate speech and harassment, even if they have never experienced any themselves.

“People also fear for others, for example, that their aging parents click links and become victims of scams.”

Illustration: Hans Eiskonen

I wish the algorithm was more human!

In their research, Ruckenstein’s team has noticed that users often wish algorithms had more human qualities. 

“They want the machine to be so intuitive that it understands a person at exactly the right moment in exactly the right way. Active longing for more human traits is a new feature in our relationship with machines,” Ruckenstein says. 

“The algorithm works as a mirror that demonstrates how a human is different from a machine. Yet we wish the machine was more human-like and that our coexistence was effortless.” 

The marketing assertions of AI developers always promise a little too much, and the media often exaggerate these promises. 

“If Facebook, for example, says they have developed a method for emotion detection that allows advertising to target, say, young people in a particular type of vulnerable situation, we don’t have the means to find out if that is really the case.”

Data giants consider their algorithms to be trade secrets and don’t want to reveal much about how they operate in public. It is difficult for researchers to figure out what the algorithms of companies like Facebook and Google actually can and can’t do.

For example, in connection with the Cambridge Analytica scandal in 2018, there was a discussion about how the company used the data of millions of Facebook users to target political advertising. The data had been obtained by questionable means and used in the presidential campaigns of Ted Cruz and Donald Trump. Following the investigation of the case, large fines were imposed on Facebook in both the United States and the United Kingdom. The reason for these fines was the violation of users’ privacy and harmful use of data.

At times, news coverage on the topic promoted the idea that manipulating Facebook users is very simple. In reality, we still don’t know enough about who was affected and how much.

“It is possible that Cambridge Analytica helped mobilise some passive voters, and if the voters were in the swing states, that may have been enough to make a difference. Electoral influencing may be more effective in some places than in others,” Ruckenstein says.

“Of course, the Cambridge Analytica case is a scandal in any case. It made Facebook’s excessive data collection practices visible. We must not allow ourselves to get used to it and start to think that this is normal, but we should also not assume that data and technologies have a power over people that works in a straight-forward manner.”

In 2012, Facebook conducted an experiment involving 689,003 users without their knowledge. Known as the emotion experiment, it used an algorithm that identified updates as negative or positive based on specific keywords. Facebook was testing what happens when negative or positive content is reduced in people’s news feed. 

The researchers published the results in 2014. According to them, manipulating the emotional content of the news feed changed users’ emotional state. The effect discovered in the study was small, but the media exaggerated it.

This kind of experimentation happens all the time. Researchers refer to it as persuasive technology: people are pushed in different directions in various services using subtle changes in user interface. Then the results are examined to see which design has had the most impact. 

Persuasive algorithms aim to get a person hooked on the service so that they spend as much time as possible in it. Netflix’s official Twitter account, for example, reported in 2017 that the company is competing against sleep. 

“Smart people who have studied at the best universities in the world push and shove people, and intentionally intensify the addictive logic of the digital. I don’t understand how they justify it to themselves,” Ruckenstein says. 

We are all guinea pigs for data giants

Ruckenstein’s work on algorithms and emotions shows that users of social media and the Internet observe and feel algorithm-related emotions privately, but this has yet to turn into shared culture. A collective approach could lead us to move beyond complaining about our own use and to start demanding something better from the companies involved.

As data giants automate our everyday lives, we become more and more dependent on their services. We use systems that are in a constant state of experimentation – permanently beta, as technology companies say. Yet we lack a public debate about what the new algorithmic world order means and about the terms of our involvement. 

“We live in a kind of global living lab, where our reactions to new technologies are constantly being monitored. Every time we learn something, something new is placed in front of our noses,” Ruckenstein says.

The dating culture, for example, has gradually changed. So has our consumption of music, films and television series. And our communication with friends and acquaintances. The way we work. The way we buy things.

“We give ourselves over to various algorithmic relationships and don’t understand until afterwards what we have got into,” Ruckenstein says.

Longer-term social changes are difficult to detect. Soon we will have forgotten how things used to be done.

“There we are, wondering whether it was a good idea to buy a genetic test from some American company or go on Tinder and so on.”

People talk a great deal about their own social media engagement. A very common topic of discussion is limiting one’s use of social media. Ironically, it is also a subject often discussed on social media. People agonise over how much time they spend using the services. Excessive use of social media causes a hangover-like feeling. 

“People also talk about how showing only the shiny parts of their lives on social media is starting to shrink their existence. Then they give up social media altogether for a while, rebalance their lives and go back.”

Ruckenstein believes that the ongoing discussion about self-regulation among social media users is inevitable because the environment created by the data giants is based on the premise that nothing should be regulated. They constantly compete for people’s attention, which is why they penetrate our lives with as much energy as they can.

“This also creates a new type of social inequality. People who are capable of self-regulation have other things to focus on in their lives too, which is why they are able to self-regulate. But if there isn’t much else to focus on, why would you leave social media?”

Ruckenstein teaches a course at the University of Helsinki called Emotions and Technologies in Consumer Society. Her students think of ways they themselves could better operate in the environments of persuasive technology. That’s when Ruckenstein asks them, what if it’s not you? 

“It’s an environment that does everything it can to keep you there. It’s too much to ask, especially of children and young people, to regulate it themselves,” Ruckenstein says.

“We need much more debate about what is socially acceptable. Are we okay with companies competing against our sleep?”

Minna Ruckenstein is the Associate Professor at the University of Helsinki’s Centre for Consumer Society Research Centre, and the head of the Algorithm Culture project (2019–2023) funded by Kone Foundation. The book Everyday Automation: Experiencing and Anticipating Emerging Technologies, which Ruckenstein has co-edited, has been published by Routledge in May 2022 and is also available as an open access publication: https://doi.org/10.4324/9781003170884

Writer:
Johanna Vehkoo is a journalist and non-fiction author who heads Long Play‘s algorithm newsroom project.

What do you think of this story?

Please give us feedback on Different Routes long reads!