You only like (me) when you're happy
This weekend’s news story about Facebook’s mass psychological testing ‘emotional contagion’ left me in something of an emotional flux.
There has been a lot of shock and disgust at this revelation – there’s been outrage reported in the press; the hashtag #Facebookexperiment has taken off, and close to me (and perhaps you) some of my Facebook friends have vowed to stop using the site until the T&Cs are changed. There’s also been some criticism of the findings of the research – not based on the ethics of the approach, but the research tools used: the data analytics tools not being appropriate for the analysis needed.
I’m interested for two reasons: persuasion and shock.
The first is how technologies can be used for persuasion: from well before the ‘Nudge Unit’ was set up, at a time when BJ Fogg was defining the term ‘Captology’ (the study of Computers As Persuasive Technologies) I have been interested in how we can design digital tools to persuade. There’s a lot written about the ethics of persuasion, how it differs from manipulation, but for brevity, this is about how we can design tools to help people act in ways they say they want to, but for whatever reasons struggle to enact. When someone wants to give up smoking but requires prompting and support, that’s where persuasive technologies can be useful. The same, potentially for moderating consumption of alcohol, saving money, or trying to get into new habits of healthy physical activity, or recovering from illness. We can design tools to support people in their ambitions but where short term behaviours challenge those actions.
But beyond the ethics of the approach and whether the T&Cs cover the activity sufficiently enough, it is the surprise that has really caught my attention.
Our Digital Makers work is founded on the recognition that we need to understand how digital tools are made and to understand how they influence/mediate the way we interact. Whilst Google provides a gateway to incredible amounts of information, it does so in a way that determines what information is relevant to your needs. Not all information is the same, some, determined by Google’s algorithms, is more relevant than others.
Similarly, Facebook doesn’t just provide you with a way of connecting with your friends and seeing what they’re up to, it does so in a particular way – determined by Facebook’s design. What counts as important and relevant; which of your friends’ status updates prompts a notification whilst others’ don’t; that you can like, but not dislike... The design of the technology, the algorithms at play each influence the way in which the tool can be used. When this is really personal, such as communicating with a friend, it becomes intrusive when it becomes clear that this communication is being mediated. This research experiment makes this mediation explicit, but it is happening on all digital tools whether for advertising, research purposes or more commonly for usability – trying to give you the best possible experience.
Jim Sheridan, a member of the media select committee was quoted in this Guardian article as describing the experiment as intrusive:
“They are manipulating material from people’s personal lives and I am worried about the ability of Facebook and others to manipulate people’s thoughts in political or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it.”
The article continues: ‘But other commentators voiced fears that the process could be used for political purposes in the run-up to elections”.
The mystery behind the mediation
Digital architecture influences us as much as physical architecture pushes us to cross the road in particular places or to queue in certain ways. The physical architecture is obvious to see, the digital less so. The 1992 headline ‘It’s The Sun Wot Won It’ is now infamous for showing how the press influences voters and politicians. As Aleks Krotoski points out in The Personal (computer) is Political – we know the explicit political affiliations of the mainstream press and perhaps even subscribe to particular papers because of it. Yet the political affiliations behind Facebook or Google’s algorithms aren’t well understood. This mystery behind the mediation means that there has been real surprise at this experiment, but the more we can support people to understand how technologies are made and work, the quicker we can use technologies knowledgably and effectively.
The level of surprise at the #facebookexperiment demonstrates how important #digitalmaking is, and how urgent we need to understand how technologies are made in order to use them openly and effectively.
I wonder if this story will persuade people to get involved in digital making...