The recent revelations surrounding the manipulation of Facebook news feeds to affect the emotional state of users should not be understood as an isolated social experiment, but rather as one point within a far broader setting in which algorithms are powerful forces in everyday life.
The recent reports detailing the use of Facebook news feeds to control information exposure should probably not be too much of a surprise. Yet, understandably, the idea that these news feeds have been used in an attempt to manipulate emotions has caused some concern. The reports suggest that an experiment was conducted in which personalised news feeds were used to prioritise positive and negative news about Facebook friends. The intention, it would seem, was to affect the emotional state of those in receipt of the news items. It is entirely fitting that some commentators have pointed to the worrying possibilities of political misuse opened up by such systems.
But this is not a point of concern solely because of the type of social experimentation it deploys. What this represents is an instance in which the broader power of algorithms becomes momentarily visible. In this glimpse, we can see that software algorithms are powerful forces that may be active in shaping our lives. New forms of data in combination with algorithms, which are the decision-making parts of code, already have significant consequences for how we live and for how we relate to the social world. There is already a vast and complex politics of data circulations that make up the fabric of the social world. Algorithms play the part of manipulating and shaping these circulations of data and deciding what becomes visible and what does not.
Rather than being an exception, the Facebook news feed revelations might actually be understood as just one revealing example of the embedded part that algorithms now play in our lives. The power of algorithms is to be found in the way that they sort, filter and manipulate the things that we encounter. This is not new. This is part of an established set of media infrastructures in which we now live – the power of which has been escalating over recent years with the incorporation of mobile devices into our bodily practices, with new types of mediated consumption, streaming, and the general rise of data accumulation and extraction. If we pause to reflect, we can begin to imagine the scale of influence that algorithms are now capable of having upon our lives. Algorithms define what is visible to us. The result is that they have the power to shape our tastes, to reconfigure our interests and to potentially define how we understand and engage with the world around us.
In a social world that is saturated with data, the filtering of data is by no means inconsequential. Algorithms, in short, can shape what we discover, what we encounter, and what we see and hear. This is visibility through prioritization. The media we use bring certain elements to our attention whilst also drawing our gaze away from others. The reports about Facebook’s news feed algorithms may have caused consternation, but we should also consider these developments as being an embedded part of our lives. What we discover about the people in our social networks, what we read, what we listen to, what we watch, who we follow - these all have the potential to be shaped by algorithmic processes.
The embedded power of algorithms is an underlying presence that has vast consequences. Given its splintered and complex nature we know very little about this lively environment, nor do we know much about how these algorithms are designed or with what consequences. But what emerges from the Facebook news feed story is the critical point that these algorithmic media forms are not neutral. That is to say, it is not just when they are explicitly being used to manipulate emotions that they have consequences. These algorithms are always filtering and sorting, and as such, they are making decisions about what is visible. These are active systems that shape our encounters and our everyday experiences. In many instances they are largely invisible within the technical structures of which they are a part. Each of these algorithmic processes might look inconsequential: a recommendation of a TV show here, or a suggestion of who to follow there. But taken collectively, we can see their potential power.
Facebook’s use of filtering algorithms to shape news feeds offers a moment in which the presence of algorithms is illuminated. We are already living within media infrastructures that are defined by such algorithmic processes. The central thrust of the rhetoric around these systems will be that they are necessary in order for us to negotiate all the information that is out there. The suggestion will be that for our convenience, we will need some algorithmic assistance, predictions, recommendations and filtering. We need to think about the power of algorithms, and their consequences, in this context. And thinking through such questions requires an understanding of the established part that algorithmic processes are already playing in our lives.