Print Friendly and PDF
only search openDemocracy.net

Could populism be a side effect of the Personalized Algorithm?

And can we work together so that one day we can use filter bubbles against themselves?

lead Claudio Agosti: "During the French elections, our users (a,b,c,d,) were following the same news media sources and access to Facebook at the same time per day, one per hour. Despite all having 0 friends and only different 'likes', Facebook displayed different posts to these users in that hour." All rights reserved.

In a democracy, the way information circulates is a power in itself. The landscape of the so-called ‘fourth estate’ has metamorphosed out of all recognition in the last 15 years. Currently, social networking is the technology that is used to share our perceptions of the public space.

Given the addict-like compulsion to adopt such technologies, we have endowed those advertising firms who profit from this (that’s the business, the category, that they are in) a de facto role they do not deserve.

During recent decades, it has become clear that the information generated by your thousands of friends can be considered as so much noise if you don't want to listen to them. Algorithms have been developed by these platforms to replace this unorganized information flow, keeping us satisfied by prioritizing content more likely to enable you to interact, engage, or get the solution that you aspire to.

These communication platforms intermediate both public and private exchanges. But whatever their superiority to other communication networks such as the telephone or the television, they are far less transparent on the rules that govern our conversations.

Transparency is essential when it comes to the way that the public discourse gets shaped. This is what motivates us at Tracking Exposed: we want to make the Personalization Algorithm – the algorithm that learns from the user and provides personalized content – transparent and accountable.

Imagine yourself as a Facebook user; you follow a certain Nobel prizewinner among your sources, and they publish a post. On Facebook, the complex multi-paragraph text full of articulate concepts gets posted, and a few minutes later you access the article.

Something happens behind the curtain: a “posts competition”. All the posts you are eligible to see have been attributed a numeric value by Facebook, (probably) proportional to the likelihood that you are going to interact with them. The posts will show up ordered. In the primary position, the one that it is most likely you will interact with, and so on and so forth. Who won that competition appears on the timeline.

Coming back to our Nobel prize text, it has entered a competition with another more uncomplicated, and swifter gratification: a lovely landscape perhaps, or a puppy. But you have never told the algorithm that you prefer cats to the Nobel prize – the algorithm has just assumed this is the case based on your previous interactions.

Is this in our best interest? Is it censorship? Do we have some control over it? This is a question even the mainstream media are pondering because most of their visits (and consequently  their income) depend on what Facebook ‘favorites’ or penalizes.

The most creditable and official way out of this conundrum is to have a proper policy. But it is not happening so far, and so, we are still in the field of civic activism.

Our goal

We know that the real goodies in social networks lie in the users’ data. A veritable goldmine. Our goal: to make that goldmine accessible to everyone (wait, in a not totally unregulated, but privacy-preserving way). In this millennium, the power lies within the data? Then we must aim to re-possess the data, own our gold, keep it under our control and thank only those who display openly how social media works, why they manipulate our perceptions, and how they do it.

That's how our techno-activist venture begins. It starts with that theory, and becomes over time a browser extension that records the posts showing up in your Facebook Newsfeed. In the beginning, it is just a personal copy of the public posts, structured, and re-useable under certain conditions.

Our mission is to display and highlight how this business logic distorts and plays with our time, our expectations and, even more fundamentally, with our perception of reality.

So far, this might look a little like an academic project: collecting what the user saw, detecting which content has been promoted despite the chronological order. But that's not the goal!

Researchers, journalists, analysts, and policymakers are among our target users, but we aim for a far broader audience.

Algorithms have a collective impact, and can only be addressed collectively. If we can make the problem accessible to users with average skills, our activism will have achieved its goal.

We need lots of users, not a ‘critical mass’ like Facebook itself, but enough to observe the features of the so-called "walled garden." To get this kind of user base, we have to offer them some kind of functionality, and we are currently researching the socially helpful types of functionality Facebook doesn't ( and won’t?) provide.

The first and basic exploration this puts in the user's hands is "look at your information diet." No tool in this world can enable you to pop the filter bubble you are in, and anyway, you really don't need a technical solution to fix what another technology has mistakenly forced on you.

But seeing your information diet allows you to access what is informing you, which topics, and where you stand in the world.

If we can't be on a network without a filter, at least we have to know the nature of that filter and some day in the future, have enough power to determine our algorithm.

There is a secondary functionality we can offer, too. If you have ever used the legacy mass media, you know that they have an agenda. The Editor has it; it influences a little bit the observations contained in the stories as reported.

With Facebook, the algorithm is our personalized Editor. And again, the best way to understand such technocratic subterfuges, is to compare your information diet with that of another person you know. It is like watching something together and exchanging comments. It is a way to get some human feedback from a place, the social net, where gamification is usually reserved for researchers on a dopamine high.

The last functionality, which will not be ready for some time, is to use the filter bubbles against themselves.

We can assume that populist waves lead to audience fragmentation and, perhaps, filtered interactions confirming their positions have been complicit in this fragmentation.

This fragmented condition has made it even more difficult to relate to problems far removed from our reality. How do you understand the problem of a migrant, for example, if you have never been away from home and you don't spend time directly in their company?

Can we use their filter bubble to understand their world? Can we navigate a cluster of aggregated information, perhaps, to register the different points of view, the comments, the feelings about any specific topic?

Developing this functionality is a complex exercise because it immediately imposes on us two barriers before we start. The first is that "a navigator cannot be allowed access to the original post". As individuals, we know that information shared in some context is often inappropriate to share in another. That’s why we have different disclosure expectations of the same bit of information in respect to different situations or people. Our ability to keep track of the flow of information we send and receive through a different context is what makes privacy valuable for our individual autonomy.

Even if we are working with only publically posted posts, and the author has willingly consented to go public with these, we must not display such content outside the expected channels and original purpose of their consumption. The content must at least be anonymized and minimized, before it can be served up to others. This “privacy-by-design” commitment and technology reduces content misuse and privacy loss. To be more precise, I want to protect the users involved from what is called “social media intelligence”. See more about social media intelligence here.

Secondly, we are proud to be developing "stupid tools, for smart users", which empower critical judgment rather than extending the influence of the algorithm or its capacity for surveillance. These are robust, simple tools which do not try to give you answers, just an output of your usage.

This whole system, which empowers researchers by giving them the data to work on and users the highlights that inform them, is a collaborative project. More supporters can only mean bigger benefits for all those who will join us later on. At the moment, a user can only support us by using the browser extension for Firefox or Chrome, and to know more look here at facebook.tracking.exposed.  

About the author

Claudio Agosti is a self-taught hacker who since the late 90ties has gradually become a techno-political activist as technology has begun to interfere with society and human rights. In the last decades, he worked on whistleblower protection with GlobaLeaks, advocated against corporate surveillance, and founded "facebook tracking exposed." He is a research associate in UvA, DATACTIVE team, vice president of the Hermer Center and OTF fellow 2017 with CodingRights.

Read On

Claudio Agosti will make a presentation at WFD 2017's Lab 7 on 'Bursting social media echo chambers' on 9 November 2017 - 14.30-16.30 - Room 5.

More On

openDemocracy will be at this year's World Forum for Democracy, exploring the impact of populism on our media, political parties and democracy (see the programme for more details).

Do you care about online rights? digitaLiberties needs your support to keep publishing alternative perspectives on surveillance, net neutrality and privacy. Please help by donating whatever you can.


We encourage anyone to comment, please consult the
oD commenting guidelines if you have any questions.