R obert Gehl received a PhD in Cultural Studies from George Mason University in 2010. He is currently an associate professor in the Department of Communication at the University of Utah. His research draws on science and technology studies, software studies, and critical/cultural studies and focuses on the intersections between technology, subjectivity, and practice. He has published critical research exploring corporate and alternative social media, knowledge management, crowdsourcing, media theory, and the Dark Web. This work appears in journals such as New Media and Society, Communication Theory, Social Text, Fibreculture, Television and New Media, European Journal of Cultural Studies, and the Canadian Journal of Communication. His first book, Reverse Engineering Social Media (Temple UP, 2014), explores the architecture and political economy of social media and is the winner of the Association of Internet Researchers Nancy Baym Book award. His second book, Weaving the Dark Web was released from MIT Press in the Fall of 2018.
Facebook Algorithms and Alternative Social Media
There's a saying people use to discuss online services: if it's free, then you’re the product. As Karl Hodge notes in The Conversation, an exchange between Mark Zuckerberg and Utah Senator Orrin Hatch illuminates this saying quite well. Hatch asked, "How do you sustain a business model in which users don’t pay for your service?" Zuckerberg replied, "Senator, we run ads."
Like so many of his public statements, Zuckerberg’s response is accurate, if not particularly illuminating. Magazines run ads. Radio stations run ads. TV runs ads. Facebook runs ads, too, but it's different: its ads are far more targeted, far more invasive. Its ads are based on our own expressions, desires, and ideas, all of which are sold back to us.
The relationship between advertising and our sociality – our connections to our friends, family, colleagues – is precisely what I became interested in during the writing of my first book, Reverse Engineering Social Media. As I suggest in that book, Facebook and other social media are almost direct outgrowths of the late 1990s online advertising industry, which created the concept of surveillance capitalism: monitor what people do online then sell them things based on their activities.
At the heart of this relationship between advertising and sociality is the sorting, funneling, channeling, and above all modulation of what we see in Facebook. I’m talking, of course, about Facebook’s algorithms. As The New York Times reports,
"Facebook’s ad system provides ways to target geographic locations, personal interests, characteristics and behavior, including activity on other internet services and even in physical stores. Advertisers can target people based on their political affiliation; how likely they are to engage with political content; whether they like to jog, hike or hunt; what kind of beer they like; and so on.
If advertisers provide a list for email addresses, Facebook can try to target the people those addresses belong to. It can also do what is called 'look-alike matching.' In this case, Facebook’s algorithms serve ads to people believed to be similar to the people those addresses belong to."
The goal of such targeted ads is to fit in with the non-advertising content in Facebook. That is, an ad should look like it belongs alongside your grandma's latest pictures and your colleague's note about the upcoming Christmas party.
It is not fair to say that Facebook’s algorithms are totally subservient to the needs of advertisers and marketers. More precisely, the two sides are engaged in constant negotiation. As Taina Bucher writes in a New Media and Society article,
"There is now a whole industry being built around so-called 'News Feed Optimizatio’ akin to the more established variant, search engine optimization. Marketers, media strategists, PR firms all have advice on how to boost a brand's visibility on Facebook."
That is, while Facebook wants to serve advertisers by selling our attention to them, it also must maintain our perception that it is giving us access to our friends, family, and interests.
Given that Facebook is driven almost entirely by the needs of marketers, what is to be done? Much of my scholarship has explored this question. My answer is: support the alternatives. If you’re worried about Facebook's desire to know everything about you, consider leaving Facebook for non-profit, open source systems, such as Mastodon, diaspora*, or Twister (check out the Omeka Archive on the S-Map site for more). These systems often do two things that differ greatly from Facebook: they don’t sell your data to marketers, and they don’t shape the content you see with algorithms. As part of this conversation, I am happy to talk more about the alternatives, as well as their relationship to Facebook and its internal algorithms.
Transcript of Rob Gehl's Twitter conversation
Robert Gehl's statement is also available on the The Social Media Alternatives Project (S-MAP) website at https://www.socialmediaalternatives.org/?p=165
When I was researching and writing the dissertation that would eventually become Reverse Engineering Social Media (this was around 2008, 2009), a friend of mine asked me, "What’s the harm in using Facebook? Who cares if they are gathering data on us?" At that time, I suggested that the danger was that Facebook could modulate a lot of how we see the world and how we understand ourselves. I don't think the answer satisfied my friend. And, indeed, I spent the next decade despairing as I heard people repeat my friend’s questions. What’s the harm? Who cares?
Well, then we had 2018 and the "techlash." After participating in the Social Media Narratives discussion with some smart students, and in my discussions with people these days, I feel a bit more vindicated. A lot of people are questioning why Facebook should do what it does with seemingly no consequence. And they are starting to look at alternatives. As the discussion we had (on Twitter, another corporate social media system) illustrates, the students are not naive about moving from Facebook to alternatives. There are serious questions for any alternative to Facebook: questions about hate speech, questions about identity and trust. Even as I argue that we need to leave corporate social media behind for the alternatives, I also celebrate these sorts of hard, critical questions people are asking of the alternatives.
2018 SAIC ATS Panel Participants
Gary O. Larson
Kathi Inman Berens
SAIC ATS Class in Social Media Narratives:
Produced by: SAIC ATS Part-time Faculty: Judy Malloy: Introduction to the Panel