July 8, 2014

Facebook and Manipulation

I thought I would add my small voice to the large chorus of voices surrounding the recent Facebook study on mood manipulation.

Let me begin by saying that it's sort of fun to see social science getting so much attention! :) This is broadly my area of research (computational social science and online communities), and this is somewhat similar to the kinds of research I have done, and would like to do in the future.

So, naturally, I think that this sort of research is OK to do. I'll leave out the question of informed consent for now (although it's certainly important), and give a few reasons why I think that this sort of research is important and should be done.

  1. Websites provide a great way to study the understudied. For years, the main source of our knowledge about human psychology has been college undergraduates. Facebook users provide a much, much broader group of participants (although certainly still skewed toward Westerners).
    Using these sites as a context for research should allow us to be more representative in the way we understand the world (and therefore in how we create policy based on that research).
  2. Critics of this research underestimate our autonomy. The press around this has acted as though Facebook's experiment was causing suicides, and creating chaos around the world. This is similar to the "magic bullet" theory of communication. Some early Communication scholars thought that messages from the media could directly influence people.

    In reality, messages from the media (or our friends) combine in a very complicated way, and we choose how to respond and react. Indeed, the effect of the Facebook study was just barely statistically significant - people were barely, barely more likely to post positive things when reading positive things, and barely, barely more likely to post negative things when reading more negative things.
  3. This leads to my next point. There are opportunities for real insight. My wife has been incredibly supportive of my career, but she studied genetics, and doesn't appreciate the social sciences as much as I do. She often says that the results of social science are obvious. I argue that the results seem obvious in retrospect, but that the opposite result would also seem obvious.

    For example, one of the arguments against this study is that Facebook was intentionally making people sad by showing them more negative posts. However, previous research actually suggested that when people saw lots of positive posts they became more sad - they would see all the great things their friends were doing, and become depressed by comparison. The Facebook study showed that this isn't true - an important finding.
  4. This is related to my main point. We are spending more and more of our time on these types of sites. We need to understand them. Facebook and other sites have every right (IMO) to run these sorts of experiments in order to improve the experience of users. We can either encourage them to publish their findings, and make their insights available to the public, or we can criticize their research, and encourage them to keep it private, and manipulate our behavior without our knowledge.

    I prefer the former.

January 18, 2014

Timebombing and Algorithms: Our co-created culture

I've been thinking a lot lately about how the creation of culture is no longer solely influenced by norms, but is co-created by algorithms.

Algorithm has a technical definition, as basically a series of instructions to produce a given set of outputs from a given set of inputs, but I'm focusing on a more specialized definition - I'm thinking of code that affects the way that users interact with information. For example, my working definition would include the algorithm that determines the ranking of Google's search results, but not an algorithm for determining whether a triangle is obtuse.

In his wonderful book Code 2.0, Lawrence Lessig describes how norms, law, architecture (which includes code), and the market all regulate/influence behavior.

My argument is related, and partially addressed by Lessig. Namely, it is that in online spaces norms and algorithms co-evolve in really interesting ways.

Algorithms are becoming more and more complex. In order to deal with the huge amount of information that is being created online, we need to have filters, filters which come from algorithms. For the most part, these algorithms are completely invisible to the user, and in many cases they are incredibly complex. For example, Google uses more than 200 signals to rank search results and Facebook's algorithm uses a number of features to determine how close you are to each of your "friends", and which of their updates should appear.

The invisibility of these algorithms has its own, potentially serious, problems. There is another problem with algorithms - they are rigid, and they are well-defined. This leaves room for people to "game the system" - that is, to interact in ways that were either unanticipated by the developers or which are difficult to identify and stop.

Cultural norms, on the other hand, are "fuzzy". They are heuristics, which we must decide to apply in ambiguous situations. For example, there is a norm of reciprocity - if someone does something nice to you, then you are expected to do something nice in return. Exactly how to apply this varies greatly depending on the situation, and the relationship between the people involved.

My working theory is that norms sort of fill in the gaps where algorithms are either too rigid, or not well-defined enough, but that norms that can be well defined will eventually be codified as algorithms.

As an example, Facebook's news feed algorithm surfaces posts that are popular and/or recent. This measure of recency includes not only the post, but any activity on a post.

I know this because sometimes I participate in an activity that I call "Timebombing", where I comment on a friend's embarrassing and/or important post from years ago. For example, after telling a friend about how fun timebombing was, he commented on 4-year-old post which announced the pregnancy of our first child.

Soon, the post was filled with congratulatory comments from my friends and family, and people started calling to congratulate us for a non-existent pregnancy.

In this case, it's difficult for the news feed algorithm to differentiate between an old post that is important again, and an old post that someone is commenting on as a joke. Because this is so fuzzy, a norm has arisen which says that you shouldn't comment on old posts, unless your comment makes it clear that it's an old post.

If the behavior became widespread enough, then developers could potentially change the algorithm to reflect the importance of the norm, either by eliminating the problem (e.g., "Posts older than X years won't show up again no matter what"), or to make it less confusing (e.g., "Make the date of the original post larger or a different color if it's older than X days").

However, the rigidity and explicitness of code will always leave algorithms open to manipulation and abuse; in systems designed for social interaction, code will never be sufficient to stop undesirable behaviors, and norms will always play the most important role in guiding behavior.

September 14, 2013

On Small Changes

Democracy via Newspaper

Before newspapers, people could (and did) distribute hand-written bulletins. In fact, the only major advantage of newspapers was the volume that could be produced. The sorts of things that could be written about didn't change, the labor-intensive means of distribution were the same. And yet, this small change may have been enough to make the rise of democratic nation-states possible. Elihu Katz summarizes this argument (as made by Gabriel Tarde):
But Tarde takes a further step in this role, in asserting that the newspaper overthrew the monarchy. His argument is based on the idea that only the king — the representative public sphere — had had knowledge of what was going on in the various villages and regions of his realm; he had spies and bureaucrats to tell him, and he was in no special hurry to let Village A find out what Village B was thinking. The newspaper did exactly this and thereby undermined the king, says Tarde: it made him redundant. (Katz, 2000, p. 126)
The claim is that the king was in a position to bridge structural holes, and to decide how and when to distribute what information between villages. This information asymmetry provided power and control, and when newspapers removed some of that asymmetry, the monarchy was too weak to survive.

Similar claims have been made about the power of technologies to alter societies. For example, Eisenstein's description of the printing press or Putnam's argument that TV killed socializing.

My point is not to discuss the merit of any of these arguments (although I think all 3 are fascinating and compelling). However, I think that the underlying theme behind them is important and interesting: Changes in communication technologies have subtle effects on the way we see each other, and the way we see the world. And these subtle changes can make a huge difference in the way we choose to work, play, and govern ourselves.


? Via The Internet


My next question, naturally, is what aspects of the Internet (if any) have the potential to have these sorts of long-term impacts (with the caveat that by their nature these changes are hard to see when you are in the midst of them)?

Some ideas:


  • Global information and communication → Weakening of Nationalism → Less wars 
  • Filter Bubbles → Strengthening of Zealots → More wars 
  • EBay → Temporary view of posession → Collaborative Consumption → Weakening of capitalism/consumerism → ? 
  • Digitization of activity → Surveillance (by states and others) → Distrust of government 
  • Digitization of activity → Surveillance (by states and others) → Less crime 
  • Digitization of activity → Surveillance by algorithms → Less crime? 

What else? Are some big effects already occurring?

Reference
Katz, E. (2000). Media multiplication and social segmentation. Departmental Papers (ASC), 161.