January 18, 2014

Timebombing and Algorithms: Our co-created culture

I've been thinking a lot lately about how the creation of culture is no longer solely influenced by norms, but is co-created by algorithms.

Algorithm has a technical definition, as basically a series of instructions to produce a given set of outputs from a given set of inputs, but I'm focusing on a more specialized definition - I'm thinking of code that affects the way that users interact with information. For example, my working definition would include the algorithm that determines the ranking of Google's search results, but not an algorithm for determining whether a triangle is obtuse.

In his wonderful book Code 2.0, Lawrence Lessig describes how norms, law, architecture (which includes code), and the market all regulate/influence behavior.

My argument is related, and partially addressed by Lessig. Namely, it is that in online spaces norms and algorithms co-evolve in really interesting ways.

Algorithms are becoming more and more complex. In order to deal with the huge amount of information that is being created online, we need to have filters, filters which come from algorithms. For the most part, these algorithms are completely invisible to the user, and in many cases they are incredibly complex. For example, Google uses more than 200 signals to rank search results and Facebook's algorithm uses a number of features to determine how close you are to each of your "friends", and which of their updates should appear.

The invisibility of these algorithms has its own, potentially serious, problems. There is another problem with algorithms - they are rigid, and they are well-defined. This leaves room for people to "game the system" - that is, to interact in ways that were either unanticipated by the developers or which are difficult to identify and stop.

Cultural norms, on the other hand, are "fuzzy". They are heuristics, which we must decide to apply in ambiguous situations. For example, there is a norm of reciprocity - if someone does something nice to you, then you are expected to do something nice in return. Exactly how to apply this varies greatly depending on the situation, and the relationship between the people involved.

My working theory is that norms sort of fill in the gaps where algorithms are either too rigid, or not well-defined enough, but that norms that can be well defined will eventually be codified as algorithms.

As an example, Facebook's news feed algorithm surfaces posts that are popular and/or recent. This measure of recency includes not only the post, but any activity on a post.

I know this because sometimes I participate in an activity that I call "Timebombing", where I comment on a friend's embarrassing and/or important post from years ago. For example, after telling a friend about how fun timebombing was, he commented on 4-year-old post which announced the pregnancy of our first child.

Soon, the post was filled with congratulatory comments from my friends and family, and people started calling to congratulate us for a non-existent pregnancy.

In this case, it's difficult for the news feed algorithm to differentiate between an old post that is important again, and an old post that someone is commenting on as a joke. Because this is so fuzzy, a norm has arisen which says that you shouldn't comment on old posts, unless your comment makes it clear that it's an old post.

If the behavior became widespread enough, then developers could potentially change the algorithm to reflect the importance of the norm, either by eliminating the problem (e.g., "Posts older than X years won't show up again no matter what"), or to make it less confusing (e.g., "Make the date of the original post larger or a different color if it's older than X days").

However, the rigidity and explicitness of code will always leave algorithms open to manipulation and abuse; in systems designed for social interaction, code will never be sufficient to stop undesirable behaviors, and norms will always play the most important role in guiding behavior.