October 8, 2014

Technology and Fitness Landscapes

One of the ideas which has become a tool for thinking for me is a fitness landscape. The concept is most common in evolution and in machine learning (at least, those are the domains where I've come across it).
Visualization of a population evolving in a dynamic fitness landscape
The main idea, as this Wikipedia GIF illustrates, is that agents look around, and then "hill climb" to the highest point that is near them. They then repeat this process until there is nowhere near them that is lower than them (that is, they are at a peak). In biology, peaks represent attributes with high reproductive success. In machine learning, they represent parameters with a better solution to a problem.

The result that you end up with (e.g., the attributes of an animal or the solution produced by a ML algorithm) depends on the fitness landscape, and on the location where you start. In general, you will end up at local, rather than global, peaks.

I think a lot about the interaction between technology and society, and I think that fitness landscapes can provide a useful way of thinking of them. Technology is sometimes talked about as though it causes certain political, cultural, or social outcomes. Alternatively, some people seem determined to act as though people act nearly independently of technology, and that technology is merely a tool.

I think that we can see the introduction of a new technology as a reshaping of the socio-cultural-political fitness landscape, so that some configurations become comparatively more costly or difficult to maintain (i.e., move lower) while others become more attractive or possible (i.e., move higher). Whether a given society will change their configuration is dependent on how much the landscape changes (e.g., if the landscape changes dramatically, then most configurations will change) and where they are located on the fitness landscape (e.g., if the peak they are on remains a local optimum, then they will stay, even if the overall landscape has changed a lot).

These ideas can be imperfectly mapped to real situations. For example, we could see a country with strong institutions and a stable culture as being at a rather high local peak. Technological changes would therefore be less likely to result in reconfigurations. Similarly, minor changes in technology may only alter the landscape in small ways, but those small changes may be enough to cause societies to change drastically (the arguments about Twitter causing the Arab Spring can be framed in these terms).

For me, at least, I think this is a nice framework for thinking about what's happening without resorting to simple cause and effect explanations.

September 12, 2014

Every Generation Makes the World Anew

This is probably something obvious to most people, but it struck me the other day that so much of how we perceive and interact with the world - language, culture, how-to knowledge, etc. - is in practice quite stable and long-lived, but in theory is very malleable and uncertain. As the title of this post says, each generation has to choose what to pass on - what books to preserve, what buildings to (not?) tear down, what songs to teach to their children, etc. The things they choose not to pass on disappear. Forever. I found that to be sobering.

September 2, 2014

The Web We Had

I recently finished the very good, classic work Small Pieces Loosely Joined, by David Weinberger. I enjoyed it very much (my brief review is here).

The book is 12 years old, however, and the Web has changed a lot in that time. While I found many of the ideas, especially the philosophical ideas about the Web, both insightful and applicable, as I was reading, I couldn't help but think about the ways in which the Web has drifted from the Web Weinberger knew in 2002.

In fact, coincidentally, Weinberger himself recently wrote about one of the ways that the web changed since then. Weinberger knew that it was social, but in the early days, that sociability was built almost exclusively around interests - mailing lists, irc channels, forums, blogs - they are all built around people interested in the content that is produced. Now, much more of the experience of the web is built around real-world relationships - Facebook, Snapchat, Twitter, etc. For most people, these are populated primarily with people who they know. In ways that are both good and bad, the web is more intimate and more insular than it was.

Secondly, the web was a purpose-built place. Many of Weinberger's arguments revolve around the idea that the web is a place apart from the "real world" - where what is created is created intentionally, and build around the interests of its creators. While the core of this claim is still true today, much of the web is now co-created with algorithms.

There are very few web pages that are designed exactly as they are. Most pages at least have ads which are chosen and served up by an algorithm. Many are created by algorithms in even more explicit ways. For example, Amazon suggests other items that you might be interested in, based on what similar consumers purchased in the past. Your Facebook feed is filtered based on the relationships you have with others - how many times you've liked someone's posts, how many friends you have in common, etc.

This is really interesting - we are living in a space that is altered based on what browser we are using, what sites we have visited, what purchases we have made, etc. It's as though when you went to the mall, all the stores you like were put closest to the entrance, and the salad places magically disappeared from the food court. Then, when the next person came in, their favorite stores and food magically moved to prime locations.

There are some dangers. The idea of a filter bubble is one - the idea is that it is good for us to be exposed to things we don't like, ideas we disagree with, etc. While I agree that this is something to worry about, let's not forget how amazing what we have is - the algorithm-based world of the web is one with better ads, more interesting content, and ways to find and connect with others who have things to say which are particularly interesting to us.

August 26, 2014

Review: Small Pieces Loosely Joined: A Unified Theory Of The Web


Small Pieces Loosely Joined: A Unified Theory Of The Web
Small Pieces Loosely Joined: A Unified Theory Of The Web by David Weinberger

My rating: 4 of 5 stars



Only on the web does a book 12 years old feel like ancient history. In many respects, Weinberger was prescient, identifying trends that have become more and more powerful (e.g., one passage could be seen as predicting the rise of Wikipedia, and another the advent of the currently omnipresent "Like" button). Even more often, he provides insights that are still deep and thought-provoking.

Weinberger is a philosopher by training, and this book is strongest when Weinberger focuses on philosophy. For example, he argues that the web is a push back against the turn toward realism in society. He argues that the web is a completely constructed space, without the constraints of the real world, and the fact that we can find such meaning and purpose through that sort of "unreal" environment says something about what our real needs and desires are.

At times, the book is overly technological deterministic, and at times Weinberger makes claims about the nature of the web that may have been true 12 years ago, but feel less true today (e.g., today's web is much more organized around real-world friends and less around interests). However, I find his argument that using the Internet subtly changes the way we see the world to be both persuasive and important. Just because the Internet isn't "changing everything" rapidly before our eyes doesn't mean that there it isn't influencing culture in really important ways.

Overall, an important and interesting book. Should be required reading for anyone interested in the cultural impact of the internet.



View all my reviews

Review: It's Complicated: The Social Lives of Networked Teens


It's Complicated: The Social Lives of Networked Teens
It's Complicated: The Social Lives of Networked Teens by Danah Boyd

My rating: 4 of 5 stars



boyd is an ethnographer of teens and technology. She has spent the past decade observing and talking with teens about how they use technology.

Like much of the recent literature on the web, I would characterize the overall takeaway as, "Plus ça change, plus c'est la même chose." Phones and the internet are not changing teenagers into new creatures, who hide themselves away in their rooms, lit only by the screen of their phone and laptop, furtively typing and texting - alone together.

Rather, boyd argues that much of the tech-centricity of the lives of teenagers is a result of a shift in societal norms. Parents no longer allow teens to hang out together; malls and other public places have also started to disallow groups of teens to congregate. boyd claims that teens would prefer to spend time face-to-face with each other - technology is simply the next best alternative in a society that is making that more and more difficult.

I find myself to be something more of a technological determinist than boyd, and I think that teens (and the rest of us) are being molded by our technology, although I agree that this is certainly more subtle than some earlier commentators on the web might have us believe.



View all my reviews

July 8, 2014

Facebook and Manipulation

I thought I would add my small voice to the large chorus of voices surrounding the recent Facebook study on mood manipulation.

Let me begin by saying that it's sort of fun to see social science getting so much attention! :) This is broadly my area of research (computational social science and online communities), and this is somewhat similar to the kinds of research I have done, and would like to do in the future.

So, naturally, I think that this sort of research is OK to do. I'll leave out the question of informed consent for now (although it's certainly important), and give a few reasons why I think that this sort of research is important and should be done.

  1. Websites provide a great way to study the understudied. For years, the main source of our knowledge about human psychology has been college undergraduates. Facebook users provide a much, much broader group of participants (although certainly still skewed toward Westerners).
    Using these sites as a context for research should allow us to be more representative in the way we understand the world (and therefore in how we create policy based on that research).
  2. Critics of this research underestimate our autonomy. The press around this has acted as though Facebook's experiment was causing suicides, and creating chaos around the world. This is similar to the "magic bullet" theory of communication. Some early Communication scholars thought that messages from the media could directly influence people.

    In reality, messages from the media (or our friends) combine in a very complicated way, and we choose how to respond and react. Indeed, the effect of the Facebook study was just barely statistically significant - people were barely, barely more likely to post positive things when reading positive things, and barely, barely more likely to post negative things when reading more negative things.
  3. This leads to my next point. There are opportunities for real insight. My wife has been incredibly supportive of my career, but she studied genetics, and doesn't appreciate the social sciences as much as I do. She often says that the results of social science are obvious. I argue that the results seem obvious in retrospect, but that the opposite result would also seem obvious.

    For example, one of the arguments against this study is that Facebook was intentionally making people sad by showing them more negative posts. However, previous research actually suggested that when people saw lots of positive posts they became more sad - they would see all the great things their friends were doing, and become depressed by comparison. The Facebook study showed that this isn't true - an important finding.
  4. This is related to my main point. We are spending more and more of our time on these types of sites. We need to understand them. Facebook and other sites have every right (IMO) to run these sorts of experiments in order to improve the experience of users. We can either encourage them to publish their findings, and make their insights available to the public, or we can criticize their research, and encourage them to keep it private, and manipulate our behavior without our knowledge.

    I prefer the former.

January 18, 2014

Timebombing and Algorithms: Our co-created culture

I've been thinking a lot lately about how the creation of culture is no longer solely influenced by norms, but is co-created by algorithms.

Algorithm has a technical definition, as basically a series of instructions to produce a given set of outputs from a given set of inputs, but I'm focusing on a more specialized definition - I'm thinking of code that affects the way that users interact with information. For example, my working definition would include the algorithm that determines the ranking of Google's search results, but not an algorithm for determining whether a triangle is obtuse.

In his wonderful book Code 2.0, Lawrence Lessig describes how norms, law, architecture (which includes code), and the market all regulate/influence behavior.

My argument is related, and partially addressed by Lessig. Namely, it is that in online spaces norms and algorithms co-evolve in really interesting ways.

Algorithms are becoming more and more complex. In order to deal with the huge amount of information that is being created online, we need to have filters, filters which come from algorithms. For the most part, these algorithms are completely invisible to the user, and in many cases they are incredibly complex. For example, Google uses more than 200 signals to rank search results and Facebook's algorithm uses a number of features to determine how close you are to each of your "friends", and which of their updates should appear.

The invisibility of these algorithms has its own, potentially serious, problems. There is another problem with algorithms - they are rigid, and they are well-defined. This leaves room for people to "game the system" - that is, to interact in ways that were either unanticipated by the developers or which are difficult to identify and stop.

Cultural norms, on the other hand, are "fuzzy". They are heuristics, which we must decide to apply in ambiguous situations. For example, there is a norm of reciprocity - if someone does something nice to you, then you are expected to do something nice in return. Exactly how to apply this varies greatly depending on the situation, and the relationship between the people involved.

My working theory is that norms sort of fill in the gaps where algorithms are either too rigid, or not well-defined enough, but that norms that can be well defined will eventually be codified as algorithms.

As an example, Facebook's news feed algorithm surfaces posts that are popular and/or recent. This measure of recency includes not only the post, but any activity on a post.

I know this because sometimes I participate in an activity that I call "Timebombing", where I comment on a friend's embarrassing and/or important post from years ago. For example, after telling a friend about how fun timebombing was, he commented on 4-year-old post which announced the pregnancy of our first child.

Soon, the post was filled with congratulatory comments from my friends and family, and people started calling to congratulate us for a non-existent pregnancy.

In this case, it's difficult for the news feed algorithm to differentiate between an old post that is important again, and an old post that someone is commenting on as a joke. Because this is so fuzzy, a norm has arisen which says that you shouldn't comment on old posts, unless your comment makes it clear that it's an old post.

If the behavior became widespread enough, then developers could potentially change the algorithm to reflect the importance of the norm, either by eliminating the problem (e.g., "Posts older than X years won't show up again no matter what"), or to make it less confusing (e.g., "Make the date of the original post larger or a different color if it's older than X days").

However, the rigidity and explicitness of code will always leave algorithms open to manipulation and abuse; in systems designed for social interaction, code will never be sufficient to stop undesirable behaviors, and norms will always play the most important role in guiding behavior.