I recently read Hallucinations by Oliver Sacks. It's a fascinating book; one of my main takeaways is that the structure of our brains makes a lot of different types of malfunctions possible (in this case, hallucinations).
Brains can be seen as communication networks. Neurons have synapses and axons which allow them to communicate their local environment to other neurons. In ways that aren't fully understood, this network-level communication leads to perception, consciousness, thought, and memory.
Some hallucinations, such as those that accompany migraines, appear to come from networks of neurons that are firing without the right input. This made me think of Information cascades. In networks, like the brain, the output of a node is related to the inputs it receives, and those inputs often come from other nodes. This structure means that changes in one part of the network can quickly spread and magnify to other parts of the network. In the brain, this can look like a migraine or a hallucination. Different networks have different properties, but most real-world networks are what we call Small-world networks. These networks have certain properties - they allow for complex, deep interactions between neighboring nodes, high-level interactions between groups of nodes, and are robust to interference. However, they are also susceptible to certain problems, and runaway cascades is one of those problems.
Brains (and the people that own them) are part of social networks and organizations, which are in turn part of communities and organizational networks. Both of these are parts of nations, which are part of inter-national networks (other networks could also be identified, such as the cell network, the protein network, the genome network, etc.) At each level, the actions of one "node" or actor can be an input into the actions of others, and can cascade through the network. In social networks, an information cascade could be a chain email, a viral Facebook post, or a new fashion. In general, cascades are kept in check by negative feedback loops, but when the conditions are right, they can spread quickly and distantly.
I've been thinking about ways that cascades can move from one network to another. In the book, Sacks talks about Joan of Arc, and how her visions may have been hallucinations. I thought about how neural cascades in the brain could have led to Joan of Arc changing her behavior, which led to an informational/behavioral cascade in her social network, which led to a cascade in her community, which led to a cascade in her nation, etc.
Of Sterner Stuff
The world is a beautiful and strange place.
April 24, 2015
October 8, 2014
Technology and Fitness Landscapes
One of the ideas which has become a tool for thinking for me is a fitness landscape. The concept is most common in evolution and in machine learning (at least, those are the domains where I've come across it).
The main idea, as this Wikipedia GIF illustrates, is that agents look around, and then "hill climb" to the highest point that is near them. They then repeat this process until there is nowhere near them that is lower than them (that is, they are at a peak). In biology, peaks represent attributes with high reproductive success. In machine learning, they represent parameters with a better solution to a problem.
The result that you end up with (e.g., the attributes of an animal or the solution produced by a ML algorithm) depends on the fitness landscape, and on the location where you start. In general, you will end up at local, rather than global, peaks.
I think a lot about the interaction between technology and society, and I think that fitness landscapes can provide a useful way of thinking of them. Technology is sometimes talked about as though it causes certain political, cultural, or social outcomes. Alternatively, some people seem determined to act as though people act nearly independently of technology, and that technology is merely a tool.
I think that we can see the introduction of a new technology as a reshaping of the socio-cultural-political fitness landscape, so that some configurations become comparatively more costly or difficult to maintain (i.e., move lower) while others become more attractive or possible (i.e., move higher). Whether a given society will change their configuration is dependent on how much the landscape changes (e.g., if the landscape changes dramatically, then most configurations will change) and where they are located on the fitness landscape (e.g., if the peak they are on remains a local optimum, then they will stay, even if the overall landscape has changed a lot).
These ideas can be imperfectly mapped to real situations. For example, we could see a country with strong institutions and a stable culture as being at a rather high local peak. Technological changes would therefore be less likely to result in reconfigurations. Similarly, minor changes in technology may only alter the landscape in small ways, but those small changes may be enough to cause societies to change drastically (the arguments about Twitter causing the Arab Spring can be framed in these terms).
For me, at least, I think this is a nice framework for thinking about what's happening without resorting to simple cause and effect explanations.
The main idea, as this Wikipedia GIF illustrates, is that agents look around, and then "hill climb" to the highest point that is near them. They then repeat this process until there is nowhere near them that is lower than them (that is, they are at a peak). In biology, peaks represent attributes with high reproductive success. In machine learning, they represent parameters with a better solution to a problem.
The result that you end up with (e.g., the attributes of an animal or the solution produced by a ML algorithm) depends on the fitness landscape, and on the location where you start. In general, you will end up at local, rather than global, peaks.
I think a lot about the interaction between technology and society, and I think that fitness landscapes can provide a useful way of thinking of them. Technology is sometimes talked about as though it causes certain political, cultural, or social outcomes. Alternatively, some people seem determined to act as though people act nearly independently of technology, and that technology is merely a tool.
I think that we can see the introduction of a new technology as a reshaping of the socio-cultural-political fitness landscape, so that some configurations become comparatively more costly or difficult to maintain (i.e., move lower) while others become more attractive or possible (i.e., move higher). Whether a given society will change their configuration is dependent on how much the landscape changes (e.g., if the landscape changes dramatically, then most configurations will change) and where they are located on the fitness landscape (e.g., if the peak they are on remains a local optimum, then they will stay, even if the overall landscape has changed a lot).
These ideas can be imperfectly mapped to real situations. For example, we could see a country with strong institutions and a stable culture as being at a rather high local peak. Technological changes would therefore be less likely to result in reconfigurations. Similarly, minor changes in technology may only alter the landscape in small ways, but those small changes may be enough to cause societies to change drastically (the arguments about Twitter causing the Arab Spring can be framed in these terms).
For me, at least, I think this is a nice framework for thinking about what's happening without resorting to simple cause and effect explanations.
September 12, 2014
Every Generation Makes the World Anew
This is probably something obvious to most people, but it struck me the other day that so much of how we perceive and interact with the world - language, culture, how-to knowledge, etc. - is in practice quite stable and long-lived, but in theory is very malleable and uncertain. As the title of this post says, each generation has to choose what to pass on - what books to preserve, what buildings to (not?) tear down, what songs to teach to their children, etc. The things they choose not to pass on disappear. Forever.
I found that to be sobering.
Subscribe to:
Posts (Atom)