In thermodynamics, where it all started, entropy is a measure of the uniformity of energy distribution within a system - higher entropy means more uniform distribution. John von Neumann is reckoned to have told Claude Shannon to name his measure of uncertainty in information theory 'entropy' because (among other things) 'nobody really knows what entropy is, so in a debate you will always have the advantage'.
Wikipedia offers:
'An everyday example of entropy can be seen in mixing salt and pepper in a bag. Separate clusters of salt and pepper will tend to progress to a mixture if the bag is shaken. Furthermore, this process is thermodynamically irreversible. The separation of the mixture into separate salt and pepper clusters via the random process of shaking is statistically improbable and practically impossible because the mixture has higher entropy.' - WikipediaThis highlights another key part to our understanding of entropy - a closed system will increase in entropy both inevitably and irreversibly. We can see this in cosmological entropy, which argues that our universe is a closed system, and will thus reach a state of maximum entropy where all energy is evenly distributed and (consequently) all parts of the universe will be the same temperature. It's a theory that doesn't bode well for us in the (very) long run.
For me, when I hear the word entropy, I don't perceive a unit of measure - I perceive the irrevocable march toward homogeneity. Keeping heterogeneous things that are in contact from becoming homogenous takes a lot of effort. We see global systems become more alike as they come into contact, loss of biodiversity, loss of cultural diversity, loss of political diversity, loss of economic diversity, and loss of the protections that come with diversity. There's not much to be done about it either - as we go global, as our culture becomes a single closed system, rising entropy is inevitable. We see our attempts to keep our heterogeneity alive taking a lot of energy, and generally failing.
In light of this, we might observe three options:
- Get some negative entropy - find some new cultures
- Start embracing entropy - hooray for homogenisation!
- Create closed systems - don't put salt and pepper in the same bag
Point 1 only delays the inevitable. It is highly interesting that elements of both point 2 and point 3 are generally championed as solutions to the problems we face today. Is retaining some heterogeneity while allowing some homogenisation the right approach? Can it be possible to maintain both heterogenous and homogenous elements in a closed system? What is the right combination and how do we control it? According to the laws of entropy, it would appear that we can neither stop nor reverse homogenisation. Of course seeing our world as a closed system is short sighted: it is part of our solar system, which is itself part of our galaxy, and our universe. We get energy exogenously from the sun, and all life ultimately uses this source of energy to endogenously maintain diversity - to swim against the relentless tide of rising entropy. From this perspective we apparently have great potential to choose between homogeneity and heterogeneity. The trap, however, is that whenever our attention wavers, the tide sweeps us a little further toward homogeneity, and the way back may never appear. We must fight perpetually for heterogeneity if we want it. Once we perceive diversity, it is at permanent risk of fading away.
The term 'political entropy' is interesting:
“The entropy measurement gives the average social uncertainty about what will happen for event sets in the social system. An entropy value for a unitary social system is analogous to a temperature reading for thermodynamic system, such as a volume of gas . In a state of temperature equilibrium one temperature measurement describes the whole volume of any part of it. If a social system is in an entropy equilibrium, a single entropy measurement describes the state of the system or any subsystem. For a system in partial equilibrium, the entropy values of its subsystems must be known.“ - Stephen ColemanColeman is saying that when we reach maximum political entropy, we will have maximum uncertainty over what is happening - in a democratic system this might mean many candidates with similar popular support - calling the result is very difficult. Further research supports this interpretation: Coleman felt that the lowest entropy system was one where the certainty of the political outcome approached 100% - e.g. a one party democracy. He also understood voting patterns as a means to measure political entropy - at minimum entropy any vote sample will identify the outcome, while at maximum entropy we must sample the entire vote to reach a conclusion.
One key aspect of the thermodynamic system is the inevitable tendency toward homogeneity, and Coleman identifies this is in his discussion of political entropy - we will head towards political systems with less certain outcomes. Also highlighted is the role of heterogeneity - the presence of subsystems, each of which must also be undergoing changes in entropy, and which influence each other to reach an eventual state of entropy equilibrium. This subsystem relationship must also be recursive, with subsystems containing subsystems to an undefined degree of complexity. The conclusion here, then, is that at maximum entropy a democratic political system is homogeneous - every citizen is a candidate with the explicit support of themselves alone.
Of course, we don't have the mechanics to support such a homogenous system - it is not possible for political entropy to reach that equilibrium. It doesn't make sense at many levels - what are the means of election? what are the means of governing? In fact, a maximum entropy democracy sounds a lot like anarchy. That's ok though - it's a theoretical maximum, an ideal - it serves as a bookend in the entropy discussion. We can observe, however, that public participation in policy making provides a pressure to increase political entropy - more people, more involved, more often. And therein lies a small paradox - our quest for transparency, for involvement, to have a say in our own government will actually deliver less certainty.
Less certainty? We don't want that do we? One might assume such at first glance, but if we look at some recent history of certainty [Iraq War][Copenhagen][Business deals][Credit Crunch][Iran election] we may see that it is in fact our our ignorance and impotence that drives calls for a more participatory and open government.
So now, with a little imagination, we can begin to see our political and cultural landscape through the lens of thermodynamics - as bubbles of gas inside each other, determined to coalesce into a single bubble of uniform temperature. On this landscape, humanity helps, hinders, increases, reduces and divides these bubbles - often unintentionally, and often without understanding the outcomes and implications.
When we look at the future of government, something becomes clear in the context of this discussion - it is inevitable that citizen involvement will increase and, barring monumental upheaval, we can't stop it, and we can't go back. We're going to need better tools to manage our cultural and political entropy - because government as a platform will deliver mechanisms that allow us to move ever closer to the theoretical maximum.
No comments:
Post a Comment