Friday, January 27, 2006

What is Information?

If evolution is really the process of information change in nature, what then is information? In our everyday use of the term we think of information as intangibles, dates, facts about the world, notions, concepts etc. Such pieces of information take on a tangible form only when we utter them to someone else, when we write them down or email them to someone. For most of us, this understanding of the concept of information is all that we need to navigate our everyday lives. Unbeknown to most of us, this everyday definition of information is part of a much larger and more fundamental concept of information as it is defined in physics.

In the realm of physics, the concept of information is more than a mere intangible; it refers to the physical arrangement of the elements that make up any given system. Information in this sense is a measure related to the concepts of order and organization. It is a universal concept that can be applied to almost any system from galaxy clusters, black holes, life forms to computers. “Information, in its connotation in physics,” writes Werner Loewenstein in his book The Touchstone of Life, “is a measure of order – a universal measure applicable to any structure, any system. It quantified the instructions that are needed to produce certain organization. This sense of the word,” he says, “is not too far from the one it once had in old Latin. Informare meant to ‘form,’ to ‘shape,’ to ‘organize.’”

To understand this better, consider how information is stored in our brain. An example of information, in our daily use of the term, is the details that describe the car that you drive, such as its make, model, color etc. This information about the details of your car is stored in your brain by the specific network firing relationships or arrangements formed between the neurons in your brain. When these relationships change, so does the memory or the information. If these arrangements are completely lost, so is the information. Such information loss occurs when there is brain injury that destroys the neurons and their established pathways. Much the same occurs with degenerative brain diseases such as Alzheimer’s. Amyloid plaques between neurons and abnormal neurofibrillary tangles inside the neurons of affected individuals are believed to disrupt these neuronal relationships. With the progressive loss of these connections is the irreversible loss of “information” contained within those specific connections. Thus we as human beings can store information because the neurons in our brains are capable of making very specific interacting network arrangements in the brain.

We see the same phenomenon in the information storage devices of our computer age. Take the Compact Disc or CD. If you could zoom in on the landscape of a CD with information on it, you would see that its surface is pockmarked with microscopic craters called pits. The CD player is able to “read” the information on the CD by deciphering the specific arrangements of these pits and lands (regions without pits) on the surface of the CD.

When we look at the DNA molecule we see the same principle at work. In this case the elements capable of specific arrangements are the four bases of the DNA molecule, adenine, guanine, thiamine and cytosine. A set of three bases, a triplet code is needed to assemble amino acids, the building blocks of proteins. Different arrangements of the base pairs or different triplet codes code for specific amino acids . Once again the information for the assembly of amino acids are contained in the arrangement of the bases of the DNA molecule. Changing the base pairs therefore changes the information in the DNA molecule.

If you have never heard of this interpretation of information, there is an understandable explanation. This concept of information in physics as the arrangement or more specifically the possible arrangements of the components of a system, is a statistical concept linked to the statistical concept of entropy in thermodynamics. Together they form a conservation law applicable to all systems, living and nonliving from massive galaxies to tiny cells. In spite of the universality of this relationship it remains somewhat obscure because it is not extensively used, except in certain fields of study.

Modern Information theory was born, in the early half of the 20th century out of the theoretical advances that accompanied the invention and development of modern communication and computation systems. In fact the two central figures of information theory are mathematician Alan Turing also known as the father of modern computer science and engineer Claude Shannon whose theoretical work laid the foundation for modern digital circuits and communication networks. Not surprisingly the bulk of information theory’s influence continues to be in the many fields of modern information technology.

Outside the information technology fields, information theory has had the widest application in molecular biology. This was the result of a series of happy historical coincidences. While mathematicians and engineers, during the early part of the 20th century, were building the theoretical framework to construct better information systems, the molecular biologists were trying to understand the molecular machinery within the cell. Much of the cellular activities they were observing seemed to be best described using classic functions of information systems, such as coding, decoding, assembly etc. By mid-century, the information/entropy connection and its application to living systems and non-living systems made it clear that the information technology/molecular biology similitude was more than a mere superficial analogy. It is no accident that today the many functions of the DNA are described using, everyday information related words such as, transcription, translation and error correcting. Life is a very sophisticated information system and Information theory is in a sense essential to understanding how the cell, and its molecular subsystems work.


Werner R. Loewenstein, The Touchstone of Life: Molecular Information, Cell Communication and the Foundation of Life (New York: Oxford University Press, 2000), p.128

Erwin Schrödinger, What is Life? The Physical Aspect of the Living Cell (New York: The Macmillan Company, 1947), p.74

Claude Shannon & Warren Weaver The Mathematical Theory of Communication (Urbana: University of Illinois Press, 1949)

Hubert Yockey, Information Theory, Evolution and the Origin of Life (New York: Cambridge University Press, 2005), p.171-181

Lionel Johnson, “The Thermodynamic Origins of Ecosystems: A tale of broken symmetry” in Entropy, Information and Evolution, ed. Bruce Weber, David Depew & James Smith (Cambridge, Mass: MIT Press, 1998), p.75-105

Comments: Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?