Tuesday, January 31, 2006

Randomness & Intelligence

Given that the undisputed example of intelligence is the human being, in the next couple of posts I will take a closer look at how we human beings make decisions. What role probability & randomness has on the many everyday decisions we make.

Friday, January 27, 2006

What is Information?

If evolution is really the process of information change in nature, what then is information? In our everyday use of the term we think of information as intangibles, dates, facts about the world, notions, concepts etc. Such pieces of information take on a tangible form only when we utter them to someone else, when we write them down or email them to someone. For most of us, this understanding of the concept of information is all that we need to navigate our everyday lives. Unbeknown to most of us, this everyday definition of information is part of a much larger and more fundamental concept of information as it is defined in physics.

In the realm of physics, the concept of information is more than a mere intangible; it refers to the physical arrangement of the elements that make up any given system. Information in this sense is a measure related to the concepts of order and organization. It is a universal concept that can be applied to almost any system from galaxy clusters, black holes, life forms to computers. “Information, in its connotation in physics,” writes Werner Loewenstein in his book The Touchstone of Life, “is a measure of order – a universal measure applicable to any structure, any system. It quantified the instructions that are needed to produce certain organization. This sense of the word,” he says, “is not too far from the one it once had in old Latin. Informare meant to ‘form,’ to ‘shape,’ to ‘organize.’”

To understand this better, consider how information is stored in our brain. An example of information, in our daily use of the term, is the details that describe the car that you drive, such as its make, model, color etc. This information about the details of your car is stored in your brain by the specific network firing relationships or arrangements formed between the neurons in your brain. When these relationships change, so does the memory or the information. If these arrangements are completely lost, so is the information. Such information loss occurs when there is brain injury that destroys the neurons and their established pathways. Much the same occurs with degenerative brain diseases such as Alzheimer’s. Amyloid plaques between neurons and abnormal neurofibrillary tangles inside the neurons of affected individuals are believed to disrupt these neuronal relationships. With the progressive loss of these connections is the irreversible loss of “information” contained within those specific connections. Thus we as human beings can store information because the neurons in our brains are capable of making very specific interacting network arrangements in the brain.

We see the same phenomenon in the information storage devices of our computer age. Take the Compact Disc or CD. If you could zoom in on the landscape of a CD with information on it, you would see that its surface is pockmarked with microscopic craters called pits. The CD player is able to “read” the information on the CD by deciphering the specific arrangements of these pits and lands (regions without pits) on the surface of the CD.

When we look at the DNA molecule we see the same principle at work. In this case the elements capable of specific arrangements are the four bases of the DNA molecule, adenine, guanine, thiamine and cytosine. A set of three bases, a triplet code is needed to assemble amino acids, the building blocks of proteins. Different arrangements of the base pairs or different triplet codes code for specific amino acids . Once again the information for the assembly of amino acids are contained in the arrangement of the bases of the DNA molecule. Changing the base pairs therefore changes the information in the DNA molecule.

If you have never heard of this interpretation of information, there is an understandable explanation. This concept of information in physics as the arrangement or more specifically the possible arrangements of the components of a system, is a statistical concept linked to the statistical concept of entropy in thermodynamics. Together they form a conservation law applicable to all systems, living and nonliving from massive galaxies to tiny cells. In spite of the universality of this relationship it remains somewhat obscure because it is not extensively used, except in certain fields of study.

Modern Information theory was born, in the early half of the 20th century out of the theoretical advances that accompanied the invention and development of modern communication and computation systems. In fact the two central figures of information theory are mathematician Alan Turing also known as the father of modern computer science and engineer Claude Shannon whose theoretical work laid the foundation for modern digital circuits and communication networks. Not surprisingly the bulk of information theory’s influence continues to be in the many fields of modern information technology.

Outside the information technology fields, information theory has had the widest application in molecular biology. This was the result of a series of happy historical coincidences. While mathematicians and engineers, during the early part of the 20th century, were building the theoretical framework to construct better information systems, the molecular biologists were trying to understand the molecular machinery within the cell. Much of the cellular activities they were observing seemed to be best described using classic functions of information systems, such as coding, decoding, assembly etc. By mid-century, the information/entropy connection and its application to living systems and non-living systems made it clear that the information technology/molecular biology similitude was more than a mere superficial analogy. It is no accident that today the many functions of the DNA are described using, everyday information related words such as, transcription, translation and error correcting. Life is a very sophisticated information system and Information theory is in a sense essential to understanding how the cell, and its molecular subsystems work.


Werner R. Loewenstein, The Touchstone of Life: Molecular Information, Cell Communication and the Foundation of Life (New York: Oxford University Press, 2000), p.128

Erwin Schrödinger, What is Life? The Physical Aspect of the Living Cell (New York: The Macmillan Company, 1947), p.74

Claude Shannon & Warren Weaver The Mathematical Theory of Communication (Urbana: University of Illinois Press, 1949)

Hubert Yockey, Information Theory, Evolution and the Origin of Life (New York: Cambridge University Press, 2005), p.171-181

Lionel Johnson, “The Thermodynamic Origins of Ecosystems: A tale of broken symmetry” in Entropy, Information and Evolution, ed. Bruce Weber, David Depew & James Smith (Cambridge, Mass: MIT Press, 1998), p.75-105

Thursday, January 26, 2006

The Intelligence Illusion

Listen to this Essay Instead

Audio Part I http://omnidictum.blogspot.com/2006/01/intelligence-illusion-part-i.html

Audio Part II http://omnidictum.blogspot.com/2006/01/intelligence-illusion-part-ii.html

Audio Part II http://omnidictum.blogspot.com/2006/01/intelligence-illusion-part-iii.html

A few weeks ago, I listened to George Gilder, the Founder of the Discovery Institute and Richard Dawkins duke it out on a Boston public radio program about Intelligent Design. As with other, now ever increasing, articles and shows on the Intelligent Design/ Evolution debate, I have noticed that for all the points and counter points made, there is one concept that both sides do seem to implicitly accept. While both “design” and “evolution” are questioned, intelligence never is. In fact, if not for the implicit agreement about what this concept means, it would be impossible to even debate whether life is an outcome of an “intelligent” process or a “non-intelligent” process. Which means that this debate actually pivots on our sense of reassurance about the word intelligence.

So what is “intelligence”? If you look up a typical dictionary definition, you will find entries like, “mental capacity”, “capacity to acquire and apply knowledge” etc. Though you will never see it stated explicitly, what these definitions really mean is the “mental capacity” of a human being or the human “capacity to acquire and apply knowledge.” Our baseline understanding of the word intelligence is the human being. We have long used the word intelligence to differentiate ourselves from the rest of nature’s creatures. Even scientists usually speak of intelligence as a human adaptation. The collective consensus of this concept seems to suggest that we believe there exists some ‘singular thingy’, the presence of which in humans makes us intelligent and the absence of which in other creatures or entities, makes them unintelligent.

This idea has certainly driven hoards of psychologist, cognitive scientists and neuroscientists to try and quantify this ‘thingy’ in the human brain. They have studied stroke patients and individuals with selective brain damage, to see what ‘intelligent’ activities they can and cannot perform. They have studied individuals with Downs Syndrome, autism, dementia and savants. They have also studied how healthy individuals perform ‘intelligent tasks’ under varying conditions, like sleep deprivation or high stress. With the advent of a host of neuroimaging techniques, many of these experiments have been repeated with ‘live’ brain scans of subjects as they perform these different tasks.

And the cumulative result of these studies?

First, the human brain is a composite of many parts, each performing different functions. Second, any given observable ‘intelligent’ behavior of an individual involves multiple different parts. So say, taking a geometry test will involve attention functions, retrieval functions, spatial functions, and language functions. Selective damage to some parts may only affect those functions while leaving others intact. Or as the case is with savants, some parts, such as those involved in music work extraordinarily well, allowing them to play and compose, while parts of their brain needed for language are impaired making them unable to read music. It is, in other words, impossible to study human intelligence without parsing it into a series of functions all of which contribute in varying degrees to behaviors we consider to be overtly intelligent.

This fragmented reality of intelligence is also borne out by studies in animal cognition and comparative brain anatomy. Comparative brain anatomy reveals that the architecture of human brain shares much in common with that of other species. This is of course to be expected from common descent. Behavioral studies of animals show that corresponding to these shared structures, animals demonstrate varying degrees of shared cognitive abilities with humans. Behavioral observations of our close cousins the chimpanzees, for example document that they too are capable of basic tool use, a behavior once thought to mark the advent of human intelligence.

Yet another field that poses a challenge to the notion of intelligence as a ‘singular thingy’ is the field of artificial intelligence. While studies of the human brain and animal behavior have taken a tear down approach to dissecting intelligence, the field of artificial intelligence has taken a build up approach. We try to build inorganic systems that can behave like a human being and perform ‘intelligent’ tasks. For example, we can create an inorganic computer system like Deep Blue to perform a very complex human task like playing the game of chess. Other artificial intelligence systems can be built to mimic other human like cognitive tasks.

This means that the concept of intelligence we take for granted, when placed in the MRI machine, does not prove to be the solid word we think it is. In fact, it is more like a sponge that can absorb of a lot of different functions, but does not hold up very well under pressured scrutiny. It is certainly a useful word when used to denote comparative relationships such as, “that man is more intelligent” or “that animal is less intelligent.” But it is not a clear-cut singular concept that can clearly categorize objects into intelligent and unintelligent bins.

If the word intelligence itself cannot be used to clearly separate the intelligent and the unintelligent, then the implicit agreement between the scientists and the ID folks central to the debate on evolution, that the creative process of an “intelligent being” or human being is completely different from the unintelligent creative process, must be reexamined.

How then does the creative process of the “Watchmaker” differ from the creative process of the “Blind Watchmaker”?

To find out, let us take another look at what the “Watchmaker” does when he creates and how evolution creates, this time from an information perspective. The watchmaker generates the first design for a watch using the information he has about components of a watch in his memory. Once the information in his memory has been translated into an actual watch, he then tests it or evaluates whether his design works. If the watch keeps time and keeps ticking he will retain that design in his memory. If the watch does not work he will discard that design. The figure below illustrates this process.

Now let us take a look at what evolution does. Evolution too involves information about the organism. This time the information is stored in the organism’s DNA or genetic memory. During reproduction changes to the DNA’s base pairs, generates variation or new information. Once the new organism with the new information comes to life, the environmental conditions will dictate whether it will survive. In other words environmental conditions will evaluate whether the organism’s genetic information will be retained in the genetic memory of its descendents. If the organism is not able to successfully reproduce and pass on its genes, its genetic information will be lost or discarded from the genetic memory of future life forms. The figure below illustrates this process.

As you can tell by looking at the figures, while we have been caught up with the word intelligence, the creative process of a human being and the evolutionary process are a lot more alike than different. When we look at the two processes in terms of information we find that what really matters is a Memory where information can be stored. The memory is necessary to Generate new information. Information in memory is then brought under some sort of a selection pressure and is Evaluated. Information that ‘passes’ the evaluation is retained in memory and information that does not is discarded.

In effect, as far as information is concerned, it does not matter whether the memory involved is the human brain, DNA or even a computer. The process of information change, save for contextual specifics, remains the same for all systems. To put it another way, the precise mechanism by which new information is generated in the human brain and the DNA might differ, but both are engaged in the generation of new information. Similarly there may be differences in how information in each of these memory systems is evaluated, but nonetheless they are subject to an evaluation process at the end of which some information is retained and others are discarded.

This means that what Darwin articulated nearly a 150 years ago was a lot more than just the process by which new organisms come into being. What he discovered was much greater and far more profound. What he really discovered was the process of information change in nature.


Stuff to Read/Listen:

On Point Program on Intelligent Design

PBS cites about brain geography:

Sleep Deprivation & IQ

Chimpanzee tool use:

Animal Behavior Society

About the AI Deep Blue

BBC In Our Time Show on Artificial Intelligence

More Serious Stuff:

Felicia Gershberg & Arthur Shimamura, “Neuropsychology of Human Learning and Memory” in Neurobiology of Learning and Memory ed. Jose Martinez & Raymond Kesuer (San Diego: Academic Press, 1998), p.333-359;

Christine Sahley & Terry Crow, “Invertebrate Learning: Current Perspectives” in Neurobiology of Learning and Memory, ed. Joe Martinez & Raymond Kesner (San Diego: Academic Press, 1998), p.177-209.

Jane Goodall, The Chimpanzees of Gombe: Patterns of Behavior (Cambridge: Harvard University Press, 1986)

Hirata S, Celli ML “Role of mothers in the acquisition of tool-use behaviours by captive infant chimpanzees” Animal Cognition 6(4) (2003), p.235-44

Sanz C, Morgan D, Gulick S. “New insights into chimpanzees, tools, and termites from the Congo Basin” Animal Nature 64(5) (2004), p.567-81

Darold Treffert, Gregory Wallace “Islands of Genius” Scientific American 14(2004), p.14-23

Claus Hilgetag “Learning from Switched-off Brains” Scientific American 14(2004), p.8-9

Norman A. Krasnegor, G. Reid Lyon, and Patricia S. Goldman-Rakic Development of the prefrontal cortex : evolution, neurobiology, and behavior (Baltimore: P.H. Brookes Pub. Co., 1997.)

John Pearce Animal Learning & Cognition: An Introduction (Hove, East Sussex: Psychology Press, 1997)

Richard Dawkins, The Blind Watchmaker (New York: W.W Norton & Co., 1986)

This page is powered by Blogger. Isn't yours?