Friday, September 29, 2006

If There Ever was a Mother-of-All Theories……..

A review of Decoding the Universe: How the new science of Information is Explaining Everything in the Cosmos, from our Brains to Black Holes by Charles Seife



In reviews thus far of Decoding the Universe, both formal and informal, there is a pattern of confusion and disorientation about the book’s real topic.

Take Laura Miller’s review on Salon.com for example. Though it is largely a positive review, she introduces the book as a book on cosmology and compares it, as a few other reviewers have, to Seth Lloyd’s book on quantum computing, Programming the Universe.

Yes it is true, Charles Seife does write about the universe and he does have a chapter on quantum computing, but there is more to the book than multiverses and quantum computing.

In fact, the very reason for this general sense of disorientation may be the real central concept of the book – Information. For most of us, information is, dates, faces, or names of places. It is an abstract concept. Contrast that to the concept of “Information” Seife introduces, a concept that is physical, a concept that is probabilistic and one that governs the behaviors of atoms, black holes and all living beings.

The word “Universe” in the title may have been a bit misleading, conjuring a, somewhere ‘out there’ in a subatomic realm, far far away, image. The universe in Seife’s title is not just about the universe out there in the dark sky, it really alludes to a ‘Universal Law’ that applies to all things in our universe. Seife’s book is really about an emerging law, that may well become, once all the debates come to an end, the most fundamental law of the universe.

“Information can neither be created nor destroyed.”

The book begins with three important figures, Alan Turing the English mathematician who is considered as the father of computer science, Ludwig Boltzmann, who formalized the statistical concept of thermodynamic entropy and engineer Claude Shannon, whose Information theory is the reason we have such mainstays today as the internet and cell phones.

In the first three chapters Seife introduces the works of these three men and the happenstance way in which, the exorcism of a demon (Maxwell’s theoretical demon) establishes a fundamental connection between Boltzmann’s entropy and Shannon’s information. As a result, thermodynamics, the field in physics that describes the behaviors of mass and energy became “a special case of information theory,” and along with it, information itself became, according to Seife, a quantifiable and concrete property of mass and energy.

As Seife goes on to tell us, the laws of thermodynamics are not the only ones to be subsumed by the concept of information. Even Einstein’s theory of relativity, Seife insists, is really, “a theory about information.” Specifically, the theory dictates the maximum speed at which information can be transferred in the universe: speed of light. He doesn’t stop there; he takes us to the subatomic realms where he introduces the information character of quantum behavior. “In fact,” he says, “all the absurdity of quantum theory – all the seemingly impossible behaviors of atoms, electrons, and light – has to do with information: how it is stored, how it moves from one place to another, and how it dissipates.”

While the book does heavily tend towards physics and cosmology, there is one chapter often overlooked by readers and reviewers, that makes it more than just a book about the ‘universe as a giant computer.’ This is the chapter about the information characteristic of life. Almost half a century ago, even before the discovery of the structure of the DNA, quantum theorist Erwin Schrödinger, realized that life is a, “delicate dance of energy, entropy, and information, ” and said as much in his book What is Life?

The discovery of the DNA’s structure and our subsequent understanding of its information role in living systems have only reaffirmed Schrödinger’s intuition about the information character of living systems themselves. Today, as Seife explains, all living beings, ourselves included, are understood as, “incredibly complex information-processing machines, ones capable of tasks that no other such machine is capable of, but information-processing machines nonetheless.”

For at least a quarter century now, the information concept has been cropping up in different disciplines and specialists in different fields have been writing about information theory’s influence within their fields, whether it is molecular biology or black hole theory. Therefore it is true that many popular science books written by specialists have addressed much of what Seife relates. However, most specialists, though aware of information theory’s influence in their own fields of expertise, are often oblivious to the theory’s influence in disciplines other than their own. This gives Seife, as a non-specialist a unique vantage point of sorts, and sets his work apart from other books that may have addressed many of these concepts before. Unhindered by the usual blinders of specialization, Seife is able to weave together, what has so far been considered disconnected stories, with one thread – the concept of information. Perhaps the most important thread of all.

It is nothing short of extraordinary that one set of rules (the rules of information), which dictate the behavior of gigantic exotic unseen objects like black holes, and the behavior of our modern computing machines could also dictate how our own minds and bodies function.

Seife’s book is the first truly comprehensive treatise on the information concept in all its dimensions. It is an ambitious, necessary, and timely book that is a harbinger of things to come. It is the leading edge of a wave. As the realization of information theory’s importance begins to take hold, there will be a deluge of books on this topic. If you want to be ahead of the curve, read this book.

By Andrew Jennings, author The Invisible Matrix: The Evolution of Altruism, Culture, Human Behavior and the Memory Network


Wednesday, August 23, 2006

Where the book fits in the Big Picture


Monday, August 21, 2006

The New Book on Altruism


After 3 long years of solid writing, it is finally done!

The Title: The Invisible Matrix
Subtitle: The Evolution of Altruism, Culture, Human Behavior, & the Memory Network

Buy A Copy
$15.11

(This price is for a limited time only. The price goes up in October)

See table of contents and extended excerpt.



Book Back:

Why did we evolve to be altruistic?

Why did we evolve to value a society of equals?

Why are we the only species to care for our aging members?

How did we become capable of culture?

These are the longstanding puzzles of human behavioral evolution that have defied explanation within the classic Darwinian framework. For the first time, promising clues to these intractable puzzles are emerging from an unexpected field – computer science. According to a growing body of research, delicate living systems and bulky computers are both information systems engaged in the storage, transmission, and processing of information.

This shared characteristic of life systems and our information technology devices gives us an opportunity to study human evolution using concepts from computer science. Such analysis, as detailed in The Invisible Matrix, points to the existence of an important ‘invisible’ adaptation in human beings. This ‘invisible’ adaptation is a memory network mechanism, and it is the reason we evolved to be cultural beings, who are altruistic, who value equality, and our aging elders.


Wednesday, February 01, 2006

Finding the Love of Your Life ( A Lesson in Probability)

It is that time of the year again, when American businesses, struggle to find a way to wrap their products in red & pink and pitch it to the American consumer looking for a way to celebrate that day dedicated to all things romantic, Valentine’s Day. For those who have found ‘the love of their lives’ it is a day, to reaffirm their love for each other and to secretly relish the fact that they no longer have to play the arduous dating game. For the rest of us, who have yet to find that ‘love of our lives’ it is hand-wringing time. Will we ever find Mr. or Ms. Right?

Well, maybe there is nothing to fret about, after all this is the 21st century, the era of internet dating and as the online dating companies boast, millions of choices. Somewhere in that million, there has got to be that one perfect one for you, right? But how do you find that one in a million? You may not like the answer, but finding the romance of your life has a lot more to do with the mathematics of probability than roses and champagne.

Imagine that you are a blue sock with tiny white polka dots and you must find your perfect match the other blue sock with white polka dots. Now imagine that to find your perfect match you are only allowed to close your eyes and pick one from a sock drawer with a million different socks in it. This is essentially a random choice. What are the chances that you will find the right one? 1/1000,000. You could get lucky and pull the right sock out of the drawer. Probabilistically speaking, one in a million means, it is just not gonna happen.

You need a way to go from impossible to more possible. So what can you do to increase your odds? In mathematics the probability of a given event (that is finding that one perfect match) will depend on the total number of choices. You must find 1 blue sock with white polka dots out of a million different ones. Which translates to 1/1000,000.

If you can some how decrease the total number of choices, you would be able to increase your chances. So let’s say that you could eliminate all the white socks from the drawer and reduce the total number of socks in the drawer to say 100,000. Now your chances of finding the right one if you closed your eyes and picked one out of the drawer at random would be 1/100,000. This is better than what it was but still pretty impossible. So we should reduce again. This time let’s get rid of all the colors except for those that are blue. Assume that this reduces things to 10,000. Your chance of finding the right one at random is 1/10,000. Better than before, still not good enough. So we reduce again and keep only those socks with patterns, which, let us assume, reduces our pool to a 100. Now your chances are 1/100. This is much much better, but still a lot of tries to find the right one. So we will do one more elimination. We will reduce the pool to only those socks with white patterns and let us assume that this brings it down to 10. Now your chances are 1/10. This is fantastic! This means that somewhere between the first sock you pull out and potentially the tenth one you will definitely find your perfect match, that other blue sock with white polka dots.

If only finding the love of your life was like finding a matching pair of socks.

In some ways though, it sort of is. The sock analogy applied to the dating world would be the chances of meeting Mr., or Ms. Right if all you were allowed to do was pick someone out of the million at random. You could find the love of your life anywhere between the first and the millionth date. Those are bad odds. So you increase your chances of success by elimination, let’s say by age. That will bring down your choices and increase your odds. By the sock analogy the more filters you use the greater your odds will be.

Ah! But here is where the similarities end. Using the filters will only increase your odds if you use the right filters or parameters. The filters we used to eliminate all the wrongs socks, were useful only because we knew exactly what we were looking for. To increase your odds of finding your perfect match, you need to know what the right questions or parameters are. To figure out the right parameters or questions, you must first know exactly what you want. Otherwise the math is useless.

So what if you don’t know what exactly you are looking for? What if, you really need that dependable geek, but your dating check list keeps you dating all those irresponsible bad boys? What if, you think you know the check list that will get you to your soul mate, but don’t really? In fact that is exactly what Dr. Phil and all of those other relationship gurus on the self-help aisle are counting on. We all want to find that right one, but don’t really know the right check list. Our yearning and ignorance becomes their cha-ching.

Those of us, who do end up finding the love of our lives, often realize things only after the fact. You may have thought all along that your perfect match was the blue sock with the white polka dots, but somewhere along the way, it may suddenly dawn on you that the white sock with the blue polka dots you are holding is the real ying to your yang.

Tuesday, January 31, 2006

Randomness & Intelligence

Given that the undisputed example of intelligence is the human being, in the next couple of posts I will take a closer look at how we human beings make decisions. What role probability & randomness has on the many everyday decisions we make.

Friday, January 27, 2006

What is Information?

If evolution is really the process of information change in nature, what then is information? In our everyday use of the term we think of information as intangibles, dates, facts about the world, notions, concepts etc. Such pieces of information take on a tangible form only when we utter them to someone else, when we write them down or email them to someone. For most of us, this understanding of the concept of information is all that we need to navigate our everyday lives. Unbeknown to most of us, this everyday definition of information is part of a much larger and more fundamental concept of information as it is defined in physics.

In the realm of physics, the concept of information is more than a mere intangible; it refers to the physical arrangement of the elements that make up any given system. Information in this sense is a measure related to the concepts of order and organization. It is a universal concept that can be applied to almost any system from galaxy clusters, black holes, life forms to computers. “Information, in its connotation in physics,” writes Werner Loewenstein in his book The Touchstone of Life, “is a measure of order – a universal measure applicable to any structure, any system. It quantified the instructions that are needed to produce certain organization. This sense of the word,” he says, “is not too far from the one it once had in old Latin. Informare meant to ‘form,’ to ‘shape,’ to ‘organize.’”

To understand this better, consider how information is stored in our brain. An example of information, in our daily use of the term, is the details that describe the car that you drive, such as its make, model, color etc. This information about the details of your car is stored in your brain by the specific network firing relationships or arrangements formed between the neurons in your brain. When these relationships change, so does the memory or the information. If these arrangements are completely lost, so is the information. Such information loss occurs when there is brain injury that destroys the neurons and their established pathways. Much the same occurs with degenerative brain diseases such as Alzheimer’s. Amyloid plaques between neurons and abnormal neurofibrillary tangles inside the neurons of affected individuals are believed to disrupt these neuronal relationships. With the progressive loss of these connections is the irreversible loss of “information” contained within those specific connections. Thus we as human beings can store information because the neurons in our brains are capable of making very specific interacting network arrangements in the brain.

We see the same phenomenon in the information storage devices of our computer age. Take the Compact Disc or CD. If you could zoom in on the landscape of a CD with information on it, you would see that its surface is pockmarked with microscopic craters called pits. The CD player is able to “read” the information on the CD by deciphering the specific arrangements of these pits and lands (regions without pits) on the surface of the CD.

When we look at the DNA molecule we see the same principle at work. In this case the elements capable of specific arrangements are the four bases of the DNA molecule, adenine, guanine, thiamine and cytosine. A set of three bases, a triplet code is needed to assemble amino acids, the building blocks of proteins. Different arrangements of the base pairs or different triplet codes code for specific amino acids . Once again the information for the assembly of amino acids are contained in the arrangement of the bases of the DNA molecule. Changing the base pairs therefore changes the information in the DNA molecule.

If you have never heard of this interpretation of information, there is an understandable explanation. This concept of information in physics as the arrangement or more specifically the possible arrangements of the components of a system, is a statistical concept linked to the statistical concept of entropy in thermodynamics. Together they form a conservation law applicable to all systems, living and nonliving from massive galaxies to tiny cells. In spite of the universality of this relationship it remains somewhat obscure because it is not extensively used, except in certain fields of study.

Modern Information theory was born, in the early half of the 20th century out of the theoretical advances that accompanied the invention and development of modern communication and computation systems. In fact the two central figures of information theory are mathematician Alan Turing also known as the father of modern computer science and engineer Claude Shannon whose theoretical work laid the foundation for modern digital circuits and communication networks. Not surprisingly the bulk of information theory’s influence continues to be in the many fields of modern information technology.

Outside the information technology fields, information theory has had the widest application in molecular biology. This was the result of a series of happy historical coincidences. While mathematicians and engineers, during the early part of the 20th century, were building the theoretical framework to construct better information systems, the molecular biologists were trying to understand the molecular machinery within the cell. Much of the cellular activities they were observing seemed to be best described using classic functions of information systems, such as coding, decoding, assembly etc. By mid-century, the information/entropy connection and its application to living systems and non-living systems made it clear that the information technology/molecular biology similitude was more than a mere superficial analogy. It is no accident that today the many functions of the DNA are described using, everyday information related words such as, transcription, translation and error correcting. Life is a very sophisticated information system and Information theory is in a sense essential to understanding how the cell, and its molecular subsystems work.

-------------------------







Werner R. Loewenstein, The Touchstone of Life: Molecular Information, Cell Communication and the Foundation of Life (New York: Oxford University Press, 2000), p.128

Erwin Schrödinger, What is Life? The Physical Aspect of the Living Cell (New York: The Macmillan Company, 1947), p.74

Claude Shannon & Warren Weaver The Mathematical Theory of Communication (Urbana: University of Illinois Press, 1949)

Hubert Yockey, Information Theory, Evolution and the Origin of Life (New York: Cambridge University Press, 2005), p.171-181

Lionel Johnson, “The Thermodynamic Origins of Ecosystems: A tale of broken symmetry” in Entropy, Information and Evolution, ed. Bruce Weber, David Depew & James Smith (Cambridge, Mass: MIT Press, 1998), p.75-105

This page is powered by Blogger. Isn't yours?