The Cell as an Information-Sharing Domain
Biologists are finding more uses for information theory, a branch of knowledge dealing with communication, signaling and knowledge transfer -- a design-theoretic approach if ever there was one.
In the October 19 issue of Science, three biologists from Johns Hopkins wrote a Perspective piece about systems biology ("How Information Theory Handles Cell Signaling and Uncertainty"). The article showcases the abilities of information theory to characterize and understand cell signaling, then encourages other biologists to join the new trend of viewing cells as information-sharing domains.
Population-level studies, they assert at first, can obscure the real action: "Detailed studies of cellular biochemistry at the single-cell level now show that cells responding en masse may have quite varied behaviors when examined individually, raising the question of how precisely signaling pathways can control a cell's actions." A "fundamentally different view" is required to understand what cells are doing. "Hence, rather than relying on seemingly robust and sensitive signaling input-output dependencies to analyze networks and cell behavior, we should instead seek to learn the limits to how well cell signaling can enable decision-making, given a cell's uncertain response to changes in the environment."
This is where information theory comes in. With its lingo about inputs, outputs, bandwidth, signal quality and noise, information theory is "just right" for tackling biological questions.
Mathematics turns out to have just the right theory. This theory has already been adopted to understand the workings of another type of "noisy" signaling network, the nervous system. Created to analyze uncertainty in human communication, information theory enables the limits of decision-making fidelity to be rigorously defined and measured. Conveniently, its general formulation permits analysis of many complex systems, including those found in biological signaling. [Emphasis added.]What, exactly, is information? In their answer, the writers hint that information may be a more fundamental aspect of a cell than chemistry:
Within this theory and in the context of signaling, information is quantified as the uncertainty about the environment that is removed by signaling activity (which is equivalent to the knowledge gained by the signaling system). The amount of information depends on both the amount of variability in the environment (the initial level of uncertainty) and noise in the signaling process itself (affecting the amount of uncertainty remaining). Extending this definition, we can also determine the information capacity of a system, which is the maximum information that a signaling system can obtain about some aspect of the environment under ideal conditions. This capacity is an intrinsic property of the signaling system, as much as the underlying chemistry, in that it is the key determinant of achievable decision-making fidelity.In their discussion, the authors refer to various information-theory principles, which can be summarized as below. Consider how these same principles work in cell signaling and human communication, like voice or text messages sent through wires or the air:
- Noise limits the information carrying capacity of a system.
- It is possible to use a noisy signaling output to accurately discriminate different input doses.
- The number of resolvable concentrations is limited and is a simple function of the pathway capacity.
- If mistakes do occur, the capacity determines the minimum amount of error that a system must tolerate, with higher capacity unambiguously allowing for lower error.
- Information lost at each step of processing should prevent information sources and destinations from being separated by more than a few intermediates.
- Information-processing optimality suggests that the aspect of the input associated with a higher capacity is the more pertinent one.
- The conditions that maximally utilize the information capacity of a sensory system should reflect the natural fluctuations in the environment.
It's not necessary, they say, to know all the details to get a picture of what the cell is doing. "Information theory allows such categorical statements without necessarily requiring detailed specifics of the signaling network organization and operation, and thus can be used to analyze the capabilities of complex and incompletely characterized biological systems." This provides a key to the big picture, from which perspective the details will make more sense. What's necessary at the outset is to "determine the capacities of these signaling pathways and networks and the relationships between system structure and capacity."
Are these authors advocates of intelligent design? Clearly not. They believe that natural selection somehow hit on the optimal solution to signaling systems:
Acquiring information typically costs the cell energy, time, or opportunity, so a signaling system that collects more information than is necessary or ignores information that is easily obtained wastes valuable resources. Therefore, under evolutionary pressure, it's expected that signaling systems are optimally matched to the sources of information they have evolved to process. Indeed, examples from neuroscience (such as sensory perception) and developmental biology (such as embryonic patterning) show that biological systems usually have a capacity that is minimally sufficient for the information they process.This is nothing more than a just-so story that assumes what it needs to prove. They have turned "evolutionary pressure" into an optimizer! Human engineers seek to avoid wasting resources, too; they also try to match carrying capacity to sources. But they accomplish this with intelligence, not "evolutionary pressure." Darwinian evolutionists play a "Heads I win, tails you lose" game, so that no matter how elegant a living system is, or how optimal a living solution, they can say unguided evolution did it.
Yet notice that the only cases of systems where we observe optimization of signal to noise are (1) life and (2) human-engineered communication systems. Is lightning a communication channel from cloud to ground? No. Does it care about the carrying capacity of signal to noise? The idea is laughable. So is any other nonliving example of transfer of material: a star leaking its contents into a black hole, a river flowing from the mountains to the ocean, chunks of Mars arriving on Earth as meteorites. None of these are amenable to information theory. Environmental factors and laws of physics may act as constraints, but they do not optimize.
The uniqueness of optimization in signaling strengthens the argument for intelligent design in cellular communication as is self-evident in human-designed communication systems (e.g., software and computer networks). This is not to claim cells are intelligent, but rather to say they bear the stamp of purposeful design, just as a radio or smartphone does.
In their conclusion, the authors focus on the bright future of biology as information theory contributes its insights. Notice how the word "evolving" functions here as a figure of speech for intelligent design:
As biologists are finding more uses for information theory, it is evolving to handle more complex biological networks. By rigorously quantifying the properties and limits of cellular information transfer, new questions will be formulated and answered within the single-cell paradigm shift that is already underway. As the ability to quantify the functionality of signaling networks improves, we will hopefully gain insight into the details of their underlying chemistry while gaining a deeper understanding of their higher-level organization and functionality.Higher-level organization; information transfer; that's ID talk. We can rightly sweep away the evolution talk as little more than "noise." Purged of it, the "signal" point is that intelligent design has a great deal to offer the biological sciences. It has done so, it is doing so, and it will continue to do so. That, by the way, is why our sister podcast is called "ID the Future."