The human brain, and any animal brain for that matter, is an engineering marvel that evokes comments from researchers like “beyond anything they’d imagined, almost to the point of being beyond belief”1 and “a world we had never imagined.”2 Why do discoveries about the brain evoke such startling statements from secular scientists? The main reason is that random, purposeless evolution and its imaginary processes are entirely unable to account for the brain’s seemingly infinite complexity. This article will highlight some of what researchers have discovered about this amazing organ and hopefully inspire the same awe in you and direct the glory to our infinitely powerful Creator who engineered it all.
Perfect Optimization
A paper published in 1994 was one of the first to reveal the complexity of the human brain through the analyses of various animals’ nervous systems.3 The researchers applied the principles of combinatorial network optimization theory used in electrical engineering to see if they could fit the “save wire” principle with brain structure.4 The scientists found that at multiple hierarchical levels in the whole brain, nerve cell clusters (ganglion), and even at the individual cell level, the positioning of neural units achieved a goal that human engineers strive for but find difficult to achieve—the perfect minimizing of connection costs among all the system’s components.
The human brain, and any animal brain for that matter, is an engineering marvel.
When the researchers studied a seemingly simple microscopic soil worm called a nematode, they found the same level of efficiency. From among 40 million possible network layouts of ganglia in the nematode’s nervous system, they found that each ganglion placement produced the least possible total connection length. Even the positioning of individual nematode neurons was thoroughly optimized. Similar results were obtained for datasets on the positioning of entire brain regions in humans and other mammals.
Vast Computational Power
In 2010, a group of Stanford University scientists published a new visualization technique based on targeting individual proteins involved in transmitting electrical brain signals.5 This technology allowed for previously unknown levels of multidimensional assessment of synapse complexity and diversity. Researchers discovered that a single synapse is like a computer’s microprocessor containing both memory-storage and information-processing features. The previous oversimplified belief was that synapses acted like basic on/off switches—but nothing could be further from the truth since the brain acts more like a quantum computer than a digital computer. Just one synapse alone can contain about 1,000 molecular-scale microprocessor units acting in a quantum computing environment. An average healthy human brain contains some 200 billion nerve cells connected to one another through hundreds of trillions of synapses. To put this in perspective, one of the researchers revealed that the study’s results showed a single human brain has more information processing units than all the computers, routers, and Internet connections on Earth.1
Phenomenal Processing Speed
As is typical of researchers bogged down in an evolutionary mindset where the complexity of living systems is routinely underestimated, the processing speed of the brain had been greatly underrated. In a new research study, scientists found the brain is 10 times more active than previously believed.6,7 In this project, scientists discovered that the branched projections of neurons (dendrites) are not merely passive conduits but are electrically active in animal brains and generate nearly 10 times more electrical spikes than the main body of the neuron cell (called a soma).
These new results overturned the long-held belief that electrical spikes in the soma are the primary way in which the mental processes of perception, learning, and memory formation occur. While the somas produce all-or-nothing spikes of electricity like a digital signal, the dendrites are hybrid systems performing both analog and digital transactions. Once again, this is more evidence of quantum computer-like brain engineering operating at warp speed levels. The large number of dendritic spikes also means the brain has more than 100 times the computational capabilities than was previously believed. While humans are only beginning to develop quantum computing devices, the Creator engineered our brains at a much more complicated, compact, and efficient level at the beginning of creation.
Petabyte-Level Memory Capacity
Yet another recent discovery revealed incredible levels of memory storage in the human brain.8 Terry Sejnowski, a leading scientist on the research paper, stated, “This is a real bombshell in the field of neuroscience.”9 Dr. Sejnowski perhaps also unwittingly framed the results using designed-based thinking when he explained:
We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power. Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web.9
What a mighty Creator we have who can engineer that much memory between our ears. Nothing engineered by humans even comes close.
What a mighty Creator we have who can engineer that much memory between our ears.
Optimal Energy Efficiency
The brain is one of the most energy-hungry organs in the human body, consuming about 20% of the energy budget even though it represents only about 2% of the body’s mass.10 Despite this high-energy demand, the brain operates with startling efficiency—with no comparison among modern man-made devices. One Stanford scientist who is helping develop computer brains for robots calculated that a computer processor functioning with the computational capacity of the human brain would require at least 10 megawatts to operate properly. This is comparable to the output of a small hydroelectric power plant. As amazing as it may seem, the human brain requires only about 10 watts to function.11 A phenomenal level of energy optimization engineering like this can only be attributed to an infinitely wise Creator.
Multidimensional Processing
One of the most daunting challenges in neuroscience has been detecting the link between brain structure and how it relates to information processing. Brain cells communicate and process information in a completely different way than computer networks, and the neurological data appeared to be very chaotic, primarily because scientists could not interpret them properly. This lack of a clear link between neural cell network structure and how information is processed was a huge barrier to understanding the brain’s functioning.
Recently, scientists made progress by constructing three-dimensional graphs of neural cell networks that more accurately reflect the direction of information flow.12 To do so, they analyzed the graphs using complex algebraic topology. Applying this approach to a local network of neurons in the neocortex region revealed a remarkably intricate and previously undetected topology of synaptic connectivity at multiple dimensions.
When the brain processes information such as a thought or some other task, temporary informationally interactive structures appear that consist of three-dimensional groups of cells called cliques. These structures communicate with each other while that action is processed—just like a group of people might briefly come together in a clique to chat. When the neural process concludes, the association disintegrates. But most amazing are the specific geometric structures called cavities that develop when these cliques form. One of the researchers, Ran Levi, stated, “The appearance of high-dimensional cavities when the brain is processing information means that the neurons in the network react to stimuli in an extremely organized manner.” He continued:
It is as if the brain reacts to a stimulus by building then razing a tower of multi-dimensional blocks, starting with rods (1D), then planks (2D), then cubes (3D), and then more complex geometries with 4D, 5D, etc. The progression of activity through the brain resembles a multi-dimensional sandcastle that materializes out of the sand and then disintegrates.13
He also said:
We found a world that we had never imagined. There are tens of millions of these objects even in a small speck of the brain, up through seven dimensions. In some networks, we even found structures with up to eleven dimensions.13
Biophoton Brain Communication
Not only does the brain work by chemically generated electrical pulses, but research is producing a growing amount of evidence indicating photons (packets of light) play an important role in the daily function of brain cells as well.14,15 Much of this evidence comes from observing cells in the dark and evaluating the light photons they produce as they work. In fact, data now strongly indicate that many cells use light to communicate. Prior to observing this, there were ample indications that bacteria, plants, and even kidney cells communicate with light. It has been shown that rat brains literally light up due to the photons produced by neurons and that spinal neurons can conduct light.
Neurons contain many light-sensitive molecules such as porphyrin rings, flavinic, pyridinic rings, lipid chromophores, and aromatic amino acids. Even the mitochondria machines that produce energy inside cells contain several different light-responsive molecules called chromophores. This research suggests that light channeled by filamentous cellular structures called microtubules plays an important role in helping to coordinate activities in different regions of the brain. Because it is difficult to explain electrical activity in the overall brain that synchronizes over long distances, researchers now believe that light communication aids the processing speed of the communications channels since photons move much faster than electrical signals.
It’s overwhelmingly clear that evolution as a theory has failed to explain the complexity of the brain. It’s entirely unsupportable that this organ containing spectacular levels of processing capacity, efficiency, memory storage capabilities, up to 11 dimensions of structure for a single information process, and dual electrical-photonic communication could have evolved by sheer chance. The brightest human engineers cannot come up within anything close to this level of ingenuity.
Evolution as a theory has failed to explain the complexity of the brain.
The great stumbling block for Darwinian evolution is explaining how a multitude of features could simultaneously coalesce to form a unified, functional biological system. Obviously, the human brain did not evolve by the progressive addition of one factor at a time as postulated by evolution. The brain’s efficiency, power, and complexity not only defy any explanation that relies on chance, but also points directly to an omnipotent and all-wise Creator.
References
- Moore, E. A. Human brain has more switches than all computers on Earth. CNET. Posted on cnet.com November 17, 2010, accessed July 7, 2017.
- Osborne, H. Brain Architecture: Scientists Discover 11 Dimensional Structures That Could Help Us Understand How the Brain Works. Newsweek. Posted on newseek.com June 12, 2017.
- Cherniak, C. 1994. Component Placement Optimization in the Brain. The Journal of Neuroscience. 14 (4): 2418-2427.
- The “save wire” principle involves the minimization of connections between a system’s interconnected parts in order to reduce costs.
- Micheva, K. D. et al. 2010. Single-Synapse Analysis of a Diverse Synapse Population: Proteomic Imaging Methods and Markers. Neuron. 68 (4): 639-653.
- Moore, J. J. et al. 2017. Dynamics of cortical dendritic membrane potential and spikes in freely behaving rats. Science. 355 (6331): eaaj1497.
- Gordon, D. Brain is 10 times more active than previously measured, UCLA researchers find. UCLA news release. Posted on newsroom.ucla.edu March 9, 2017.
- Bartol, T. M. et al. 2015. Nanoconnectomic upper bound on the variability of synaptic plasticity. eLife. 4: e10778.
- Memory Capacity of Brain Is 10 Times More Than Previously Thought. Salk News. Posted on salk.edu January 20, 2016.
- Herculano-Houzel, S. 2011. Scaling of Brain Metabolism with a Fixed Energy Budget per Neuron: Implications for Neuronal Activity, Plasticity and Evolution. PLOS One. 6 (3): e17514.
- Hsu, J. How Much Power Does The Human Brain Require To Operate? Popular Science. Posted on popsci.com November 6, 2009.
- Reimann, M. W. et al. 2017. Cliques of Neurons Bound into Cavities Provide a Missing Link between Structure and Function. Frontiers in Computational Neuroscience. 11 (48).
- Osborne, Brain Architecture: Scientists Discover 11 Dimensional Structures.
- Rahnama, M. et al. 2010. Emission of Mitochondrial Biophotons and their Effect on Electrical Activity of Membrane via Microtubules. Journal of Integrative Neuroscience. 10 (1): 65-88.
- The Puzzling Role of Biophotons in the Brain. MIT Technology Review. Posted on technologyreview.com December 17, 2010.
* Dr. Tomkins is Director of Life Sciences at the Institute for Creation Research and earned his Ph.D. in genetics from Clemson University.