Google Play icon

Memory Capacity of the Human Brain 10 Times Larger than Thought Before

Share
Posted January 22, 2016

The human brain – one of the most fascinating biological systems to ever come about by means of natural selection – had just become even cooler: a new study, led by researchers from the Salk Institute found it to be capable of storing up to ten times more information than previously thought.

New study finds the human brain to be highly energy-efficient and capable of storing up to ten times more information than thought before, while avoiding common computational errors. Image credit: geralt via pixabay.com, CC0 Public Domain.

New study finds the human brain to be highly energy-efficient and capable of storing up to ten times more information than thought before, while avoiding common computational errors. Image credit: geralt via pixabay.com, CC0 Public Domain.

Using rat neurons as a proxy for human brain cells, the team has determined that on average each synapse can hold about 4.7 bits of information. Scaled up to the size of the human brain, this number implies a storage capacity of about one petabyte, or 1,000,000,000,000,000 bytes.

The study also sheds light on how the brain can manage massive amounts of data with very little energy, and avoids conceptual traps that can stymie machine learning algorithms.

“In neuroscience, we’ve known for a long time that when one neuron sends a signal to another, it’s only able to get a message across 10 to 20 percent of the time on average,” said study co-author Tom Bartol. “And the percentage it’s able to get a message across is proportional to how strong the synapse is”.

According to Bartol, the new study indicates that this might be a way to conserve energy – if a neuron is active only 10 to 20 percent of the time, then its energy use is 80 percent lower than if it were active non-stop.

The study found that the success rate of synaptic connections is averaged over time, making the system more efficient, and reducing the effect of individual synaptic mistakes.

“Our hypothesis is that building something that’s accurate on every event is more (energy) expensive and harder to do than having something that is highly accurate and stable over many events,” said Bartol.

The research team also found the brain to have at least 26 different sizes of synapses – as compared to only 3 in previous estimates. Given that synapses adjust their size in accordance with the signals they receive, the more sizes are available, the more data each synaptic state can hold.

The study, entitled “Nanoconnectomic Upper Bound on the Variability of Synaptic Plasticity” was recently published in the journal eLife and can be accessed in its entirety online.

Sources: study, sandiegouniontribune.com, medicalxpress.com.

Featured news from related categories:

Technology Org App
Google Play icon
85,355 science & technology articles

Most Popular Articles

  1. New treatment may reverse celiac disease (October 22, 2019)
  2. "Helical Engine" Proposed by NASA Engineer could Reach 99% the Speed of Light. But could it, really? (October 17, 2019)
  3. New Class of Painkillers Offers all the Benefits of Opioids, Minus the Side Effects and Addictiveness (October 16, 2019)
  4. The World's Energy Storage Powerhouse (November 1, 2019)
  5. Plastic waste may be headed for the microwave (October 18, 2019)

Follow us

Facebook   Twitter   Pinterest   Tumblr   RSS   Newsletter via Email