BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Brain's Memory Capacity Rivals World Wide Web

This article is more than 8 years old.

Illustration of the human brain (Image: Pixabay / public domain)

Neuroscientists say the human brain can store 10 times more information than previously thought.

The researchers calculated the amount of storage by measuring connections between brain cells, then translated that number into bytes, the units of computer memory. One byte consists of 8 bits (each with values of 1 or 0) and the human brain can hold more than one quadrillion (1 followed by 15 zeroes) bytes of information – a petabyte.

As Terry Sejnowski of the Salk Institute for Biological Sciences in La Jolla, California, a lead author of the recent study, said in a press release: "Our new measurements of the brain's memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web."

After examining a small cube of rat brain tissue under an electron microscope, the scientists created a 3D reconstruction of the centre of learning and memory, the hippocampus, along with connections among its neurons (brain cells). Each neuron resembles a tall tree, with numerous branches of 'dendrites' leading to a long trunk or 'axon'. Information – in the form of electrical signals – is transmitted from one neurone's dendrites to another cell's axon across a chemical junction – the 'synapse'.

Each neuron has thousands of synapses, and the amount of information a brain stores is partly determined by the strength of connections between neurons, which is influenced by the size of synapses. Traditionally, it's been assumed that synapses only come in 3 sizes – small, medium and large – but it now seems there are many more. This is because neurons have twig-like spines on their branches (dendrites) that enable the cells to make multiple synaptic connections.

The scientists identified 26 different spine sizes, which raises the memory capacity to roughly 4.7 bits of information per synapse. Multiply that by trillions of synapses and the total storage is an order of magnitude greater than previous estimates.

An adult brain only generates around 20 watts of power, equivalent to a dim light bulb. By mimicking biological connections, engineers could design more powerful and energy-efficient computers. Maybe these machines will even have enough memory to store all the information on the web today.

Follow me on Twitter or LinkedInCheck out my website or some of my other work here