Researchers at the Salk Institute for Biological Studies have made a pretty startling discovering, finally figuring out just how large a capacity our brain actually has when put in normal computing terms. With the help of GPU computing they figured out that we can store around one petabyte, which is nearly 10 times larger than what we originally thought.
Modeling the full function of the brain isn't easy, and even with clusters of GPU powerhouse servers it takes time to properly model the actual function of the connection between neurons, the synapses. The team of researchers modeled the full hippocampus of a rat with startling accuracy and 26 different sizes of synapses. Size matters, apparently, and the bigger the more storage the accompanying neurons have. They found that the average amount of information that can be held in one synapse is around 4.7 bits, far more than we thought previously.
The amount of information stored and transferred across those still misunderstood synapses doesn't translate directly to computing terms, however. We obviously don't store our moving memories as GIF's or even H.265 encoded MPEG's either. In fact, our brain is incredibly efficient, moving information across the synapse at a rate of around 10-20% of the time.
That translates into some serious power savings too. The researchers concluded that we use around the equivalent of 20W of power while awake and actively doing stuff. It's an incredible finding that should hopefully lead to much more research into just how our brain works and why it works in the manner it does. Better yet would be the development of systems that mirror the actual neural network of the brain, making more efficient and powerful computers capable of who knows what.