Newsletter Subject

The Power of a Human Brain

From

brownstoneresearch.com

Email Address

feedback@e.brownstoneresearch.com

Sent On

Tue, Jun 18, 2024 08:24 PM

Email Preheader Text

The Power of a Human Brain By Jeff Brown, Editor, The Bleeding Edge Earlier this month, we saw the A

[The Bleeding Edge]( The Power of a Human Brain By Jeff Brown, Editor, The Bleeding Edge [Jeff] Earlier this month, we saw the Aurora supercomputer ascend the rankings to the second most powerful supercomputer on Earth. It clocked in at an incredible 1.012 exaflops (floating-point operations per second). An exaflop is equivalent to 1 quintillion flops – a number so large, it’s hard to comprehend. Exascale computing, once the realm of science fiction, is now possible on two supercomputers based in the U.S. Aurora is in Illinois at the Argonne National Laboratory. And Frontier is in Tennessee at the Oak Ridge National Laboratory. Both are under the U.S. Department of Energy (DOE). Which is kind of funny considering how much electricity is required to run a computing system of this size… Source: Argonne National Laboratory Replicating the Human Brain Frontier currently holds the top spot at 1.206 exaflops. It’s a monster of a supercomputer built on Hewlett Packard Enterprise (HPE) Cray supercomputers – technology HPE now owns through its 2019 acquisition of Cray. But at its core, the Frontier supercomputer is powered by Advanced Micro Devices (AMD) central processing units (CPUs) and graphics processing units (GPUs). These powerful chips give Frontier the capabilities for remarkable computations, allowing scientists to tackle subjects like nuclear fusion, cosmology, complex climate models, and subatomic particle research. And to do that, Frontier requires 21 megawatts of electricity. Source: Oak Ridge National Laboratory for the U.S. Department of Energy Now we know why Aurora and Frontier are housed under the DOE! That’s enough electricity to power roughly 20,000 homes. This has long been a point of consternation in computing. After all, the human brain – a supercomputer in its own right – can perform tasks a supercomputer can’t perform. And it only requires 12 watts (W) to operate. That’s less than a normal light bulb. The human brain is remarkable with 100 billion neurons and the capacity for 100 trillion parameters. No computing system has been able to do that yet. Which is why the human brain is such an intense area of study. In a perfect world, we have massively powerful computing systems that require very little energy to operate. Biological computing is the field exploring these types of systems. A big part of the challenge has been understanding how the brain functions. The field of connectomics focuses on how each brain cell is connected to the other, in an effort to understand how the human brain performs so well with so little energy. And last month, [Google finally published some remarkable research in Science]( after a decade of work in connectomics. The research shed a little light on this dark corner of our understanding of the human brain… AI Advances Human Understanding The team at Google, with the help of researchers at Harvard, was able to map out a single cubic millimeter portion of a human temporal cortex using electron microscopy. Put more simply, the team was able to image a piece of brain tissue about the size of half a grain of rice by using an extremely high-resolution microscope. The data collected is remarkable, amounting to 1.4 million gigabytes (or 1.4 petabytes). For comparison, an average smartphone is typically 128 gigabytes. The connectomics data for this tiny portion of the brain is about 11,000 times that amount. Excitatory Neurons | Source: Google Research Such a small volume of brain tissue revealed 16,000 neurons, 32,000 glia, 8,000 blood vessel cells, and 150 million synapses. This mapping was done using specialized software, specifically machine learning (ML), to make such a daunting task possible. I remember back in 2020 when the same team at Google released their connectome for a portion of the brain of a fruit fly. That was a monumental achievement at the time, also aided by machine learning, that revealed the connections of 25,000 neurons in a portion of the fruit fly brain. This makes Google’s latest research that much more remarkable, showing the progress – enabled by ML – in just four years. One of the more interesting discoveries was that of a rare class of synaptic connections, shown below. A portion of one neuron (blue) is shown making more than 50 connections (yellow) with another neuron (green). Source: Google Research We don’t yet understand the significance of this, or how it all works. This is just the starting point. Once we understand how the brain is structured and interconnected, we will ultimately be able to figure out the secrets to how it works. And in the meantime… The semiconductor and computing industries are in an accelerated race to build technology capable of performing like the human brain… Not so much the energy efficiency of the brain… but to be able to manage a trillion bits or a trillion parameters of information or more. The Tech Behind the World’s First Ultra-Intelligence AI Computer One of the possibilities for Frontier is to facilitate mapping out the human brain. The significance of exascale computing is that the data from electron microscopy of an entire brain would be measured in exabytes, not petabytes. The task will therefore require an exascale supercomputer. And while the efforts to map the brain will continue, a much faster race is taking place to develop a computing system capable of exceeding the human brain. This requires both semiconductors and computing systems designed and optimized for neural networks, a form of artificial intelligence (AI). This kind of design is more similar to our brains than the supercomputers like Aurora or Frontier. [Graphcore]( is a private company that has been on this mission, developing its intelligence processing unit (IPU). Graphcore first released its IPU in 2022, with its unique stacked wafer-on-wafer design, enabling a 3D semiconductor architecture. Source: Graphcore This design enables parallel processing on a massive scale. A simple reference, shown below, is the difference between a CPU (like AMD or Intel), a GPU (like Nvidia or AMD), and an IPU. GPUs are designed to ingest large blocks of contiguous data. The design is a single instruction with multiple data inputs in parallel. Source: Graphcore In the case of the IPU, the design is for multiple instructions and multiple data inputs. In this way, each processor in an IPU can function independently of the other processors, maximizing complex computational throughput. 2024 is significant as it is the year in which Graphcore is scheduled to deliver what it calls the world’s first ultra-intelligence AI computer. The Good Computer Called the Good Computer, it’s named after Jack Good, a computer scientist who described a computer more capable than the human brain back in 1965. The Good Computer will be powered by the next generation of Graphcore IPUs and capable of more than 10 exaflops of AI-specific computational power. And it will support AI models with sizes of 500 trillion parameters – something we haven’t yet seen. Not only would a system like this provide a path towards artificial general intelligence (AGI), which is right around the corner, but it’s the kind of horsepower that will lead to artificial superintelligence (ASI). And what’s equally incredible is that the system is expected to cost “only” $120 million. I say “only” because today’s most advanced large language models are multibillion-parameter models (OpenAI GPT-4o is a 200 billion-parameter model), which cost hundreds of millions of dollars to train. We’re only halfway into the year in terms of AI breakthroughs. It’s our brain’s natural thought process to assume that we’ll take small incremental steps as AI improves. But this is precisely where our brains are weak. The advancements in semiconductor technology, specifically related to AI, are happening at an exponential pace. In fact, they are happening at a speed even faster than Moore’s law right now. And these AI-specific semiconductors, and related computing systems like the Good Computer, are declining in cost per unit of computing power. Which means that those developing AI software are able to do it faster and much cheaper than the year before. And that means accelerated innovation in AI is happening at a pace that none of us have ever seen before. Soon, very soon, the entire world will be alive with a network of intelligence more powerful than what the collective brains of the human race can produce together. And there won’t be just one AGI, there will be millions. Just imagine the implications of having intelligent “machines” capable of working around the clock and performing research and development autonomously. We’re at the bleeding edge of technology and at the outer limits of what’s possible. And we’re all on this incredible ride together. I’m back, Jeff Brown Editor, Brownstone Research --------------------------------------------------------------- Like what you’re reading? Send your thoughts to feedback@brownstoneresearch.com. In Case You Missed It We invite you to watch Jeff Brown’s return message to his readers. You can view it right here… FAQ Editor’s Note: We recognize readers may have questions regarding the transition to the new Brownstone Research with Jeff Brown. Our team has prepared some helpful answers [right here]( in our FAQ. We welcome all reader questions and feedback – we want to hear from you! [Please write to us here](mailto:feedback@brownstoneresearch.com). After reading the FAQ, if you have remaining questions, you may reach our dedicated Customer Service team at 1-888-493-3156. [Brownstone Research]( Brownstone Research 55 NE 5th Avenue, Delray Beach, FL 33483 [www.brownstoneresearch.com]( To ensure our emails continue reaching your inbox, please [add our email address]( to your address book. This editorial email containing advertisements was sent to {EMAIL} because you subscribed to this service. To stop receiving these emails, click [here](. Brownstone Research welcomes your feedback and questions. But please note: The law prohibits us from giving personalized advice. To contact Customer Service, call toll free Domestic/International: 1-888-512-0726, Mon–Fri, 9am–7pm ET, or email us [here](mailto:memberservices@brownstoneresearch.com). © 2024 Brownstone Research. All rights reserved. Any reproduction, copying, or redistribution of our content, in whole or in part, is prohibited without written permission from Brownstone Research. [Privacy Policy]( | [Terms of Use](

EDM Keywords (196)

yet year would world works work whole well welcome weak way want view using use us understanding understand ultimately types transition train today thoughts terms tennessee technology team task systems system supercomputer subscribed study structured soon sizes size simply similar significant significance service sent semiconductors semiconductor secrets second science scheduled say saw run right rice revealed researchers requires required require remarkable redistribution realm reading readers rankings questions provide processor prepared precisely powerful powered power possible possibilities portion point piece petabytes perform part pace owns optimized operate number note none network named much moore month monster ml missed millions measured meantime means mapping map manage make long less lead large know kind ipu invite interconnected intelligence intel information implications imagine image illinois housed horsepower help hear harvard hard happening halfway half graphcore grain google frontier form figure field feedback faster faq fact expected exceeding exaflop exabytes equivalent ensure energy efforts effort earth dollars doe difference develop designed design described department deliver declining decade data cray cost corner core continue content consternation connectomics connectome connections connected computing computer comparison clocked clock challenge case capacity capable capabilities calls brains brain aurora assume amd alive ai advancements able 2022 2020 1965

Marketing emails from brownstoneresearch.com

View More
Sent On

26/06/2024

Sent On

25/06/2024

Sent On

24/06/2024

Sent On

21/06/2024

Sent On

20/06/2024

Sent On

19/06/2024

Email Content Statistics

Subscribe Now

Subject Line Length

Data shows that subject lines with 6 to 10 words generated 21 percent higher open rate.

Subscribe Now

Average in this category

Subscribe Now

Number of Words

The more words in the content, the more time the user will need to spend reading. Get straight to the point with catchy short phrases and interesting photos and graphics.

Subscribe Now

Average in this category

Subscribe Now

Number of Images

More images or large images might cause the email to load slower. Aim for a balance of words and images.

Subscribe Now

Average in this category

Subscribe Now

Time to Read

Longer reading time requires more attention and patience from users. Aim for short phrases and catchy keywords.

Subscribe Now

Average in this category

Subscribe Now

Predicted open rate

Subscribe Now

Spam Score

Spam score is determined by a large number of checks performed on the content of the email. For the best delivery results, it is advised to lower your spam score as much as possible.

Subscribe Now

Flesch reading score

Flesch reading score measures how complex a text is. The lower the score, the more difficult the text is to read. The Flesch readability score uses the average length of your sentences (measured by the number of words) and the average number of syllables per word in an equation to calculate the reading ease. Text with a very high Flesch reading ease score (about 100) is straightforward and easy to read, with short sentences and no words of more than two syllables. Usually, a reading ease score of 60-70 is considered acceptable/normal for web copy.

Subscribe Now

Technologies

What powers this email? Every email we receive is parsed to determine the sending ESP and any additional email technologies used.

Subscribe Now

Email Size (not include images)

Font Used

No. Font Name
Subscribe Now

Copyright © 2019–2024 SimilarMail.