The Next Generation Of Brain Mimicking AI

From New Mind.

▶️ Visit to get a 30-day free trial + 20% off your annual subscription

The tech industry’s obsession with AI is hitting a major limitation – power consumption. Training and using AI models is proving to be extremely energy intensive. A single GPT-4 request consumes as much energy as charging 60 iPhones, 1000x more than a traditional Google search. By 2027, global AI processing could consume as much energy as the entire country of Sweden. In contrast, the human brain is far more efficient, with 17 hours of intense thought using the same energy as one GPT-4 request. This has spurred a race to develop AI that more closely mimics biological neural systems.

The high power usage stems from how artificial neural networks (ANNs) are structured with input, hidden, and output layers of interconnected nodes. Information flows forward through the network, which is trained using backpropagation to adjust weights and biases to minimize output errors. ANNs require massive computation, with the GPT-3 language model having 175 billion parameters. Training GPT-3 consumed 220 MWh of energy.

To improve efficiency, research is shifting to spiking neural networks (SNNs) that communicate through discrete spikes like biological neurons. SNNs only generate spikes when needed, greatly reducing energy use compared to ANNs constantly recalculating. SNN neurons have membrane potentials that trigger spikes when a threshold is exceeded, with refractory periods between spikes. This allows SNNs to produce dynamic, event-driven outputs. However, SNNs are difficult to train with standard ANN methods.

SNNs perform poorly on traditional computer architectures. Instead, neuromorphic computing devices are being developed that recreate biological neuron properties in hardware. These use analog processing in components like memristors and spintronic devices to achieve neuron-like behavior with low power. Early neuromorphic chips from IBM and Intel have supported millions of simulated neurons with 50-100x better energy efficiency than GPUs. As of 2024, no commercially available analog AI chips exist, but a hybrid analog-digital future for ultra-efficient AI hardware seems imminent. This could enable revolutionary advances in fields like robotics and autonomous systems in the coming years.

Denis Dmitriev –
Jay Alammar –
Ivan Dimkovic –
Carson Scott –