Neuromorphic Computing — Where silicon meets synapses

The brain is wilder than the sky

Shaunak Inamdar
5 min readMay 15, 2024

“Nature is the Mother of All Information. She is the source, the keeper, the database, the memory bank of all information.” — Originemology

A thought illustrated

The evolution of human thought is a process that unfolded over millions of years. More complex cognitive abilities, which we could recognize as human-like thinking, likely began to develop significantly with Homo habilis around 2.3 million years ago and evolved more prominently with Homo sapiens, who appeared around 300,000 years ago. It took tens of thousands of evolutionary events so that we could form thoughts.

The Spike Symphony

Neural activity manifests as Spikes

Specifically, the neocortex in our brains is where higher cognitive functions are believed to take place. This layered structure is the outer peel of the brain. This perceives the world for our brain through the senses. And it works closely with the Hippocampus which is responsible for the memory and learning part of the brain. The connection between the cerebral cortex and the hippocampus is specifically interesting when looked at in the context of computing.

How humans come up with creative thought is not understood in conventional computing. For this, we need to derive inspiration from the human intelligence. The perpetually evolving electrochemical soup that makes up the brain is somewhat elusive but fascinating to study. To understand neuromorphic chips, we have to first understand neurons. The way the brain operates is through producing spikes. Think of spikes as the language of the brain. They’re the neurons’ way of communicating with each other. When a neuron receives signals from other neurons, either through chemical or electrical signals at its inputs (dendrites), it processes these signals and, if the input is strong enough, generates a spike lasting only a fraction of a second.
They operate in an event-driven way, where neurons only process information when relevant events occur. In essence, forming memories.

The Memory Problem

A “memory” being formed in a living brain

In the brain, working memory resides within dynamic patterns of activity among neurons. When neurons encounter a signal resembling previous ones, they tend to respond similarly. This ability to adapt is crucial for encoding useful information while filtering out random noise. Synaptic weights, the strengths of connections between neurons, play a pivotal role in this process. They inherently adjust to store valuable patterns, enhancing memory retention. It’s as if they’re stretching and reshaping themselves to fit the new information. This ability allows our brain to strengthen important connections and weaken less important ones, which is crucial for learning and memory.

Neuromorphic computing tackles this challenge by leveraging a remarkable component called “memristors.” Similar to resistors, memristors regulate electrical flow, but they possess a unique trait: they can “remember” the amount of electricity that has passed through them. They can also emulate the synaptic plasticity of biological neurons where they store the synaptic weights, which change over time based on activity patterns. This ability to retain their resistance state enables them to store vast amounts of information in a compact physical space, making them exceptionally energy-efficient.

The Temporal Dimension

A message through time

In conventional machine learning, such as in regular feed-forward neural networks, the concept of “time” is often overlooked. These networks process inputs, like images, and produce outputs, such as classifications, without accounting for passage of time. While these networks perform crucial computations, time is not a primary consideration.

In contrast, the neural networks in our brains operate as Spike Neural Networks, where the temporal domain holds significant importance. These networks must adapt to various “time-scales” of incoming signals, making the temporal aspect a vital consideration in their functioning.

The ability to remember and change their resistance based on the duration of current applied affords memristors the unique ability to capture some “temporal” characteristics of input signal. Memristors also perform computations that take into account the temporal dynamics of input signals.

When incorporated into artificial neural networks, memristors can be used to implement these temporal coding schemes where information is encoded based on strength of connections and also the timing of neural spikes.

The Utopian Future

Biological neurons, Silicon networks.

Once these designs are scaled up to accommodate large network sizes, they will affect the future of devices profoundly. Imagine laptops, mobile phones, and robots equipped with ultra-low-power neuromorphic chips capable of processing various sensory inputs, from audio to visual information. Medical sensors and devices could continuously monitor vital signs and treatment responses, dynamically adjusting dosages or detecting issues early on. Your smartphone could evolve into an intuitive assistant. Imagine it providing background information on someone you’re about to meet or reminding you to leave for your next meeting precisely on time. But the advancements don’t stop there. Breakthroughs in bio-memristor technology, leveraging DNA and even blood cells as computational units, are blurring the lines between silicon and biological systems. This convergence opens up a realm of possibilities, where neuroprosthetics could harness the power of hybrid organic and digital semiconductors.

A man made organ.

As analog computing methods, mirroring the enigmatic behaviors of the human brain, merge seamlessly with physical devices, we witness the ominous dawn of a new era — the simulation of the human cortex. Giving us just a hint at the hardware that helps produce highly efficient, purpose-driven sequence of spikes and connections that can lead to a conclusion grounded in reality. Yet, its significance extends far beyond mere technological advancement. It beckons us to embrace the immense opportunities, and reshaping our understanding of human cognition.

Thank you very much for reading :) Please drop a comment with your thoughts and subscribe for more such articles and follow me on Medium, and LinkedIn.

--

--

Shaunak Inamdar

Shaunak Inamdar is a CS undergrad with a passion for writing about technologies and making them accessible to a broader audience. www.shaunak.tech