LibraryHebbian Learning and its Variants

Hebbian Learning and its Variants

Learn about Hebbian Learning and its Variants as part of Neuromorphic Computing and Brain-Inspired AI

Hebbian Learning: The Foundation of Brain-Inspired Adaptation

Hebbian learning, often summarized as 'neurons that fire together, wire together,' is a fundamental principle in neuroscience and a cornerstone for understanding adaptive algorithms in neuromorphic computing and brain-inspired AI. It proposes that the connection strength between two neurons increases if they are simultaneously active.

Simultaneous activation strengthens synaptic connections.

Imagine two friends who always meet up at the same time. Over time, their bond strengthens because they consistently share experiences. Hebbian learning applies this to neurons: if neuron A and neuron B are active at the same time, the connection between them becomes more robust.

Donald Hebb's 1949 book 'The Organization of Behavior' introduced this concept. He hypothesized that learning occurs by modifying the efficacy of synaptic connections. Specifically, if a presynaptic neuron repeatedly or persistently takes part in firing a postsynaptic neuron, the synaptic connection between them is strengthened. This is a biological plausibility for how memories and associations are formed in the brain.

What is the core principle of Hebbian learning?

Neurons that fire together, wire together.

Mathematical Formulation of Hebbian Learning

The simplest mathematical representation of Hebbian learning involves updating the synaptic weight (wijw_{ij}) between a presynaptic neuron jj and a postsynaptic neuron ii. The change in weight (Δwij\Delta w_{ij}) is proportional to the product of the pre- and postsynaptic neuron's activities (aja_j and aia_i, respectively).

The basic Hebbian learning rule can be expressed as: Δwij=ηaiaj\Delta w_{ij} = \eta \cdot a_i \cdot a_j, where η\eta is the learning rate. This equation signifies that if both neurons are active (ai>0a_i > 0 and aj>0a_j > 0), the weight increases. If one is active and the other is not, or if both are inactive, the weight change is zero or negative depending on the specific variant. This simple multiplicative rule captures the essence of correlated activity leading to potentiation.

📚

Text-based content

Library pages focus on text content

Variants and Extensions of Hebbian Learning

While the basic Hebbian rule is powerful, it has limitations, such as unbounded weight growth. Several variants have been developed to address these issues and incorporate more nuanced biological observations.

VariantKey FeatureMechanismAdvantage
Basic HebbianCorrelation of pre/post activityΔwij=ηaiaj\Delta w_{ij} = \eta a_i a_jSimple, biologically inspired
Hebbian with DecaySynaptic depressionΔwij=ηaiajβwij\Delta w_{ij} = \eta a_i a_j - \beta w_{ij}Prevents unbounded growth
Spike-Timing Dependent Plasticity (STDP)Temporal order of spikesPotentiation if pre-spike precedes post-spike; Depression if vice-versaMore biologically realistic, captures temporal dependencies
Covariance RuleCorrelation between neuron activities and target outputΔwij=η(aiaˉi)(ajaˉj)\Delta w_{ij} = \eta (a_i - \bar{a}_i)(a_j - \bar{a}_j)Can be used for supervised learning

Applications in Neuromorphic Computing and AI

Hebbian learning and its variants are crucial for developing self-organizing systems, unsupervised learning algorithms, and adaptive neural networks. They enable hardware to learn and adapt in real-time, mimicking the brain's efficiency and flexibility.

Hebbian learning is a key enabler for 'on-chip' learning in neuromorphic hardware, allowing systems to adapt to new data without explicit retraining in the cloud.

These principles are applied in areas such as pattern recognition, associative memory, reinforcement learning, and creating more energy-efficient AI systems that operate closer to biological neural networks.

Learning Resources

Hebbian Learning - Scholarpedia(wikipedia)

A comprehensive overview of Hebbian learning, its history, mathematical formulations, and biological relevance.

Spike-Timing Dependent Plasticity (STDP) - Scholarpedia(wikipedia)

Details the crucial STDP variant, explaining how the timing of neural spikes influences synaptic plasticity.

Neuromorphic Computing - An Overview(blog)

An introduction to neuromorphic computing, highlighting the role of brain-inspired learning principles like Hebbian learning.

Introduction to Hebbian Learning(video)

A video tutorial explaining the fundamental concepts and mathematical basis of Hebbian learning.

The Organization of Behavior - Donald Hebb (Book Excerpt/Review)(blog)

Discusses Donald Hebb's seminal work and the introduction of the Hebbian learning principle.

Learning and Memory: The Hebbian Hypothesis(paper)

A scientific paper exploring the biological underpinnings and implications of the Hebbian hypothesis for learning and memory.

Artificial Neural Networks: A Tutorial(tutorial)

Provides a broader context of neural networks, including how learning rules like Hebbian learning are implemented.

Deep Learning for Neuromorphic Computing(paper)

A Nature article discussing the intersection of deep learning and neuromorphic computing, often leveraging Hebbian principles.

Hebbian Learning in Artificial Neural Networks(paper)

A research paper detailing various implementations and applications of Hebbian learning in artificial neural networks.

Introduction to Computational Neuroscience(tutorial)

A tutorial that covers foundational concepts in computational neuroscience, including synaptic plasticity and Hebbian learning.