26th EAAAI (EANN) 2025, 26 - 29 June 2025, Limassol, Cyprus

Brain Inspired Learning for Neural Networks

Hewavitharana Jayani, Anand Amida, Giese Peter, Ierardi Carolina, Steinhofel Kathleen

Abstract:

  Artificial neural networks (ANNs) have achieved remarkable success in various AI applications, yet their learning mechanisms remain fundamentally different from those in biological systems. While conventional ANNs rely on global weight updates via backpropagation, biological learning operates through more localised, energy-efficient synaptic modifications. Inspired by these principles, this study investigates two alternative learning rules modelled after long-term potentiation (LTP) of young brains and multi-innervated spines (MIS) observed in ageing brains. We implement these learning mechanisms in a bipartite artificial neural network and analyse their impact on learning speed, network adaptation, and specificity of output representations. Our results demonstrate that LTP-based learning facilitates rapid convergence, with 85% of input patterns achieving the learning objective in the minimum required iterations. In contrast, learning based on MIS exhibits slower, incremental learning but enables weight redistribution, supporting more flexibility. Despite these differences, both learning mechanisms lead to highly distinct representational spaces, with approximately 79% of output patterns being unique. Additionally, learned representations across the two approaches exhibit only a 21% overlap, highlighting their fundamentally different learning trajectories and potential benefits for ensemble learning and hybrid architectures.  

*** Title, author list and abstract as submitted during Camera-Ready version delivery. Small changes that may have occurred during processing by Springer may not appear in this window.