and Its Convergence with Right-Brain AI (RAI)
J.Konstapel, Leiden, 25-11-2025.

Executive summary
Neuromorphic computing is moving from a niche research topic to a strategic pillar in the search for energy- and data-efficient AI. It replaces the classical von Neumann separation of memory and processing by brain-inspired architectures that co-locate storage and computation, operate event-based in time, and exploit the physics of devices rather than abstracting it away. arXiv+1
Three developments maken this space strategically relevant now:
- The energy crisis of AI and HPC – leading researchers and industry actors (Intel, IBM, many academics) explicitly frame neuromorphic as a response to the unsustainable compute and energy cost of large-scale AI. IO++4Newsroom+4Nature+4
- The maturation of enabling devices and architectures – phase-change memory, memristive arrays, spintronics, photonics and large digital neuromorphic platforms (Loihi, SpiNNaker, BrainScaleS) provide multiple technical paths with different risk/return profiles. experts.umn.edu+5arXiv+5Nature+5
- The emergence of integrated roadmaps and master plans – the 2022 Roadmap on Neuromorphic Computing and Engineering and the 2022 Nature paper Brain-inspired computing needs a master plan move the field into the realm of strategic technology planning, comparable to quantum. europepmc.org+5arXiv+5research-collection.ethz.ch+5
Parallel to this, the Right-Brain AI (RAI) framework proposes a more radical shift: from probability-driven, “left-brain” AI (LLMs, transformers) to resonance- and coherence-based architectures organised as a “Resonant Stack” of oscillatory layers, with explicit coupling to existing LAI systems. Hans Konstapel Blogs+2Hans Konstapel Blogs+2
In this report:
- Section 1–3 define neuromorphic computing and trace its history.
- Section 4–5 describe the current state, key actors and their visions.
- Section 6 sketches technical and market futures.
- Section 7 links neuromorphic computing to Right-Brain AI / RAI and outlines how neuromorphic platforms can underpin resonant, right-brain architectures.
- Section 8 extracts strategic implications.
1. What is neuromorphic computing?
Definition.
Neuromorphic computing refers to hardware and systems whose architecture and dynamics are inspired by biological nervous systems. Rather than executing neural networks as software on a general-purpose processor, neuromorphic systems:
- Co-locate memory and computation (often in synapse-like devices or arrays).
- Use spikes or events in continuous time rather than global clocked steps.
- Exploit device physics (e.g., conductance changes, phase transitions, spin dynamics) as part of the computation. Wikipedia+3arXiv+3ResearchGate+3
The goal is not only to imitate the brain, but to achieve orders of magnitude better energy efficiency and throughput on tasks such as perception, control and associative memory than conventional digital systems. ScienceDirect+3arXiv+3University of Groningen+3
Key characteristics vs. conventional AI hardware
- Architectural: classical systems separate CPU/GPU and DRAM (the von Neumann architecture). Neuromorphic systems embed local memory in synapse-like devices and reduce expensive memory traffic. arXiv+1
- Temporal: neuromorphic circuits are usually event-driven and asynchronous; they process spikes or events when they occur, saving energy in idle periods. ResearchGate+1
- Physical: computation is analog or mixed-signal at the device level, even when the system is digitally orchestrated. Examples are phase-change memory cells that accumulate conductance changes as part of a correlation computation. Nature+2ResearchGate+2
2. Historical development
2.1 Origins: Carver Mead and analog VLSI
Neuromorphic engineering originates from work in the 1980s by Carver Mead at Caltech. Mead’s book Analog VLSI and Neural Systems (1989) and his 1990 paper Neuromorphic Electronic Systems framed the idea of building electronic systems that emulate the physics of neural computation using analog transistors operating in subthreshold. Wikipedia+4Amazon+4hasler.ece.gatech.edu+4
Early work targeted silicon retinas, cochleas and simple neural circuits, using continuous-time differential equations implemented directly in circuits rather than in software. arXiv+1
2.2 2000–2015: from circuits to systems
In the 2000s and early 2010s, neuromorphic engineering expanded from individual circuits to more complex spiking networks and sensory-motor systems:
- Indiveri and others developed libraries of analog/digital neuron and synapse circuits and demonstrated small autonomous cognitive systems. arXiv+3ResearchGate+3Frontiers+3
- Reviews such as Neuromorphic Electronic Circuits for Building Autonomous Cognitive Systems (Chicca & Indiveri, 2014) argued that neuromorphic circuits can implement working memory, decision-making and sensory processing in real time at very low power. RUG Research+1
- Large-scale digital neuromorphic platforms (e.g. early SpiNNaker and BrainScaleS efforts in Europe) explored how to scale spiking simulations to millions of neurons on custom hardware. experts.umn.edu+2asu.elsevierpure.com+2
2.3 2015–today: devices, platforms and roadmaps
From roughly 2015 onwards, three strands converged:
- New devices and materials
- Phase-change memory (PCM) arrays and resistive memories were explored as “computational memory” where the same devices that store data also perform operations such as correlation and matrix–vector multiplication. IBM Research+3Nature+3ResearchGate+3
- Spintronic devices (e.g. magnetic tunnel junctions, spin-torque oscillators) were proposed as synapses and neurons with non-volatility and rich dynamics. Nature+2tsapps.nist.gov+2
- Industrial-scale digital neuromorphic systems
- Intel’s Loihi and Loihi 2 research chips, and the 2024 Hala Point system with 1,152 Loihi 2 processors (≈1.15 billion neurons, 128 billion synapses, ~20 peta-operations/s at >15 TOPS/W), position neuromorphic hardware as a candidate for mainstream AI workloads. EL PAÍS English+4Newsroom+4Dutch IT Channel+4
- Large-scale spiking array processors such as SpiNNaker provide a software-programmable platform for spiking neural networks and brain models, emphasising flexibility and scale. experts.umn.edu+2arXiv+2
- Strategic framing and roadmaps
- The 2022 Roadmap on Neuromorphic Computing and Engineering provides a broad, multi-author assessment from materials through devices, circuits, algorithms, applications and ethics. It highlights energy-efficient edge computing and a shift of control from data centres to embedded systems as key application niches. arXiv+2research-collection.ethz.ch+2
- Mehonic & Kenyon’s Brain-inspired computing needs a master plan argues that brain-inspired computing requires the same level of coordinated investment and strategic planning as quantum technologies, or it will remain fragmented and fail to reach impact. x-mol.net+3Nature+3PubMed+3
More recently, Indiveri’s 2025 Neuromorphic is dead. Long live neuromorphic reframes neuromorphic not as narrow brain mimicry but as a broader movement toward event-based, energy-efficient computing architectures that may look quite different from early neuromorphic visions. cell.com+2ScienceDirect+2
3. Current state of the field
3.1 Devices and materials
Phase-change and resistive memories.
PCM and related resistive memory technologies (RRAM, OxRAM) are central in IBM’s and others’ neuromorphic work. In “computational memory”, arrays of such devices implement operations in situ, such as weighted sums or correlation detection, by exploiting their analog conductance states and dynamics. pubs.acs.org+4Nature+4ResearchGate+4
This enables:
- High-density synapse arrays for spiking networks.
- Low-precision but massively parallel analog compute, particularly suited for inference or sensory preprocessing.
Spintronics.
Spintronic devices are attractive as they combine non-volatility, high endurance and rich non-linear dynamics. Grollier’s review Neuromorphic spintronics identifies multiple neuromorphic roles: synaptic elements (multi-level conductance), neuron-like oscillators, and stochastic units for probabilistic computing. ResearchGate+3Nature+3tsapps.nist.gov+3
Towards photonic and hybrid platforms.
The roadmap highlights photonic neuromorphic approaches – using integrated optics for ultrafast, low-latency multiply–accumulate operations – as a promising pathway especially for high-bandwidth sensing and communication-heavy workloads. arXiv+1
3.2 Circuits and architectures
Analog / mixed-signal neuromorphic circuits.
Work by Indiveri, Chicca and others has produced families of neuron and synapse circuits operating in continuous time with biophysically relevant dynamics and plasticity rules. SciSpace+4RUG Research+4ResearchGate+4
These circuits are:
- Extremely power-efficient (sub-milliwatt for networks).
- Suitable for embedded sensory systems and robotics.
- Harder to scale and program than digital arrays, which limits industrial adoption so far.
Digital neuromorphic platforms.
Digital platforms (Loihi, SpiNNaker, BrainScaleS-2) trade some biological realism for programmability and industrial-grade tooling. Key trends:
- Increasing neuron and synapse counts, moving into the 10⁸ – 10⁹ neuron range. arXiv+3Newsroom+3News Releases+3
- Support for both spiking networks and more conventional deep learning workloads, allowing neuromorphic hardware to act as a drop-in accelerator. Newsroom+2Dutch IT Channel+2
3.3 Algorithms and applications
On the algorithmic side, the field is heterogeneous:
- Spiking neural networks (SNNs) that aim to exploit temporal coding and sparsity. arXiv+2Semantic Scholar+2
- Event-based sensing (e.g. dynamic vision sensors) where the sensor itself produces sparse spikes; neuromorphic hardware processes streams with microsecond latency. arXiv+2ResearchGate+2
- Reservoir computing and oscillator networks using coupled oscillators (electronic, spintronic, optical) as physical recurrent networks. Nature+2Nature+2
- Hyperdimensional computing and associative memories implemented in computational memory arrays. ResearchGate+2pubs.aip.org+2
Applications under active exploration include:
- Low-power edge intelligence (IoT, wearables, autonomous sensors). IO++3arXiv+3University of Groningen+3
- Robotics and autonomous systems needing continuous perception–action loops. RUG Research+2experts.umn.edu+2
- Efficient inference for speech, vision and anomaly detection in constrained environments. arXiv+2Nature+2
A consensus across roadmaps and reviews is that there is no single “killer app” yet, but energy-efficient perception and control at the edge is the most immediate opportunity. IO++3arXiv+3Nature+3
4. Global actors and their visions
4.1 Academic and roadmap leaders
Carver Mead
Mead’s original view – and his recent reflections Neuromorphic Engineering: In Memory of Misha Mahowald – emphasise neuromorphic engineering as a fundamental shift: using physics-level computation rather than digital abstraction to approach brain-like efficiency. Wikipedia+3hasler.ece.gatech.edu+3worrydream.com+3
Giacomo Indiveri
Indiveri has been central in framing neuromorphic as both brain-emulation and a broader event-based computing paradigm. In Frontiers in Neuromorphic Engineering and later work, he highlights real-time spiking implementations for cognition and interaction with the physical world. Frontiers+2ResearchGate+2
In his 2025 NeuroView piece Neuromorphic is dead. Long live neuromorphic., he argues that the field must move beyond narrow brain mimicry and integrate with mainstream computer engineering, focusing on robust, scalable, event-based architectures. cell.com+2ScienceDirect+2
Christensen et al. – 2022 Roadmap
The Roadmap positions neuromorphic as a stacked endeavour:
- materials → devices → circuits → algorithms → applications → ethics,
- with energy-efficient computing and edge autonomy as the main strategic benefits. arXiv+2research-collection.ethz.ch+2
It stresses that progress in one layer without alignment with the others (e.g. devices without algorithms, or algorithms without tooling) will not create impact.
Mehonic & Kenyon – “master plan” vision
Mehonic and Kenyon’s Nature article explicitly compares brain-inspired computing to quantum technologies and calls for: ResearchGate+3Nature+3PubMed+3
- Flagship-style, long-term funding.
- Coordinated roadmaps and centres.
- Integration of materials science, device physics, architectures and applications.
Their core message: without an integrated master plan, the field risks being perpetually promising but structurally under-delivering.
4.2 Corporate and industrial actors
Intel – Mike Davies and the Neuromorphic Computing Lab
Intel’s strategy is to** bridge neuromorphic and mainstream AI**:
- Loihi and Hala Point demonstrate that neuromorphic hardware can run both spiking and conventional deep learning workloads with much higher energy efficiency for certain tasks. News Releases+3Newsroom+3Dutch IT Channel+3
- Davies openly frames neuromorphic as a response to “unsustainable” compute cost of current AI and as an exploration of fundamentally different scaling laws. Newsroom+2EL PAÍS English+2
Vision: pragmatic radicalism – keep compatibility with today’s AI ecosystem while exploring new learning rules and architectures that better exploit hardware dynamics.
IBM – Abu Sebastian and computational memory
IBM Research pursues “computational memory” as a way to move beyond von Neumann constraints. In this view, PCM arrays become active computing substrates for learning and inference (e.g. temporal correlation detection and in-memory vector operations). IBM Research+3Nature+3ResearchGate+3
Vision: a new kind of memory-centric processor where non-volatile devices serve as both synapses and compute elements, integrated into SoCs and data-centric systems. pubs.acs.org+1
Thales/CNRS – Julie Grollier and neuromorphic spintronics
Grollier’s work shapes the spintronic branch of neuromorphic computing. She positions spintronics as a platform for building neuron-like oscillators, stochastic elements and ultra-dense synapses, opening new ways of implementing learning and inference. ResearchGate+3Nature+3tsapps.nist.gov+3
Vision: device-physics-driven neuromorphic computing, where properties like magnetisation dynamics and spin-torque oscillations are directly harnessed for computation.
4.3 Centres and ecosystems
CogniGron (University of Groningen)
CogniGron is a prominent example of a materials-to-systems neuromorphic centre. Its mission is to achieve up to 10,000× more energy-efficient chips by co-designing self-learning materials, devices and architectures. LinkedIn+5University of Groningen+5University of Groningen+5
Vision:
- Neuromorphic computing as “future-proof computing” for a world where current chip technology hits physical and energy limits.
- Strong emphasis on education and multidisciplinary talent as bottlenecks.
Similar centres and consortia exist across Europe, the US and Asia, often linked to national or EU-wide flagship projects, as mapped in the 2022 Roadmap. arXiv+1
5. Future directions and scenarios
5.1 Technical convergence
Across devices, circuits and systems, several convergence trends are visible:
- Hybrid digital–physical neuromorphic platforms
- Large digital systems (Loihi, SpiNNaker) act as orchestrators or “outer loops” around arrays of analog or in-memory devices (PCM, RRAM, spintronics). ResearchGate+5arXiv+5Newsroom+5
- Oscillator- and resonance-based architectures
- Spin-torque oscillators, coupled phase-change devices and photonic resonators are used as building blocks for reservoir computing and pattern recognition based on synchronisation phenomena rather than purely on static matrix multiplies. arXiv+3Nature+3Nature+3
- Event-based, edge-first designs
- Sensors and neuromorphic processors are increasingly co-designed (e.g. dynamic vision sensors plus on-chip spiking processors), minimising data transfer and latency. IO++3arXiv+3ResearchGate+3
5.2 Market and application outlook
In the next 5–15 years, plausible market trajectories are:
- Short term (0–5 years)
- Neuromorphic hardware deployed as specialised accelerators in research datacentres and high-end edge devices; main value in energy savings and low-latency inference for specific workloads. IO++3Newsroom+3arXiv+3
- Medium term (5–10 years)
- Integration of computational memory and neuromorphic coprocessors into heterogeneous SoCs for automotive, industrial IoT, robotics and communications equipment. ScienceDirect+3ResearchGate+3pubs.acs.org+3
- Longer term (10+ years)
- Potential shift toward resonant and oscillator-based computing architectures that blur the line between neuromorphic and other non-von-Neumann paradigms, particularly if tools and theory mature. ScienceDirect+3Nature+3Nature+3
In all scenarios, the main value propositions are energy efficiency, autonomy at the edge, and robustness in complex environments, rather than raw peak FLOPS. ScienceDirect+4arXiv+4Nature+4
5.3 Risks and open questions
Key uncertainties include:
- Tooling and programmer experience: programming SNNs and analog arrays remains complex; industrial adoption depends on higher-level abstractions and robust toolchains. arXiv+2arXiv+2
- Competing trajectories: GPUs and ASICs continue to improve; specialised digital accelerators may “eat” much of neuromorphic’s value unless neuromorphic offers qualitatively new capabilities (e.g. on-device learning, continuous-time control). arXiv+2Newsroom+2
- Fragmentation vs. master planning: without coordinated programs and shared roadmaps, many promising device concepts may never escape the lab. zora.uzh.ch+3Nature+3PubMed+3
6. Neuromorphic computing and Right-Brain AI (RAI)
Right-Brain AI (RAI), as articulated in The Architecture of Right Brain AI (RAI) and follow-up essays, proposes a complementary AI paradigm to today’s “Left-Brain AI” (LAI) such as LLMs and transformers. Hans Konstapel Blogs+2Hans Konstapel Blogs+2
6.1 Core ideas of RAI
From your RAI work, the key elements are: Hans Konstapel Blogs+2Hans Konstapel Blogs+2
- Resonant Stack: a multi-layer architecture built around oscillatory subsystems that maintain coherence across time and scales (physical, cognitive, social).
- Oscillatory computing and synchronisation: computation emerges from phase relationships, resonances and synchrony (e.g. Kuramoto-type dynamics), rather than from discrete symbol manipulation or static matrix multiplies.
- Right-Brain vs. Left-Brain AI:
- LAI = probabilistic, language- and symbol-centric, dominated by LLMs that optimise likelihood.
- RAI = pattern-, context- and coherence-centric, focusing on systemic consistency and longer-term stability.
- RAI as meta-controller: RAI steers LAI by feeding it coherent “resonant evaluation vectors” (REV) that bias outputs away from purely probabilistic responses toward systemically coherent ones.
Strategisch gezien adresseert RAI twee problemen die ook in het neuromorphic-debat spelen:
- De energetische onhoudbaarheid van pure LAI-schaalvergroting.
- De systeem-incoherentie van AI-beslissingen zonder fysisch/structureel anker.
6.2 Conceptuele raakvlakken
There is a strong conceptual alignment between RAI and modern neuromorphic visions:
- From discrete to physical computation – both emphasise exploiting the dynamics of physical substrates (oscillators, phase transitions, conductance changes) instead of abstract digital operations. Hans Konstapel Blogs+4Nature+4Nature+4
- From static models to continuous-time systems – neuromorphic circuits and RAI’s Resonant Stack both operate in continuous time with ongoing adaptation, rather than in discrete batches. Hans Konstapel Blogs+3ResearchGate+3arXiv+3
- From pure accuracy to coherence and energy – RAI explicitly optimises for systemic coherence and resilience; neuromorphic roadmaps stress energy efficiency and robustness as primary metrics, not just accuracy. Hans Konstapel Blogs+4arXiv+4Nature+4
6.3 Neuromorphic hardware as a substrate for RAI
Many of the building blocks required for a RAI-style architecture map naturally onto neuromorphic platforms:
- Oscillatory layers:
- Spin-torque oscillators, phase-change relaxation oscillators and photonic resonators can implement coupled oscillator networks needed for resonance-based computation. arXiv+4Nature+4Nature+4
- Associative and hyperdimensional memory:
- PCM-based computational memory and resistive arrays can implement high-dimensional associative memories and similarity search – key for encoding “coherence patterns” at multiple scales. IBM Research+3Nature+3ResearchGate+3
- Edge-side right-brain modules:
- Neuromorphic edge devices can serve as local RAI layers, capturing context, rhythms and anomalies in physical processes (energy grids, logistics, finance) and feeding higher-level LAI systems with structured signals (REV-like vectors). Hans Konstapel Blogs+4arXiv+4University of Groningen+4
- LAI–RAI integration:
- Digital neuromorphic platforms that already support deep learning workloads (Loihi/Hala Point) are plausible candidates for hosting the LAI–RAI hybrid stack: spiking/resonant layers for RAI, dense networks for LAI, on a shared hardware fabric. Hans Konstapel Blogs+5Newsroom+5Dutch IT Channel+5
Effectively, neuromorphic computing provides the physical implementation space in which RAI’s Resonant Stack could be realised:
- oscillator networks for resonance;
- computational memory for structured coherence;
- event-based interfaces to the physical world;
- digital neuromorphic cores for integration with LLM-style components.
6.4 Strategic complementarity
RAI can be seen as a conceptual and architectural “north star” for neuromorphic efforts:
- Where the Roadmap and master-plan papers provide the materials-to-ecosystem alignment, RAI adds a coherence-centric AI architecture that tells us what to build neuromorphic hardware for, beyond generic efficiency. Hans Konstapel Blogs+3arXiv+3Nature+3
- For policy and industry, this combination is powerful: neuromorphic for how to compute, RAI for why and to what end (coherence and systemic resilience rather than isolated point-optimisation).
7. Strategic implications
For an intellectually mature but business-oriented agenda, several implications follow:
- Portfolio approach to neuromorphic investments
- Incremental: support digital neuromorphic platforms and computational memory as near-term accelerators for AI and edge computing. arXiv+3Newsroom+3ResearchGate+3
- Radical: invest in oscillator- and resonance-based neuromorphic components that align with RAI’s vision, even if the use-cases are exploratory. Hans Konstapel Blogs+4Nature+4Nature+4
- Link technical roadmaps to architectural north stars
- Use Christensen et al.’s neuromorphic stack as the technical roadmap. arXiv+2research-collection.ethz.ch+2
- Use RAI’s Resonant Stack as the AI architecture roadmap – ensuring neuromorphic developments are driven by coherent system-level objectives, not just benchmarks and demos. Hans Konstapel Blogs+1
- Frame neuromorphic + RAI as a response to AI’s two crises
- Energy and compute sustainability: clearly articulated by Intel, CogniGron and Mehonic & Kenyon. IO++3Newsroom+3Nature+3
- Systemic incoherence and risk: articulated in RAI as a need to move beyond local optimisation of model likelihoods toward global coherence constraints. Hans Konstapel Blogs+2Hans Konstapel Blogs+2
- Talent and governance
- Centres like CogniGron show that neuromorphic progress depends heavily on cross-disciplinary talent (materials + devices + computing + AI). University of Groningen+2University of Groningen+2
- RAI adds the need for systems thinkers who can handle multi-scale coherence (technical, economic, societal). Governance structures and funding schemes should reflect this.
8. Conclusion
Neuromorphic computing has transitioned from an elegant niche in analog VLSI to a strategically positioned candidate for the post-von-Neumann era. The convergence of new devices, digital platforms and integrated roadmaps indicates that the coming decade will likely see neuromorphic technologies embedded in both edge and data-centre systems, initially as accelerators and later as integral computing fabrics. IO++4arXiv+4Newsroom+4
Right-Brain AI (RAI) extends this trajectory by providing an architectural and philosophical framework that prioritises resonance, coherence and systemic resilience over raw predictive accuracy. Neuromorphic platforms – especially those built on oscillatory and in-memory devices – are natural physical substrates for such architectures. Hans Konstapel Blogs+4Nature+4Nature+4
For stakeholders who think strategically, the key is not to choose between neuromorphic and RAI, but to recognise that neuromorphic computing is the hardware frontier, and RAI is one of the most promising conceptual frontiers for what that hardware should ultimately enable.
References (selected)
(Non-exhaustive, focused on works cited above.)
- Christensen, D. V., Dittmann, R., Linares-Barranco, B., Sebastian, A., Le Gallo, M., et al. (2022). 2022 roadmap on neuromorphic computing and engineering. Neuromorphic Computing and Engineering, 2(2), 022501. https://doi.org/10.1088/2634-4386/ac4a83 arXiv+2research-collection.ethz.ch+2
- Mehonic, A., & Kenyon, A. J. (2022). Brain-inspired computing needs a master plan. Nature, 604, 255–260. https://doi.org/10.1038/s41586-021-04362-w Nature+2PubMed+2
- Mead, C. (1990). Neuromorphic electronic systems. Proceedings of the IEEE, 78(10), 1629–1636. hasler.ece.gatech.edu+2Semantic Scholar+2
- Mead, C. (1989). Analog VLSI and Neural Systems. Addison-Wesley. Amazon+1
- Indiveri, G. (2011). Frontiers in neuromorphic engineering. Frontiers in Neuroscience, 5, 118. Frontiers+1
- Chicca, E., & Indiveri, G. (2014). Neuromorphic electronic circuits for building autonomous cognitive systems. Proceedings of the IEEE, 102(9), 1367–1388. RUG Research+1
- Indiveri, G. (2025). Neuromorphic is dead. Long live neuromorphic. Neuron (NeuroView). cell.com+1
- Neftci, E. O., et al. (2018). Data and power efficient intelligence with neuromorphic learning machines. Cell Reports, 23(12), 2900–2915. ScienceDirect
- Grollier, J., Querlioz, D., & Stiles, M. D. (2020). Neuromorphic spintronics. Nature Electronics, 3(7), 360–370. Nature+2tsapps.nist.gov+2
- Sebastian, A., Le Gallo, M., Khaddam-Aljameh, R., & Eleftheriou, E. (2020). Computational memory: A perspective on computing in memory. Nature Communications, 11, 111. (and related works such as “Temporal correlation detection using computational phase-change memory.” Nature Communications 2017). IBM Research+3Nature+3ResearchGate+3
- Intel Labs. (2024). Intel builds world’s largest neuromorphic system to enable more sustainable AI (Hala Point announcement). News Releases+3Newsroom+3Dutch IT Channel+3
- Davies, M. (2024). Interview: “We’re reaching the boundaries of basic computing.” El País (English edition). EL PAÍS English
- Roy, K., Jaiswal, A., & Panda, P. (2019). Towards spike-based machine intelligence with neuromorphic computing. Nature, 575, 607–617. Semantic Scholar+1
- Poon, C.-S., & Zhou, K. (2011). Neuromorphic silicon neurons and large-scale neural networks. Frontiers in Neuroscience, 5, 108. PMC
- Indiveri, G., et al. (2021). Introducing Neuromorphic Computing and Engineering. arXiv:2106.01329. arXiv
- Chicca, E., et al. (2014). Neuromorphic engineering: Recent trends. (Review article on methods, issues and challenges). SciSpace
- Campus Groningen / University of Groningen. (2018–2025). CogniGron – Cognitive Systems and Materials. Mission statements and research overviews. LinkedIn+5University of Groningen+5University of Groningen+5
- Indiveri, G., & co-authors. (2018). Large-Scale Neuromorphic Spiking Array Processors: A Quest to Mimic the Brain. Various outlets summarised in 2018–2020 reviews. asu.elsevierpure.com+1
- Konstapel, H. (2025). The Architecture of Right Brain AI (RAI). Constable.blog, 24 November 2025. Hans Konstapel Blogs+1
- Konstapel, H. (2025). RAI en de Nieuwste Technologische Ontwikkelingen. Constable.blog, 25 November 2025. Hans Konstapel Blogs+1
Why it takes so long before science understands computing.
