The field of artificial intelligence (AI) is set to have profound effect on our lives with a growing list of applications spanning business, automation, finance, health, transportation and many more. The ability to recognise faces, objects, speech and underlying patterns through machine learning is currently achieved using a conventional von-Neumann computing architecture where the logic and memory elements are functionally and physically separated.
The vast data sets used for machine learning, combined with the physical separation of computation and memory, results in AI processors which are highly demanding in terms of energy consumption. Dedicated AI tasks can consume more than 10 kW, compared to the human brain running at around 20W.
For environmental and practical reasons, there is a pressing need to realise AI computation using dedicated hardware and architectures which are inspired by the brain. In the brain, computation and memory are tightly interwoven, and this points the way to how significant energy savings can be achieved in dedicated processors. Brain-inspired, neuromorphic computing aims to address the growing computational complexity and power consumption in modern von-Neumann architectures.
In the case of artificial neural networks (ANN), one route to energy savings is through the monolithic integration of non-volatile memory directly above the microprocessor, reducing the main component of the energy consumption which results from the intensive processor and memory interactions. If the non-volatile memory can also achieve multi-level storage, this allows for further reductions in power consumption and processor size.
An alternative AI approach is through the use of spiking neural networks (SNN), which integrate the essential role of time into the learning process. In this case, the leaning process is encoded in the spike time dependence response of the elements which represent synapses and neurons in the brain.
The Tyndall CMOS++ cluster is working on materials and devices which can function as multilevel non-volatile memory for artificial neural networks and synaptic transistors for use in spiking neural networks.