3450

Speaker

Dr. Thomas Kämpfe

Group Manager CMOS Integrated RF & AI Fraunhofer IPMS

Dr. Thomas Kämpfe

In-Memory Computing - Pathway for Low Power AI Edge Computing

In spite of the global rising demand on data transmission, data-centric local intelligence on the edge node is gaining traction: giving edge devices, such as wearables, smartphones, sensors, cars, etc., the ability to analyze data locally and to decide on a course of action autonomously.

The main challenge hereby are the power and latency requirements for such applications as they can not be met with state-of-the-art CPUs or microcontrollers. One solution Fraunhofer IPMS is investigating is in-memory computing. This is an approach where certain computational tasks are performed in place of the memory itself reducing the power for memory calls and the associated latency.

In-memory computing is projected to enable more compute-energy-efficient small-system AI engines and the high parallelism of operations in the memory macro is speeding up the computation of edge AI algorithms such as convolutional neural networks.

Share this with a colleague