Microchip delivers analogue embedded SuperFlash technology

  

In a move designed to address, via its Silicon Storage Technology (SST) subsidiary, Microchip Technology has been able to significantly reduce power with its analogue memory technology, the memBrain neuromorphic memory solution.

Based on its SuperFlash technology and optimised to perform vector matrix multiplication (VMM) for neural networks, Microchip’s analogue flash memory solution has been able to improve system architecture implementation of VMM through an analogue in-memory compute approach, enhancing AI inference at the edge.

As current neural net models may require 50M or more synapses (weights) for processing, it becomes increasingly challenging to have enough bandwidth for an off-chip DRAM, creating a bottleneck for neural net computing and an increase in overall compute power. In contrast, the memBrain solution is able to store synaptic weights in the on-chip floating gate, offering significant improvements in system latency.

When compared to traditional digital DSP and SRAM/DRAM based approaches, it delivers 10 to 20 times lower power and significantly reduced overall BOM.

“As technology providers for the automotive, industrial and consumer markets continue to implement VMM for neural networks, our architecture helps these forward-facing solutions realise power, cost and latency benefits,” said Mark Reiten, vice president of the license division at SST. “Microchip will continue to deliver highly reliable and versatile SuperFlash memory solutions for AI applications.”

The memBrain solution is being adopted by companies that are looking to advance machine learning capacities in edge devices. Due to its ability to significantly reduce power, this analogue in-memory compute solution is intended for any AI application.

SST offers design services for both its memBrain solution and SuperFlash technology, along with a software toolkit for neural network model analysis.