Now you are in the subtree of TECHNOLOGY and MARKETS public knowledge tree. 

Memory Technologies Confront Edge AI’s Diverse Challenges

febr 2021 Memory Technologies Confront Edge AI’s Diverse ChallengesMemory Technologies Confront Edge AI’s Diverse Challenges

... multidimensional matrix multiplication, lends itself to analog compute techniques with an array of memory cells used to perform calculations. Using this technique, Syntiant’s devices are designed for voice control of consumer electronics, and Gyrfalcon’s devices have been designed into a smartphone, where they handle inference for camera effects.

The Flex Logix chip will be used in edge AI inference applications that require real-time operation, including analyzing streaming video with low latency. This includes ADAS systems, analysis of security footage, medical imaging, and quality assurance/inspection applications

Experiments have applied MRAM’s stochasticity capabilities to Gyrfalcon’s devices, a technique whereby the precision of all the weights and activations is reduced to 1 bit. This is used to reduce compute and power requirements dramatically for far-edge applications. Tradeoffs with accuracy are likely, depending on how the network is retrained. In general, neural networks can be made to function reliably despite the reduced precision

... Andy Walker, product vice president at Spin Memory. “We have found that such BNNs can still function with high levels of accuracy

One application is as a so-called unified memory, “where this emerging memory can act as both an embedded flash and SRAM replacement, saving area on the die and avoiding the static power dissipation inherent in SRAM,” said Walker

Recent research by Politecnico Milan using Weebit Nano’s silicon oxide (SiOx) ReRAM technology showed promise for neuromorphic computing

While the exact nature of memory systems for edge AI depends on the application, GDDR, HBM, and Optane are proving popular for data centers, while LPDDR competes with on-chip SRAM for endpoint applications

  • When comparing GDDR6 with LPDDR,

...the low-power DDR version that has been Nvidia’s approach for most non-data–center edge solutions from the Jetson AGX Xavier to Jetson Nano, Ferro acknowledged that LPDDR is suited to low-cost AI inference at the edge or endpoint.