Design & Reuse

NeuReality Boosts AI Acelerator Utilization With NAPU

Apr. 04, 2024 – 

By Sally Ward-Foxton, EETimes (April 4, 2024)

Startup NeuReality wants to replace the host CPU in data center AI inference systems with dedicated silicon that can cut total cost of ownership and power consumption. The Israeli startup developed a class of chip it calls the network addressable processing unit (NAPU), which includes hardware implementations for typical CPU functions like the hypervisor. NeuReality’s aim is to increase AI accelerator utilization by removing bottlenecks caused by today’s host CPUs.

NeuReality CEO Moshe Tanach told EE Times its NAPU enables 100% utilization of AI accelerators.

Click here to read more ...

NeuReality Boosts AI Acelerator Utilization With NAPU