Samsung Introes Faster, More Power-efficient NPU For Faster AI

No more relying on the cloud

AI is the new buzzword for mobile devices nowadays, with companies like Huawei and Qualcomm releasing their own processors with the tech. Samsung isn’t about to be left behind the AI race, and has announced a new kind of neural processing unit technology (NPU) that’s faster, more energy-efficient and takes up less space in a smartphone’s chip.

Samsung accomplished this via Quantization Interval Learning, which is essentially just 4-bit neural networks that retain the accuracy of a 32-bit network.

Fewer bits means fewer computations, and fewer computations mean less hardware – Samsung claims that the new tech allows them to achieve the same results 8x faster while reducing the number of transistors 40x to 120x.

The lower hardware required to run computations for AI means the new NPU can be run on the device without having to send data to the cloud, important for privacy and low latency.

Samsung will be bringing their new technology to their upcoming mobile chipsets, possibly the next iteration of their flagship Exynos chipset.

John Nieves

John is a veteran technology and gadget journalist with more than 10 years of experience both in print and online. When not writing about technology, he frequently gets lost in the boonies playing soldier.

Related Articles

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button