No more relying on the cloud
AI is the new buzzword for mobile devices nowadays, with companies like Huawei and Qualcomm releasing their own processors with the tech. Samsung isn’t about to be left behind the AI race, and has announced a new kind of neural processing unit technology (NPU) that’s faster, more energy-efficient and takes up less space in a smartphone’s chip.
Samsung accomplished this via Quantization Interval Learning, which is essentially just 4-bit neural networks that retain the accuracy of a 32-bit network.
Fewer bits means fewer computations, and fewer computations mean less hardware – Samsung claims that the new tech allows them to achieve the same results 8x faster while reducing the number of transistors 40x to 120x.
The lower hardware required to run computations for AI means the new NPU can be run on the device without having to send data to the cloud, important for privacy and low latency.
Samsung will be bringing their new technology to their upcoming mobile chipsets, possibly the next iteration of their flagship Exynos chipset.