CPU tech for image sensors!?
Samsung is not resting its laurels when it comes to improving imaging sensors. Following the launch of the 108-megapixel ISOCELL HMX with the Xiaomi Mi MIX Alpha and Mi Note 10, the Korean tech giant is exploring pushing the megapixel count further with 144-megapixels.
To make this possible, Samsung is exploring using a 14nm FinFET process—a method used by the majority of 2017 and 2018 entry-level and mid-range processors—in manufacturing image sensors. For reference, the 12-megapixel sensor used in its flagship Galaxy S and Galaxy Note phones use a 28nm planar process.
According to its research, the shift to a 14nm process should allow a 144-megapixel sensor to be up to 42% more power-efficient at a 10FPS image capture rate. At 12-megapixels, this means up to 37% better power efficiency when shooting video at anywhere between 30FPS to 120FPS.
Samsung’s method is unconventional as it focuses on using smaller transistors instead of resorting to pixel binning—which is currently utilized with most high-megapixel sensors for the past year.