During the press event yesterday, Google announced among other things the two new smartphone Nexus line — Nexus 5 x made with LG and Nexus 6 p in cooperation with Huawei. At first glance, hardware configurations don’t seem to have any news egregious, but Google has focused on two aspects worthy of study: the camera and a new chip called Android Sensor Hub.
The name is quite explanatory: the chip works as a hub for all data collected from various sensors (fingerprint scanner, brightness, gyro, etc) on your device, and thanks to the new Marshmallow API developers will be able to interface their apps directly to it, rather than going to interrogate the individual sensors. It’s a bit the Apple motion coprocessor Android counterpart, currently come to the M9 generation iPhone 4S and 6s Plus.
One of the advantages of this solution is that the chip will be able to relate the data from various sensors, making more contextualized readings. The consequence is that the device will have a better idea of what the user is trying to do, whether gesture or action “real.” The new Nexus will therefore be more accurate to enable the Active Display, for example, even though the real limit is the imagination and skill of third-party developers.
The benefits should be substantial in the autonomy: the small processor inside the chip will allow the main processor to remain in idle despite the sensors are in use. Combined with the new Marshmallow Doze mode , standby power consumption is likely of the devices will be reduced significantly.
For the time being, of course, Android Sensor Hub is only available on two Nexus presented yesterday, even if it had already been studied such a solution (with slightly different features) in generation of Moto X by Motorola. It remains to be seen if it will be adopted by other manufacturers.