Over the last few years, technology has advanced to better understand everyday situations. It has evolved in order to acquire the ability to perceive the environment in the most human way possible , recognizing both objects and faces. This type of technology is known as computer vision, and is already part of some commercial applications (with dubious success), such as video games, with sensors such as the well-known Kinect.
This type of technology requires an important development in the hardware, which includes from the sensors to the SoC, as well as in the software. Put again as an example to Kinect, we find a dedicated hardware, capable of processing a large amount of data in a relatively short time. In an example closer to the iPhone, we find Tango, the system designed by Google to work with mobile phones. The only problem is, like all technology in its "beginnings", it needs a good push to become standard. And Apple could provide it shortly.
The next iPhone would be able to recognize its environment in three dimensions
As I said at the beginning of the article, computer vision has a wide variety of uses, because it gives the device the ability to respond accurately to the environment around it. Apple, could be about to integrate functions based on this technology thanks to the inclusion of 3D sensors manufactured by the company Lumentum, although it is not known exactly what these would be. It is speculated, that could be related to the system of facial recognition that will bring the iPhone 8.
But we should not just stick with security applications. This sensor could allow, for example, an even more precise operation of the camera , or perhaps the use of certain gestures for the control of the device. For now, we can only imagine what it will do, waiting for that surprise that Apple always brings us.
Source | 9to5Mac