Apple plans to equip all of its iPhones with a time-of-flight sensor on the back by 2021, in addition to a TrueDepth camera on the front. That is what the Taiwanese newspaper Digitimes claims. Apple is said to have signed a three-year contract with camera maker Sony.
In addition to Apple, manufacturers of Android phones also want to use more cool sensors, Digitimes reports. Apple has put the cool sensor on its iPhone 12 Pro models and on last year’s iPad Pro. Samsung, Huawei and OPPO already used cool sensors on some phones, but such a sensor is still on a small minority of smartphones. Apple refers to its cool sensors as lidar. A tof sensor emits infrared signals and an infrared camera captures the reflected infrared signals again, after which software calculates the distance to objects based on the time elapsed between sending and receiving the signal.
It is obvious that Apple also equips an AR product with a cool sensor to scan the environment. Such AR glasses or headset would be released in 2021. Apple has been working on augmented reality for years. For example, ARKit is on all iPhones and iPads that can run iOS 13 or higher. Face ID’s TrueDepth camera, which has been on many Apple phones since the iPhone X in 2017, isn’t a great sensor; it senses depth from the shape of the projected infrared dots on a person’s face, but does not measure time-of-flight.
There have been rumors for years about an AR headset or AR glasses from Apple. The first version would rely on a user’s iPhone for location data and internet connectivity. Apple says it has been very interested in augmented reality for years, but until now only made software for it.
IPhone 12 Pro with cool sensor