6 Ways Android Phones Could Use The iPhone 12 Pro's LiDAR Scanner Technology

6 Ways Android Phones Could Use The iPhone 12 Pro's LiDAR Scanner Technology The iPhone 12 Pro has a lot of cool features. We've all heard of 5G connectivity, though it's not yet available where we live and Apple fans will be familiar with MagSafe as well. But what is a LiDAR scanner? Short for "Light Sensing and Ranging," this sensor bounces infrared light off surfaces to create 3D maps. Like so many futuristic technologies, it was developed by the United States military in the 1960s. Apple is now promoting it as the next big thing for mobile devices by including it in the iPhone 12 Pro and also in the iPhone 12 Pro Max. only on some iPad Pro models. This ignores the fact that Samsung, LG, and Huawei are already using it. Of course, time-of-flight (ToF) sensors only illuminate a single beam, and Apple's LiDAR scanner uses multiple pulses to create a more detailed image. But Google has been experimenting with this exact technology for over a decade. LiDAR has been helping the tech giant's self-driving cars navigate since 2009. While its 2014 Project Tango was a first attempt at bringing augmented reality (AR) to phones. This led to the release of two Android phones with LiDAR systems: the Lenovo Phab 2 Pro and the Asus ZenPhone AR. Although Google finally ditched Tango in favor of ARCore computer vision, because it could do the same job without specialized hardware. But if Apple generates as much excitement around LiDAR as it hopes, Android makers won't want to be left behind. They can push Google to develop better Android compatibility for the sensor. And maybe they should. Google might even be doing more, so here are some ways Android makers could use this technology.

1. RA out of this world

(Image credit: Google) Apple says that iPhone 12 Pro will support powerful new augmented reality experiences, thanks to the LiDAR scanner. This means apps can create a more detailed map of a room in less time. Therefore, they can better integrate virtual objects, for example by hiding them behind real objects. Tellingly, Google added the same functionality to its ARCore platform in June with just a software update.. It has years more experience in this area than Apple, which has just started investing heavily in AR. So if Android were to embrace LiDAR, it would combine the best of both worlds: driving more precise data on already advanced algorithms. This could lead to more immersive VR and AR games, as well as all sorts of new apps.

2. Take more striking selfies

(Image credit: Apple) An example camera provided by Apple for the iPhone 12 Pro Apple's other big selling point for the LiDAR scanner is that it helps the iPhone autofocus up to six times faster in low light. Depth sensing also improves your night mode portraits for selfies after dark. Perhaps you could do the same for Android. As with AR, Google advocates machine learning to improve its images. Night site mode for Android 11 works amazingly well in complete darkness. But even ambient light can add noise to these algorithmically-enhanced selfies. While your bokeh simulator can do weird things if there's more than one face in a frame. So it's no surprise that many other Android manufacturers have added ToF sensors to improve shots. Full LiDAR could go even further.

3. Map the world inside and out

The original use of LiDAR was for mapping. The Apollo 15 astronauts used it to study the surface of the Moon in 1971. Just as Android users filled in for Street View's deficiencies by uploading spherical photos, its depth scans could yield improvements to Google Earth's 3D terrain. Better yet, they could make Google Earth VR (yes, it really exists) more immersive. But the Tango project focused on mapping interior spaces. Open the possibility that Google Maps allows you to see the majesty of the Sistine Chapel in 3D as if you were really there. Get step-by-step directions to find your airport gate faster. Or check if your stadium seats are behind a pillar before buying tickets.

4. Non-contact turbocharging controls

(Image credit: Future) The Pixel 4's Motion Sense was a groundbreaking idea. With built-in radar, you can control your phone from muting calls to skipping songs with a wave of your hand. It could also detect your presence to unlock the device faster than Apple's Face ID. But sadly, the technology never caught on and was abandoned for the Pixel 5. But just as LiDAR helps autonomous cars detect obstacles with high precision, it could also help phones detect hand gestures. And since LiDAR sensors are cheaper than miniature radars, Motion Sense could be implemented on more Android devices beyond Google's flagship. With a larger user base, app developers might be more inclined to adopt it and invent new touchless controls.

5. Smarter home security

Amazon and Google have been locked in a war to dominate the smart home for years. It goes beyond smart speakers like the Echo and Home. Amazon also owns Ring and Google has Nest. Just like with photography, LiDAR could drastically improve the night vision of Nest security cameras. Although Nest may also use sensor technology to launch a standalone alternative to Ring's new home security drone. While Tango helped quadcopters stand on their own in 2014. This, in turn, could open the doors for a host of mainstream Android robots.

6. Recover Google Glass

(Image credit: Google) Google promised the world a head-mounted display back in 2012. But despite partnerships with brands like Ray-Bans and Oakley, technical hurdles and privacy concerns have prevented the realization of an easy-to-use Glass model. With new competition from Facebook's Project Aria and smart specs from Apple, Glass could very well be back. After all, Google Lens' visual search and Assistant voice commands have essentially perfected every feature Glass should have. LiDAR could give you a competitive edge by improving the performance of your AR display.