The iPhone 12 Pro has a lot of cool features. We've all heard of 5G connectivity, although it's not yet available where we live and Apple fans will also be familiar with MagSafe. But what is a LiDAR scanner?
Short for "Range and Light Detection," this sensor bounces infrared lights off surfaces to create 3D maps. Like so many futuristic technologies, it was developed by the United States military in the 1960s. Now Apple is promoting it as the next big thing for mobile devices by including it in the iPhone 12 Pro and also in the iPhone 12 Pro Max. only on select iPad Pro models.
This ignores the fact that Samsung, LG and Huawei are already using it. Of course, time-of-flight (ToF) sensors only illuminate a single beam, and Apple's LiDAR scanner uses multiple pulses to create a more detailed image. But Google has been experimenting with this exact technology for over a decade.
LiDAR has been helping the tech giant's autonomous cars navigate since 2009. While its Project Tango 2014 was a first attempt to bring augmented reality (AR) to phones. This led to the launch of two Android phones with LiDAR systems: the Lenovo Phab 2 Pro and the Asus ZenPhone AR.
Although Google finally ditched Tango in favor of ARCore computer vision, because it could do the same job without specialized hardware.
But if Apple builds as much buzz around LiDAR as it hopes, Android makers won't want to be left behind. They can pressure Google to develop better Android compatibility for the sensor. And maybe they should. Google could even be doing more, so here are some ways that Android makers could use this technology.
Tabla de contenido
1. RA out of this world

(Image credit: Google)
Apple says iPhone 12 Pro will support powerful new augmented reality experiences, thanks to the LiDAR scanner. This means apps can create a more detailed map of a room in less time. Therefore, they can better integrate virtual objects, for example by hiding them behind real objects.
Revealingly, Google added the same functionality to its ARCore platform in June with just a software update.. It has years more experience in this area than Apple, which has just started investing heavily in AR.
So if Android were to embrace LiDAR, it would combine the best of both worlds - pushing more accurate data into already advanced algorithms. This could lead to more immersive virtual reality and augmented reality games, as well as all kinds of new apps.
2. Take more striking selfies

(Image credit: Apple)
An example camera provided by Apple for the iPhone 12 Pro
Apple's other big selling point for the LiDAR scanner is that it helps the iPhone to automatically focus up to six times faster in low light. Depth detection also enhances your night mode portraits for selfies after dark.
Perhaps you could do the same with Android. As with AR, Google advocates machine learning to improve its images. Night site mode for Android 11 works amazingly well in complete darkness. But even ambient light can add noise to these algorithmically enhanced selfies.
Whereas your bokeh simulator can do weird things if there is more than one face in a frame. So it's no wonder that many other Android makers have added ToF sensors to improve shots. Full LiDAR could go even further.
3. Map the world inside and out
The original use of LiDAR was for mapping. It was used by Apollo 15 astronauts to study the surface of the Moon in 1971. Just as Android users filled in the deficiencies of Street View by uploading spherical photographs, their depth analyzes could lead to improvements in Google Earth's 3D terrain.
Better yet, they could make Google Earth VR (yes, it really does exist) more immersive. But the Tango project focused on the mapping of interior spaces. It opens the possibility that Google Maps allows you to see the majesty of the Sistine Chapel in 3D as if you were really there. Get step-by-step instructions to find your airport gate faster. Or check if your stadium seats are behind a pillar before buying tickets.
4. Non-contact turbocharging controls

(Image credit: future)
Motion Sense from the Pixel 4 was a groundbreaking idea. With the built-in radar, you can control your phone, from silencing calls to skipping songs, with a wave of your hand. It could also detect your presence to unlock the device faster than Apple's Face ID. But sadly, the technology never caught on and was ditched for the Pixel 5.
But just like LiDAR helps autonomous cars detect obstacles with high precision, it could also help phones detect hand gestures. And since LiDAR sensors are cheaper than miniature radars, Motion Sense could be implemented on more Android devices beyond Google's flagship. With a larger user base, app developers might be more inclined to adopt it and invent new touchless controls.
5. Smarter home security
Amazon and Google have been in a war to dominate the smart home for years. It goes beyond smart speakers like the Echo and Home. Amazon also owns Ring and Google owns Nest. As with photography, LiDAR could dramatically improve the night vision of Nest security cameras.
Although Nest may also use sensor technology to launch a standalone alternative to Ring's new home security drone. Whereas Tango helped quadcopters to fend for themselves in 2014. This, in turn, could open the doors to a host of mainstream Android robots.
6. Recover Google Glass

(Image credit: Google)
Google promised the world a head-mounted display in 2012. But despite partnerships with brands like Ray-Bans and Oakley, technical hurdles and privacy concerns have prevented a user-friendly Glass model from being made.
With new competition from Facebook's Project Aria and Apple's smart specs, Glass could very well be back. After all, Google Lens visual search and Assistant voice commands have essentially perfected every feature Glass should have. LiDAR could give you a competitive edge by improving the performance of your AR display.