iOS VPN App
The iPhone 12 and 12 Pro are on sale now, but one of the key differences between the Pro and non-Pro models this year is a new type of depth-sensing technology called lidar. Peer closely at one of the new iPhone 12 Pro models, or the most recent iPad Pro, and you’ll see a little black dot near the camera lenses, about the same size as the flash. That’s the lidar sensor.
Los iPhone 12 Pro y 12 Pro Max tienen tres cámaras traseras.
But why is Apple making a big deal about lidar and what will the tech be able to do if you buy the iPhone 12 Pro or iPhone 12 Pro Max? It’s a term you’ll start hearing a lot now, so let’s break down what we know, what Apple is going to use it for and where the technology could go next.
What does lidar mean?
Lidar stands for light detection and ranging, and has been around for a while. It uses lasers to ping off objects and return to the source of the laser, measuring distance by timing the travel, or flight, of the light pulse.
How does lidar work to sense depth?
Lidar is a type of time-of-flight camera. Some other smartphones measure depth with a single light pulse, whereas a smartphone with this type of lidar tech sends waves of light pulses out in a spray of infrared dots and can measure each one with its sensor, creating a field of points that map out distances and can “mesh” the dimensions of a space and the objects in it. The light pulses are invisible to the human eye, but you could see them with a night vision camera.
Isn’t this like Face ID on the iPhone?
It is, but with longer range. The idea’s the same: Apple’s Face ID-enabling TrueDepth camera also shoots out an array of infrared lasers, but can only work up to a few feet away. The rear lidar sensors on the iPad Pro and iPhone 12 Pro work at a range of up to 5 meters.
Lidar’s already in a lot of other tech
Lidar is a tech that’s sprouting up everywhere. It’s used for self-driving cars,…