… for Visual Effects
Tip #1185: What Does LiDAR in an iPhone 12 Do?
LiDAR is a key technology that makes AR believable.
One of the key new features in the iPhone 12 Pro is LiDAR. Lidar stands for light detection and ranging, and has been around since the 1960’s. It uses lasers to ping off objects and return to the source of the laser, measuring distance by timing the travel, or flight, of the light pulse.
An iPhone sends waves of light pulses out in a spray of infrared dots and can measure each one with its sensor, creating a field of points that map out distances and can “mesh” the dimensions of a space and the objects in it. The light pulses are invisible to the human eye, but you could see them with a night vision camera. It works up to a range of 15 feet (5 meters).
The primary purpose of LiDAR in the iPhone is to improve augmented reality (AR) implementation. It will give apps more useful and accurate information about their surroundings, for smoother, more reliable AR. Even today, there is still a lot this technology can do, not just for augmented reality but games and shopping.
LiDAR actually has many uses across many industries. Archaeologists use it to prepare dig sites and autonomous vehicles rely on it to construct real-time 3D maps of their surroundings. LiDAR has even been used to create highly realistic and accurate maps of race tracks in video games, like Project CARS. Police speed guns also use LiDAR.
There’s an excellent article at halide.com, the developers of Halide, an iPhone camera app, that goes into much more detail showing what LiDAR can do and how it relates to AR and mapping the real world into your camera.
As the Halide authors conclude: “Photography isn’t traditionally taking photos anymore; it’s combining all the data and smarts in our devices into allowing totally new interpretations of what ‘photography’ can mean. And if you’re not excited about that, we’re at a loss!”
Here are the references I used for this article: