LiDAR stands for Light Detection & Ranging Technology. This technology, LiDAR uses a pulsed laser to calculate an object’s variable distances from the earth’s surface. These light pulses, put together with the information collected by the airborne system to generate accurate 3D information about the earth’s surface and the target object.
This technology was being used on mapping the surface of the Mars by NASA.
Now that we know LiDAR is actually a big tech that they use in rocket building and stuff, you might be wondering when will we ever gonna see it our day to day life.
Well,The day is not very far as Apple decided to put this massive technology in to their latest Iphone 12 Pro model.
iPhone 12 pro comes with a LiDAR scanner.
Apple introduced the sensor in its high-end iPhone 12 Pro and Pro Max flagships to enhance photography. This is not the first time the company is using LiDAR sensor in its products. In March 2020, the Cupertino-based company included the sensor in its new iPad. However, the primary purpose of the sensor in that device was to aid augmented reality.
With the new iPhone, Apple claims that it’ll help you with three things: photo and video effects, precise placement of AR objects, and object and room scanning. The last two applications might be largely for developers and businesses like construction that need to map the room. For an average consumer, photo and video effects might be the most interesting part.
Because LiDAR can work in the dark by shooting lasers to calculate distances, Apple’s using it to improve the autofocus on the high-end iPhones. The company claims that because of the sensor the iPhone 12 Pro phones have 6 times faster autofocus in low light as compared to a normal camera. Plus, the LiDAR sensor enables the device to take better portrait mode photos in low-light conditions.
The concept of using a dedicated depth sensor is not new. Plenty of mid-range phones such as the OnePlus Nord use a depth sensor camera to enhance portrait mode photos. But these sensors work better in the daylight than low-light. Some devices such as the Huawei P30 Pro and the Samsung Galaxy S20 Ultra have a time-of-flight (ToF) sensor. It uses infrared rays to maps the surroundings. You can read about the ToF sensor in our explainer here.
While both ToF and LiDAR can only scan the area up to a few meters in phones, some variants of the latter can measure distances more than 100 meters. However, those are used primarily on top of cars. The advantage of having a LiDAR sensor is that it sends smaller pulse rays from one corner to another to ‘scan’ the area. On the other hand, the ToF sensor sends a single flash pulse to measure the area, and that can cause more accuracies when distances are calculated.
More Snapchat accuracy?
Having a LiDAR sensor on a phone will encourage more developers to probably find better uses cases for AR that are more convincing than one-time amusement demos. Snapchat has already confirmed that it’ll ship LiDAR-powered lenses on its iOS app.
Apple might not be able to convince you to buy its high-end modes just on the basis of the LiDAR sensor. However, the company hopes that if you’re an AR developer or someone who cares for quality low-light photos, it could one of the reasons for you to spend a few more dollars.