I’ve always loved the technique of Apple for its practicality and thoughtfulness. Unlike Samsung, which could afford absolutely stupid experiments in a commercial smartphone technology for tracking gaze or 100-fold approach of the images, the Cupertino-based company was acting more cautious and did not offer its users with innovation for the sake of innovation. But at some point something went wrong and Apple first decided to turn the tablet into a computer, and then engaged in the development are not applicable in practice, AR, and lidar.
After Apple released iPad Pro 2020 with an integrated lidar, it has become clear that the tablet is only a test ground for computer vision technology in front of really important release. After all, a tablet, especially a professional, is a highly directional device, but a smartphone, especially the flagship, is a gadget more mass. So no doubt that the iPhone 12 will also receive support for lidar, no one was in the beginning. It is no wonder that Apple installed on the iPhone such 11 large lot that just left room for one more module.
Lidar scanner — it…
If you still do not know, lidar is a sensorthat can recognize three-dimensional objects thanks to technology, absorption and scattering of light. He just radiates a beam of light that is repelled from objects placed in his way and then back again. Thus it is possible to form the silhouette of the obstacles in the visible area, and determine the region free from foreign objects. Usually lidars are used in the systems of auto-pilot, acting as the eyes of the unmanned vehicle.
But why is the smartphone? Well, actually lidar can accommodate at least three goals. It can be used for face recognition, as did Samsung with their Galaxy S10 5G, which the developers of the company learned to read the topography of the face is not worse than the Face ID. But since Apple decided to install the lidar on the rear panel of their devices, it is logical to assume that they will be used if not specifically for photography, at least in the role of a support tool for the camera. Hence, there are only two options: work with augmented reality and portrait photography.
Augmented reality on the iPhone
Yes, we know that Apple has a thing for augmented reality, lidar can help in realizing goals by building AR-objects to work with the surrounding space and obstructions, so as not to fall into the texture. But, as you can see in the above images, now iPhone pretty good job with the placement of the virtual objects, creating the impression of a quite material. So the only thing that would improve lidar is the measurement using the app “Line”, making them more accurate and precise positioning when placing virtual furniture from the application “IKEA”. Other scenarios I come up with yet.
Another thing – is there demand for this feature from users. In my opinion, AR is exactly the kind of story that is only suitable as wow-factor. Well, just think how many times you start it? One? Two? Three? Sure not more, and even then only in the first week after purchase, yet I want to test all the capabilities of the product. No, really, remember, how many times have you tried to run AR on your current iPhone or iPad? On the strength of one and a half times. Does not change the situation and lidar, because it is not that augmented reality is poorly aligned with the everyday, and that no one needs.
How to improve the portraits on the iPhone
Portrait still looks the most likely scenario of massive use of lidar. Due to the fact that it allows to recognize three-dimensional objects, he can achieve more natural and right blur effect. But telephoto lenses, which Apple used in the iPhone in 2016, and so coped with its task to 100%. Something I don’t remember who some of the experts negatively about portrait photography even iPhone 7 Plus, not to mention newer models. In contrast, the flagship devices from Apple or superior to the competition, either was on a par with them, but not inferior. Therefore, in General, the lidar will be useless.