If you have ever used FaceTime, the question of “where to look” during a conversation you know nothing less than the eternal “to be” and “what to do”. On the one hand, I want to see the interlocutor, his gestures and facial expressions. And, on the other, not to look to the subscriber at the other end a bored melancholic with hooded eyes, you need to look strictly at the camera. Most people can’t determine what is more important, but because rolling your eyes at the screen then at the camera. That’s only because of these marathons by the middle of the conversation you want to give your eyes a rest. But in iOS 13 this problem will be no more.

See also: What’s new in iOS 13 beta 3

In the iOS settings 13 there is a special option that allows you to maintain eye contact with the interlocutor. It was called “Correcting the focus” and is based on the work of the special algorithms that artificially change the angle of your vision as if all the time look strictly at the camera and not watching the actions of the interlocutor on the screen of your iPhone or iPad.

On the left — look at the camera right at the screen

Where to look in FaceTime

Given how realistic correction feature attention changes the direction of gaze of the user, it can be assumed that it is based not only on software algorithms, but also on data obtained from the system of TrueDepth. This is the same system that is responsible for determining the depth if the face interface Face ID. Its application explains why the correction of attention is only supported on iPhone XR, iPhone XS and XS Max.

See also: Apple turned on the radio in iOS 13 for some users

However, it is possible that in this case the role played by augmented reality technology. In any case, if correction focus system was used only TrueDepth, nothing would prevent Apple to implement this feature yet on the iPhone X and iPad Pro 2018. However, at the moment it is supported only in the firmware of the three smartphones.

Advertisements