Apple has released a feature that makes it appear as if looking at the camera, not the screen, during FaceTime calls.
iOS 13 has implemented an enhancement that will look like you’re looking at the camera, even if you look at the screen during FaceTime search.
Dace Schukin announced on Twitter with the new feature that the users look at the screen and the conversation on the other side, giving the impression that the ‘problem’ will be solved. Eye contact is of great importance in the meetings is expected to provide a great advantage.
(Left: Looking at the screen in the Camera app / Right: Looking at the screen in the FaceTime app on iOS 13 Beta 3)
Even if you look at the other person’s eyes while making a call, you can’t afford to look at the screen because you look at the screen instead of the camera. By addressing this problem, iOS 13 used a complex image manipulation technique and eliminated the problem.
According to Dave Schukin, the iOS 13 uses augmented reality with the TrueDepth camera and ARKit to change the position of your eyes in real time.
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
— Dave Schukin ? (@schukin) July 3, 2019
The feature is enabled in the application’s settings. The future of this feature was predicted in 2017, as The Verge reports. Feature review, iPhone’s eyes, and nose by detecting this process revealed.
It is currently unknown whether the iPhone Xs and Xs Max will be limited to older devices.