Yesterday brought with it a new iOS 13 beta, and it includes a pretty nifty new FaceTime feature.
FaceTime Attention Correction is a new iOS 13 feature that will adjust your eyes during FaceTime calls to make it look like you're always staring into the camera. Spotted by Mike Rundle, the feature is currently available on iPhone XS, XS Max, and XR and you can see an example of it in the images from Will Sigmon below.
The feature works by using Apple's ARKit to get a depth map of your face, explains Dave Schukin, and then it adjusts your eyes to make them appear to be looking at the camera. You can see an example in the video below, with the eyeglass arm distorting slightly when passing over the eyes.
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
— Dave Schukin (@schukin) July 3, 2019
When you're on a video call, the person on the other end can always see your face, and they may think you're distracted or looking at something else if your eyes are focused on the camera and looking right at them. iOS 13's FaceTime Attention Correction aims to fix this, making sure your eyes appear like they're pointed at the camera if you're actually looking at your screen or something else.
iOS 13 will launch to the public this fall.