Apple's new iPhone update could solve one of the biggest annoyances about video chatting
- Apple added a new feature to the iPhone in the iOS 13 beta that makes it seem like you're making eye contact during a FaceTime call even if you're looking at the screen instead of the camera, as two people who have used the beta posted on Twitter.
- Apple may be using ARKit, its suite of tools for developing augmented reality apps, to generate this effect.
- The feature only appears to be available in the developer beta of iOS 13, but it could make its way to every iPhone when iOS 13 launches later this year.
- Visit Business Insider's homepage for more stories.
Anyone who has ever video chatted using FaceTime or a similar app knows that it can be difficult to maintain eye contact while also looking at the person you're speaking with. That's because establishing eye contact requires you to look into your device's camera, not at the other participant in the conversation on screen.
Apple is apparently trying to solve this in iOS 13 with a new feature called FaceTime Attention Correction, as app designer Mike Rundle and podcast co-host Will Sigmon recently posted on Twitter. The description of the feature simply says that eye contact with the camera will be more accurate during FaceTime calls, as a screenshot posted by Rundle indicates.
And the feature actually works, according to Sigmon and Rundle.
"Looking at him on-screen (not at the camera) produces a picture of me looking dead at his eyes like I was staring into the camera," Rundle said in a tweet.
See Sigmon's tweet below to see how a FaceTime conversation could look with the feature turned on.
Apple did not immediately respond to Business Insider's request for confirmation and additional details about how the feature works. The FaceTime Attention Correction function appears to only be available in the developer version of the iOS 13 beta, as it is not currently appearing in the public beta's FaceTime settings.
The feature was discovered just as Apple released the third iteration of its iOS 13 developer beta on Tuesday.
To achieve this effect, Apple uses ARKit to build a depth map of the user's face and adjusts the eyes as needed, Dave Schukin, cofounder of Observant, a company that makes software for monitoring whether drivers are paying attention to the road, wrote on Twitter. The Verge first spotted Schukin's tweet.
The feature could be a new addition coming to every iPhone with iOS 13 later this year, along with confirmed features like Dark Mode and redesigns of apps like Reminders, Photos, and Apple Maps. The company typically releases the latest version of iOS in September to coincide with the launch of its new iPhones, but the firm has not announced an official release date beyond this fall. If you own an iPhone but aren't a developer, you can try the software by installing the public beta after registering on Apple's website.