The iPhone 12 Pro and iPhone 12 Pro Max are getting a new feature called “People Recognition,” which enables visually impaired users to see how far they are from other people. The new feature is part of a new beta version of iOS 14.2. It uses the LiDAR sensor and wide-angle camera found on the latest iPhone models to accurately capture the distance between users and their nearby people. Apple may include it among the top accessibility-related changes in the public version of iOS 14.2 in the coming days.
The person recognition feature available in the Loupe app works with a combination of Augmented Reality (AR) and machine learning to help visually impaired iPhone users see where people and objects are in their vicinity. The addition was initially discovered in September. However, it is now part of the iPhone 12 Pro and iPhone 12 Pro Max via the iOS 14.2 version “Release Candidate”, which reached both developers and public beta testers on Friday.
As reported TechCrunch’s people detection function is an extension of the People occlusion The ability of Apple’s ARKit enables the AR system to lock down virtual content from a person identified by the camera feed.
In addition to the software optimizations, the feature uses the LiDAR sensor and wide-angle camera of the iPhone 12 Pro and iPhone 12 Pro Max to accurately measure the distance between users and their nearby people and objects. Ultimately, this helps provide visual assistance to users with blindness or extremely low vision.
The people recognition function lets users know if there are people in their area. When it finds someone in close proximity, it measures the distance and then creates a stereo sound that corresponds to that person’s direction. Also, users can set specific sounds once the iPhone has measured a certain distance. This essentially helps to keep your distance from other people and objects. It is also very helpful today when people are advised to keep a distance of at least two meters from others.
Apple also provided an option for haptic stimuli, which speeds up as a person gets closer, TechCrunch notes. However, the option for haptic impulses is limited to the iPhone and even not available on the Apple Watch. There is also the option to visually indicate how far the person is from the user.
Since the people recognition feature is currently in beta, there are a few bugs that may not always give the correct results. Still, Apple will likely fix them before its public debut that may come with the release of iOS 14.2.
It’s important to point out that the feature is currently specifically intended for the iPhone 12 Pro and iPhone 12 Pro Max. This means that if you are even using the iPhone 12 or the iPhone 12 mini, you cannot experience it. The exclusivity may give some users another reason to choose a more expensive option in the iPhone 12 series.
Are iPhone 12 Mini, HomePod Mini the Perfect Apple Devices for India? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, Download the episodeor just hit the play button below.
Source link : https://gadgets.ndtv.com/mobiles/news/iphone-12-pro-max-people-detection-magnifier-app-ios-14-2-apple-blind-visually-challenged-2319235#rss-gadgets-all