The latest beta version of iOS packs a feature that can help visually impaired users to effectively maintain social distancing by indicating how far away other people are from them.
Apart from detecting the presence of people, it also measures the distance to people in the view of the iPhone's camera, TechCrunch reported on Saturday. The feature should be available to iPhone 12 Pro and Pro Max running the iOS 14.2 release candidate, said the report.
This useful feature emanates from Apple's augmented reality (AR) platform for iOS devices, ARKit.
ARKit 4 introduced a brand-new depth application programming interface, creating a new way to access the detailed depth information gathered by the LiDAR scanner on iPhone 12 Pro, iPhone 12 Pro Max, and iPad Pro. The LiDAR Scanner measures the distance to surrounding objects.
For ARKit, Apple developed a feature called "people occlusion" which detects shape of people. The team found that when combined with the accurate distance measurements provided by the LiDAR units on the iPhone 12 Pro and Pro Max, this tool could of immense help for visually impaired users, said the report.
iPhone packs several other features which are particularly useful for people with visual impairment or low vision. For instance, "VoiceOver" describes exactly what is happening on your iPhone and "Magnifier" works like a digital magnifying glass.