Apple researching real-time LiDAR surface tracking, other tech to record touch sensations

iOS VPN App

Protect Your Access to the Internet


Apple’s augmented reality headset or a future iPhone could use light from a display to track the movement of nearly any surface, while finger devices may be able to provide detail to the system about what kind of objects a user may be touching.

Apple has been rumored to be working on a form of VR or AR headset for quite some time, and various reports and patent filings have indicated it could take a number of directions. In a pair of patents granted by the US Patent and Trademark Office on Tuesday, two more ways the system could function have been revealed.

Light-based AR object tracking

One of the advantages of AR headsets is the inclusion of cameras in the setup. Usually employed to take an image of a scene and for object recognition, the same hardware could be used to perform object tracking, seeing how an item changes position in real time relative to the headset’s position.

The benefits of object tracking for AR generally boil down to being able to apply digital graphic overlays to the video feed around the object. This could include app-specific status indicators on a controller that would otherwise not be visible in the real-world view, for example.

However, the resources required for object recognition and object tracking, complete with determining the orientation, can be quite heavy on a system. In an area where a massive amount of processing is required to give an optimal experience, any methods to reduce the resource usage is welcomed.

In the patent titled “AV/VR controller with event camera,” Apple offers the idea that the camera system may not necessarily need to keep track of all of the pixels relating to a tracked object at all times, but instead could cut the number down considerably to a bare minimum when only cursory checks are needed.

This patent image shows a HMD could need to track a user-held secondary device.

This patent image shows a HMD could need to track a user-held secondary device.

Apple suggests that the system selects specific pixels in an image that relate to a tracked object, which in turn provides readings for light intensity and other attributes. In the event the camera or object moves or alters position, the light intensity for those few pixels will change beyond a set limit, triggering the entire system to start…

Source…