Apple Vision Pro data privacy: Here's what it gathers, shares from your surroundings

Apple's data privacy overview of Vision Pro and visionOS provides an insight into what exactly the device gathers from our surroundings
An undated image of Vision Pro headset. — Freepik
An undated image of Vision Pro headset. — Freepik

Apple's recently released, class-leading mixed reality headset, called Vision Pro, has given rise to a range of concerns since its groundbreaking launch earlier this month. 

These include both the follies that signify it as an overpriced gadget, and its excellence that validates its comparatively high cost of $3,500. 

However, the industry is currently grappling with whether the tech giant is considerate of individuals' privacy or not, as it reports and are appearing in spotlight, saying that Apple's formidable contender in the sphere of VR/AR acquires users' surrounding data without their consent and shares it with third-party applications.

Read more: Is Xbox cloud gaming app for iOS coming: Here's the answer

Gratefully, Apple last week revealed a detailed data privacy overview of the Vision Pro and visionOS, providing us with an insight into what exactly the device gather from our surroundings and sends to third-party applications.

Environments within the Apple Vision Pro are created using a combination of camera and LiDAR data to provide near real-time viewing of a user’s space. In addition, visionOS uses audio ray tracing to simulate the behavior of sound waves as they interact with objects and surfaces. Applications overlay these scenes or, in some cases, create environments of their own.

Surroundings data Vision Pro collects and shares

  • Plane estimation: Identifying nearby flat surfaces where virtual 3D objects, also known as Volumes by Apple, can be positioned. This feature improves the immersive experience by enabling users to engage with virtual objects within their actual surroundings.
  • Scene reconstruction: The process of scene reconstruction entails generating a polygonal mesh that precisely depicts the shape of objects within the user's physical space. This mesh facilitates the accurate alignment of virtual objects with physical elements in the user's environment.
  • Image anchoring: This functionality guarantees that virtual objects stay fixed in their desired locations in relation to real-world objects, even as the user changes position. The WSJ’s Joanna Stern showcased this technology in a video shared on X, where she is observed placing several timers over items boiling on a stove. 
  • Object recognition: Apple claims to utilise object recognition to identify "objects of interest in your space." In a general sense, it is employed by Vision Pro to recognise the contents of your surroundings.

Can apps get access to your Vision Pro data?

By default, apps are unable to retrieve information about the environment in Vision Pro. Third-party developers may seek access to this environmental data to create more realistic experiences. This is same as giving access to Photos or Camera on an iPhone; a Full Space in Vision Pro can utilise surroundings data to enhance immersive experiences.

“For example, Encounter Dinosaurs requests access to your surroundings so the dinosaurs can burst through your physical space. By giving an app access to surroundings data, the app can map the world around you using a scene mesh, recognise objects in your surroundings, and determine the location of specific objects in your surroundings,” Apple explains.

However, any app will only get access to information about your surroundings within five meters of where you are.