Overview
The XR Privacy Framework (XRPF) is for App Developers to give their participants control of their data and privacy while collecting additional insights from their applications. It is also for Tool Developers that want to build a transparent privacy standard through the XR ecosystem.
XRPF is not a privacy policy and is not legally binding. It is a specification for informed consent about the types of data an XR app records. App Developers and Tool Developers should maintain separate privacy policies that specify how data is collected and used, and disclose any applicable subprocessors.
XRPF defines consent between the App Developer and the participant about data being recorded, organized into five categories: Hardware Data, Spatial Data, Location Data, Social Data, and Biometric Data.
What can an App Developer do with data after it is recorded? The developer is free to use data as they see fit—for research, training, app optimization, monetization, and so on. These uses should be indicated in their privacy policy.
As XRPF is a specification beyond the scope of a single piece of software, implementations will evolve as technology changes. This specification will be updated with the following goals:
- Give participants a reasonable understanding about the data they make available
- Allow developers to gain additional value from the data their participants have agreed to provide
Why XRPF?
Many data sources in XR are outside the scope of standard permission popups available on Android devices or PCs. These sources can be important for building immersive experiences:
- Location Data — apps already ask for permission
- Social Data — apps already ask for permission
- Hardware Data — not asked for permission
- Spatial Data — not asked for permission
- Biometric Data — not asked for permission (depends on device)
Participants should be able to have feature-rich XR experiences—powered by these data sources—without compromising their privacy. Correct implementation of XRPF separates data used by the app to deliver experiences from data available to the developer for analysis.
Concepts & Terminology
XR — any augmented reality, virtual reality, or mixed reality experience.
Data Sources — a grouping of data that is reasonably connected to provide a high-level agreement about what the user finds acceptable to collect. This may refer to a technical grouping (e.g., GPU, CPU, and OS are all device-specific elements) or a conceptual grouping (e.g., fixations and heart rate are not directly related, but the source is the user's body).
Data Sources
XRPF organizes potentially sensitive data into five categories. Each describes the types of data available to the app that should be disclosed to the user when collected, along with restrictions on data that should never be recorded.
Hardware Data
Device info, performance metrics, CPU/GPU, OS, battery, and framerate.
Spatial Data
HMD & controller position, room size, gaze direction, and surface detection.
Location Data
Latitude, longitude, elevation, and compass direction for outdoor AR.
Social Data
Non-identifiable multiplayer and social engagement metadata.
Biometric Data
Eye tracking, heart rate, cognitive load, EEG, EMG, ECG, and GSR.
⚙ Hardware Data
Covers any hardware used for the experience.
Never record: individual hardware serial numbers, MAC addresses, or images from pass-through cameras.
- Device ID implementation — combination of serial numbers to track device over multiple sessions
- IP address (may identify country and city)
- Friendly name of HMD (e.g., Pico Neo 3 Eye)
- CPU manufacturer and type (e.g., Intel i7-1200)
- GPU manufacturer and type (e.g., NVIDIA RTX 2060)
- OS version (e.g., Windows 11 Rev 2)
- Battery level of HMD and controllers
- Date/time when app was launched
- Framerate over time
- Dropped frames over time
- Performance data (e.g., GPU time per frame, memory usage)
□ Spatial Data
Covers HMD and controller movement in XR space and the XR space itself.
- Positions of events in XR space
- Position and rotation of HMD in XR space (can infer height)
- Position and rotation of controllers in XR space (can infer arm length)
- Position and rotation of additional tracking devices (e.g., Vive Trackers)
- Room size (to rectangular area and 10 cm accuracy)
- Direction of gaze (including eye direction if hardware is available) at a fixed interval
- Surface sizes and positions detected in AR
● Location Data
Covers location data primarily for outdoor augmented reality experiences.
- Latitude, longitude, and elevation
- Compass direction
♥ Biometric Data
Covers sensors that record biometric data.
Never record: images of the user's eyes.
- Fixations and saccades using eye tracking hardware
- Cognitive load (from HP Omnicept)
- Heart rate
- Heart rate variance
- Electromyography (EMG)
- Electroencephalography (EEG)
- Electrocardiogram (ECG/EKG)
- Galvanic Skin Response (GSR)
- High fidelity motion tracking (e.g., Teslasuit)
Implementing the Framework
Reference implementations are available for Unity and Unreal Engine.
Guidelines
- Provide non-technical descriptions of features for participants
- Customize the description of biometric data sources (if used)
- Do not disable app features if certain data sources are not selected—only disable the recording of that data
- Display the agreement before data is recorded
- Allow a user to change their agreement from a reasonably accessible menu option
- Include a link to your privacy policy for further information
Allowances
- The agreement may be waived if you receive informed consent by another method, such as written consent in an academic study or via an employment agreement. Consider including a popup in XR informing your user that data is being recorded regardless.
- Customize UI visuals and controls for your application
- Remove data source agreements for invalid options (e.g., the Meta Quest doesn't include biometric sensors, so that option doesn't need to be displayed)
- You may require the user to agree with the hardware data source to continue using your app. This should not include any personally identifying information.
- You may save the agreement for a participant—it does not need to be accepted each session
Supporting the Framework
When a participant is presented with the agreement, their choice should set certain flags in the application. Recording data should respect the flags set by the user's choices.
Tool Developers (such as analytics, advertisement, or machine learning companies) should build their SDKs to enable or disable collection of data respecting these flags.
App Developers should verify that the tools they are using correctly implement this standard.
Supporters & Adopters
Cognitive3D
Cognitive3D is a supporter and adopter of the XR Privacy Framework. Their spatial analytics platform—spanning SDKs for Unity, Unreal, visionOS, WebXR, Android XR, and C++—implements the XRPF consent standard, enabling developers to instrument XR experiences while giving participants transparent control over which data sources are recorded.
Contribute
XRPF is an open standard released under the MIT License. Contributions are welcome:
- Proposals for including additional hardware and software
- CC0 iconography
- Typos or edits for clarity (grammar, descriptions, use cases)
Open an issue or submit a pull request on GitHub.
License
The XR Privacy Framework is released under the MIT License. Copyright © 2026 Cognitive3D.
See the full LICENSE file on GitHub.
◉ Social Data
Covers non-identifiable multiplayer gameplay and social engagements.
Never record: real names or usernames of the user or their friends.