To begin using eye tracking, you’ll need to calibrate the Vision Pro to recognize your gaze. The setup process is straightforward and takes about a minute. Follow these steps:
-
Access Settings: Put on your Vision Pro and navigate to the Settings app. Scroll to the Eyes & Hands section (or Accessibility > Interaction for more options).
-
Enable Eye Tracking: Select Eye Tracking and toggle it on. A pop-up will guide you through calibration.
-
Follow the Dot: A moving dot will appear on the screen, stopping at various points. Look at the dot as it moves to calibrate the system to your eye movements. For best results, remain stationary and ensure the headset fits correctly, as misalignment can affect accuracy.
-
Confirm Calibration: Once complete, a black dot (the Dwell Pointer) will appear on-screen, tracking your gaze. This acts as a cursor, replacing finger-based input.
If tracking feels off, you can recalibrate by pressing the top left button on the headset four times or by going to Settings > Eyes & Hands > Redo Eye Setup. Siri can also initiate this process with a command like, “Siri, set up eyes.”
Using Eye Tracking for Navigation
Eye tracking on visionOS serves as the primary targeting system, functioning like a mouse pointer or touchscreen hover. Here’s how it works:
-
Selecting Items: Look at a UI element, such as a button or app icon, to highlight it. A pinch gesture (tapping thumb and index finger together) confirms the selection, mimicking a click or tap. For hands-free control, enable Dwell Control in Settings > Accessibility > Interaction > AssistiveTouch. This allows selections by holding your gaze on an element for a set period, indicated by a ring-shaped progress bar.
-
Scrolling with Eyes: With visionOS 3, expected in September 2025, users can scroll through lists, emails, or webpages by looking at the top or bottom of a window. This hands-free scrolling, reported by Cult of Mac, eliminates the need for hand gestures like pinching and flicking, making navigation smoother and less tiring.
-
Opening Menus: Gaze at a menu bar to expand it or look at a microphone icon to trigger speech input. For example, looking at your hand can summon the Home View or Control Center.
These interactions rely on the Vision Pro’s internal infrared cameras and LEDs, which project light patterns to track eye movements with precision, enhanced by the R1 chip’s foveated rendering for efficient display processing.
Customizing Eye Tracking Settings
To tailor the experience, adjust settings in Settings > Accessibility > Interaction or Pointer Control:
-
Dwell Control: Fine-tune the dwell time for selections to match your preference.
-
Smoothing: Adjust how closely the cursor follows your gaze. Less smoothing is faster but may feel twitchy; more smoothing ensures stability.
-
Snap to Item: Enable this to make the cursor automatically lock onto buttons or toggles when your gaze is nearby, improving accuracy.
-
Zoom on Keyboard Keys: Magnify keyboard keys for easier text input when using eye tracking.
-
Auto-Hide: Set how long the cursor remains visible before hiding when not in use.
For optimal performance, place the Vision Pro on a stable surface about one foot from your face, avoid glare or reflections (especially if wearing glasses), and consider using Dark Mode to enhance contrast. Cult of Mac notes that the Vision Pro’s multiple cameras outperform the single front-facing camera on iPhones, but accuracy may still vary compared to the headset’s controlled environment.
Accessibility and Practical Applications
Eye tracking is a cornerstone of Vision Pro’s accessibility, designed to assist users with physical disabilities who may struggle with hand gestures or touch inputs. As Apple highlights, visionOS supports a flexible input system, allowing control via eyes, voice, or a combination, with features like Switch Control and Sound Actions complementing eye tracking. Ryan Hudson-Peralta, an accessibility consultant, praised Vision Pro as “the most accessible technology I’ve ever used,” noting its seamless functionality for users with limited mobility.
Beyond accessibility, eye tracking enhances everyday use. For instance, hands-free scrolling in visionOS 3 could make browsing emails or websites more comfortable than flicking a thumb on an iPhone. The ability to trigger actions like opening the Control Center or typing on a virtual keyboard by gaze alone streamlines tasks, especially in mixed-reality environments where physical inputs may be impractical.
Why It Matters
Eye tracking on visionOS represents a leap in human-computer interaction, aligning with Apple’s “it just works” philosophy. UploadVR reports that users find this system more intuitive than controller-based VR headsets, thanks to the seamless integration of eye and hand tracking. The Vision Pro’s 14 cameras, including four infrared cameras for eye tracking, and the R1 chip ensure precise, low-latency performance, setting a new standard for mixed-reality interfaces.
For tech users, this technology offers practical benefits: navigating apps without lifting a hand, controlling devices in hands-busy scenarios (like cooking or working), and enabling inclusive access for those with motor impairments. As Apple prepares to unveil visionOS 3 at WWDC 2025 on June 9, the addition of eye-based scrolling could further elevate the Vision Pro’s appeal, potentially surpassing iPhone navigation in ease of use.
Troubleshooting and Tips
If eye tracking feels inaccurate, try these tips from The Verge and Cult of Mac:
-
Recalibrate Regularly: Redo the setup if you add ZEISS optical inserts or share the headset with others, as it’s personalized to one user’s eyes.
-
Check Fit: Ensure the headset is snug and properly aligned to avoid tracking errors.
-
Use Guest Mode: For demos, activate a guest user session via Control Center > Profile Icon to avoid resetting your calibration.
-
Lighting Conditions: Avoid bright reflections or glare, which can disrupt infrared sensors.
If issues persist, contact Apple Support or file a Feedback Assistant request to report specific use-case challenges, as eye-tracking data isn’t directly accessible to developers for privacy reasons.
Looking Ahead
With visionOS 3, expected in fall 2025, eye tracking will become even more powerful, particularly with hands-free scrolling across Apple’s built-in apps and third-party developer support. Bloomberg’s Mark Gurman notes that this feature leverages existing hardware, ensuring no additional cost for users. As Apple continues to refine visionOS, eye tracking could redefine how we interact with spatial computing, making the Vision Pro a must-have for tech enthusiasts and accessibility-focused users alike.