Meta Quest 3 & 3S: A Glimpse into the Future of XR Interaction
Meta continues to push the boundaries of extended reality (XR) with the latest Public Test Channel (PTC) update for Horizon OS v85. While seemingly incremental, the features – a surface keyboard for Quest 3 and remappable action button for the Quest 3S – hint at a larger shift in how we’ll interact with virtual and mixed reality environments. These aren’t just about convenience; they’re about making XR feel more natural and intuitive.
The Rise of Surface Computing: Beyond the Keyboard
The Surface Keyboard, currently exclusive to the Quest 3, is arguably the more groundbreaking of the two features. For years, text input in VR has been a clunky experience. Floating keyboards are imprecise, and constantly switching to a physical keyboard breaks immersion. Meta’s solution – turning any flat surface into a responsive keyboard – addresses this directly.
This isn’t a new concept. Meta has been researching this for over six years, and recent breakthroughs combining neural networks and language models, as demonstrated by the TouchInsight software developed with ETH Zurich, have made it a reality. The key is predicting touch events without the need for tracking markers, a limitation of earlier prototypes. Early reports suggest impressive accuracy, offering a significantly improved typing experience.
But the implications extend far beyond just typing. Surface computing, the ability to interact with virtual elements overlaid on real-world surfaces, is poised to become a core tenet of XR. Imagine virtual controls appearing on your desk for a complex VR application, or a collaborative design session where participants manipulate 3D models directly on a tabletop. This moves XR beyond a headset experience and integrates it seamlessly into our physical spaces.
Meta Research Turns Any Surface Into A Virtual Keyboard
Meta is researching turning any flat surface into a virtual keyboard, leveraging ambient haptics.
Customization and Control: The Quest 3S Action Button
While less revolutionary, the ability to remap the Action Button on the Quest 3S is a significant step towards user customization. The original function – toggling passthrough – is useful, but limiting. Allowing users to assign different actions to the button empowers them to tailor the headset to their specific needs and workflows. This aligns with a broader trend in XR towards greater personalization.
This seemingly small change foreshadows a future where XR hardware adapts to the user, rather than the other way around. We can expect to see more customizable controls, adjustable interfaces, and personalized experiences that cater to individual preferences and accessibility requirements. Companies like Varjo are already pioneering this with their professional-grade headsets, offering extensive customization options.
Navigator, Horizon Feed, and the Evolving XR Interface
The planned shift to Navigator as the default UI and the removal of the Horizon Feed are further indicators of Meta’s evolving vision for XR. The Horizon Feed, a social-focused discovery platform, hasn’t gained significant traction. Navigator, with its emphasis on quick access to apps and experiences, represents a more streamlined and functional approach. This suggests a move away from trying to replicate social media within VR and towards focusing on core XR use cases.
This mirrors a trend observed in other emerging technologies. Early attempts to shoehorn existing paradigms (like social feeds) into new platforms often fail. Successful platforms prioritize functionality and user needs, building unique experiences that leverage the technology’s strengths.
Quest’s New ‘Navigator’ UI Becoming Default As Horizon Feed To Be Removed
“Starting” in Quest v85, the new ‘Navigator’ UI is becoming the default, and the Horizon Feed will be “gradually” removed from Horizon OS.

Future Trends to Watch
- Haptic Feedback Integration: As surface computing evolves, expect more sophisticated haptic feedback systems to simulate the texture and feel of virtual objects.
- AI-Powered Interaction: AI will play a crucial role in understanding user intent and adapting the XR environment accordingly. Imagine a keyboard that learns your typing style and predicts your next word.
- Cross-Platform Compatibility: The ability to seamlessly transition between different XR platforms and devices will be essential for widespread adoption.
- Spatial Audio Enhancements: Realistic spatial audio will further enhance immersion and create a more believable XR experience.
- Hand Tracking Refinement: Improved hand tracking accuracy will reduce reliance on controllers and enable more natural interactions.
Did you know? Meta’s research into surface computing dates back to 2018, demonstrating a long-term commitment to this technology.
FAQ
- Is the Surface Keyboard available on the Quest 3S? Currently, no. It’s exclusive to the Quest 3 in the PTC build.
- What is the Public Test Channel? It’s a beta program that allows users to test pre-release versions of Horizon OS.
- Will features in the PTC always make it to the stable version? Not necessarily. Meta may remove or modify features based on user feedback and technical considerations.
- What is Horizon Navigator? It’s a redesigned user interface for the Quest, focusing on quick access to apps and experiences.
Pro Tip: If you’re a Quest 3 user, consider joining the Public Test Channel to experience these features firsthand and provide valuable feedback to Meta.
The features rolling out in Horizon OS v85 PTC aren’t just about incremental improvements; they represent a fundamental shift in how we’ll interact with XR. By embracing surface computing, personalization, and streamlined interfaces, Meta is laying the groundwork for a future where virtual and physical worlds seamlessly blend together. The journey is far from over, but these developments offer a compelling glimpse into what’s to come.
Want to learn more about the latest XR innovations? Explore our other articles or subscribe to our newsletter for regular updates.
