- Advertisement -

RECENT HEADLINES

- Advertisement -

Reviews

Google Rolls Out Android XR SDK Developer Preview 2 with New Tools

Google has announced Developer Preview 2 of its Android XR SDK, adding a fresh layer of capabilities for developers working on immersive apps using familiar Android tools. The update arrives just in time for Google I/O 2025, where Android XR is taking center stage in two featured sessions.

What’s New in Developer Preview 2

The latest preview focuses on expanding functionality and streamlining development for extended reality (XR) applications. Highlights include:

  • Immersive Video Playback: The Jetpack XR SDK now supports 180° and 360° stereoscopic video playback, with encoding options like MV-HEVC for high-quality visuals.
  • Smarter Layouts in Compose for XR: Developers can define UI layouts that dynamically adjust to various XR displays using new tools like SubspaceModifier and SpatialExternalSurface.
  • More Material Components for XR: Material Design for XR adds support for components like TopAppBar, AlertDialog, and ListDetailPaneScaffold, helping apps maintain consistency across 2D and XR environments.
  • Hand Tracking with ARCore: Jetpack XR apps can now access 26 posed hand joints, enabling gesture-based controls and more natural user interaction.

Better Tools, Smoother Dev Experience

The Android XR Emulator continues to improve with stability fixes, AMD GPU support, and tighter integration into the Android Studio UI. Unity developers aren’t left out as OpenXR: Android XR (Pre-Release 2) introduces performance upgrades including SpaceWarp shader support, Dynamic Refresh Rate, and realistic hand meshes with occlusion.

A coding interface displaying an 'AnimalList' component in a development environment for Android XR, featuring a list of animals, a search bar, and detailed information on the Bengal Tiger.

New Unity samples demonstrate features like hand and face tracking, plane detection, and passthrough capabilities. There’s also a refreshed Mixed Reality template and persistent anchor support, making it easier to build advanced spatial apps.

Firebase and AI Come to XR

The integration of Firebase AI Logic for Unity (now in public preview) marks a new chapter for AI-powered XR. Built to work with Gemini’s multimodal and streaming capabilities, it enables responsive conversational interfaces and rich, intelligent interactions. Developers can also take advantage of Firebase services like App Check and Remote Config for security and customization.

Looking Ahead: Devices and Distribution

Developer Preview 2 is optimized for Samsung’s upcoming Project Moohan headset, set to be the first to ship with Android XR. Shortly after, XREAL will release Project Aura, a portable, tethered device designed to run Android XR apps. Aimed at early adopters and developers, Aura offers access to standard Android apps and XR-specific experiences.

Google Play is preparing to host these apps, including support for preview assets like stereoscopic videos and spatial screenshots. Developers are encouraged to get their apps ready now to be among the first listed on the Android XR Play Store later this year.

Building XR Together

Google is continuing to collaborate with industry partners like the Khronos Group to support open standards, such as the upcoming glTF Interactivity spec. Jetpack XR will begin supporting interactive 3D models later this year, unlocking new possibilities for engaging XR content.

More details, sample projects, and resources are available at developer.android.com/develop/xr.

With new tools, devices, and support from Google’s broader ecosystem, Android XR is shaping up to be a key platform for next-gen immersive apps. Developers ready to test the waters can start experimenting now; headsets are coming, and the stage is set.

Note: This content may contain affiliate links, meaning we may earn a commission for purchases made using them.

- Advertisement -

Featured