How to Create a Cross-Platform App for Apple Vision Pro
On February 2, 2024, Apple officially released the Apple Vision Pro, their revolutionary spatial computer that aims to change how people work, communicate, and entertain themselves.
The Apple Vision Pro works as a mixed reality (VA + VR) and allows you to control with just your voice, hand, and eyes.
Also, the M2 chip powers it, and the R1 chip is specifically dedicated to processing input from the cameras, sensors, and microphones.
Developing Apps for Apple Vision Pro
With these great powers the Vision Pro offers to us, there are a few options we can use to develop and give support to our mobile apps for the visionOS when we want to use cross-platform tools, be it for sharing the code base, flexibility, or faster development:
Adding a native visionOS module to our project and communicating with the cross-platform side.
Running the cross-platform app in a Compatibility Mode.
Using libraries that will help us develop apps for the visionOS from the cross-platform tool.
Let’s explore the final two options featured in this blog post. Ready? Let’s dive in!
How to Use Flutter for Apple Vision Pro Apps
For your Flutter apps to run in an Apple Vision Pro, you only need to add some basic configurations to your Xcode project:
Update your Xcode and install the visionOS simulator
Add “Apple Vision” as one of the Supported Destinations of your app
Be sure your app excludes the architecture for Intel x86
And done! Now you can select the “Apple Vision Pro” simulator as run destination in your Xcode project and run.
That way, your app will run in the Compatibility Mode and will seem the same as it will be on an iPhone/iPad but with the possibility of changing the resolution and orientation of the window.
Limitations
Have to run the project from Xcode
Loss of the hot reload feature
The app looks like the same as on iPad/iPhone
No access to the visionOS libraries
A taste of the future
The Flutter community is already working on plugins that allow you to access Apple Vision features like Hand Detection, Face Detection, and more (https://pub.dev/packages/apple_vision).
It’s an initial development for the new OS and may have some bugs.
As Flutter, the same configurations work for running your React Native apps on an Apple Vision Pro in a Compatibility Mode.
However, the React Native community is already building a fork of the main project to fully support the platform’s capabilities, called: React Native visionOS, which unlocks the platform’s full capabilities, giving your app a transparent look that feels right at home next to other visionOS apps.
It allows you to leverage the power of ImmersiveSpaces and multi-window apps.
This fork is easy to set up and run. You just need to set your environment as for the original React Native, install the visionOS simulator runtime on your Xcode, and start a project using:
Then, inside your project folder, you will notice a “visionsos” folder which is pretty much like the “ios” folder. Therefore, you need to install the pods in both folders.
To run your visionOS app run the following commands:
For both Cross-platform tools, we can add support with the Compatibility Mode to run the app on the visionOS.
Still, with the limitation of not accessing the full features and libraries, it’s limited when we try to work with the best practices according to the Apple Human Interface Guidelines.
However, the fork of react-native-visionos seems promising and we should keep track of their development, although it’s in early stage development and is not production-ready.