Agent Skills: Stop Stuffing Workflows Into Your Rules File
Salatiel Lima | Apr 29, 2026
Listen to this article
On February 2, 2024, Apple officially released the Apple Vision Pro, their revolutionary spatial computer that aims to change how people work, communicate, and entertain themselves.
The Apple Vision Pro works as a mixed reality (VA + VR) and allows you to control with just your voice, hand, and eyes.
Also, the M2 chip powers it, and the R1 chip is specifically dedicated to processing input from the cameras, sensors, and microphones.
With these great powers the Vision Pro offers to us, there are a few options we can use to develop and give support to our mobile apps for the visionOS when we want to use cross-platform tools, be it for sharing the code base, flexibility, or faster development:
Let’s explore the final two options featured in this blog post. Ready? Let’s dive in!
For your Flutter apps to run in an Apple Vision Pro, you only need to add some basic configurations to your Xcode project:


And done! Now you can select the “Apple Vision Pro” simulator as run destination in your Xcode project and run.
That way, your app will run in the Compatibility Mode and will seem the same as it will be on an iPhone/iPad but with the possibility of changing the resolution and orientation of the window.
The Flutter community is already working on plugins that allow you to access Apple Vision features like Hand Detection, Face Detection, and more (https://pub.dev/packages/apple_vision).
It’s an initial development for the new OS and may have some bugs.
Moreover, it’s been in discussion about supporting visionOS on the Flutter platform: (https://github.com/flutter/flutter/issues/128313).
As Flutter, the same configurations work for running your React Native apps on an Apple Vision Pro in a Compatibility Mode.
However, the React Native community is already building a fork of the main project to fully support the platform’s capabilities, called: React Native visionOS, which unlocks the platform’s full capabilities, giving your app a transparent look that feels right at home next to other visionOS apps.
It allows you to leverage the power of ImmersiveSpaces and multi-window apps.
This fork is easy to set up and run. You just need to set your environment as for the original React Native, install the visionOS simulator runtime on your Xcode, and start a project using:
<em>npx @callstack/react-native-visionos@latest init YourApp</em>Code language: HTML, XML (xml)Then, inside your project folder, you will notice a “visionsos” folder which is pretty much like the “ios” folder. Therefore, you need to install the pods in both folders.
To run your visionOS app run the following commands:
<em>yarn start</em>Code language: HTML, XML (xml)<em>yarn visionos</em>Code language: HTML, XML (xml)For both Cross-platform tools, we can add support with the Compatibility Mode to run the app on the visionOS.
Still, with the limitation of not accessing the full features and libraries, it’s limited when we try to work with the best practices according to the Apple Human Interface Guidelines.
However, the fork of react-native-visionos seems promising and we should keep track of their development, although it’s in early stage development and is not production-ready.
The Apple Vision Pro was officially released on February 2, 2024. It is a spatial computer that works as a mixed reality device (VA + VR) and allows control through voice, hand, and eyes. It is powered by the M2 chip, with the R1 chip specifically dedicated to processing input from the cameras, sensors, and microphones.
There are a few options: adding a native visionOS module to the project and communicating with the cross-platform side, running the cross-platform app in Compatibility Mode, or using libraries that help develop apps for visionOS from the cross-platform tool.
To run Flutter apps on Apple Vision Pro, you need to update Xcode and install the visionOS simulator, add 'Apple Vision' as one of the Supported Destinations of your app, and ensure your app excludes the architecture for Intel x86. Then you can select the 'Apple Vision Pro' simulator as run destination in Xcode. The app will run in Compatibility Mode.
The limitations include having to run the project from Xcode, loss of the hot reload feature, the app looking the same as on iPad/iPhone, and no access to the visionOS libraries.
The same configurations as Flutter work for running React Native apps on Apple Vision Pro in Compatibility Mode. Additionally, the React Native community is building a fork called React Native visionOS that fully supports the platform's capabilities, giving apps a transparent look and allowing the use of ImmersiveSpaces and multi-window apps. A project can be started with: npx @callstack/react-native-visionos@latest init YourApp, and then run using 'yarn start' and 'yarn visionos'. However, it is in early stage development and is not production-ready.