Blockchain Transactions: UTxO vs. Account-Based Models
Gustavo Martins | Oct 31, 2024
On February 2, 2024, Apple officially released the Apple Vision Pro, their revolutionary spatial computer that aims to change how people work, communicate, and entertain themselves.
The Apple Vision Pro works as a mixed reality (VA + VR) and allows you to control with just your voice, hand, and eyes.
Also, the M2 chip powers it, and the R1 chip is specifically dedicated to processing input from the cameras, sensors, and microphones.
With these great powers the Vision Pro offers to us, there are a few options we can use to develop and give support to our mobile apps for the visionOS when we want to use cross-platform tools, be it for sharing the code base, flexibility, or faster development:
Let’s explore the final two options featured in this blog post. Ready? Let’s dive in!
For your Flutter apps to run in an Apple Vision Pro, you only need to add some basic configurations to your Xcode project:
And done! Now you can select the “Apple Vision Pro” simulator as run destination in your Xcode project and run.
That way, your app will run in the Compatibility Mode and will seem the same as it will be on an iPhone/iPad but with the possibility of changing the resolution and orientation of the window.
The Flutter community is already working on plugins that allow you to access Apple Vision features like Hand Detection, Face Detection, and more (https://pub.dev/packages/apple_vision).
It’s an initial development for the new OS and may have some bugs.
Moreover, it’s been in discussion about supporting visionOS on the Flutter platform: (https://github.com/flutter/flutter/issues/128313).
As Flutter, the same configurations work for running your React Native apps on an Apple Vision Pro in a Compatibility Mode.
However, the React Native community is already building a fork of the main project to fully support the platform’s capabilities, called: React Native visionOS, which unlocks the platform’s full capabilities, giving your app a transparent look that feels right at home next to other visionOS apps.
It allows you to leverage the power of ImmersiveSpaces and multi-window apps.
This fork is easy to set up and run. You just need to set your environment as for the original React Native, install the visionOS simulator runtime on your Xcode, and start a project using:
<em>npx @callstack/react-native-visionos@latest init YourApp</em>
Code language: HTML, XML (xml)
Then, inside your project folder, you will notice a “visionsos” folder which is pretty much like the “ios” folder. Therefore, you need to install the pods in both folders.
To run your visionOS app run the following commands:
<em>yarn start</em>
Code language: HTML, XML (xml)
<em>yarn visionos</em>
Code language: HTML, XML (xml)
For both Cross-platform tools, we can add support with the Compatibility Mode to run the app on the visionOS.
Still, with the limitation of not accessing the full features and libraries, it’s limited when we try to work with the best practices according to the Apple Human Interface Guidelines.
However, the fork of react-native-visionos seems promising and we should keep track of their development, although it’s in early stage development and is not production-ready.