
My learnings running flutter on the Apple Vision Pro
I had the chance to participate in an AppleVisionPro hackathon (more details at Florian Bauers article — here) and to try out flutter on the real device. Some things were familiar from Flutter on Apple devices, others might be more problematic, if you don’t know about them upfront. So here are my first impressions.
If you are more interested in a guide how to get flutter running on the vision pro, how to set it up and how to get it working in release mode, I prepared a different article:
Disclaimer:
I hope the following information is of help to you, but please take them with a grain of salt. I only tested the device for a day and focused more on developing something worth showing for the hackathon.
It is quite possible, though, that some of the things I mention here are somehow possible and I missed them, but still think and research them, before you expect them for your app.
Also, please let me know if you find anything. ;)
## My personal opinion about VR/AR/XR
Just for you to know where I am coming from:
I have quite an interest in VR/AR/XR.
I had a Google Cardboard, an Oculus Rift and now also own a Meta Quest 3. So I have spent quite a bit of time on VR/AR devices and know the ups and downs.
I also read quite a bit about it and was very excited to hear, that apple is entering the game.
I think they can push this form factor forward and make it more mainstream, because VR/AR is an amazing and unique experience, that I would recommend everyone to try.
BUT I don’t think VR is the future (at least for now) and that everybody will run around in their glasses all day.
And as a short TLDR spoiler already:
After trying the vision pro today, I am even more convinced of that. ;)
## My first impressions of the device
The device is beautiful, feels very sturdy and well-made and thought through.
(apart from the magnetic cover thing, that let us almost drop the glasses multiple times during the day)
I am not going into all the details because I am not a hardware expert, but to summarize it:
I honestly did not wear it that much. It was very heavy, pinched on my nose and I think at the end of the hackathon, we all agreed that we are all kind of glad to not have to put it on for a while.
The eye tracking and hand gestures worked amazingly well though, and we were surprised how fast and easy you accept and use them. :)
## Getting flutter up and running on the device
Getting to work with flutter on visionOS was super easy and fast.
I wrote a step-by-step guide on how to do that if you are interested in that:
https://medium.com/p/4335c3243248/
Just as a summary here:
The process was super easy and setting it up (~5–10 minutes) and developing with flutter is very similar to developing for iOS and iPad.
## The possibilites with flutter
As flutter devs, we can simply run our iOS apps without any hassle on the Vision Pro. All it takes is adding the device as a new destination and pairing XCode and the vision pro.
Sensors & Camera
In our short hackathon session, we already played around a bit with sensors, camera access (see limitations below ;) ).
All in all most things we tried worked quite well and used the same APIs and packages, we would also use for iOS.
So there are already plenty of things you can work with without having to do custom work or wait for packages to drop.
I think most iOS or iPad apps can work out of the box and without any issues on the device, or can be made to so.
Inputs
We also tried different input methods (Text Input, buttons, slider, checkboxes, …) which worked very fine without any additional changes needed.
An interesting part here was seeing how different the same application feels on an iPhone/iPad and then on visionOS, just because of the eye tracking/hand gesture input.
I think it changes the UX completely and allows for very interesting use cases and new possibilities.
There were not many things (apart from the points below), that were different for this device and that surprised me very much. Because I expected to be much more limited in the beginning.
Most of the Flutter gestures are exactly the ones we know and are just “translated” into the visionOS equivalents like the tap/click gesture into the “pinch fingers while looking at it” gesture automatically.
(except hover -> see “Limitations”)
## The limitations
There are however some limitations and things to be careful about, that I want to get out there and these are the main reason why I am writing this article.
### iPad windowed style apps instead of worlds
With flutter, we currently are limited to “Apps designed for iPad”, which means that our apps are limited to a floating window in our living room instead of immersive applications, that surround us without any borders.

That makes sense, of course, but I still wanted to mention it here.

### No access to front camera
This is something, we have not tried out completely thoroughly, so there might be a way (if so, please let me know in the comments), but while access to the camera works and is straightforward using the same flutter apis and packages as for iOS/iPad apps, we could not get the camera facing to the outside front working.
It seems like Apple has restricted the access to it. Probably this is a privacy measurement and maybe there are ways around it for apps in the stores, but keep this in mind and research if your app depends on it.
There is access to the “inner camera”, that shows your “avatar” with your (not always precise) movements, that you can access with the camera api, though. ;)
### No access to eye-tracking / eye pointer position
One of the things we were most excited about and had high hopes for, was access to the eye tracking listener (we hoped it would be like a MouseGesture listener, that sends us coordinates), so we could build cool things, that were not possible at all on other devices. (drawing just with your eyes, …)
But we soon realized that there is no access to these listeners or data.
I think this is due to privacy concerns and apple also states, that it is not possible to get hover data unless, the user is connected with a mouse and moves the pointer around. ( https://support.apple.com/en-us/HT214051 )
Although this is possible, it is probably not the way it will be used all to much and therefore rather an edge case.
This diminishes a lot of the potential and is not possible in either flutter or any other framework so far.
### No hover styles and callbacks
Initially, we were like “ok, then we cannot build the application like this”, but what if we just create a grid of “InkWells” or “GestureDetectors” and then detect the “hover” event on these to trigger certain actions, until we realized, that also that does not work.
It really seems like Flutter does not get the trigger here and also adding custom “onHover” gestures, triggers perfectly fine on iOS, but not on vision OS.
And that is where it gets problematic.
For a device, that does not have a visual cursor, but relies on you looking at a button to not show a hover effect in Flutter could be problematic. Because you don’t get a visual feedback that the area you are currently looking at is “clickable” unless you try it.
In practice this worked better than expected and most of the time we could use the application intuitively, but the UX was definitely flawed compared to SwiftUI applications, where “onHover” Callbacks also don’t work, but the visual hover effects do!
## Resume & Opinion
So here are some things, that could be of interest for you, if you are thinking of adjusting/creating an app for visionOS are just to know what you can tell colleagues, clients or somebody else about the possibilities of flutter on visionOS.
As said above: Please take the details above with a grain of salt because there might be ways around them, new changes in later versions and I could also be wrong.
This is meant as a short intro for you, so if you are thinking of working with it, you know what to research or be careful about, which might not have crossed your mind as problematic, before getting started.
### Would I recommend our clients to build their apps for vision OS?
Yes and no. Generally, I think, the market is still too small and as of right now, I cannot imagine that people are really going to wear these glasses for a longer period after the hype went down, but because it was so easy to take an existing flutter app and deploying it for vision OS, I might think of some use cases, where it could make sense.
In case you want to get started with Flutter for visionOS (works also well if you don’t have the device and use the simulator), you can find my step by step guide on how to set everything up here:
I will try to get my hands on another pair of glasses soon and might share experience updates, corrections and maybe solutions in the future.
I think it always helps to get some insights and share knowledge. So if you are interested to read more about this topic or about flutter development in general, follow me here on medium or on twitter.

👋 If you find this helpful, please click the clap 👏 button below a few times to show your support for the author 👇
🚀Join FAUN Developer Community & Get Similar Stories in your Inbox Each Week