In a previous project, I created an Augmented Reality environment that uses Firebase as the backend to send information back and forth between the computer, where the geometry is being created, and the phone (a Pixel 1), where said geometry gets populated and rendered. This time around, I decided to build a similar logic but using the Magic Leap.
It works as follows:
– The Unity Editor pushes the locations of the camera and the hands to the Firebase Database
– The visual programming tool Grasshopper reads those positions: The large sphere represents the head; the smaller spheres are the hands.
– I used those positions as the input for a responsive environment. A change in those positions triggers a change in the parametric design that then gets pushed to Firebase. In this case: the head movement triggers a change in the color of the pieces; the hands change their relative position.
– The Unity Editor then reads those changes and updates the geometry accordingly. One of the main issues I found is that the Database didn’t work in real time. Some of the changes took several seconds to get read and updated visually.
Unfortunately, there seemed to be incompatibility of the Firebase libraries with the Magic Leap and to show a proof of concept, I baked one of the versions of the parametric design and loaded it in Unity. I then wrote a script in Unity that works similarly to how the parametric design definition was set up: it detects the distance from the user’s hands to the pieces in the design and then moves them accordingly.
The vertex color shader used in the Unity Editor is not supported in the Magic Leap, so the pieces all keep the same color in this demo.
I have written a new Medium post for the Virtual Reality Pop publication where I discuss jazz, parametric design and spatial computing. You can check it out here.
At IrisVR and every quarter, we stop everything to work on hack projects for a couple of days. In Q1 of this year, I decided to develop an Augmented Reality app that could stream objects and changes made to their geometry in real time. ARCore, Google’s augmented reality SDK for Android, is now available for developers and Unity has great support for it. I thought it would be fun to test it by creating an Augmented Reality environment that uses Firebase as the backend to send information back and forth between the computer, where the geometry is being created in Grasshopper, and the phone (a Pixel 1), where said geometry gets populated and rendered. Check it out below!