This past weekend, I participated in a Meta AR Augmented Reality hackathon in San Francisco and had a chance to play around with & try building for one of the only consumer-available AR headsets. I’ll be going into more detail in an upcoming post about the technical side of the event, but I’ve gotten some interest about what we built during the project, so I’ve decided to share some of it here!
Astro AR is an augmented reality application build on the Meta SDK to bring the solar system into reality in two different mini-applications geared towards teaching children (cough padawans) about the planets from both our solar system and the various star systems in the Star Wars universe.
The first part of Astro AR, pictured above, utilized the Star Wars API to pull down data from the web about various planetary systems in the Star Wars universe. This was an example that we wanted to showcase with Astro AR to serve as a proof of concept for utilizing existing APIs for use in Unity – in this case, we simply parsed the JSON object that was returned and used it to populate the planets in our Unity scene, but the idea was to show that existing web APIs can be used easily in AR and VR applications. We put in an additional MGUI layer – a wrapper around Unity’s UI elements specifically made for use with the Meta SDK – which was then used to display the data received by the GET request for the SWAPI call. When wearing the AR glasses, a finger press+hold on the virtual planet would trigger the call and display the relevant planetary information.
The second part of Astro AR was a representation of our own solar system to demonstrate a real-world application of the technology for use in educational environments. In this scene, the viewer played the part of the sun, watching planets orbit around them. Although the sample project wasn’t to scale, it was a fun way to see the planets in an interactive way that has the potential to increase hands-on education for students as AR/VR tech becomes more widely available.
Right now, I’m working on an Oculus version of the application, but without the camera input to integrate with gestures, it’s not as fun (LeapMotion, if you wanted to throw one of those my way… you know… ) but that will hopefully improve as we build out some functionality that is conducive for using an Xbox controller as input. Realistically, there’s a lot of opportunity that this sort of tech will open up – so if you’re interested, go grab the source on GitHub and let me know what you think!