Hands-on with HoloLens at Microsoft Holographic Academy

Disclamer: I work at Microsoft as a Developer Evangelist, specifically for virtual and augmented reality technology. This means I get to write about and play with cool new tech, but I do *not* get to know secret insider information. I know, I’m bummed out too. This also strictly my own opinion, and does not in any way represent the views of Microsoft, the HoloLens team, or any other entity associated with Microsoft other than “Livi The Evangelist”

In a quiet room in a San Francisco hotel, I sat under an array of lights in a huge room filled with computer desks, sofas, and coffee tables. I wasn’t in a furniture showroom – I was in Microsoft’s ‘Holographic Academy’, an experience that a select few develeropers were able to experience at this year’s //Build Conference. The subject of the day? “Holo World”, an overview of the Microsoft Holographic computing platform featuring HoloLens.

Windows Holographic, Microsoft.com

Windows Holographic, Microsoft.com

I was fortunate enough to join this event where developers would get a hands-on demo of the HoloLens augmented reality headset and the chance to try building a hello-world application using Unity and the new HoloToolKit libraries. We were each given a mentor and assigned a table, where we could see the devices waiting across the room for our anxious hands. No cameras or devices were allowed in, so this is going to be a text-only recount of the workshop, but I wanted to share what was probably the best day of my life as an aspiring AR/VR developer.

That’s only exaggerating a little. I might have been the only person in the room who was bouncing up and down at the chance to try out a piece of hardware I have been dreaming about since it’s announcement in January. We got a brief overview of what we would be building that day (a project dubbed Origami): a small application that created a ‘hologram’ to explore the various aspects of the Windows Holographic SDK. This included generating a hologram with 3D Object assets, spatial sound, audio input commands, and gestures, to make objects that obeyed the laws of physics, interacted with physical objects, and rendered on the glasses in personal spaces.

This workshop provided me with two opportunities that I’ve been hoping to get to experice: the ability to just try out the HoloLens and see how it worked, and the ability to actually get a sneak preview of what HoloLens development would look like.

First: the hardware. HoloLens isn’t the only mixed reality headset that I’ve tried, but it is hands down the best one that I have seen. The device itself was lighter than I had expected, due to its size, and rested comfortably on my head both with and without my glasses. The headset was easily and comfortably adjustable, and had a strap on the top that took the majority of the weight of the device and had it resting on my forehead instead of my nose, which definitely made it more comfortable for the 3+ hours that I was using it.

The “holograms” that HoloLens visualizes are amazingly bright and crisp. As I mentioned earlier, they do actually intereact with their environment really, really cleanly – we had a trial app to test out before we started coding ourselves that consisted of an RC car, and it drove around flag poles that were dropped on the ground with a gesture. It fell off of surfaces where you’d expect it to, and it occasionally got caught in backpack straps on the floor.

After playing with the demo, we finally got to boot up Unity and Visual Studio to give developing the Origami app development process a try. We started by connecting to the HoloLens headsets with their IP addresses through the browser, and the app building workshop began by taking us through the process of adding in three assets (two crunched up balls of paper and a pad of graph paper base with some toys stacked on it) and a holographic camera object, which was a prefab that just got dropped into the scene replacing the main camera, and that was honestly the only thing that we needed to do to get started.

With zero lines of code, we had made our own hologram for the first time!

We then got on to the more interesting stuff. We added a few other elements into the scene using the HoloToolkit, including an ambient sound script and mesh to the holograms so they would interact with our real-world physics, and each time we saw our holograms get more and more awesome.

What really impressed me was how natural it was to interact with the holograms we had created. Gestures weren’t perfect, but they worked significantly more consistently than anything that I’d seen with other tracking cameras – there was only one time that I actually couldn’t get the gesture working on the first or second try. The ambient sound was delightful – I’ve been especially interested in spatial sound, and what amazed me was how clearly the sound came across and how I couldn’t hear a single other person’s headset audio, despite being in a room of 60 people all playing with their own holographic scene. The sound was positional, so the device knew where I was relative to the holograms and played it accordingly, and getting closer/further away from the hologram changed the volume as well.

Creating new voice commands with the HoloToolkit library was exactly one line of code – you pass in a string and then when the HoloLens picked up the wearer saying it (I didn’t get ANY overlap from anyone else) it would perform the action specified. I “taught” my two spheres to “Play Dead”, much to the amusment of the attendees around me, in about thirty seconds.

One of the coolest parts of the entire lab was being able to view the spatial mapping that Hololens used to visualize the physical world. There were several varieties of mesh that you could select in the Unity editor, and when we re-ran the applications on our devices, we could see what the device saw: an awesome overlay of polygons that our objects would then interact with.

The lab followed up with a “surprise” – we added a hidden prefab and script to create an explosion effect when the spheres landed on the ground. This projected a hologram hole in on the floor to look through – and the objects interacted with the virtual space below the ground, too.

I walked out of the workshop with HoloLens on my mind, and wanted to share it with you. As a developer and avid VR/AR enthusiast, I could not be more excited about what this technology entails and can’t wait to see what the next iteration looks like.

HoloLens is Hiring! Check out Microsoft Careers for more info

Related Posts

Leave a Reply

Your email address will not be published.