Monday Musings: Analytics & Design in Virtual Reality

Blog Posts ,Random Musings ,Virtual Reality
October 26, 2015

Welcome to Monday Musings! These are are shorter-form, note-like blog posts that I share that may or may not be related to VR/AR, but that I want to share some quick thoughts on, get some extra insight on, or share out quickly.

Today’s Monday Musings: Analytics & Design in Virtual Reality

I love exploring VR – not just in the context of new places within the comfort of HMDs, but exploring the underlying technological concepts and trying to figure out where they might be modified for 3D immersive computing. Recently, I’ve been thinking about the idea of analytics in VR, and what kind of forms they may take. What are some things that would be useful to track?

The first thing that I was wondering about was tracking the best way to identify the percentage of time that a user would spend looking at and interacting with content in a particular region of the surrounding scene. A quick temperature check on Twitter revealed some consensus around my initial thoughts:

  • Standing VR experiences with enhanced interaction support (e.g. hand/body tracking) tended to have a greater need for interactive content surrounding the user (such as in Owlchemy’s Job Simulator game)
  • Seated VR experiences that may or may not have basic positional tracking  (e.g. the Oculus Rift DK2 or Gear VR) were more likely to have users stay focused in one primary direction over the other.

The differences in experiences that I’ve seen vary quite a lot, so I’ve begun referring to these as two different ways of approaching VR application design.

Netflix for GearVR (above) is an example of an application that has a primary content direction. These applications immerse the user in a 360 degree environment, but there isn’t necessarily anything around that is going to distract the user from a given direction. There is one primary point of interest in the app (the screen) and the rest of the scene is secondary.

From a behavioral point of view, my observation has been that this type of experience generally starts with someone putting on the headset and looking around , then settling in the primary direction with maybe a few period glances in other directions. As I mentioned a bit earlier, I’ve found this is especially true of seated experiences. It isn’t limited to just apps with 2D content, either: even Henry, the short film from Oculus Story Studio, keeps the majority of the viewing content within one hemisphere.

Job Simulator: The 2050 Archives, in contrast to having a primary content direction, requires the player to make full use of the entire scene. There isn’t a specific flow to the scene, and while a lot of content does make use of the “forward” immersive hemisphere, you cannot complete the tasks without objects from around the room. I refer to this as an application with distributed content direction(s). These cases are particularly amusing from a user standpoint, because the player behaviors are less predictable from an observational standpoint.

So where does this become important? Most applications have varying stages or degrees of primary and distributed content directions, and depending on the events happening in the scene, evaluating user behaviors within your application becomes a question of accurately guiding attention. Do your users spend 97% of their total time in the application looking in one direction, miss a cue for an event, and lose an important part of the context? Knowing what stages of your application should have primary focus, or understanding the attention order of your user base, is going to be critical for developing experiences in ways that user attention is first attained, and then retained.

Consider an analytic approach where you take the four cardinal directions and lock them into your app. You could have analytics tracking the estimated direction (N/S/E/W) and the time spent in each direction, and match that to your scene elements  to determine if users are consistently looking where you’d expect them to. Improvements to the process of capturing attention (and keeping it where it’s desired) help with creating experiences where the users don’t feel as though they are missing vital parts of the story line. Being able to track how long people are looking at the different components of your scene aids you in making decision and content creation decisions to maximize your application’s impact.

A few other areas of interest for analytics:

  • Objects that are most commonly interacted with
  • Attempted and incomplete interactions (e.g. device or gesture accuracy)
  • Time spent interacting with a given component of a level
  • Gaze pattern visualizations over the course of the application

It’s definitely going to be fun to explore these in more depth as the industry advances and more data is collected about how we’re interacting with VR at a psychological level.

Related Posts

1 thought on “Monday Musings: Analytics & Design in Virtual Reality”

  1. Chris says:

    As someone who is often thinking about these sorts of analytics and what effects they can have, this was a fun read. Keep up the great posts!

Leave a Reply