Playing around with Vizor Create

Blog Posts ,Programming ,Software ,Virtual Reality
May 7, 2015

I’ve recently spent most of my time playing with virtual reality web development, so when I was putting together a list of the tools that would be interesting to dive into deeper, Vizor Create looked interesting but also like it might be more trouble than it was worth. I’m at the point with one of my other main projects where I might decide to scrap it, so I took a break today and fired up the Vizor editor to see what it was like.

Introduction to Vizor Create on YouTube

It’s been about five and a half years since the last time I used a visual programming interface (LabVIEW, for those of you who may be interested) and the Vizor editor took some getting used to. The general idea behind it is pretty simple: drag and drop various components onto your scene and connect them with other objects to make things happen. This graph can then be used to generate a .JSON file that gets loaded into a web player displaying a WebGL canvas.

The Vizor graph is similar to what you would expect from a flow chart, but it can be hard to grasp this the first time using the tool since you only see one level of the graph at a time in the editor. For my mini-project, I made a simplified flow chart that shows the general break down of what my graph needed to look like:

Graph Breakdown

Now, it’s important to note that not each of these levels of my flow chart correspond to its own unique level in the graph – it’s meant to serve as a logical breakdown of the various objects in my scene and make note of how components are added to them. One key difference that took me a few minutes to figure out was animation – In my experience using ThreeJS to generate WebGL/WebVR scenes, I would have a global animation controller function that updated the child elements, but in Vizor Create, these were attached to the objects themselves. Coming from a Java/.NET background, that made a lot of sense to me, but it wasn’t initially obvious from what I’ve done recently with the web.

Onto creating the first elements in the scene! There are several VR ready pre-made objects available in the Vizor Create editor, including scenes that have support for movement, skyboxes, and other common use elements. To get one of these scenes onto your graph, double click the item that you want to include and it will appear in the current level of your graph hierarchy. I started with a VR Movement Template preset so that I could move within the scene without having to handle the controls myself.

Note: To delete an object from the Vizor Create editor, click your mouse and drag a bounding box around the item to select it and then click delete. It needs to have a blue border around it, which doesn’t happen if you just try to click the box and then delete it. 

Like the flowchart, some objects are children of others, and reside at a lower level of the graph hierarchy. Each object in the editor that has a level below it is represented by a yellowish-orange colored box, whereas inputs or components that are individual items are represented in gray boxes.

graph

The VR Render Loop, represented by graph nodes and paths

To edit a node that doesn’t have a subgraph, clicking on the node title will expand the item so it can be changed. The nodes that lead to a new subgraph such as the Scene node in the image above have an edit button that will open the new graph in the editor space. You can navigate between graph levels from the ‘Graph Tree’ section on the left side.

Generally, if you’re following along using one of the VR templates, you’ll want to do the bulk of your editing in the Scene graph, since this is where you’ll be placing your various objects and their components. To do this, locate the Scene link in the Graph Tree list and take a look at what’s already there.

Initially, the template I used had two items: a camera and a grid. The camera plays an important role in the scene – it provides a reference to the view that the canvas should have of your objects, and is linked to said objects as an input. Inputs are represented on the left side of the object, and Outputs are represented on the right.

For each new object that I ended up creating (these are the spheres and skybox, if you refer back to the above flowchart) I created a link from the camera to the object by clicking the word ‘camera’ on the output side of the camera object and dragging it to the input side of the object I had newly added. Reminder: the prefabs for objects are found on the left-side of the editor under Prefabs & Plugins. The text will turn green when the connection is valid, and red if it isn’t. This prevents you from being able to plug in things that don’t make sense, and is incredibly helpful for poking around and trying new things.

My sphere subgraph with scaling, texture, and animation loop applied

My sphere subgraph with texture applied

I decided to create a simple representation of the earth spinning, which meant that I needed to apply a scaling effect (the default size of the sphere was tiny), a rotation animation, and a texture. I started with the texture, which is a component of the 3D object that is applied to the mesh, or what the scene will actually use to create the circle. Generally with 3D development, core properties of an object’s appearance will be applied to the object’s mesh; I then went back up a level to modify the object created from the mesh.

Tip: For slider values, such as the one above labeled H & V Res, you can change the numbers on each side of the bar to get a more specific level of granularity.

Grid and Sphere objects within a scene graph. Sphere has two matrix transformations applied: scale and rotate

Grid and Sphere objects within a scene graph. Sphere has two matrix transformations applied: scale and rotate

In the list of presets, I discovered that modifying the object would be fairly straightforward and double-clicked to add a Scale node. The resulting matrix output was then connected to the Sphere node under “transform”, but when I went to add in a loop for a continuous rotation animation, I learned that you can only have one input to each node – which makes sense. I quickly tracked down the Concatenate operation that would allow me to connect multiple matrix transforms into one, and added in a loop that would rotate my sphere on the Y axis.

The transform for scaling was very straightforward – a size slider connected into the X, Y, and Z inputs for the Scale node, which then spat out a matrix output. The animation was a little bit trickier, but I figured out that I could place any transforms I wanted on a timer, and found that the Time: Accumulating Time object would create the desired effect while the scene was running by feeding into the Y input of the rotate node.

At any time, you can see your scene in real time under the graph by clicking the play button in the upper right corner of the editor. This was an incredibly useful feature, because the immediate feedback from each action was apparent and you could see right away if the action you took was correct for what you wanted. You can’t see it in the image below, but once I hooked up the rotate matrix, the globe started spinning and I was able to make minor adjustments to the speed in real-time.

The non-animated view of my scene as I was creating the graph

The non-animated view of my scene as I was creating the graph

Once you get your scene finished, or just feel like saving/sharing it, you can do so by signing in with an account (I’m guessing if you are using the self-hosted version of Vizor Create, this can be skipped, but I haven’t tried it myself yet) and saving it to be viewed in the player. Without sharing the scene (though that is also an option) you can take a look at it in the web player by removing the /edit/ from the URL of your project. Mine is available at http://create.vizor.io/misslivirose/simple_planet.

Although I don’t necessarily see myself using Vizor for production-ready projects soon, it was invaluable in terms of teaching myself better ways to visualize 3D content and coding – something that’s getting more common in the industry with the advent of virtual and augmented reality. It was a really quick way to get up a simple scene, although I haven’t had a chance to test the VR part yet. I have seen some of the other projects that people have created with Vizor, though, and it seems really promising for specific types of projects or quick visualizations for POC. I’ll definitely be following the development of the tool closely to see how it shapes up – I could potentially see this being a great tool for more in depth projects as it evolves. I also expect it could be used to teach students the basic principles behind 3D / VR development as a good intro – lots of potential here.

Related Posts

Leave a Reply