Bring in the Assets!

This week I worked on a way to bring in external assets into the Editor.

Importing an external image into the editor and texturing a cube.
Importing an external image into the editor and texturing a cube.

The way this works is by means of a new import tool in the UI. Import allows the user to select any set of files to copy into the project’s resource directory. Once in the project, resources are shown in the new Library View from where they can be accessed.

What can you do with the assets you bring into the project?

For 3D assets, when dragged onto the 3D view the Editor will instantiate a new entity. This new entity will appear in the entities list view and can be manipulated like any other entity.

For Image assets, these can be assigned to materials to change the appearance of entities in the scene. The figure above show how we can bring a JPG file into the Editor and use it to texture a box we create from the Entity menu.

Designing the Asset Library UX

I tried different designs for the library view, starting with a simple list that would show the asset’s name, type and preview, but I ultimately decided to simplify the interaction and settled for a tree view.

It was hard to throw away the initial tabular design I implemented, but I believe going with the tree view was ultimately the right choice.

Once I had the library view, the first thing I wanted to try was whether dragging assets from it onto the 3D world and other widgets would be feasible. I thought this feature would be troublesome to land, however, I was very pleased to see the implementation come together rather quickly.

Drag and drop is a great interaction because it prevents taking the user out of context when she wants to assign a value to a widget or other UI element.

Putting the picture together

If we take a step back, we can now start to see how these features are coming together to help bring the WYSIWYG experience to the Editor.

If you recall from our first post in the Vortex Editor series (click here to quickly jump there), one of the main objectives of this project was to allow simplifying the way we assemble scenes for the Vortex Engine.

Having now a Resource Library and a way to visually edit Entity properties, look at how much simpler it is to create a scene from scratch:

  1. First, we start the Editor, click File -> Import Assets.
  2. Then, we select the assets to bring into the project (for instance, a few obj files).
  3. After this, we simply drag the imported obj file from the Library onto the 3D view. This will create a new Entity.
  4. And Finally, we adjust the new Entity’s properties (position, rotation, scale) to our liking and drag image assets to texture them. That’s it!

The whole process is very visual, allowing the user to see the scene at all times and (hopefully) unleashing her artistic creativity by allowing quick improvisation and trial and error. It feels much more fun than having to be tweaking a series of float values intermixed with C++ code and recompiling every time.

Having reached this initial milestone, the plan for next week will be mostly polish work for these features, improving project resource management and further cleaning up the internal architecture.

Stay tuned for more!

Entity transform via the Properties Panel

This week I finished implementing full edition of Entity transformations by means of the properties panel.

Five boxes showcasing different transforms set via the properties panel.
Five boxes showcasing different transforms set via the properties panel.

In the above image, five box instances were dynamically created and then moved, rotated and scaled using nothing but the transform panel. This is a big milestone that will allow, from now on, assemble complete scenes to be rendered with the engine without having to tinker with the transformation values in code.

Now, a common piece of feedback that I’ve been receiving during this series is whether the editor is capable of rendering something other than boxes. Because the editor is powered by the full Vortex Engine, it can load and render a number of file formats out of the box.

With the ability to assemble scenes dynamically, we can now start bringing in external geometry in the form of entities and build our setup from them. This coming week I’m going to be working on better Entity support for external meshes and start bringing them into the editor.

Stay tuned for more!

Transform Inspection in the Vortex Editor

This week I finished implementing the Transform Panel of the properties view. This is a huge step towards providing a WYSIWYG interface to the Vortex Engine!

The now-complete properties panel. Notice how rotation and scale are now correctly displayed.
The now-complete properties panel. Notice how rotation and scale are now correctly displayed.

This work encompassed being able to select any entity and inspect its properties, namely, its position, rotation and scale.

As I described in my previous post, I reworked the way entities store their transforms in order to be able to keep properties separate, only combining them for rendering purposes. Moving entities to the new Transform construct was simpler that I had anticipated, as entities encapsulate and manage the scenegraph nodes that the Vortex renderers expect.

The next step now in the Properties Panel saga is to add editing capabilities to the UI, so that modifying the values presented in the transform will alter the entities in the 3D world.

The only major issue that I see here is that, as I start to add more editing capabilities to the UI, it becomes ever more pressing to start implementing the undo/redo stack. It will be necessary as the Editor becomes more powerful.

This begs the question of the stack commands going through a centralized front controller, as opposed to having each UI component modify entities on its own. Now, a centralized component for actions must be carefully designed, as it can also provide the native backend for an (eventual) engine scripting system – something I’ve also started to research, but that requires a post on its own : )

The plan for this week is then: add “write” capabilities to the properties panel and start implementing the undo/redo stack. Stay tuned for more!

Visual Editing of Transformations through the Vortex Editor UI

This week I started implementing the properties panel (sometimes called the “inspector” panel) for the Vortex Editor.

The redesigned Transformation Panel in the Vortex Editor vertical slice.
The redesigned Transformation Panel in the Vortex Editor vertical slice.

I originally wanted to go with a table design for the UI. I though it would give me the flexibility of adding as many editable entries as necessary. I even implemented a mockup in the UI that can be seen in previous screenshots of the Editor.

The problem that I found however was that both, the difference in font and text spacing, between the left and right panels of the UI made the layout look uneven. I knew I wanted something more symmetrical for what’s essentially the “home” view of the Editor, so I came up with a new design that can be seen in the image above.

Here, the Transform Panel consists in a custom UI component that is created dynamically and parented to the docked panel on the right. This provides a more uniform layout with two advantages: first, any number of property panels can be created and they will be nicely stacked one after the other in the panel (this will be useful in the future). Second, because the properties panel is detachable, it still allows the user to customize the Editor layout to her liking.

Now, one minor roadblock I’ve encountered from the engine standpoint for fully realizing this idea is the way Vortex currently represents Entity transformations.

All transformations in Vortex are represented as a 4×4 Matrix. The driving force behind this decision was to avoid having to convert between rotation representations at render time, thus saving some time down the line at during any scenegraph traversal passes.

So what does a generic transformation matrix look like in Vortex?

Matrices are a 4×4 list of values representing values in the homogeneous coordinate system:

  \begin{array}{cccc}  sr_{00} & r_{01} & r_{02} & t_{x} \\  r_{10} & sr_{11} & r_{12} & t_{y} \\  r_{20} & r_{21} & sr_{33} & t_{z} \\  0 & 0 & 0 & 1  \end{array}

 

In this matrix, (tx, ty, tz) correspond to the translation component of the Entity and we can easily extract this information to populate the Transformation Panel. But what happens with the rotation and scale?

Rotation and Scale are mixed in together in the matrix (represented by the overlapping sr components), so we can’t really extract the original scale and rotation that generated this matrix. This means that we will only be able to show and edit the position of the Entity and not its rotation or scale.

This is a limitation that has to be lifted.

The plan is to provide a higher-level contraption for describing transformations. Indeed a Transform class that will keep separate tabs for position, rotation and scale but will also provide a convenience method for computing on-demand the transform matrix that this transform represents.

A tentative interface for the Transform class could be:

namespace vtx {
class Transform
{
  public:
    void setPosition( float x, float y, float z );
    void setRotationEuler( float rx, float ry, float rz );
    void setScale( float sx, float sy, float sz );

    vtx::Matrix4 matrix() const; // used for the rendering pass

    // Other getter methods omitted
};
}

I think this will be a good change for the Engine. Working with Entities using a position-rotation-scale mindset instead of having to the deal with the cognitive overhead of thinking in terms of representing these as matrix operations will help users be more productive (and save precious keystrokes).

This coming week I will be working on finalizing this implementation and finally exposing full Entity transformation control through the UI. Stay tuned!

The GLSL Shader Editor

This week we take a break from work in the Vortex Editor to revisit an older personal project of mine: the GLSL Shader Editor, a custom editor for OpenGL shaders.

The UI of my custom shader editor.
The UI of my custom shader editor.

The idea of the editor was to allow very fast shader iteration times by providing an area where a shader program could be written and then, by simply pressing Cmd+B (Ctrl+B on Windows), the shader source would be complied and hot-loaded into the running application.

This concept of hot-loading allowed seeing the results of the new shading instantly, without having to stop the app and without even having to save the shader source files. This allowed for very fast turn-around times for experimenting with shader ideas.

As the image above shows, the UI was divided in two main areas: an Edit View and a Render View.

The Edit View consisted in a tabbed component with two text areas. These text areas (named “Vertex” and “Fragment”) are where you could write your custom vertex and fragment shaders respectively. The contents of these two would be the shader source that was be compiled and linked into a shader program.

The shader program would be compiled by pressing Cmd+B and, if no errors were found, then it would be hot-loaded and used to shade the model displayed in the Render View.

The status bar (showing “OK” in the image), would display any shader compilation errors as reported by the video driver.

The application had a number of built-in primitives and it also allowed importing in models in the OBJ format. It was developed on Ubuntu Linux and supported MS Windows and OS X on a number of different video cards.

Application Features

  • Built entirely in C++.
  • Supports Desktop OpenGL 2.0.
  • Qt GUI.
  • Supported platforms: (Ubuntu) Linux, MS Windows, OSX.
  • Diverse set of visual primitives and OBJ import support.
  • Very efficient turn-around times by hot-loading the shader dynamically – no need to save files!
  • GLSL syntax highlighting.
  • Docked, customizable UI.

Interestingly, this project was developed at around the same time that I got started with the Vortex Engine, therefore, it does not use any of Vortex’s code. This means that all shader compiling and loading, as well as all rendering was developed from scratch for this project.

I’ve added a project page for this application (under Personal Projects in the blog menu). I’ve also redesigned the menu to list all different personal projects that I’ve either worked on or that I’m currently working on, so please feel free to check it out.

Next week, we’ll be going back to the Vortex Editor! Stay tuned for more!

Basic Entity Instancing and Positioning

This week, I’ve continued working on the Editor, adding Entity instancing through the UI and (simple) position editing.

An animated gif showing very basic entity creation and positioning in the Vortex Editor.
An animated gif showing very basic entity creation and positioning in the Vortex Editor.

The image above features a rough preview that helps show how any number of Entities can be created now. The next steps will be to refine this to add Mouse-based selection, as well as a more comprehensive UI that allows for finer control over Entity position, rotation and scale.

Once these and a few more Editor features are in, I will create a higher quality video showcasing the creation of an environment by means of Entity editing.

Stay tuned for more!

Entity Management in Vortex

This week I started working on the foundation for entity management in the Vortex Editor.

“Entities” are a new concept I’m introducing in the next version of the Vortex Engine (V3), whose design responds to the larger effort of simplifying the engine’s API.

Screenshot showing updated UI for Entity Management.
The Updated UI of the Editor.

Entities in Vortex are high level objects that have a transformation, a material, a mesh and (of course) a name.

Down the line, the plan is that entities expose more properties. Custom shading and executable code are strong candidates that come to mind. As part of the Vortex Editor vertical slice, however, basic properties for placing entities in 3D space and drawing them are all we need.

Entities are also first-class citizens in the Editor, so I reworked the Editor’s UI to add two new panels: an entity list view and a property editor. The image below shows a scene setup with two box entities. Note their names listed in the panel on the left.

Screenshot showing updated UI for Entity Management.
Entity management in the Vortex Editor.

Because I wanted the contents of the entity list to be taken directly from the entity management component in the engine, I had to refactor the internal architecture of the Editor a bit. As it was standing before, the OpenGL View was the one handling the Editor Controller and -by extension- the engine.

This created a weird situation where the entity list view’s controller would have to go through the GL View to query the existing entities. This had to be fixed.

Thankfully, because the Editor Controller was fully encapsulating the engine, it was easy to move the instance of this class upwards in the object hierarchy without introducing undesired dependencies to Vortex’s core in UI classes.

All in all, the whole rework didn’t take as big of an effort as I had originally anticipated, and I’m pretty satisfied with the results.

The next step now is to expose entity instancing to the Editor UI! Stay tuned for more!

Designing the Editor Architecture

Last week I used the (very little) free time that I had to work on the internal architecture of the Editor and how it’s going to interact with the Vortex Engine.

In general terms, the plan is to have all UI interactions be well-defined and go to a Front Controller object that’s going to be responsible for driving the engine. This Front Controller, by definition, will be a one-stop shop for the entire implementation behind the UI and it will also, at a later stage, provide higher-granularity control of the engine.

Vortex Framebuffer Object Support: a knight is rendered on a texture that is then mapped on a cube. All rendering is done on the GPU, avoiding expensive copies to RAM.
Vortex Framebuffer Object Support: a knight is rendered on a texture that is then mapped on a cube. All rendering is done on the GPU, avoiding expensive copies to RAM.

Other components I’ve been designing include an undo/redo stack (which is super important for an editor application) and a scripting API. It’s still early for both these components, but I think it’s better if the design supports these from early on as opposed to trying to tack them on to the Editor at a later stage.

Finally, last week I took the time to bootstrap a higher OpenGL version on Windows. The Editor now has access to full OpenGL on this platform. This is a significant milestone that opens the door for bringing in to Windows Vortex’s advanced rendering techniques, such as FBO objects as depicted in the image above.

I’ve only got a short update for this week. Stay tuned for more to come : )

Vortex Editor .plan

Not too long ago, I started working on an Editor for the Vortex Engine. I have been toying with the idea for years and I finally decided to get started. Not only because it is going to be an interesting challenge, but also because I feel it’s a good way to improve the development workflow when using the engine.

A very early screenshot depicting a scene with a Box entity. No lighting, no mipmapping, no AA. “Crate” texture image courtesy of lighthouse3d.com
A very early screenshot depicting a scene with a Box entity. No lighting, no mipmapping, no AA. “Crate” texture image courtesy of lighthouse3d.com

Using the Engine Today

Let’s take a look at the way I can build an App today with the Vortex Engine. First, I would create a new Application (be it a Linux, Mac or iOS App). Then, I would link against the engine, and then finally, I would create a scene through the Vortex API manually.

Now, while this approach certainly works and even plays as one of Vortex’s strengths by allowing you to integrate the engine into any App without taking over the application loop, it does become cumbersome to create the scene programatically.

The reason is that this process usually amounts to repeating a series of steps for every scene in the App:

  1. Start by taking a first stab in the dark.
  2. Build and run the App.
  3. Realize that you want to change the scene layout.
  4. Go back to the code, change it.
  5. Rebuild and re-run the App.
  6. Repeat from step 3 until you’re satisfied with the results.

The idea of the Editor is to tackle this problem head-on. With the Editor, you will be able “see” the scene you are building, tweak it visually and then save it as a package that can be loaded by the engine.

Bringing Vortex to Windows

Starting a new project for the Editor begs the question of which platforms this App shall run on. The Editor will be a desktop App, so ideally, it would work on all three major desktop platforms: Windows, Linux and Mac.

Now, there is no point in making a new renderer for the Editor, as we want the scene we see in it to be as close as possible to what the final user Apps will render. What this means is that the Editor needs to run the engine.

Portability has always been one of the key tenets of the Vortex Engine, so this is the perfect opportunity to bring the engine to Windows, a platform it’s never run on before.

Bringing to Windows a codebase that was born on Linux and then expanded to support Mac and iOS is the ultimate test for source-code level portability. Once finished, the end result will be a flexible codebase that is also more adherent to the standard.

So far, the two main challenges in building the engine on Windows have been: adapting the codebase for building under the MSVC compiler and Windows’ barebones support for OpenGL.

Building on MSVC

Although Vortex is standard C++ and it builds with both GCC and Clang, building it with MSVC required a few changes here and there to conform better to its front end.

This also meant reconsidering some dependencies of the engine to allow for a non-POSIX environment. Thankfully, the move to C++11 has already helped replace some UNIX-specific functions with now-standard equivalents.

OpenGL on Windows

Regarding OpenGL support, the windowing toolkit I’ve chosen to implement the UI in has proven to be more of a problem than a solution. At the time of writing, and mostly because I’m trying to hit a high velocity building the Editor, I haven’t taken the time to bootstrap anything beyond OpenGL 1.1 support.

This would be a problem, however, Vortex’s Dual Pipeline support, as I first described in this post back from 2011, has proven essential by allowing the engine to scale down to OpenGL 1.1.

Dual Pipeline support: a Comparison of the Rendering Pipelines available in Vortex Engine. The image on the left represent the Fixed Pipeline. The image on the right represents the Programmable Pipeline.
Dual Pipeline support: a Comparison of the Rendering Pipelines available in Vortex Engine. The image on the left represent the Fixed Pipeline. The image on the right represents the Programmable Pipeline.

The plan is to move forward with the basic Editor functionality and then drop in the programmable pipeline renderer later in the game, retiring the fixed one.

It’s quite amazing to see the fixed pipeline renderer, written about 6 years ago, running unmodified on a completely new platform that it has never been tested on before. This is the true virtue of OpenGL.

In Closing

So far work is progressing nicely. As the image above shows, I have a simple proof-of-concept of the engine running inside the Editor skeleton under Windows. This is the foundation on which I will continue building the Editor App.

Stay tuned for more!

Goodbye Hypr3D

It seems the sun has set on the good old Hypr3D app. This was the first commercial app to use a licensed copy of my 3D Engine back in 2011.

Hypr3D's help page showing the viewer's recognized gestures.
Hypr3D’s help page showing the viewer’s recognized gestures.

Hypr3D rendering a reconstructed pineapple model on an iPhone.
Hypr3D rendering a reconstructed pineapple model on an iPhone.

Back in the day, the Vortex Engine was on its 1.0 version and it was pretty much done when the App was developed. Although this made using it almost a “plug and play” experience, there were some interesting problems that had to be resolved during the App development process.

On problem in particular was how to integrate a HTML UI with the Objective-C Front Controller of the App. This was way before any of the modern “hybrid app” toolchains existed and I remember I was on a trip to Seattle when I started to lay down the main concepts.

Goodbye Hypr3D, I’ll always remember the countless hours staring at the cookies reference model while debugging!

Hypr3D's model viewed rendering a reconstructed model featuring a pile of rocks on an iPad
Hypr3D’s model viewed rendering a reconstructed model featuring a pile of rocks on an iPad

I’ve created a dedicated page in the memory of Hypr3D with a short feature list and some more screenshots. You can visit it here.

Although Hypr3D may not be available anymore, Vortex Engine 1.0 is anything but dead. Stay tuned ;)