Building the Engine Scripting API

Last week when we left off, we were able to implement a Lua REPL in the Vortex Editor console. This week, I wanted to take things further by allowing Lua scripts to access the engine’s state, create new entities and modify their properties.

Scripting Interface to the Engine. A C++ cube entity is instantiated from Lua code that is evaluated on the fly in the console.

Scripting Interface to the Engine. A C++ cube entity is instantiated from Lua code that is evaluated on the fly in the console.

In order to get started, I added a simple single function: vtx_instantiate(). This function is available to Lua, but its actual implementation is provided in native code, in C++. The image above shows how we can use this function to add an entity to the scene from the console.

This simple example allows us to test two important concepts: first, that we can effectively call into C++ from Lua. Second, it shows that we are able to pass in parameters between the two languages. In this case, the single argument expected is a string that specifies which primitive or asset to instantiate.

With this in place, we can now move on to building a more intricate API that enables controlling any aspect of the scene, respond to user input and even implementing an elaborate world simulation.

Best of all, because the Lua VM is embedded into the engine, scripts built against the Vortex API will by definition be portable and run on any platform the engine runs on. This includes, of course, mobile devices.

The idea now is to continue to expand the engine API, developing a rich, easy to use set of functions. API design should prove an interesting exercise. Stay tuned for more!

New Light Interface

This week, work focused on developing a completely new interface for placing and customizing lights in the scene.

New Light Component Panel in the Vortex Editor.

New Light Component Panel in the Vortex Editor.

The history of Vortex with lights is interesting. In Vortex 1.1, the light system would leverage the fixed pipeline functionality. This meant that any single object in the scene could be lit by up to 8 different lights simultaneously.

In Vortex 2.0, the entire light system was replaced by programmable shaders. This meant that a user could define as many light as she wanted, as long as she created a custom “Visual Effect” that implemented the lighting rig. This was very flexible, but shifted the burden of lighting to the application.

For V3, we are changing the approach again to be able to support multiple dynamic light sources while, at the same time, moving most of the work back into the engine. The plan is to effectively shield the application from implementing the lighting logic and let it focus on just light placement and configuration.

In V3, lights are components that can be attached to entities. Being part of entities means that lights will be able to move around just like any other object in the scene. Being custom components allows bringing in the flexibility of exposing a rich declarative interface for configuring the appearance of the light and how it affects the world.

Under the hood, the new renderer will take care of processing all the lights in a consistent fashion, ensuring that lighting throughout the scene is uniform.

I think we are about to reach the most interesting parts of the new renderer. This is where V3 will really set itself apart from previous iterations of the engine. Stay tuned for more!

Material Editing via Shader Introspection

Last week I started to work on an advanced Material Editor for Vortex. If you remember, one of the objectives of the new V3 Renderer was to completely overhaul the material system of the engine as to support advanced rendering techniques.

New Material Editor exposes the Shader Uniforms so they can be easily tuned.

New Material Editor exposes the Shader Uniforms so they can be easily tuned.

Building a dedicated editor that allows changing the material properties on the fly will help tune shader properties in order to achieve compelling visual results.

Ever since I developed the original Shader Composer project years ago, I’ve been puzzled about how one might go about providing a flexible enough means to supply the shader with meaningful values from a UI. The solution to this problem comes in the form of shader introspection.

The idea is that, once the shaders have been compiled and linked together, we will pull the (referenced) shader Uniforms from the GL. This allows dynamically asking for what properties can be provided to the shader. This allows configuring its behavior at runtime. No need to stop the Editor or App to change values and recompile.

The image above shows a basic textfield-based editor that can be used to set the uniform values for the shader associated to the selected Entity. The plan moving forward is to provide specialized editors that help entering vectors, matrices or even an image reference.

We’ll continue with this next week. Stay tuned for more!