Procedural Sphere Primitive and Specular Highlights

This week I spent some time adding specular highlights to the Deferred Renderer. The results can be seen below.

A procedural sphere with specular highlights. Notice the dramatic difference that normal mapping makes when computing the highlights.

A procedural sphere with specular highlights. Notice the dramatic difference that normal mapping makes when computing the highlights.

Since the box was not the best primitive to test specular highlights, I added a sphere primitive to the engine. This primitive is also going to come in handy when we start adding point lights.

The sphere primitive includes normal and texture coordinate data, enabling shading from both the regular deferred light pass and the normal mapping pass. Both passes are shown in the image above. The difference is dramatic. Notice how normal mapping helps convey the illusion of a rough brick surface, whereas the “geometrical” normal just makes it look flat.

Please don’t mind the glitches in the GIF above. Overridding the mouse cursor makes the Screen To Gif tool I use act up a little bit. I promise these are not Editor bugs ;)

Custom Computer @ SIGGRAPH

I’m back from SIGGRAPH 2017 and I wanted to share with you this amazing custom computer I saw at the expo.

A computer made out of a Tegra K1 SoC, a touch screen and a custom-built enclosure.

A computer made out of a Tegra K1 SoC, a touch screen and a custom-built enclosure.

This computer consists of a NVIDIA Tegra K1 System-on-Chip (SoC) that has a touch screen connected. It is housed in a custom-built enclosure that somewhat resembles an iMac and other “all in one” PCs out there.

What I find fascinating is how using off-the-shelf components, one can simply build an “appliance” that showcases 3D tech and/or art for a booth.

In this case, the machine is demoing a PBR renderer built for the Qt framework. Of course, I can envision this as an interesting showcase for the Vortex Engine as well. Because the renderer can scale from a dedicated desktop GPU all the way down to an iPhone, and it already supports Linux-based systems, it would be easy to build something similar. This would also give me the opportunity to play with some hardware in my free time : )

I’m excited about this project. I will start looking into it once the renderer is more mature.

Normal Mapping 2.0

This week we swtiched gears back into Rendering! A lot of work went into building Normal Mapping in the new Renderer. The following image shows the dramatic results:

Normal mapping in the new Deferred Renderer.

Normal mapping in the new Deferred Renderer.

Here, I switch back and forth between the regular Geometry Pass shader and a Normal Mapping-aware shader. Notice how Normal mapping dramatically changes the appearance of the bricks, making them feel less part of a flat surface and more like a real, coarse surface.

I initially discussed Normal Mapping back in 2014, so I definitely recommend you check out that post for more details on how the technique works. The biggest difference in building Normal Mapping in Vortex V3 compared to Vortex 2.0 was implementing it on top of the new Deferred Renderer.

There is more work to be done in terms of Normal Mapping, such as adding specular mapping, but I’m happy with the results so far. Next week we continue working on graphics! Stay tuned for more!

Putting it all together

This week has been a big one in terms of wrangling together several big pillars of the engine to provide wider functionality. The image below shows how we can now dynamically run an external Lua script that modifies the 3D world on the fly:

Vortex loading and running an external script that changes the texture of an entity's material.

Vortex loading and running an external script that changes the texture of an entity’s material.

In the image above, I’ve created two boxes. Both of these have different materials and each material references a different texture.

What you can see I’m doing is that I “mistakenly” drag from the Asset Library a character texture and assign it as the second box’s texture. Oh no! How can we fix this? It’s easy: just run an external script that will assign the first box’s texture to the second!

I’ve pasted the code of the script below:

function get_entity_material( entity )
	--get an entity's material
	local RENDER_COMPONENT_TYPE = 1
	local rendercomp = entity:first_component_of_type( RENDER_COMPONENT_TYPE )
	local material = rendercomp:get_material()
	return material;
end

ent0 = vtx.find_first_entity_by_name("box0")
mat0 = get_entity_material( ent0 )
tex0 = mat0:get_texture( "diffuseTex" )

ent1 = vtx.find_first_entity_by_name("box1")
mat1 = get_entity_material( ent1 )

mat1:set_texture( "diffuseTex", tex0 )

print("done")

As you can see, the script is pretty straightforward. It finds the boxes, drills all the way down to their materials and then assigns the texture of the first box to the second. The changes are immediately seen in the 3D world.

It’s worth noting that all function calls into the vtx namespace and derived objects are actually jumping into C++. This script is therefore dynamically manipulating engine objects, that’s why we see its effects in the scene view.

The function names are still work in progress, and admittedly, I need to write more scripts to see if these feel comfortable or if they’re too long and therefore hard to remember. My idea is to make the scripting interface as simple to use as possible, so please if you have any suggestions I would love to hear your feedback! Feel free to leave a comment below!

Next week I will continue working on adding more functionality to the scripting API, as well as adding more features to the renderer! Stay tuned for more!

Component Introspection

This week, work on the scripting interface continued. As the image below shows, I can now access an entity’s components and even drill down to its Material through the Lua console.

Introspecting a Render Component to access its material via the Lua interface.

Introspecting a Render Component to access its material via the Lua interface.

The image above shows an example of the scripting interface for entities and components. Here, we are creating a new Entity from the builtin Box primitive and then finding its first component of type “1”. Type 1 is an alias to the Render Component of the Entity. It is responsible for binding a Mesh and a Material together.

Once we have the Render Component, we use it to access its Material property and print its memory address.

As the image shows, although Lua allows for very flexible duck typing, I am performing type checking behind the scenes to make scripting mistakes obvious to the user. Allow me to elaborate:

For the interface, I’ve decided that all components will be hidden behind the vtx.Component “class”. Now, this class will be responsible to exposing the interface to all native component methods, such as get_material(), set_mesh(), get_transform() and so forth.

The problem is, how do we prevent trying to access the material property of a Component that doesn’t have one, such as the Md2AnimationComponent? In my mind, there are two ways. I’m going to call them the “JavaScript” way and the “Python” way.

In the JavaScript way, we don’t really care. We allow calling any method on any component and silently fail when a mismatch is detected. We may return nil or “undefined”, but at no point are we raising an error.

In the Python way, we will perform a sanity check before actually invoking the function on the component, and actually halt the operation when an error is found. You can see that in the example above. Here we’re purposefully attempting to get the material of a Base Component (component type 0), which doesn’t have one. In this case, the Engine detects the inconsistency and raises an error.

I feel the Python way is the way to go to prevent subtle hard-to-debug errors arising from allowing any method to be called on any component and happily carrying on through to -hopefully- reach some sort of result.

A third alternative would have been to actually expose a separate “class” for every component type. This would certainly work, but I’m concerned about a potential “class explosion”, as we continue to add more and more components to the Engine. Furthermore, I feel strongly typed duck typing is a good approach, well in tune with the language philosophy, for a language like Lua.

Now that we can drill all the way down to an Entity’s material, it’s time to expand the interface to allow setting the shader and the material properties, allowing the script developer to control how entities are rendered by the Engine. Stay tuned for more!

Vortex Editor turns 1 year!

I hadn’t realized, but the Vortex Editor turned one year old a couple of months ago. I set off to work on this project in my free time with a clear set of objectives and it’s hard to believe that one year has already passed since the initial kick-off.

Of course, the Editor is closely tied to the Engine, which has seen its fair share of improvements through this year. From building an entirely new deferred renderer to completely replacing the node-based scene graph system to an Entity Component System model that is flexible and extensible, enhancements have been wide and deep.

This post is a short retrospective on the accomplishments of the Vortex Editor and Vortex Engine through this last year.

Vortex Engine Achievements in the last year:

  1. Kicked off the third iteration of the Vortex Engine, codename “V3”.
  2. Upgraded the Graphics API to Core OpenGL 3.3 on Desktop and OpenGL ES 3.0 on Mobile.
  3. Implemented Deferred Rendering from scratch using MRT. Establish the base for PBR rendering.
  4. New Entity Component System model, far more flexible than the old scene graph model and with support for Native and Script Components
  5. Overhaul of several internal engine facilities, such as the Material and Texture systems.
  6. Completely redesigned engine facilities such as Lights and Postprocessing.
  7. New Lua-powered engine scripting.
  8. Ported to Windows, fixing several cross-compiler issues along the way. The engine now builds with clang,
    GCC and MSVC and runs on Linux, Mac, Windows, iOS and Android.
  9. Started moving codebase to Modern C++ (C++11).

Vortex Editor Achievements in the last year:

  1. Successfully kicked off the project. Built from scratch in C++.
  2. Built a comprehensive, modular UI with a context-sensitive architecture that adjusts what you’re doing.
  3. Bootstrapped the project using Vortex Engine 2.0, then quickly moved to V3 once things were stable.
  4. Provide basic serialization/deserialization support for saving and loading projects.
  5. Implemented a Lua REPL that allows talking to the engine directly and script the Editor.
  6. Friendly Drag-and-Drop interface for instantiating new Entities.
  7. Complete visual control over an Entity’s position, orientation and scale in the world, as well as the configuration of its components.
  8. Allow dynamically adding new components to entities to change their behavior.

It has been quite a ride for the last year. I believe all these changes have successfully built and expanded upon the 5 years of work on Vortex 1.1 and 2.0. I’m excited about continuing to work on these projects to ultimately come up with a product that is fun to tinker with.

My objectives for this year two of the Editor include: implement scene and asset packaging, expanded scripting support and PBR rendering.

Stay tuned for more!

Binding C++ to Lua

For the last couple of weeks, a lot of work has been going into developing the scripting API that the engine is exposing to Lua. Embedding a scripting language into a large C++ codebase has been a very interesting experience and I’ve been able to experience first hand why Lua is regarded as such a strong scripting language.

Introspection of an Entity from a Lua script running in the Console.

Introspection of an Entity from a Lua script running in the Console.

Lua offers a myriad of ways we can develop a scripting interface for our native code.

A naïve approach would be to expose every function of the engine in the global namespace and have scripts use these directly. Although this method would certainly work, we want to offer an object-oriented API to the engine and its different components, so a more elaborate solution is required.

I ultimately decided to build the interface from scratch, following the Lua concepts of tables and metatables. The reason being that building everything myself would allow me to clearly see the costs of the binding as objects are passed back and forth. This will help keep an eye on performance.

Initial Test of the Lua-C++ binding. In this example, we query and rename an Entity.

Initial Test of the Lua-C++ binding. In this example, we query and rename an Entity.

In order to keep the global namespace as clean as possible, the idea was to create a Lua Table procedurally from the C++ side where all functions and types would live. Conceptually, this table is our namespace, so I named it vtx, accordingly. It’s really the only global variable that the engine registers.

The next step was to start populating the vtx namespace. Two functions I know I wanted to expose right away were Instantiate and Find:

vtx.instantiate( string )
-- Create a new entity corresponding to the identifier passed. 
-- Return the new entity created.

vtx.find_first_entity_by_name( string )
-- Find the first entity in the scene that matches the name specified.
-- Return the entity or nil if no matches are found.

We now have functions. But how do we do objects? And how do we expose our vtx::Entity objects to Lua?

Let’s recap a bit. We know that entities are “engine objects”, in the sense that they live in C++ and their lifecycles are managed by the Vortex Engine. What we want is to provide a lightweight object that Lua can interact with, but when push comes to shove, the native side will be able to leverage the full C++ interface of the engine.

Lua offers the concept of a metatable that helps achieve this. Metatables are can be associated to any table to provide special semantics to them. One special semantic we are interested in is the __index property, which allows implementing the Prototype design pattern.

I won’t go into details of the the Prototype design pattern works, but suffice it to say that whenever a function is called on a table, and the table does not have an implementation for it, the prototype will be responsible to service it.

This is exactly what we want. What we can do then is wrap our vtx::Entity instances in Lua tables and provide a common metatable to all of them that we implement in the C++ side. Even better, because of this approach Lua will take care of passing the Entity Table we are operating on as the first parameter to every function call. We can use this as the “this” object for the method.

Putting it all together, let’s walk over how entities expose the vtx::Entity::setName() function to Lua:

  1. From the native side, we create a metatable. Call it vtx.Entity.
  2. We register in this metatable a C++ function that receives a table and a string and can set the name of a native Entity. We assign it to the “set_name” property of the metatable.
  3. Whenever a script requests an Entity (instantiate, find), the function servicing the call will:
    1. Create a new table.
    2. Set the table’s metatable to vtx.Entity.
    3. Store a pointer to the C++ Entity in it.
  4. When a script invokes the Entity’s set_name function, it will trigger a lookup into the metatable’s functions.
  5. The function we registered under set_name will be called. We are now back in C++.
  6. The native function will pop from the stack a string (the new name) and the “Entity” on which the method was called.
  7. We reinterpret_cast the Entity Table’s stored pointer as a vtx::Entity pointer and call our normal setName() function, passing down the string.

Et voila. That is everything. The second image above shows in the console log how all this looks to a Lua script. At no point must the script developer know that the logic flow is jumping between Lua and C++ as her program executes.

We can also see in the screenshot how the Editor’s entity list picks up the name change. This shows how we are actually altering the real engine objects and not some mock Lua clone.

As I mentioned in the beginning of this post, developing a Lua binding for a large C++ codebase from scratch is a lot of fun. I will continue adding more functionality over the coming weeks and then we’re going to be ready to go back and revisit scene serialization.

Stay tuned for more!

Building the Engine Scripting API

Last week when we left off, we were able to implement a Lua REPL in the Vortex Editor console. This week, I wanted to take things further by allowing Lua scripts to access the engine’s state, create new entities and modify their properties.

Scripting Interface to the Engine. A C++ cube entity is instantiated from Lua code that is evaluated on the fly in the console.

Scripting Interface to the Engine. A C++ cube entity is instantiated from Lua code that is evaluated on the fly in the console.

In order to get started, I added a simple single function: vtx_instantiate(). This function is available to Lua, but its actual implementation is provided in native code, in C++. The image above shows how we can use this function to add an entity to the scene from the console.

This simple example allows us to test two important concepts: first, that we can effectively call into C++ from Lua. Second, it shows that we are able to pass in parameters between the two languages. In this case, the single argument expected is a string that specifies which primitive or asset to instantiate.

With this in place, we can now move on to building a more intricate API that enables controlling any aspect of the scene, respond to user input and even implementing an elaborate world simulation.

Best of all, because the Lua VM is embedded into the engine, scripts built against the Vortex API will by definition be portable and run on any platform the engine runs on. This includes, of course, mobile devices.

The idea now is to continue to expand the engine API, developing a rich, easy to use set of functions. API design should prove an interesting exercise. Stay tuned for more!

Adding a Scripting Engine to Vortex

This week I added scripting support to the Engine. I chose to go with Lua because of how easy it is to integrate into existing C/C++ codebases.

Initial integration of a Lua VM in the form of an updated Console

Initial integration of a Lua VM in the form of an updated Console

I’ve mentioned Lua several times before in this blog, but if you’re not familiar with it, it’s a great open source programming language developed at the Catholic University of Rio, Brazil (PUC Rio). It’s very easy to pick up.

Here’s a 10,000 feet view of the language, courtesy of Coffeeghost:

Lua cheatsheet by coffeeghost.

Lua cheatsheet by coffeeghost.

I’ve been interested in adding Lua scripting to the engine for a while now. I finally decided to take the step while I was revisiting serialization and a friend suggested going directly with Lua for the manifest file instead of JSON.

Moving from a “declarative” manifest to an “imperative” one might seem strange, however, it will give me the opportunity to start fleshing out the Lua-to-Engine interface that will later serve engine-wide scripting.

I am very happy with the way things turned out. In the image above you can see how I refactored the Vortex Editor console to now support a full Lua REPL.

Powered by the Lua Engine in Vortex, the console is no longer a place where the engine just prints messages, but rather a true editor shell with a direct interface to the engine. This is similar to what some popular 3D modeling software products do with Python.

I am excited about having Lua scripts as first class citizens in the engine. Expect to see much more Lua in this blog in the upcoming months!

Stay tuned for more!

Deferred Rendering – Part II: Handling Normals

This week work went into adding normal data to the G-Buffer and Light passes. I left normal data out while I was still implementing the architecture for the deferred renderer, so in the previous post we were only doing ambient lighting.

The Stanford Dragon 3D model rendered with a single directional light by the new deferred renderer.

The Stanford Dragon 3D model rendered with a single directional light by the new deferred renderer.

Adding normal data required a few tweaks to several components of the renderer.

First, I had to extend my Vertex Buffers to accommodate an extra 3 floating point values (the normal data). If you’ve been following these post series, you’ll remember from this post that we designed our Interleaved Arrays with the format XYZUVRGBA. I’ve now expanded this format to include per-vertex normals. The new packing is: XYZNNNUVRGBA, where NNN denotes the normal x, y and z values.

Next, the G-Buffer had to be extended to include a third render target: a normal texture. Here is where we stored interpolated normal data calculated during the Geometry pass.

Finally, we extend the Light pass shader to take in the normal texture and sample the per-fragment world-space normal and use it in its light calculation.

The test box renderer with a hard directional light with direction (-1,-1,0).

The test box renderer with a hard directional light with direction (-1,-1,0).

For testing, I used our usual textured box model and directional light with a rotating direction, as depicted above. The box is great for these kinds of tests, as its faces are parallel to the Ox, Oy and Oz planes with opposite value normals.

Soon enough, the test revealed a problem: faces with negative normals where not being lit. Drawing the normal data to the framebuffer during the light pass helped narrow down the problem tremendously. It turns out, normal data sampled from the normal texture was clamped to the [0-1] range. This meant that we could not represent normals with negative components.

Going through the OpenGL documentation for floating point textures revealed the problem. According to the docs, if the data type of a texture ends up being defined as fixed-point, sampled data will be clamped. The problem was caused by a single incorrect parameter in the glTexture2D calls used to create the render textures: the data type was being set to GL_UNSIGNED_BYTE for floating point textures.

The fix was simple enough: set the data type as GL_FLOAT for floating point textures, even if the internal format is already floating-point.

Our old forktruck scene, loaded into the Editor and rendered using the first iteration of the deferred renderer.

Our old forktruck scene, loaded into the Editor and rendered using the first iteration of the deferred renderer.

With this fix in place, I could confirm that we are able to light a surface from any direction from our simple Light pass. The image above shows our old warehouse scene loaded by Editor, and the forktruck is being lit by the deferred renderer!

There are now several paths where work can continue: we can extend our serialization format to include material data, we can continue improving the visuals or we can start testing our code on Mac. Stay tuned for more!