Just wanted to share a game prototype I did a while ago to learn Unreal Engine. You are a good spaceship that shoots a bunch of evil spaceships. You can jump from planet to planet to encounter more evil spaceships to shoot. Good elevator pitch right?

Before this thing dies on my hard drive I figured I might as well share it on here.

The thing that took me a while to figure out is how to snap the ship to the planets. I really like Unreal Engine because it allowed me to make a closest point on sphere node which ultimately made all this work.

Apologies for the crap audio, sounds better on my machine. ๐
Let me know what you think!

Been a while since I last posted anything on here but I have not stopped doing cool stuff!

Recently I saw a post by a friend that showed a distance constraint and that inspired me to try it myself. It's pretty simple, the target just stays in a sphere around a source on a vector between them and its super cool when its chained together.

In this example below I added a little springiness to each segment, fun experiment ๐บ๐ค

Here is some more information including code samples if you want to try it out yourself https://zalo.github.io/blog/constraints/ (Huge thanks to Johnathon Selstad for sharing!)

In December 2017 I left Sony Entertainment and moved to sunny California to work for the company I always wanted to be: Blizzard Entertainment! I work with some incredibly talented and passionate people and have honestly never felt so inspired.

Since then I have been facing a lot of awesome new challenges and had a lot of stuff to learn. In order to keep the momentum going I finally set out to learn something that has been on my to-do list for a long time:

C++! Thats right, the reaaaaal stuff! ๐ฝ

I kinda poked around with it in the maya environment but I have very little understanding of the the basics of the language. Since I am also mostly very animation focused I decided to go outside my comfort zone even more and get into basics of rendering by writing a pure raytracer in C++ using no additional libraries like OpenGL. By doing so I learned a lot of things I heard others talk about before but never fully realised what they were, such as various image filter methods etc.

I only decided to use Qt for a simple UI and also for generating the final render image since I was already familiar with it coming from PyQt. For the most part it is almost identical so there were many cases I didn't even need to look things up in the documentation because I already knew what classes and functions I needed.

Starting out I did a lot of reading and especially for the topic of raytracing I can strongly recommend the following links that helped me a lot to get a good overview of the basics

I developed this project on Mac OSX using VS Code which I can strongly recommend because it is basically like Visual studio light and it did pretty much everything I needed it to do. Its very simple to use and has some great debugging (compared to Python we can no longer print things and call it debug! ๐)

Below I just wanted to share a couple of videos that show the progress I made over a period of a month or two.

What you see here is basically just a generated QImage that is being re-rendered with new input values whenever you drag the mouse over it. For instance to move the camera or reposition the light.

And here is the final output I managed to produce. This has diffuse/spec/normal maps at a resolution of 1024x1024 on the spheres and the plane is textured procedurally. I got the reflections kinda working (I think something in the math is wrong but hey!). I also did cheat a little bit on the DOF and used Qts BlurEffect by using my depth channel, which is not really entirely correct since it bleeds
edges but it gave a pretty decent effect for almost no extra work ๐

My take away from this project is that there is never a point in time where you should stop learning new things and especially uncomfortable things because they might seem like a behemoth. Go talk to your programmers, artists, tech artists and steal all their knowledge, most of them are happy to help you and generally get excited about people wanting to learn about "their world". They might even learn something in the process just by explaining it to you.

This stimulation of your mind is absolutely necessary, especially in a field where things change so much all the time.

Just a small update to share before the holidays, I added stickiness to the collision deformer.
Wishing you all merry Christmas and a happy new year! ๐๐

Ever since I saw RBF I wanted to at least be able to understand it. I actually invested quite a lot of time into this and sort of had this on the backlog for a while to write about. This node is basically interpolating between a range of values, based on a range of other values.

A python dictionary is actually quite a good way to explain the core concept. There are keys and values, for instance lets say our keys are numbers from 1 to 5, and our values are 11-15.

keys

1

2

3

4

5

values

11

12

13

14

15

Notice how they match in length. Now how can we interpolate the keys based on the values? For example whats the the value 13 interpolated on the keys? Quite easy as its literally just sliding upwards to the keys and seeing the number 3. This could work for any range, all we have to do is remap the values right? So if keys are 0-1 and values are 100-200 then it should look like this:

keys

0.0

0.25

0.5

0.75

1.0

values

100

125

150

175

200

To find the interpolated key value of 0.375 on our values we just remap this and we should get 137.5. What we have done here mathematically is we checked the distance between each of the keys and remapped that to the values. That is a super simplified example based on 1D input data. The distance data is what really drives everything here and this can be applied to any dataset, as long as we have a valid distance function.

For 1D data the "distance" is basically one negated by the other but how about we want interpolate
3d positions based on 3d rotations? As example we have 3 rotations and 3 positions.

rotations

90.0, 0.0, 0.0

0.0, 90.0, 0.0

0.0, 0.0, 90.0

positions

-1.0, 0.0, 1.0

1.0, 0.0, -1.0

1.0, 0.0, 1.0

How can we find the interpolated rotation of the position [ -0.5, 0.0, 0.5 ] ?

If we try and calculate it with just standard arithmetic and as the input data gets increasingly complex you will see that there are in fact multiple solutions possible, there isn't just a single one. We do want to have a single solution though, the best possible interpolation between any number of elements. What if we have 500 rotations and 500 positions?

To find a single solution we now build matrices of the distances between all of our keys and build a linear equation to solve this against our values. Since we have 3 values in each, we will create a 3x3 matrix which will have the euclidean distance from each position to all the other positions.

The distance between two points in 3D is done by taking the sum of point 1 - point 2 squared and taking the square root. In python this may look like so:

For our given positions this is what the distance matrix looks like. Note how the values diagonally from top left to bottom right are all 0. This is because these represent the distances to themselves.

0.0

2.8

2.0

2.8

0.0

2.0

2.0

2.0

0.0

We will then build another 3x3 matrix of the rotations and solve the linear equation Ax=B where A is the matrix of position distances and B is the matrix of rotations. We are looking to find x here which is basically the weight of A elements contributing to form B and our ultimate solution.

There are many python packages to solve linear equations out there and to be completely honest I haven't fully understood the math behind it, it is actually quite complicated but if you want to read up more about it then you can find plenty of information online about solving linear equations using LU Decomposition.

In my case I used the mpmath library which also comes with its own matrix class and lu decomposition solve and a couple of other useful functions.

Now we end up with the following calculation where we have to find x

0.0

2.8

2.0

2.8

0.0

2.0

2.0

2.0

0.0

⋅ X =

90.0

0.0

0.0

0.0

90.0

0.0

0.0

0.0

90.0

Solving this equation will give us back a "weight matrix"

x =

-32.1

0.0

45.0

32.1

0.0

0.0

45.0

45.0

-63.0

All we have to do now is to get the distances from our interpolation probe point to all input positions and multiply them with their according weight.

So for our interpolation point p at [ -0.5, 0.0, 0.5 ] the distances to all input positions will return us

[ 54.578, 32.604, 90.15 ]

Now we simply iterate through this distance list and multiply it by its total weight. For instance the first item would be multiplied with the sum of the first column of the weight matrix and so on. We are then left with, our ultimate solution 40.22, 30.58, -30.65

You can also solve those linear equations online here. The distance matrix A can also be filtered through a basis function before we solve the equation and there are plenty of functions to choose from, eg. "linear", "gaussian" or "multiquadric" just to get some interesting interpolation effects for different datatypes.

I made a quick video to demonstrate this node, written in Python. Towards the end of the video you can see the clavicle joint rotation driving some blend shapes. Hope you enjoy!