Using layers with settings about which SDFs interact with which layers for operations seems interesting. Like put trees in a layer and then have an axe that can negatively deform the trees but not the ground layer or something. Or a predator layer that can absorb things in the prey layer. Haven't really thought through.
Really impressive work. He covers it at the end, but being able to create tunnels into terrain, walk through them and then make the terrain disappear. Or make holes in the ground and move them around to suck items in. Or dynamically erase or add to any terrain in real time. A lot of interesting gameplay opportunities here and surprisingly performant!
Go Mike! Been seeing his progress for a while on Bluesky, I knew exactly (and who) what this was when I saw it on HN.
I was rendering-curious when we overlapped together at Figma. Mike was super patient and giving with his time, answering all my dumb questions and aiding with my Maker Week projects. Excited to see him take on something so ambitious next.
Great video. A new game engine powered by SDFs is the sort of thing I want to find out about. Not the next game in a long running franchise that looks the same as all the others that preceded it. One for the from-scratch game dev nerds like myself!
The physics engine mentioned towards the end, Jolt Physics [0] is used in the frankly blockbuster games Horizon: Forbidden West and Death Stranding 2 and yet opens its description with
> Why create yet another physics engine? Firstly, it has been a personal learning project.
which is really rather wonderful and inspiring to see.
Its use in those games is no mere coincidence though, the creator of that physics engine, Jorrit Rouwé, has worked at Guerilla Games since the Killzone days.
Dreams on PS4 had an SDF modeler, but I’m not sure if the runtime was SDF. Now that I think about it, the rendering engine had a Gaussian splat look to it years before that paper.
GameGlobe from Haptico and Square Enix, the engine of which also powered Project Spark from Microsoft, also used an SDF engine. Former colleagues of mine built the tech in Copenhagen and I remember getting a super impressive demo back then. This was the first time I heard of SDFs.
Almost every 3D game uses textured polygons almost everywhere (except sometimes for fog or clouds), so this SDF engine is nice to see.
However, he doesn't mention animations, especially skeletal animations. Those tend to work poorly or not at all without polygons. PS4 Dreams, another SDF engine, also had strong limitations with regards to animation. I hope he can figure something out, though perhaps his game project doesn't need animation anyway.
I'm not super familiar with this area so I don't follow... Why is animation any more difficult? I would think you could attach the basic 3D shapes to a skeleton the same way you would with polygons.
There are lots of reasons you don’t see a lot of SDF skeletal rigging & animation in games. It’s harder because the distance evaluations get much more expensive when you attach a hierarchy of warps and transforms, and there are typically a lot of distance evaluations when doing ray-marching. This project reduces the cost by using a voxel cache, but animated stuff thwarts the caching, so you have to limit the amount of animation. Another reason it’s more difficult to rig & animate SDFs is because you only get a limited set of shapes that have analytic distance functions, or you have primitives and blending and warping that break Lipschitz conditions in your distance field, which is a fancy way of saying it’s easy to break the SDF and there are only limited and expensive ways to fix it. SDFs are much better at representing procedural content than the kind of mesh modeling involved in character animation and rendering.
Using layers with settings about which SDFs interact with which layers for operations seems interesting. Like put trees in a layer and then have an axe that can negatively deform the trees but not the ground layer or something. Or a predator layer that can absorb things in the prey layer. Haven't really thought through.
Really impressive work. He covers it at the end, but being able to create tunnels into terrain, walk through them and then make the terrain disappear. Or make holes in the ground and move them around to suck items in. Or dynamically erase or add to any terrain in real time. A lot of interesting gameplay opportunities here and surprisingly performant!
Go Mike! Been seeing his progress for a while on Bluesky, I knew exactly (and who) what this was when I saw it on HN.
I was rendering-curious when we overlapped together at Figma. Mike was super patient and giving with his time, answering all my dumb questions and aiding with my Maker Week projects. Excited to see him take on something so ambitious next.
Great video. A new game engine powered by SDFs is the sort of thing I want to find out about. Not the next game in a long running franchise that looks the same as all the others that preceded it. One for the from-scratch game dev nerds like myself!
The physics engine mentioned towards the end, Jolt Physics [0] is used in the frankly blockbuster games Horizon: Forbidden West and Death Stranding 2 and yet opens its description with
> Why create yet another physics engine? Firstly, it has been a personal learning project.
which is really rather wonderful and inspiring to see.
[0] https://github.com/jrouwe/JoltPhysics
Its use in those games is no mere coincidence though, the creator of that physics engine, Jorrit Rouwé, has worked at Guerilla Games since the Killzone days.
https://jrouwe.nl/games.php
It has also become the default physics engine in Godot.
Also increasingly well integrated into Godot.
Dreams on PS4 had an SDF modeler, but I’m not sure if the runtime was SDF. Now that I think about it, the rendering engine had a Gaussian splat look to it years before that paper.
They discuss Dreams in the video and even explain Brick rendering.
GameGlobe from Haptico and Square Enix, the engine of which also powered Project Spark from Microsoft, also used an SDF engine. Former colleagues of mine built the tech in Copenhagen and I remember getting a super impressive demo back then. This was the first time I heard of SDFs.
I wonder if ReLU fields could help reduce cache grid resolution while improving reconstruction precision? See https://arxiv.org/abs/2205.10824
Such impressive demos and great explanations in the video. Mike, if you're reading this, keep making videos!
Dang! Very nice!
Almost every 3D game uses textured polygons almost everywhere (except sometimes for fog or clouds), so this SDF engine is nice to see.
However, he doesn't mention animations, especially skeletal animations. Those tend to work poorly or not at all without polygons. PS4 Dreams, another SDF engine, also had strong limitations with regards to animation. I hope he can figure something out, though perhaps his game project doesn't need animation anyway.
I'm not super familiar with this area so I don't follow... Why is animation any more difficult? I would think you could attach the basic 3D shapes to a skeleton the same way you would with polygons.
There are lots of reasons you don’t see a lot of SDF skeletal rigging & animation in games. It’s harder because the distance evaluations get much more expensive when you attach a hierarchy of warps and transforms, and there are typically a lot of distance evaluations when doing ray-marching. This project reduces the cost by using a voxel cache, but animated stuff thwarts the caching, so you have to limit the amount of animation. Another reason it’s more difficult to rig & animate SDFs is because you only get a limited set of shapes that have analytic distance functions, or you have primitives and blending and warping that break Lipschitz conditions in your distance field, which is a fancy way of saying it’s easy to break the SDF and there are only limited and expensive ways to fix it. SDFs are much better at representing procedural content than the kind of mesh modeling involved in character animation and rendering.