In an attempt to ease myself into learning how to write GPU raytracing shaders for DXR, I challenged myself to write a traditional CPU raytracer from scratch in C++ to find out how raytracing actually works
The new raytracing capabilities of latest GPU hardware did intrigue me upon release. For some people the results left a little to be desired, and said it was "just fancy reflections". Sure, I think the RTX launch was a bit underwhelming, but I still see a lot of potential when it comes to making wacky shaders - like tech artists like to do - whether it be related to volumetric rendering, GI, simulating black holes or what have you.
So with covid lockdown in full swing a couple months ago I sat down and thought about how to best approach the subject not knowing much about raytracing and not having an RTX or other DXR card on my hands. I figured trying to write a CPU pathtracer would be my best best.
Various stages from my first successful renders to glossy specular to triangular meshes
So I had a dig through all my programming books, which I will link to in the end, and got started after naming my renderer Fridtjof, because non-Norwegians have a hard time pronouncing it, and it's the name of this absolute madman of an explorer. Not trying to equate my software development escapades to actual adventure, but hey, going outside was pretty much illegal at this point so I was feeling frisky.
Since I wanted to learn as much as possible I didn't allow any external libraries, with the exception of windows.h and others related to just getting a window showing for my application, because people's gotta have some windows. And since I wanted this to go fast I chose C++ as the weapon of choice. This meant however that I spent a pretty long time writing very basic libraries, especially regarding math. Even render output was a struggle, and I surfed on the simplicity of BMP files for a long time. I was so excited when I had my first ugly unlit aliasing sphere pop up on my screen.
One of the first renders with emissive objects where my sampling played nice
A couple of months of on and off tinkering later I have about 7000 lines of code that sum up to a functioning, though very far from completed pathtracer. Why would you pick a pathtracer other than wen you want stuff to render really really slow? Well I figured I won't beat proper renderers on speed but at least I could try to get my renders looking nice with GI by default. I've currently implemented rendering of basic primitives, rendering of triangle based meshes with some crude acceleration schemes and instancing. We've got textures - though with poor sampling support at the moment - and some basic shaders to get by with like emissives, phong, lambert and a very pseudo-PBR one. There's a couple of sampling techniques implemented though I still haven't figured out how spatially varying sampling and mapping works so no roughness textures yet.
So there's a lot to add still and more than a couple bugs, but overall I'm happy with my progress. The idea is to "end" the project with some art piece rendered in the renderer, and I have a good idea about what I'm gonna do, so now I just need to wrap up features. Eventually I might open up the repo on Github if there's enough interest, and more importantly if I feel it's good enough code to warrant anyone looking at it.
Wow, so fancy
Now here's a couple of cool books that taught me much:
Loads of good tidbits on a wide variety of topics outside of rendering
I'm not good at math, need all the help I can get
Old but good for the basics, didn't cover pathtracing much unfortunately
And a good bunch of blogs and articles I'll try to dig up and add.