by ferris
As I mentioned last week, I was doing a quick Easter demo. Now I can reveal that it was an invite for Solskogen :)
Basically I didn't have enough time to finish up the tooling I wanted to do, but I still had live asset reload etc lined up, so really all I had to do for the visuals was take the raymarcher I wrote for Backscatter and make it into something that looked kinda like the website :D
The first thing I did was to remove some of the discontinuity artifacts. This is a common artifact in raymarching, and basically comes from the fact that you have limited precision while marching along the ray to draw your scene, and this results in some screen-space discontinuities. This also isn't helped by the fact that to reduce the number of iterations, you typically use a minimum step size. Here's the test scene I used to isolate the artifacts:
You can see the nasty banding. Basically all I did to fix this, though, was that once a hit was detected, march a few extra times along the ray without the minimum step size (the distance field is signed so you'll step forwards/backwards along the ray as appropriate) and this tends to remove the artifacts quite well with pretty few iterations, and it's very easy to implement. Thanks to las/mercury btw for giving me some example code showing how they fix this in their intros; their method is basically the same as mine but they get a bit more clever with how they adjust their position along the ray iteratively after hitting the surface. Since I had a limited amount of time I didn't read up on exactly why they were doing something more advanced or how it worked exactly, but my quick implementation seemed perfectly sufficient so it didn't matter much :)
The next thing I did was some fisheye distortion, because I thought that would look cool with some straight shapes like I had in mind:
This took a bit longer than I thought as I had to try a bunch of stuff and think about projection to figure out the exact math involved, and I think in the end I actually ended up looking it up somewhere, and I can't remember where, but anyways, it worked :P
Next thing I did was to add an HDR background. Since I was trying to match the solskogen website more or less, I grabbed "Topanga Forest B" from the sIBL archive, mapped it onto a sphere I raytraced for the background, and then added some tonemapping:
On the left you can see what happens if you just take linear HDR RGB values, clamp them to [0, 1] and put them on the screen. On the right is with proper tonemapping and gamma before dumping to the screen, and yeah, it looks much more pro :D This particular tonemapping operator came from this blog post by John Hable; in particular the optimized formula by Jim Hejl and Richard Burgess-Dawson. You can check the post for the code specifically, or dig up the final demo and check the assets folder.
Next thing I did was start playing with reflection maps. At this point it was really a lot of tweaking over a day or so to get something that looked nice without straying too far from what I was after. One such example of something I tried:
But since I was going for something softer, I ended up using the blurred envmap version that came with the sIBL and tweaked that a TON to get stuff to look nice. Quick thanks to iq/rgba for the fbm noise and repetition stuff I was using at one point; I don't think I ended up using any of it in the final demo though, but thanks to him anyways :)
After all that was done I slapped a scroller on it, did some color correction/grading by hand and called it good. Visually, at least.
It turns out audio was a bit of a bitch, though. Rust currently doesn't have any cross-platform audio libraries available. This wasn't a huge problem, though, as we knew the target platform would be Windows anyways. So I switched to that OS for the rest of the demo.
So on windows I went ahead and gave rodio a shot first. The first problem I had was that I couldn't get it to build. I narrowed down the problem to link issues while building one of its dependencies, vorbis-rs. I tried a great deal of things before I realized the only real issue I had was that even though I had gcc and ld available on my path when using msys2, that didn't mean I was able to use them properly. Turns out you still need to run those from the msys2-mingw shells specifically. So that was easy, even though it took me a long time to realize that was the problem :)
Now that rodio was building, time to try to play some audio... and of course, with my luck, it crashed at runtime. This time it had to do with the fact that rodio uses cpal for playback, which uses WASAPI underneath on Windows. WASAPI operates in two modes: exclusive, where your app is the only app that can make sound but you have [almost] full control over what the format of that sound is, and shared, where your app sounds play nicely with sounds from other apps, but you must conform to the specific sound formats and sample rates everyone else is using. And lo and behold, my system wanted cpal to use a sound format it didn't support. Dangit.
So then I though I could just use vorbis-rs to decode the audio data, and wrap DirectSound myself for playback. Since winapi-rs listed directsound-sys as a crate that has functions in it and should be usable, this should work. So I got vorbis-rs up and running and decoding samples without a hitch, then onto DirectSound. As my luck would have it, the directsound-sys crate is actually empty. A quick email exchange with the author confirmed that the latest version wasn't actually empty, but it hardly had anything in it, and because of how complex DirectSound is, automatically generating the rest of the bindings was basically a no-go. So onto the next thing.
This is when I considered Fmod, something I've used in the past in demos quite successfully. So I signed up as a developer to download the package (which you apparently have to do now), got fmodex (their slightly older but simpler package), and checked the library files it shipped with. Unfortunately, there was no 64-bit support for gcc on Windows, so this wouldn't work either.
Finally I went for Bass, because at least they shipped a 64-bit compatible .lib file to link against. So then I just needed bindings.
For this the plan was to run bass' header file through rust-bindgen. This sounds like a simple enough process, but of course this project doesn't build out-of-the-box with vanilla rust. This is because it uses parts of llvm to actually parse the header file and extract the info it needs, so I dig into the source a bit to find out where it might expect to find an llvm installation, and this was pretty easy to find. Then I just downloaded the latest llvm for windows, installed it to the right dir (which was the default install dir on Windows, as luck would have it) and bang, a building binding generator.
The generator worked almost perfectly (albeit pretty slowly). The only thing it didn't handle correctly was that some of the enum's it generated only had one variant, which isn't allowed in Rust, so I had to go through the file by hand and add some Dummy
variants to please the compiler, but that was about it.
So now the code was compiling. Time to test! I threw together a super basic test where I just try to read the lib version and init the lib and ran that. The result? Segfaults. SO MANY SEGFAULTS.
This was actually a bit strange I thought; clearly the link didn't fail because otherwise I would've gotten missing symbol complaints. To make sure this was correct I even removed the lib to double-check, and indeed, missing symbol errors. So that wasn't it.
Then I remembered calling conventions. Even if the symbols were found, it's entirely possible that the code that calls these symbols could be incorrect if the code used different calling conventions than expected or some other such error, which could very well access invalid memory at runtime and segfault like I was seeing. I wasn't 100% sure this was the problem, especially when I dug around a bit for calling conventions on x64 (I'm only really familiar with x86 conventions on Windows) and found that typically everything uses really just one calling convention, but I had to try something.
Then I remembered that .lib files are really only meant to be used from MSVC, not necessarily gcc. While gcc can apparently still read these files (as mentioned earlier, it did link, it just generated segfault-y code), it's still not gcc's preferred format. So, I decided to look for some info regarding converting MSVC .lib's into gcc .a's.
I ended up finding this article which at least gave me the impression I was on the right track. In the second half it mentions that it is possible (and recommended) to convert existing .lib's to .a's, so I tried looking for tools that did that. From the article, it looked like the tool linked was only for x86 dll's because it mentioned the __cdecl calling convention, which at this point I figured was an x86-only thing. So I skipped that and went looking around the web. This was a mistake however; the first tool I found (no reason to link it, it was awful) produced a .a file that ld couldn't even load, so I went for the reimp tool mentioned in the article instead. Turns out this worked, I was able to finally call bass functions, and I could move on. Finally :D
So then it was pretty easy to just call the functions I needed to play audio, and all was right with the world. From then on it was really just a matter of tightening some things up, adding fades, etc. The result was Elk-tronic Arts by nazareth (a group I use for not-serious releases). For being the first rust demoscene prod (that I know of) and doing it quickly, I think it's pretty nice, and it was really fun to do. It placed pretty poorly at the gathering though, but that was cool too, since I wasn't actually in attendance, and it'd be a shame for a remote entry to place instead of just prods by people who were actually there :)
There was actually another prod I did quickly as well; an executable music entry for revision with h0ffman. We placed 2nd, which was pretty rad amidst 10 or so entries :) Unfortunately I don't have a link atm, as scene.org doesn't seem to have them available yet, but when I get it I can post it here. Naturally it was done with my 64k synth, as usual :)
So that's about it; until next time :)
Last Edited on Mon Mar 28 2016 23:54:55 GMT-0400 (EDT)
Dude! I was really looking forward to this one and it didn't disappoint, what a mission :D A really inspiring read, I need to pick your brain on some ibl stuff soon. Huge congrats on getting that all to work
on Tue Mar 29 2016 03:35:06 GMT-0400 (EDT)
Thanks man!! :D
on Mon Apr 04 2016 03:32:50 GMT-0400 (EDT)