I doubt I’ll have hardware that supports raytracing any time in the foreseeable future, but I think it looks neat at least. Glad we finally live in an era where graphics card manufacturers finally decided to put the engineering effort into chasing this supposed holy grail.
I like seeing raytracing dropped into games that weren’t designed for it.
This version of Mario 64 is neat:
Still obviously a work-in-progress, but the refracting water looks wild.
Not a fan of this Wolf3D conversion:
It’s just like… the floors are coated in epoxy and that’s it. Very poor sense of lighting — everything seems to be lit by an omnipresent light source, and the physical light sources in the game don’t seem to be doing much. I’d be interested in another take at this, especially because I think the idiosyncratic placement of lights in the original game would lead to some interesting atmosphere.
In contrast, this Doom one looks much better:
well-defined surface properties, lighting effects, light sources, etc. etc. Very nice showcase of the technology.
There’s this unfortunate thing in a lot of these retro projects where you need to add modern material properties (physically based rendering with roughness, specularity maps) to get the effect of light bouncing around in a simulated manner, which drastically changes the look of the game. Quake II RTX and Minecraft RTX are the poster children, of course; materials are shiny or bumpy in ways that I never conceptualized them as which breaks most of their intended aesthetic.
One of the best parts of Minecraft RTX is the simulation of light as it bounces into the holes you dig in the earth; it’s both darker and warmer and everything is more grounded. I have this dream that you could get these effects without the material/texture changes but it doesn’t seem like that’s true.
I feel like the DX raytracing API is probably subtle enough to accomplish that, considering some of the earliest supported games (BFV, Metro Whicheversubtitle) hewed more towards general GI and reflections than full on affecting everything
but that’s harder to see than “holy shit this surface is having light hit it accurately” (Nvidia’s demo of the RT effects in BFV was very much the in the vein of them having to rotate the camera to get a car’s windshield in center view so you could actually see the subtle reflections going on and that’s great and it’s definitely going to create that sense of looking right-ness to your brain but that was a hell of a performance hit for “I can see my face and the light of a flamethrower on a car’s glass that’s in my periphery”)
it’s not like we don’t have existing solutions for almost everything raytracing can build:
Mirror-like reflections have been done through planar reflections, doubled models, and screenspace reflections since the beginning of 3D. Screenspace reflections are great but have really obvious artifacts. Raytracing allows for a greater scope of reflections and less-artifacty reflections.
Objects reacting to light has been steadily improving over time; originally, we had early cubemaps used on things like Metal Mario to represent shininess and light surfaces. In the past decade, physically-based rendering has brought a pre-calculated reflection map per ‘room’ to let surfaces of different roughness all capture a bit of the color and light. Raytracing makes this higher-fidelity (as you zoom in, you get more detail!) and realtime-reactive.
Shadows have been present forever, too. Either baked into the level geometry or realtime, with artifacts like pixelated, noisy shadows, casting through floors, low fidelity. Raytracing allows for clean shadows that pick up a ton of fine detail and are realtime reactive without pre-setup.
Light propagation has been steadily improving. From baked lighting to area lights, we’ve recently gotten global illumination, which allows light to ‘bounce’ off surfaces and adjust the color. Think about how orange the light looks bouncing off cliffs in Monster Hunter: World or Red Dead Redemption 2, or the light volumes in Mirror’s Edge. In the past few years, we’ve gone from doing it in baked lighting to calculated sparingly and in realtime over volumes (1m^2 or less). Raytracing can do this at a much higher resolution.
So…you turn these features on and you get games that look much like games already do, just finer, cleaner, and combining effects you wouldn’t normally see together.
This is the best user-facing guide I’ve seen:
And it’s all subtle! I had trouble defining the difference when flipping it on and off because there are replacements for almost all these effects, but everything is more solid, more placed, and less ‘gamey’ looking in a way that’s really irresistible after you’re used to it (art style permitting).
I’m not sure it makes any actual sense, but it could be cool to see Tron 2.0 filled with ray traced light beams. Turn the black into void space instead of simply a contrast color and I think it could make for a good cyber texture.
The doom one is using a beta version of a reshade shader, so its very portable/customizable.
I used it on a recent replay of jedi knight 2 and it looked quite good the whole way through; kept it subtle and much less reflective than that doom example.
I have to admit that I think properly applied, it can improve older games that are somewhat ugly by default, like Quake 2 and Jedi Knight 2/3
It would ruin games with carefully considered aesthetics, so I’m not super hot on that ultimate doom vid.
Actually, someone made a video of Jedi Academy with RTGI
their settings are a bit more in your face than I would like, I kept roughness at a much higher level
couldn’t really tell a difference between a 670 at 1200p and 3080 at 1440p with everything turned on in Control. variable frame rates though, that should be a constitutionally protected human right even though it’s something that quickly just becomes normal and unnoticeable
Surprisingly, the raytracing shader looks good in games with pre-baked lighting once its set up. Jedi Knight 2 and Jedi Academy had baked in shadows and it consistently looked better than without rtgi, even if I couldn’t put my finger on why it looked better.
Oh nuts. I didn’t save my VoD from when I streamed Black Mesa with all of the ReShade plugins and settings used to make that Black Mesa ray-traced test video.
Playing it that way was pretty wild, save for some weird artifacts that resulted from how the Source Engine organizes its depth buffers.
However, what made the biggest impression on me while playing was leaving the dynamic depth-of-field on. I want more games to have really active depth of field. I doubt it would be for everyone but I thought it was quite amazing.
If I do another replay of a ReShade-empowered Black Mesa I’ll consider slamming the VoD(s) onto YouTube when I’m done with it. Alas, it’ll only be 1080p because that’s all I’ve got.
I haven’t really attempted to get down to the matter of figuring out the nuances of ReShade and its myriad plugins.
So in Control with an RTX card I noticed things like realistic reflections on matte metal surfaces you normally wouldn’t even consider trying to look for (like elevator walls) and very realistic reflections of the world in eyes. Is that RTX at work?