Since it first launched hardware accelerated ray tracing in 2018 with the Turing series of graphics cards, NVIDIA has been focusing solely on improving the brand-new rendering technology for succeeding GPU generations. As such, this led gaming journalists to believe that ray tracing is the future of gaming.
But what is it exactly, and is it even worth the massive performance penalties? Is "RTX On" just a buzzword that NVIDIA uses to sell graphics cards? And what's even more important is, can anyone even tell the difference between ray-traced graphics and non-RT ones?
With NVIDIA recently bringing ray tracing support to the entry-level market upon the launch of the RTX 3050 and 3050 Ti, let's dive in and find out whether paying a premium for NVIDIA RTX is worth your hard-earned money.
Read also : 2021 GPU Shortage Will Continue Until The End Of The Year According To NVIDIA; iGPUs And APUs To The Rescue
What is Ray Tracing, Really?
In the simplest words possible, ray tracing is how light behaves in real life as it hits objects, according to WIRED. RT tech replicates this in video game engines by putting a simulated light source in a space, then "tracing" the light rays from it using an algorithm as it bounces off various objects placed in the game world.
With this technique, developers are able to create the likes of almost life-like reflections and shadows, which would otherwise have been impossible. But if you think this was only invented recently, you're wrong. Ray tracing was first thought of back in 1969, and has since then been employed in the film industry to help create realistic simulated lighting and shadows.
Back then, it required a massive amount of processing power. It still does now, but not at the extent that NVIDIA has made it: portable and accessible to the common man. Aside from that, developers can also gain so much from the tech, because it allows them to reproduce realistic lighting and shadow effects in real time. They no longer have to hard-code the effects in the engine, which means they'll be able to save so much time when developing a graphically intense game.
Massive GPU Market Shift
When the RTX 20 series of graphics cards launched, Team Green was on its way to abandoning the "GTX" naming scheme (there were still the 16-series of GPUs on the low-end). What this meant was they were absolutely serious about focusing on RT. As such, all the 20 series chips were equipped with RT cores whose only job is to trace rays. Every other GPU in their lineup didn't have this hardware configuration.
Pretty soon, AMD would join the ray tracing fad themselves with their RX 6000 series of GPUs. And from there, all new graphics tech would be RT-focused, perhaps for the foreseeable future.
Can Normal People Even Tell The Difference?
Linus Sebastian, namesake and founder of the massively popular tech channel on YouTube, did a very informative video on this very topic recently. In the video, they tried to ask a handful of people to distinguish between two identical scenes from a single game, but each were rendered differently: one had RTX on, and the other had it turned off.
The results were quite interesting. Those who have had prior experience in 3D modelling or playing with ray tracing settings on a daily basis found it easier to tell the difference. As for those who don't have the same context, not so much. This was even if these folks were told to look at the scene even more intently. It seemed like they just couldn't see any discrepancies between RT off and RT on.
And here's one more thing: turning on RTX is extremely demanding on hardware, even on the highest-end parts. This is why NVIDIA baked in a new antialiasing setting to help ease RT's stress on their GPUs: DLSS, or deep-learning supersampling. This allowed RT settings to be turned to their highest without too much of a performance penalty, and it did it in a simple way.
DLSS renders the game at a lower resolution (say, 1440p instead of the native 4K) then upscales it using artificial intelligence to match native 4K in terms of quality, pixel by pixel. As a result, RT can be left on without cutting the frame rate essentially in half, because rendering an upscaled image is far easier for the GPU than rendering it in its native resolution.
Final Thoughts
The fact that NVIDIA had to include DLSS as a "failsafe" is because RTX as a whole is still a very experimental rendering technique. It's still not perfect, and there's no powerful-enough card to implement it in a game fully without supersampling trickery a massive performance dip. Plus, the fact that ordinary people can't even tell the difference between RTX On and RTX Off makes it, for now, a luxury and not a necessity.
This article is owned by Tech Times
Written by RJ Pierce