Ray tracing this, ray tracing that. Almost everywhere you go during this current gaming generation, you're going to hear about this new technology. NVIDIA and AMD are doing it with their modern graphics cards, and Sony and Microsoft have implemented this advanced rendering tech into their gaming consoles for the first time.
But what is ray tracing, really, and why should you care? In this guide, you will know more about the intricacies of this rendering technology and how it might change the gaming industry as we know it.
Ray Tracing 101
In the simplest explanation possible, ray tracing is a graphics rendering technique that makes light in games behave like it does in real life, according to WIRED.
Ray tracing works by simulating actual light rays from the point of origin (usually a light source in a game's scene), then using an algorithm to trace the paths that all the light beams would take within that scene. The result is a very realistic image that, if implemented correctly, is almost indistinguishable from real life.
Here is a video demonstration of ray tracing in action in several games that currently support it:
NVIDIA was the first to bring this technology to video games back in 2018 when they released their first-generation RTX 2000 graphics cards. AMD then followed suit. While the tech itself is impressive, however, it's still in its infancy. Right now, it still consumes way too much hardware power to render its effects, according to Tom's Guide.
This is one of the main reasons why ray tracing is still not available to the wider public. The hardware that you'll need to enjoy its realistic lighting effects are still prohibitively expensive.
It's NOT New Tech; Not Even Close
With how NVIDIA basically jump-started the RTX ON fad, you'd think this technology is quite new, right? You're dead wrong. Ray tracing as a concept is actually over 50 years old, and it's been in use for decades now; just not in video games.
Ray tracing as it is being used and known today was first conceptualized in 1969. It was detailed by Arthur Appel of IBM, who discussed tracing a light ray from the human eye. From then on, the technology progressed rather rapidly until its first commercial use in filmmaking was discussed by Lucasfilm in 1984.
This was the reason why ray tracing was primarily used for films. But despite decades of technological advances, however, ray tracing as a whole hasn't changed much. For one, it still requires a lot of processing power, which modern graphics chips can only provide with dedicated ray tracing cores outside of their normal processing cores.
What Is It Going To Mean for Gamers?
For so many years, photorealism has been the aim of engineers and artists developing real-time rendered 3D computer graphics. Furthermore, it's also been a long-standing goal to reduce the amount of work needed to make even the biggest video games, which ray tracing can actually help with for the foreseeable future.
Here's how. Traditional rendering techniques always have developers working manually to artificially light a 3D-rendered scene. They have to consider every single camera angle, texture map, shadow map, and lighting map to make an image as realistic as possible. All of the "realistic" reflections and light effects you see in-game are faked.
But with ray tracing, however, it won't be. It presents a massive technical achievement in the elusive hunt for photorealistic, moving 3D-generated imagery which is rendered in real time, according to GamesRadar. What Hollywood studios can achieve in weeks can be achieved with RTX in a fraction of the time.
Right now, ray tracing is still early-access tech. But who knows? Maybe with the release of future hardware such as the NVIDIA RTX 4000 series, RTX will be way more widespread and accessible to all.
This article is owned by Tech Times
Written by RJ Pierce