Real-Time Ray Tracing: The Visual Revolution Redefining Games

How ray-traced global illumination transformed realism in modern games

There’s a very specific moment that anyone who played Cyberpunk 2077 with ray tracing enabled probably won’t forget: walking through the streets of Night City at night, rain falling, neon lights reflecting in puddles and on building windows, and the almost unsettling feeling that it all looks too real. It’s not magic. It’s physics. It’s real-time ray tracing and this technology has irreversibly changed what we expect from modern games.

What is ray tracing, after all?
Before diving into its practical impact, it’s worth understanding the concept. Ray tracing is a rendering technique that simulates the real behavior of light. Instead of relying on tricks and approximations like precomputed lightmaps or simplified shadow projections that dominated games for decades—ray tracing fires millions of virtual light rays from the camera, calculating how each interacts with objects in the scene: reflecting, refracting, being absorbed, and creating soft or hard shadows depending on the light source.

The technique itself isn’t new. The film industry has used ray tracing for decades to create photorealistic images in animated movies and visual effects. The problem is that each frame in a film can take hours to render on powerful server clusters. In games, you need at least 30 frames per second. That turns the computational challenge into something entirely different.

The hardware that made it possible
The turning point came in 2018, when NVIDIA launched the RTX 20 Series based on the Turing architecture. For the first time, GPUs included dedicated cores for ray tracing calculations—RT Cores—and others for AI inference—Tensor Cores. The combination of the two is what really paved the way: RT Cores accelerate ray intersection calculations, while Tensor Cores enable DLSS, an AI upscaling technology that helps offset the performance cost of ray tracing.

The RTX 30 Series, released in 2020, marked a significant leap. With the Ampere architecture, NVIDIA doubled the number of RT and Tensor Cores per SM (Streaming Multiprocessor), making ray tracing far more viable at higher resolutions. Games like Control, Watch Dogs: Legion, and Metro Exodus Enhanced Edition began using ray tracing more aggressively, with fully ray-traced global illumination and reflections.

AMD didn’t fall behind. With the RDNA 2 architecture, also launched in 2020 in the RX 6000 Series and in the internal hardware of the PlayStation 5 and Xbox Series X, the company introduced its own hardware-accelerated ray tracing implementation. While initially more modest than NVIDIA’s solution, AMD has steadily improved its technology in subsequent generations.

Visual impact: what actually changes
For those who’ve never seen a side-by-side comparison, it can be hard to understand why ray tracing matters so much. The difference becomes clearer when you start noticing specific details. Take shadows, for example. With traditional techniques, shadows often have hard, unrealistic edges or are approximated in ways that look odd in certain situations. With ray tracing, a shadow cast by a diffuse light source—like a window on a cloudy day—has soft, natural edges, just like in real life.

Reflections are another transformative element. Before ray tracing, reflections on surfaces like mirrors, windows, and water were created using cubemaps—spherical textures captured at fixed points in the scene. The result worked, but was inevitably flawed: moving objects didn’t appear in reflections or showed up with delay, and objects very close to reflective surfaces had visible distortions. With ray tracing, reflections are calculated in real time and accurately show what’s actually in the scene, just as you’d expect from a real mirror.

Global illumination may be the most subtle and at the same time most impactful effect. In real life, light doesn’t just come from direct sources—it bounces off walls, floors, and objects, indirectly lighting everything around it. This indirect lighting is what gives the real world a sense of continuity and visual coherence. In traditional games, this was simulated with precomputed lightmaps that didn’t respond to dynamic changes. With ray-traced global illumination, turning on a red light in a room gives everything around it a subtle reddish tint. It sounds obvious, but it simply wasn’t possible before.

Path tracing: the next level
If ray tracing is already impressive, path tracing is an even more radical evolution. While ray tracing in games is usually implemented selectively—applied only to reflections, shadows, or global illumination—path tracing replaces the entire traditional rendering pipeline. Each pixel is computed by tracing multiple light paths, accounting for all types of interactions: reflection, refraction, subsurface scattering (how light penetrates semi-transparent materials like skin or leaves), caustics (light patterns created by curved surfaces), and much more.

Cyberpunk 2077 was the first game to implement path tracing at scale, with the “Overdrive” mode released in 2023. The result is striking in the best way: Night City gains a visual quality that challenges any discussion about the gap between games and photorealism. The practical issue, of course, is performance—the Overdrive mode requires high-end hardware and the help of DLSS 3 with Frame Generation to be playable.

The future of rendering in games
The path forward seems clear: ray tracing and path tracing will become increasingly accessible as hardware evolves and AI upscaling techniques—NVIDIA’s DLSS, AMD’s FSR, and Intel’s XeSS—grow more sophisticated. The challenge is no longer proving the technology works, but scaling it to run well on mid-range hardware, which represents most of the market.

There’s also important work happening on the engine side. Unreal Engine 5, with its Lumen global illumination system, offers a hybrid approach that combines ray tracing with other techniques to deliver results close to full ray tracing at a much lower performance cost. It’s this kind of innovation that will make the next generation of games feel like as big a leap as the one we saw between the PS3/Xbox 360 era and today’s hardware.

Real-time ray tracing is no longer a futuristic promise. It’s here, in today’s games—and anyone who has played with it enabled knows it’s hard to go back. More than a marketing feature, it represents a fundamental shift in how light is simulated in virtual worlds—and that shift is far from reaching its limit.

Leave a Reply

Your email address will not be published. Required fields are marked *