Many of you don’t know anything and still in the dark. About what? I am talking about the latest advancement done in video game graphics, in this article we’re going to talk about it.
Are you a gamer? Do you like games with eye-catching graphics? Well, then you’ll love the new ambient technology. This is an revolutionary moment for all gamers because of the introduction of RAY TRACING.
Just think how it feels when you are playing your favorite game and everything is going fine. But suddenly your game renders and you died. How will you feel? Definitely Miserable. But not now! Due to the arrival of ray tracing technology.
Recently ray tracing is getting lot of hype status by Electronic Entertainment Expo. The bigfat company made a Game announcements at the gaming show. To clarify Microsoft, Nvidia, and AMD told in a show about their latest upcoming releases. Moreover they promised to bring an amazing technology into your homes.
“I think it’s paradigm shifting,” says AJ Christensen, a visualization programmer at the National Center for Supercomputing Applications.
“There’s a lot of stuff that we’ve been waiting to be able to do. The imagination far precedes the technology, and I think a lot of people are excited and waiting for it.”
So, what is ray tracing? Why there is so hype about it? And how will this play an revolutionary card for gamers?
Let’s talk about it and know more.
What is Ray Tracing?
In simple words, It is a technique that will make the lights in videogames act like it does in our real life. At the time of announcement, Ray Tracing is one of the most talked thing in the gaming community. With the transformation in the games it became important for the company to put it forward for people.
How it works?
It works by simulating actual light rays and using an mathematical algorithm to trace the path of light. In addition, it behaves as a beam of light would take the following in the physical world. With the help of this technique, designers of the game can show the gaming rays of light appear to jump off objects, make a realistic shadows, and create a real looking reflections.
FYI, It is not a new thing, it was made at the time of 1969 by Albrecht Dürer. Scientists using this technology for years for simulating realistic lighting and shadows in the film industry. But the technology was not upto the mark and even today the technology requires a lot of work to improve it.
The Vice president of technical marketing ,Nvidia says in a talk
“A game needs to run 60 frames per second, or 120 frames per second, so it needs to compute each frame in 16 milliseconds,” says Tony Tamasi,
He further added “Whereas a typical film frame is pre-rendered, and they can take eight or 12 or 24 hours to render a single frame.”
The vice president made the announcement at Gamescom 2018, and ever since then it make a talk around the people.
Also, the excitement for ray tracing is beautiful. Felling of being able to process lighting effects in real time. In the new graphics chips that run and go into the next generation of gaming computers and videogame consoles. They have the rendering power to make ray-traced scenes on the fly. Just think if all this happens, it will definitely result in a shifting of tectonic plates of earth. Just kidding!
When is it coming?
The main question is about the arrival date,
I can assure you that Ray tracing is already here, a kind of. If you own a PC that can handle it, you can undoubtedly use it. Furthermore it is available in some current games like
- Minecraft
- Fortnite
- Battlefield V
- Metro Exodus
- Shadow of the Tomb Raider
Also it will available in the upcoming games like Cyberpunk 2077 and Wolfenstein: Youngblood.
Last year, Nvidia introduced ray-tracing capabilities to us, by the release of their RTX graphics card and Nvidia GTX 16 Series line. So your PC would need one of this to properly take advantage of the technology. Current consoles, like the Xbox One and Playstation 4, don’t have the hardware to use it.
For many of us who are unable to afford the graphic card, ray tracing will also be supported by the next generation of game consoles. It includes the Playstation 5 and Microsoft’s mysteriously named Xbox One successor, Project Scarlett.
“It’s a new tool in the toolbox,” Tamasi says.
“We have to learn to use that new tool properly. There’s going to be a whole new class of techniques that people develop.”
How is it different ?
If you look at the way light works in videogames now, it might seem like all the elements are there
• reflections
• shadows
• Bloom
• Lens flare.
But all that is just a lot tricky. The game Developers can pre-render the light effects. But these are prepared into the scene that essentially requires. Just a package of animations that always used out the same thing. Sometimes these effects can look quite good to you, but they’re not dynamic in nature.
“The problem with that is that it’s completely static,” Tamasi says.
“Unless you render in real time, the lighting is just going to be wrong.”
If the player alters the environment by—for example—blasting a hole through a wall, the light in the scene won’t change to stream through that hole unless the developers have specifically planned for that possibility. With real-time ray tracing, the light would adjust automatically.
How does ray tracing work?
Light in real life
Firstly let me tell you how light works in real life. Basically light comes to your eyes and because of this you can see whatever around you. Light Waves made up of countless little particles which we called photons. They shoot out of a light source, bounce across and through a variety of surfaces, then smack you right in the eyeballs. And finally your brain then interprets all these different rays of light as one complete picture.
Light in ray tracing.
Now talking about Ray tracing. It functions nearly the same way, except everything generally moves in the opposite direction. Inside the software the ray-traced light begins at the viewer and moves to outward direction. Plotting a path that bounces across multiple objects and sometimes even taking on their color and reflective properties. Then the software determines the appropriate light source that would affect that particular ray. Moreover this technique of simulating vision in a backward is far more effective for a computer to handle than trying to trace the rays from the light source.
Finally the only light paths that need to be rendered are the ones that fit into the user’s field of view. It doesn’t require more computing power to display what’s in front of you than it would to render the rays emitted from all sources of light in a scene.
Developer’s thought regarding this
The NCSA’s Christensen in an interview said,
“Thousands of billions of photons enter your eye every second,”
They further added “That’s way more than the number of calculations a computer can do per second also so there’s a lot of optimizing and efficiency and hacking that needs to happen in order to even begin to make something look realistic.”
Rather than try to find every single ray of light they find something else. The solution for developers at Nvidia is to trace only a select number of the most important rays. Then use machine learning algorithms to fill in the gaps and smooth everything out it is called “denoising.”
Tamasi added-
“Rather than shooting hundreds or thousands of rays per pixels, we’ll actually shoot a few or maybe a few dozen,”
Talking about Denoising he said “So we use different classes of denoisers to assemble the final image.”
Conclusion
Thinking of all this might excites you, but it will still take a few years before the tech becomes standard. Real-time ray tracing is still in its growing period and has proven to be a little temperamental. And as the hardware improves, developers and designers will have to keep up and try to do their best to come up with something amazing. To summarize all this, you have to wait a little for able to play fully games which are filled with ray tracing.