It’s been a little over 2 weeks now since Nvidia officially revealed their brand new range of 20XX GPUs. Since then, without any actual cards to look at, while we know what we are getting, we’re still left to speculate a little as to exactly how good they will be. The reveal has done very little to quell the rumours and, in fairness, the reveal itself only really confirmed what we already knew. Well, in so much that Nvidia was launching a new graphics card. Finally knowing the name helped a bit I suppose!
The key focus of the reveal (which you can check out in full here if you missed it) was the introduction of the real-time RTX ray tracing technology. This, for those unaware, would mean that for the first time ever, the technology within the card would be able to render lighting and shadows rather than games using graphics to simulate it.
Initially, we were all blown away. Particularly with the Battlefield V presentation. Since then though, I’ve sobered up a little (metaphorically) and I’ve now got a little bit of a worry about the Nvidia RTX.
Put simply, I’m left wondering if, despite the light show, the RTX ray tracing technology might actually be remarkably limited. In terms of gaming at least!
Before you start the lynch mob, there are certain concessions I’m willing to make. Firstly, I’ll concede that with the Turing Processor, the Nvidia 20XX is likely going to be a significant step up from the 10XX range. Early indications suggest it will be something around 1.5-2 times faster in terms of its graphical power. Secondly, and rather contradictory, there are no ‘real-world’ benchmarks out for the 20XX series yet. As such, everything which has been said so far is about 20% fact, 40% what Nvidia has said and 40% speculation.
Some of you might think I’ve just picked this title as it’s nice and clickbaity. In fairness though, have I ever done that with a rant before? At eTeknix, we’ve always steered away from the sensationalist titles and particularly producing 2 articles within 24 hours surrounding the new graphics cards that directly contradict each other. Naming no names!
Finally, I should also be clear, that despite this, I’m not deliberately looking to dump on the 20XX series. In fact, the chances are that I plan to buy one myself (wallet/kidney willing). Bear with me though, because my theory does hold some water. It’s a theory I’m surprised that no one else has seemingly pointed to date. Well, at least not in any particular detail.
So what is my theory? Well, I’m going to go out on a limb here and say that RTX ray tracing may only work well if you run a game at 1080p. It can’t handle 1440p or better! (Braces for the angry comments!).
I guess you, like myself, have really been getting into the ‘RTX On’ videos that have been coming out in the last week or so. The effects are just as impressive as they were made out to be at the Nvidia reveal, I have, however, made a rather interesting observation.
As far as I can tell, every single one of these games using ‘RTX On’ was running at only 1080p resolution (1920 x 1080). Pretty much what is known as standard HD.
I fully stand to be corrected on this, but to date, I haven’t seen a single gameplay example where it’s specifically noted that the game is running at a higher resolution. Not even 1440p is mentioned let alone 4K. You might be able to point up 1 or 2 examples to the contrary of this, but don’t forget, just because a video is recorded in 4K that does not necessarily equate to the source material.
Every single gameplay video I have seen so far has been run at 1080p. Metro Exodus, Battlefield V, Shadow of the Tomb Raider – all in standard HD. It has, therefore, led me to the following conclusion;
Nvidia RTX may not only work with effectively smooth framerates if you crank the resolution down and if this is the case, does it kind of defeat the point?
Now, for those of you who are used to, or indeed like 1080p resolution, this isn’t going to pose much of a problem. If you have, however, made the switch to something higher, it’s hard going back. While RTX is clearly impressive if it is indeed too much of a workload when higher graphical settings are in action, which one are you going to pick?
Is RTX simply just going to turn into the next Nvidia HairWorks? – A nice thing to have but the easiest thing to disable to improve performance?
If you’re asking me to nail my flag to the mast, then I’m just going to come out and say it – I don’t think even a Nvidia 2080TI (which incidentally is that used in most of the 1080p videos) will be good enough to handle most games at higher resolutions if RTX is turned on.
This leaves you with 3 options; turn on RTX and be happy at 1080p, ignore RTX and be happy at whatever resolution you like, turn on RTX and see how potato-ish the graphics have to get before it’ll run smoothly.
Can you find any gameplay videos with RTX on where the resolution has been higher than 1080p? Is it a choice gamers are going to have to make with the upgrade? In addition, are you planning on buying a 20XX graphics card? – Let us know in the comments!
Mike has made a rather firm opinion here, but one that he stands to be corrected on in the future. Put simply though, if he’s wrong, then he’s an idiot, not eTeknix.com – Just so we’re clear on that point!
Did you enjoy Mikes Rant? If so, please check out his previous ones which include:
Did you enjoy Mike’s Rant and want to check out more? – Well, for all of his rants you can check out the link here!
Which one is your favourite? – Let us know in the comments!
In something that has undoubtedly represented one of the most bizarre pieces of gaming news…
Following the official announcement of Forza Motorsport 8(?) for the Xbox Series X back in…
If there is a problem that could potentially be fixed with robots, then it's not…