New Nvidia RTX graphics cards announced

Recommended Videos

Chimpzy_v1legacy

Warning! Contains bananas!
Jun 21, 2009
4,789
1
0
So, Nvidia announced its new line of graphics cards yesterday, along with a September 20th release date. The cards unveiled were the RTX 2070, 2080 and 2080Ti. More info here [https://www.pcgamer.com/nvidia-rtx-2080-graphics-card-release-date/].

Good news for any PC gamers looking to upgrade, although the cards are rather pricey. Then again, the new line always is and we'll likely see a price drop in the 10xx line, so there may the option of picking up one of those on the cheap for the budget-minded enthusiasts.
 

Addendum_Forthcoming

Queen of the Edit
Feb 4, 2009
3,647
0
0
It feels like only yesterday when theywere announcing 1060/70/80 cards...

At this rate can we expect them to be in the 10,000s by 2022?

2019/ 4000s
2020/ 8000s
2021/ 16,000s
2022/32,000s

I was thinking of building a new rig ...
 

Chimpzy_v1legacy

Warning! Contains bananas!
Jun 21, 2009
4,789
1
0
As with current GPU?s, memory is also expected to drop significantly next year with the crypto currency craze reaching a saturated point. The only upgrade I?d really bother with anytime soon would be doubling mine.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
Wanna bet that with Ray Tracing off new cards won't be over 20% faster than Pascal? With Ray Tracing on, Rise of Tomb Raider was running below 60fps at 1080p at all times.
 

Mcgeezaks

The biggest boss
Dec 31, 2009
864
0
21
Sweden
Country
Sweden
Gender
Male
The cards are insanely expensive, a 2080Ti costs $1500 in my country (Sweden) and I thought the 1080Ti I bought for $1000 was expensive. I might buy a 2080 but I'll probably wait for benchmarks to see if it's even worth it.
 

Kerg3927

New member
Jun 8, 2015
496
0
0
I just built a new 1440p/144hz rig in June with a 1080ti. And I'm getting great framerate at max graphics settings in newer games.

These new cards are designed for 4K/144hz gaming, IMO. And the new 27" 4K/144hz monitors are frickin' $2,000. Plus I don't think 4K is really worth it on a 27" monitor anyway. Is the screen even big enough to tell much of a difference from 1440p?

So I'm going to wait a while. When they come out with a 32" 4K/144hz monitor, and it's in the $800 range, I think then I'll be ready to make the jump to 4K. Probably be a couple of years...
 

SupahEwok

Malapropic Homophone
Legacy
Jun 24, 2010
4,028
1,401
118
Country
Texas
Adam Jensen said:
Wanna bet that with Ray Tracing off new cards won't be over 20% faster than Pascal? With Ray Tracing on, Rise of Tomb Raider was running below 60fps at 1080p at all times.
Yeah, that's been my impression. It looks as if these cards are a standard card generation improvement; not even a large jump like the 1000's were, just a regular one. Except for this proprietary ray tracing tech. Now, ray tracing itself is hardly a new tool, it's just that through some combination of their hardware and software, Nvidia have managed to make the process massively more efficient, so that a lot more can be done with ray tracing without taxing the hardware. I don't know the technical details, really; I'm guessing they've got some code libraries that can be integrated into games, and that will call up specific functions from whatever hardware configuration enables this improvement.

So what it comes down to, is that this is like the TressFX thing that Nvidia were pushing back when Witcher 3 and Tomb Raider were released; a proprietary technique that can be incorporated into development and is only useable on Nvidia cards. Except that Nvidia have increased prices by over FIFTY FUCKING PERCENT for the fucking privilege.

AMD and soon Intel are likely to figure out their own means of realising their own version of this tech in 2 to 4 years, at which point it'll become an open standard. And that's provided it catches on, which is never a guarantee even with the shiniest of new computer tech. Until then, this is extraordinarily naked price gouging, even by Nvidia's standards.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
SupahEwok said:
So what it comes down to, is that this is like the TressFX thing that Nvidia were pushing back when Witcher 3 and Tomb Raider were released
Nvidia made HairWorks (which is terrible) and AMD made TressFX and they didn't block Nvidia users from being able to use it or tried to make it run worse on Nvidia hardware.
 

SupahEwok

Malapropic Homophone
Legacy
Jun 24, 2010
4,028
1,401
118
Country
Texas
Adam Jensen said:
SupahEwok said:
So what it comes down to, is that this is like the TressFX thing that Nvidia were pushing back when Witcher 3 and Tomb Raider were released
Nvidia made HairWorks (which is terrible) and AMD made TressFX and they didn't block Nvidia users from being able to use it or tried to make it run worse on Nvidia hardware.
Whichever it was. Point stands. Same shit.
 

Saulkar

Regular Member
Legacy
Aug 25, 2010
3,142
2
13
Country
Canuckistan
Meh.

I am saving up, hoping to snag a used Vega Frontier edition off of Ebay to replace my aging w9100 as it appears that they are no longer making the former. I still need the extra VRAM for 4-8K texture painting while still giving good OpenCL performance in Houdini but I do not need the certification, only the pro drivers for Maya.

I might get an RTX 2070 down the line, if the prices drop, to add my rendernode as the RTX cores will be utilised by my Rendering Engine (Redshift) but that will be at least a year away. Still, that wx8200 is mighty tempting but the low VRAM is worrisome. Good thing I still have a few months to think things through.
 

Chimpzy_v1legacy

Warning! Contains bananas!
Jun 21, 2009
4,789
1
0
Food for thought [https://www.tomshardware.com/news/nvidia-rtx-gpus-worth-the-money,37689.html], although his argument's really grasping at straws for most reasonable consumers. As a concession I suppose there's always the option to sell the old card before getting the next step up, because that's the catch of technology; there's always something even better right around the corner.
 

Elfgore

Your friendly local nihilist
Legacy
Dec 6, 2010
5,655
24
13
I said I was gonna upgrade my PC when the new graphics cards came out.... Don't see that happening right now. But still cool!
 

wings012

Elite Member
Legacy
Jan 7, 2011
856
307
68
Country
Malaysia
As a 3D artist, the ray tracing does somewhat interest me. And if the demo videos they've showed off are truly running in real time(I thought the Star Wars demo was pretty stunning), it could be the next step forward into making games look better. I wasn't that impressed by the BF5 comparison, but I thought the Metro Exodus did a good job of showing it off. There hasn't really been much of a development since PBR shaders/workflow. And asset creation doesn't get anymore complicated. Being able to ditch prebaked reflection maps oughta be nice. I wonder if we can get rid of baked ambient occlusion as well.



It's going to be a while before this sorta thing becomes widespread though. It's going to be limited to the cutting edge of cards, developers might not see it worth bothering with for a while. 9/10 series cards are probably going to stay in use for a good decade.

But maybe, just maybe this is the sorta leap needed to bring forth a new generation of consoles?
 
Nov 9, 2015
330
87
33
From a 3d artist standpoint, this is actually helpful. According to OTOY, it could be possibly 8x faster for raytracing. That's a gigantic performance increase in one generation. The best case scenario is that it saves you from buying 8 GTX 1080Ti's and all those server-grade parts for the same performance.

Now for the gamer, don't even bother. Your games will look almost exactly the same, except your framerate will be crippled if you enable any raytraced shaders. Despite all that NVIDIA marketing, you're decades off if you expect games to look like Hollywood CGI.
 

wings012

Elite Member
Legacy
Jan 7, 2011
856
307
68
Country
Malaysia
I don said:
Now for the gamer, don't even bother. Your games will look almost exactly the same, except your framerate will be crippled if you enable any raytraced shaders. Despite all that NVIDIA marketing, you're decades off if you expect games to look like Hollywood CGI.
At the very least you'll get definitely better reflections. The value of these are up to the individual of course, but there's a clear difference between a low resolution prebaked reflection maps or some muggy screen space reflection vs a proper raytraced reflection. And it's only really a thing games with realistic graphics are concerned about.

But yeah, we're definitely a long way off still. I do believe this is the next big thing though and I hope it becomes baseline somewhere down the line. Whether it be in 5 or 10 years. Regardless of how close it even manages to approach Hollywood CGI. Increasing screen/texture resolution and polygon counts can only really go so far.