Video Cards

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
It's on par (a little below, actually) with the 3070, yes. That's still A LOT of graphics card. It's absurd enough that it wasn't a part I felt the need to upgrade once I got it.

After doing a little bit of digging on that Matrix Awakens demo, there is apparently a single thread in Unreal Engine 5 that bottlenecks everyone from pushing past a certain threshold of detail and framerate. Even the people with 12th gen i9s (the top of single thread performance in x86, usually) and RTX 3090s aren't ever seeing much over 45fps at 4k and the highest settings. Turning off the ray-traced lighting detail fixes all that, but also makes the demo look more or less like any other game set in a city released in the past eight or nine years. Since there's no way on Windows to bind a thread to a particular core without source code access, it's not something we can really monitor or control, even though GPU utilization seldom goes over 70% of high-end cards and all-core CPU activity rarely goes over 40%.

However, for purposes of comparison, the Matrix Awakens demo isn't a rock-solid 30fps at 4k on a PS5 or whatever the biggest Xbox is, either.

This is the demo in question. If you skip to about the 3 minute part and skip the bit with the movie stars, you'll see the bits they let our cards render.

 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,456
Location
USA
It's on par (a little below, actually) with the 3070, yes. That's still A LOT of graphics card. It's absurd enough that it wasn't a part I felt the need to upgrade once I got it.

After doing a little bit of digging on that Matrix Awakens demo, there is apparently a single thread in Unreal Engine 5 that bottlenecks everyone from pushing past a certain threshold of detail and framerate. Even the people with 12th gen i9s (the top of single thread performance in x86, usually) and RTX 3090s aren't ever seeing much over 45fps at 4k and the highest settings. Turning off the ray-traced lighting detail fixes all that, but also makes the demo look more or less like any other game set in a city released in the past eight or nine years. Since there's no way on Windows to bind a thread to a particular core without source code access, it's not something we can really monitor or control, even though GPU utilization seldom goes over 70% of high-end cards and all-core CPU activity rarely goes over 40%.

However, for purposes of comparison, the Matrix Awakens demo isn't a rock-solid 30fps at 4k on a PS5 or whatever the biggest Xbox is, either.

This is the demo in question. If you skip to about the 3 minute part and skip the bit with the movie stars, you'll see the bits they let our cards render.

Even though a 2080 Super is a lot of GPU it still wasn't nearly enough for me to game at 4K with decent frame rates at or above 60fps. Only once I got a 3080 have I not noticed any frame drop issues in 4K gaming at 60fps and higher
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
15,514
Location
USA
That's about what I thought. I have the 3060 Ti since it was available last year without a crazy price. It's more of a card for 2560x1440 than 4K. Wouldn't brand new software be expected to need stronger cards than 20 series?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
I fully expect that they will.

To be clear, I think there are probably exactly zero games I've played that don't run just fine (i.e. no dips below 30fps) at 4k on a 2080. Cyberpunk 2077 and Battletech are I think the two newest games I own, but the genres of games I tend to actually play, RPGs and strategy games are also not the first sorts of games where people think of needing high frame rates, either.

At this point, I'm more concerned with getting better performance for Resolve Studio and Capture One than what happens with, say, Baldur's Gate 3.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
15,514
Location
USA
Video cards are more freely available now. You could get a 3080 or 3080 Ti. :)
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
Video cards are more freely available now. You could get a 3080 or 3080 Ti. :)

At some point in the near future, I'll weigh a Lovelace card vs whatever RDNA3 card vs a 3080Ti. I think the world is about to be absolutely flooded with GTX 1660s, which is making me think that sticking a second GPU in my PC for Topaz software might get me farther than one big GPU.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,456
Location
USA
There will likely be 4000 series gpu announcements in the near future also which could bring prices down on the 3000.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
The leaks on Lovelace so far suggest that the mid-class GPUs and up are going to be massive power hogs and the top-end will have maximum draw in the 600W range. AMD's next GPUs are efficient chiplets with a fast interconnect, and there's no sign that its products will need those 12-pin power connectors yet.

There's an awful lot of software that gets either nVidia-exclusive GPU optimization or where AMD support lags significantly, and I have very bad memories about power-hungry nVidia cards as it is, so next-gen hardware gives me a lot to consider.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,456
Location
USA
I already loathe the power consumption, heat, and noise of my 3080 FE. I wouldn't want a 600W monster in my system any time soon. If AMD can step up their game that'll be fantastic.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
15,514
Location
USA
The leaks on Lovelace so far suggest that the mid-class GPUs and up are going to be massive power hogs and the top-end will have maximum draw in the 600W range. AMD's next GPUs are efficient chiplets with a fast interconnect, and there's no sign that its products will need those 12-pin power connectors yet.

There's an awful lot of software that gets either nVidia-exclusive GPU optimization or where AMD support lags significantly, and I have very bad memories about power-hungry nVidia cards as it is, so next-gen hardware gives me a lot to consider.
The increase in power can't be more than the increase in performance, can it? That would be bonkers.
For example, I could understand 40% more power for double the performance, but ~300W would be the most I could handle.
As it is I have to decide which equipment to have on that will not overload the UPS.

I read somewhere that nVidias were really good for the Adobe and other video acceleration image processing software. Is that not correct?
 
Last edited:

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
I read somewhere that nVidias were really good for the Adobe and other video acceleration image processing software. Is that not correct?

I don't use Adobe-anything myself so I'm not completely sure. There's a not-quite-linear increase in performance on Resolve Studio performance that scales with GPU performance based on available cores, and since AMD doesn't make anything that competes with xx80 or xx90-series hardware, that establishes the high end.

There's nothing as extreme as the RTX A-series Quadro cards as there is in gamer-land, but AFAIK the A-series didn't get a 3rd generation RTX refresh, either, so you have to pick between the fastest possible cores with consumer hardware and RAM amounts and having insane amounts of VRAM on the workstation cards that don't work quite as fast.

The best guess for content creation tools is looking at FP32-based benchmarks. There are different benchmarks out there, but usually they are presented in terms of GFLOPS/sec. The rumor is that the 4090 might be 2.5x the speed of the 3090, such that the power draw makes a certain amount of sense.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
Everything is focused on the high end and below that is just relying on the traditional placement of the lower-tier SKUs in comparison. They're saying the fastest GPUs will at least double 3090 and the mid tier (4070-ish) will be somewhere between 10 and 30% faster than a 3090 for most purposes.

There's a web site called mooreslawisdead.com that a lot of hardware sites suggest is a solid, reliable source.

I think I am in the market for a new card and based on that information, I'd probably want a 4070 as well.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
15,514
Location
USA
I don't do the U-Tubes nor the socialism medias. The problem is that it is required to log into most of them, but I want to have some plausible deniability, so I refuse to have an account and be involved.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
Pretty sure you can watch Youtubes in a private window. I don't have social media either and while I do have Youtube content, it's all unlisted videos I've shared with maybe five people. In any case, I've summarizing the chatter that's out there as I see it.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
Current chatter in the GPU rumorsphere holds that Ryzen 7000 APUs will approximately on par with a 3060 Mobile part. If that's the case, that's a big step forward for integrated graphics.

I was able to go to my local Best Buy and get an RTX 3070 (not-Ti) for MSRP today. It's getting used in a build for my roommate but I told her to expect to pay $200 more than what it actually cost. I'm in mild shock to find the cards are available locally.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,456
Location
USA
I can't imagine if my gpu was pumping out 600W. I'm already annoyed with the 350W TDP of my 3080.

Wccftech: NVIDIA GeForce RTX 4090 Gets 24 GB GDDR6X Memory at 21 Gbps & 600W TDP, RTX 4070 Gets 12 GB GDDR6 Memory at 18 Gbps & 300W TDP.

 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
15,514
Location
USA
I cannot imagine how that works in a laptop. The 12-gen CPUs are power-hungry enough.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
It should probably be pointed out that pretty much nothing in the gaming world is going to fully use all the horsepower those 600W+ cards want because current-gen consoles won't have comparable hardware and more or less all game dev work is going to target those first and foremost. It'll be nice for the people gaming across multiple 4k screens but there will certainly be another generational refresh of video cards after this one before we see a PS6 or Xbox Pi or whatever.

Maybe the better move in the next generation is to move down a rung on your GPU? You'll still get better frame rates and more features for presumably lower power draw than the thing you have now.

I cannot imagine how that works in a laptop. The 12-gen CPUs are power-hungry enough.

Intel has Ark graphics, which presumably will be integrated into future CPUs, and AMD's RDNA3 APUs are purported to be on par with an RTX 3060M. What we currently think as mid-range graphics hardware should be the on-board base for a lot of machines very soon.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
15,514
Location
USA
But will those perform well in PTGui or DXO?
I always thought that integrated graphics was not good because it generates more heat and ultimately results in lower maximum performance of the CPU.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
But will those perform well in PTGui or DXO?

It depends on your definition of well, to be sure. If you have a 3080 and move to a 4070, you're probably saving somewhat on your power consumption and getting somewhat better performance than you had. If your only goal is to at least double processing output on video renders, it's probably not happening without buying in at the same series.

I always thought that integrated graphics was not good because it generates more heat and ultimately results in lower maximum performance of the CPU.

I think you have that backwards. Integrated Graphics are well understood and accounted for in the TDP value for CPUs. They also tend to be something of an afterthought compared to general processing cores rather in the same products.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
15,514
Location
USA
My point is that they will have to increase the TDP to get the same performance from the CPU and that takes better cooling or it will not perform as well as possible. For example will the 7950x with an added GPU run at 125W instead of the 105W TDP in the 5950x?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,618
Location
I am omnipresent
Website
s-laker.org
AMD typically restricts their Integrated graphics to midrange at most and I believe they rely on die shrinks to keep the thermals under control. In the past they also used more modest core counts to keep a lid on heat.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,217
Location
Eglin AFB Area
Almost 100% I'd say. The only thing you really need Win11 for are the P/E core model Alder CPUs -- the scheduler in Win10 has no concept of how to properly make use of that setup and likely never will.
 
Top