Video Cards

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
It's on par (a little below, actually) with the 3070, yes. That's still A LOT of graphics card. It's absurd enough that it wasn't a part I felt the need to upgrade once I got it.

After doing a little bit of digging on that Matrix Awakens demo, there is apparently a single thread in Unreal Engine 5 that bottlenecks everyone from pushing past a certain threshold of detail and framerate. Even the people with 12th gen i9s (the top of single thread performance in x86, usually) and RTX 3090s aren't ever seeing much over 45fps at 4k and the highest settings. Turning off the ray-traced lighting detail fixes all that, but also makes the demo look more or less like any other game set in a city released in the past eight or nine years. Since there's no way on Windows to bind a thread to a particular core without source code access, it's not something we can really monitor or control, even though GPU utilization seldom goes over 70% of high-end cards and all-core CPU activity rarely goes over 40%.

However, for purposes of comparison, the Matrix Awakens demo isn't a rock-solid 30fps at 4k on a PS5 or whatever the biggest Xbox is, either.

This is the demo in question. If you skip to about the 3 minute part and skip the bit with the movie stars, you'll see the bits they let our cards render.

 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,926
Location
USA
It's on par (a little below, actually) with the 3070, yes. That's still A LOT of graphics card. It's absurd enough that it wasn't a part I felt the need to upgrade once I got it.

After doing a little bit of digging on that Matrix Awakens demo, there is apparently a single thread in Unreal Engine 5 that bottlenecks everyone from pushing past a certain threshold of detail and framerate. Even the people with 12th gen i9s (the top of single thread performance in x86, usually) and RTX 3090s aren't ever seeing much over 45fps at 4k and the highest settings. Turning off the ray-traced lighting detail fixes all that, but also makes the demo look more or less like any other game set in a city released in the past eight or nine years. Since there's no way on Windows to bind a thread to a particular core without source code access, it's not something we can really monitor or control, even though GPU utilization seldom goes over 70% of high-end cards and all-core CPU activity rarely goes over 40%.

However, for purposes of comparison, the Matrix Awakens demo isn't a rock-solid 30fps at 4k on a PS5 or whatever the biggest Xbox is, either.

This is the demo in question. If you skip to about the 3 minute part and skip the bit with the movie stars, you'll see the bits they let our cards render.

Even though a 2080 Super is a lot of GPU it still wasn't nearly enough for me to game at 4K with decent frame rates at or above 60fps. Only once I got a 3080 have I not noticed any frame drop issues in 4K gaming at 60fps and higher
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
That's about what I thought. I have the 3060 Ti since it was available last year without a crazy price. It's more of a card for 2560x1440 than 4K. Wouldn't brand new software be expected to need stronger cards than 20 series?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
I fully expect that they will.

To be clear, I think there are probably exactly zero games I've played that don't run just fine (i.e. no dips below 30fps) at 4k on a 2080. Cyberpunk 2077 and Battletech are I think the two newest games I own, but the genres of games I tend to actually play, RPGs and strategy games are also not the first sorts of games where people think of needing high frame rates, either.

At this point, I'm more concerned with getting better performance for Resolve Studio and Capture One than what happens with, say, Baldur's Gate 3.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
Video cards are more freely available now. You could get a 3080 or 3080 Ti. :)
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
Video cards are more freely available now. You could get a 3080 or 3080 Ti. :)

At some point in the near future, I'll weigh a Lovelace card vs whatever RDNA3 card vs a 3080Ti. I think the world is about to be absolutely flooded with GTX 1660s, which is making me think that sticking a second GPU in my PC for Topaz software might get me farther than one big GPU.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,926
Location
USA
There will likely be 4000 series gpu announcements in the near future also which could bring prices down on the 3000.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
The leaks on Lovelace so far suggest that the mid-class GPUs and up are going to be massive power hogs and the top-end will have maximum draw in the 600W range. AMD's next GPUs are efficient chiplets with a fast interconnect, and there's no sign that its products will need those 12-pin power connectors yet.

There's an awful lot of software that gets either nVidia-exclusive GPU optimization or where AMD support lags significantly, and I have very bad memories about power-hungry nVidia cards as it is, so next-gen hardware gives me a lot to consider.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,926
Location
USA
I already loathe the power consumption, heat, and noise of my 3080 FE. I wouldn't want a 600W monster in my system any time soon. If AMD can step up their game that'll be fantastic.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
The leaks on Lovelace so far suggest that the mid-class GPUs and up are going to be massive power hogs and the top-end will have maximum draw in the 600W range. AMD's next GPUs are efficient chiplets with a fast interconnect, and there's no sign that its products will need those 12-pin power connectors yet.

There's an awful lot of software that gets either nVidia-exclusive GPU optimization or where AMD support lags significantly, and I have very bad memories about power-hungry nVidia cards as it is, so next-gen hardware gives me a lot to consider.
The increase in power can't be more than the increase in performance, can it? That would be bonkers.
For example, I could understand 40% more power for double the performance, but ~300W would be the most I could handle.
As it is I have to decide which equipment to have on that will not overload the UPS.

I read somewhere that nVidias were really good for the Adobe and other video acceleration image processing software. Is that not correct?
 
Last edited:

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
I read somewhere that nVidias were really good for the Adobe and other video acceleration image processing software. Is that not correct?

I don't use Adobe-anything myself so I'm not completely sure. There's a not-quite-linear increase in performance on Resolve Studio performance that scales with GPU performance based on available cores, and since AMD doesn't make anything that competes with xx80 or xx90-series hardware, that establishes the high end.

There's nothing as extreme as the RTX A-series Quadro cards as there is in gamer-land, but AFAIK the A-series didn't get a 3rd generation RTX refresh, either, so you have to pick between the fastest possible cores with consumer hardware and RAM amounts and having insane amounts of VRAM on the workstation cards that don't work quite as fast.

The best guess for content creation tools is looking at FP32-based benchmarks. There are different benchmarks out there, but usually they are presented in terms of GFLOPS/sec. The rumor is that the 4090 might be 2.5x the speed of the 3090, such that the power draw makes a certain amount of sense.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
Everything is focused on the high end and below that is just relying on the traditional placement of the lower-tier SKUs in comparison. They're saying the fastest GPUs will at least double 3090 and the mid tier (4070-ish) will be somewhere between 10 and 30% faster than a 3090 for most purposes.

There's a web site called mooreslawisdead.com that a lot of hardware sites suggest is a solid, reliable source.

I think I am in the market for a new card and based on that information, I'd probably want a 4070 as well.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
I don't do the U-Tubes nor the socialism medias. The problem is that it is required to log into most of them, but I want to have some plausible deniability, so I refuse to have an account and be involved.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
Pretty sure you can watch Youtubes in a private window. I don't have social media either and while I do have Youtube content, it's all unlisted videos I've shared with maybe five people. In any case, I've summarizing the chatter that's out there as I see it.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
Current chatter in the GPU rumorsphere holds that Ryzen 7000 APUs will approximately on par with a 3060 Mobile part. If that's the case, that's a big step forward for integrated graphics.

I was able to go to my local Best Buy and get an RTX 3070 (not-Ti) for MSRP today. It's getting used in a build for my roommate but I told her to expect to pay $200 more than what it actually cost. I'm in mild shock to find the cards are available locally.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,926
Location
USA
I can't imagine if my gpu was pumping out 600W. I'm already annoyed with the 350W TDP of my 3080.

Wccftech: NVIDIA GeForce RTX 4090 Gets 24 GB GDDR6X Memory at 21 Gbps & 600W TDP, RTX 4070 Gets 12 GB GDDR6 Memory at 18 Gbps & 300W TDP.

 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
I cannot imagine how that works in a laptop. The 12-gen CPUs are power-hungry enough.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
It should probably be pointed out that pretty much nothing in the gaming world is going to fully use all the horsepower those 600W+ cards want because current-gen consoles won't have comparable hardware and more or less all game dev work is going to target those first and foremost. It'll be nice for the people gaming across multiple 4k screens but there will certainly be another generational refresh of video cards after this one before we see a PS6 or Xbox Pi or whatever.

Maybe the better move in the next generation is to move down a rung on your GPU? You'll still get better frame rates and more features for presumably lower power draw than the thing you have now.

I cannot imagine how that works in a laptop. The 12-gen CPUs are power-hungry enough.

Intel has Ark graphics, which presumably will be integrated into future CPUs, and AMD's RDNA3 APUs are purported to be on par with an RTX 3060M. What we currently think as mid-range graphics hardware should be the on-board base for a lot of machines very soon.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
But will those perform well in PTGui or DXO?
I always thought that integrated graphics was not good because it generates more heat and ultimately results in lower maximum performance of the CPU.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
But will those perform well in PTGui or DXO?

It depends on your definition of well, to be sure. If you have a 3080 and move to a 4070, you're probably saving somewhat on your power consumption and getting somewhat better performance than you had. If your only goal is to at least double processing output on video renders, it's probably not happening without buying in at the same series.

I always thought that integrated graphics was not good because it generates more heat and ultimately results in lower maximum performance of the CPU.

I think you have that backwards. Integrated Graphics are well understood and accounted for in the TDP value for CPUs. They also tend to be something of an afterthought compared to general processing cores rather in the same products.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
My point is that they will have to increase the TDP to get the same performance from the CPU and that takes better cooling or it will not perform as well as possible. For example will the 7950x with an added GPU run at 125W instead of the 105W TDP in the 5950x?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
AMD typically restricts their Integrated graphics to midrange at most and I believe they rely on die shrinks to keep the thermals under control. In the past they also used more modest core counts to keep a lid on heat.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
Is there any chance that Win 10 will work on the late 2022 AMD CPUs/GPUs?
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,812
Location
Eglin AFB Area
Website
sedrosken.xyz
Almost 100% I'd say. The only thing you really need Win11 for are the P/E core model Alder CPUs -- the scheduler in Win10 has no concept of how to properly make use of that setup and likely never will.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
Windows 11 has been poorly accepted by the corporate world thusfar. This might be one of those things where the features get backported before too long.

The AMD RX6700 is an awful lot of video card and seems to finally be available below retail price. It's still a bear to find nVidia stuff at MSRP, but the 6700 has 12GB of texture memory, which in theory puts it up in big-boy territory, and it's under $500. To me, that signals the return of the affordable gaming PC.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,812
Location
Eglin AFB Area
Website
sedrosken.xyz
I personally feel like Microsoft doesn't give half a damn about its corporate customers anymore, or Windows 11 as a whole wouldn't have happened. The round of downgrades to the UI -- particularly the taskbar, though the start menu is noticeably crippled too -- feels like them spitting in the face of anyone who actually uses Windows to get any work done. I totally forsee them using the Alder scheduler as a carrot to force upgrades as companies slowly replace hardware. Right now we mandate Win10 and pre-Alder machines (although at the price point we tend to buy, I doubt we'd be getting P/E core Alder anyway) but that can't last forever.

Most of the UI downgrades can be reverted, at least right now, with the help of a third party program that I happen to have purchased -- StartAllBack -- but there's no telling when another "update" will break this, and it's frankly embarrassing that something like this needs to be fixed by someone else anyway. Microsoft is worth hundreds upon hundreds of billions. It's time they started acting like it.

Back to the subject at hand, I sadly find I will likely never own another Radeon until AMD fires the entire driver team and brings in one that knows what they're doing. They keep reintroducing the same bug with mixed-refresh multi-head setups that they've been dealing with for the last several years, and I'm affected by it. Essentially the bug manifests in micro-freezing, every thirty seconds or so the display locks up for a few seconds and doesn't stop the whole time the machine is on. I know it's a driver issue because it's been fixed before. It's pretty much the main reason I jumped at the chance to trade a friend for his old 1070 -- I loathe nVidia as a company but their products at least work.
 
Last edited:

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
I can't think of a single time in the last 30 years when anything branded AMD or ATI didn't have bad drivers. Although to be honest nobody really had GOOD graphics drivers in 1992 anyway.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,812
Location
Eglin AFB Area
Website
sedrosken.xyz
nVidia's drivers aren't a paragon of virtue either -- they waste 99% of the download on the GeForce Experience, which I never install, and it just keeps bloating more and more. I remember raising a stink when the driver package passed 300MB a few years back. I was surprised to find it is now 897MB a few years later now that I have a modern nVidia card again. It's absolutely ridiculous.

Beyond that, the control panel is straight out of 2004, and needs to be dragged kicking and screaming into this decade. Quite literally -- the drivers for my 6800 Ultra have the exact same control panel, minus G-Sync controls. I shouldn't have to install 3rd party software to set a fan curve, and I shouldn't have to mess with registry entries (coolbits is STILL a thing) or use the aforementioned 3rd party software to fiddle with the clocks on my card even if I'm no longer inclined to do that. That's probably the one thing AMD has going for it -- I never had a single issue with Wattman, despite my misgivings with the rest of the package.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
I just use a video card naturally with the drivers installed and nothing else. I have not had any issues since XP64, which was an orphaned OS from the start. Sometimes Windows wants to download new NVidea drivers. :(
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,812
Location
Eglin AFB Area
Website
sedrosken.xyz
In my experience, using the baked-in Microsoft drivers leads to sadness, poor performance and/or a complete lack of OpenGL support sometimes even. I've always downloaded the latest and installed them.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
That is what I am doing after MS borks the driver that was previously installed.
It just sucks that MS constantly forces software updates for no good reason.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
It's deeply frustrating that the lowest tier of new discrete GPUs, stuff like the RX6500XT, are still overvalued on the market. I can get a nine year old GTX 950 for $75-ish, which is just barely an improvement over current iGPUs, but nVidia x10 and x30 cards are still by and large commanding a premium price, and there's really nothing released in the last couple years regularly selling in the $100 to $125 sweet spot. Even the RX570s and 580s, which are about 5 years old and laughably underpowered, are still regularly going for more than that.

It's like no one told the budget market segment that the the entry level gamer cards are back to reasonable price points.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,497
Location
USA
Maybe they are prioritizing stocks of the higher grade video cards?
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,357
Location
Gold Coast Hinterland, Australia
It's the same here, GTX1050Ti's still going for a premium (~AU$250-$300), with RX6500XT going for AU$300+. Mind you RTX3050's are selling for AU$500+.

I think it's a combination of:
  • supply/demand (entry level cards sell a lot more units that mid-range cards), typically selling on price rather than features/performance.
  • retailers most likely purchased the entry level units at a higher price (6mths ago) so very reluctant to drop the price below what they paid for them,
  • iGPUs becoming good enough for entry level gaming (especially on the AMD side), and
  • low-end cards have little margin in them, so there is reason to fix the supply issue when it's easier to push someone into the mid tier for a few extra $$.
And with a rumoured RTX3030 on the horizon, I wonder if there'll be any impact to pricing there.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,374
Location
Flushing, New York
It's the same here, GTX1050Ti's still going for a premium (~AU$250-$300), with RX6500XT going for AU$300+. Mind you RTX3050's are selling for AU$500+.

I think it's a combination of:
  • supply/demand (entry level cards sell a lot more units that mid-range cards), typically selling on price rather than features/performance.
  • retailers most likely purchased the entry level units at a higher price (6mths ago) so very reluctant to drop the price below what they paid for them,
  • iGPUs becoming good enough for entry level gaming (especially on the AMD side), and
  • low-end cards have little margin in them, so there is reason to fix the supply issue when it's easier to push someone into the mid tier for a few extra $$.
And with a rumoured RTX3030 on the horizon, I wonder if there'll be any impact to pricing there.
I bold-faced what I think is the primary reason. I haven't had a discrete graphics card for a decade. And what I'm using now (A10-7870K) is probably only 1/3 as powerful as the latest APUs from AMD. Hard-core gamers are the main ones who need discrete graphics cards. And they certainly wouldn't buy entry-level cards.
 
Top