Video Cards

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
I don't know why anyone would expect a xy30 card to be anything but awful. The price is a naked cash grab, but IMO anything xy50 and below should be sitting at $100 on the highest end and $40 - $50 for the least expensive SKU.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
No one was expecting the GTX 1630 to perform well, but IMO it's about 3-4x the price of what it should be, and that's the real stink here. (The HUB video posted does a cost per frame, and the GTX 1630 significantly under performs in this category).

Or with the release of the GTX 1630 just highlights how expensive the rest of the card is vs the cost of the GPU alone? (A lot of manufacturers have indicated all components have seen a huge cost increases recently, and I can't see the GPU core itself in this instance being that expensive to justify the US$150+ price tag).

If we break down the BOM of the card, are we now seeing something like this:
  • GTX 1630 GPU - $10
  • PCB - $5
  • Caps/Chokes/resistors - $20
  • Power regulation ICs - $30
  • Manufacturing - $30 (solder, equipment in factory, labor, etc)
With all things being equal, the current cost is in the card and manufacturing and not the GPU. Swap out the 1630 for a 1660 GPU and the BOM cost only increases by $30-40, and you get a far superior experience? This could explain the lack of new sub $150 cards, in that it's just not worth it to produce entry-level cards anymore?

Or that speculation a little unfounded?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
The thing that I emphasize to people who talk to me about getting a new PC is that almost no one really needs a computer with top-end parts. CPUs and GPUs that cost over $300 new will almost certainly be wasted on someone who is going to play games at 1080P. Even someone with 4k screens who wants to play AAA FPS games with every setting cranked is going to find out that their card is limited by a game engine that assumes no one has more graphics horsepower than whatever consoles can do right now. Maybe you get extra draw distance because you have four times more real world frame buffer.

Enthusiasts really don't have the excuse of gaming to drive their purchases; mid-range hardware in the next generation will get them all the power need for 4k60 gaming, and until consoles refresh again, that will probably continue to be true. We just have to get out of the habit of buying the super high end just because.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
Thought this was relevant given the concerns here about power dissipation:

But it still contains the same general idea that has been espoused for years. We really need to see the performance/watt for the 3000 series compared to the 4000 series. All it takes is a graph or few rather than rambling prose or U-Tube video. Then the buyer can make a reasonable decision. If there is no 4070 soon enough or it is not significantly better I may just buy the 3080.
 
Last edited:

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
Definitely agree and once the 4000 series is released, we could easily ballpark in a spreadsheet a performance/watt in various categories to help people decide which area they want to focus on.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
The thing that I emphasize to people who talk to me about getting a new PC is that almost no one really needs a computer with top-end parts. CPUs and GPUs that cost over $300 new will almost certainly be wasted on someone who is going to play games at 1080P. Even someone with 4k screens who wants to play AAA FPS games with every setting cranked is going to find out that their card is limited by a game engine that assumes no one has more graphics horsepower than whatever consoles can do right now. Maybe you get extra draw distance because you have four times more real world frame buffer.

Enthusiasts really don't have the excuse of gaming to drive their purchases; mid-range hardware in the next generation will get them all the power need for 4k60 gaming, and until consoles refresh again, that will probably continue to be true. We just have to get out of the habit of buying the super high end just because.

The thing is, 1080p60 is looked at now much the way 720p30 was looked at a few years ago. The new hotness is high resolution, high refresh stuff -- hell, I've fallen into the trap myself, my main monitor is 1440p144. My 1070 meets my needs handily, for the forseeable future anyway -- I don't really play much new stuff that really pushes the envelope.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
That is because hardware companies do marketing and make people think they need the bestest ever new thing, but we've been through this cycle now for at least a couple generations.

At this point, all PC gaming is limited by what consoles can do, and consoles expect to be hooked up to less than ideal TVs rather than proper monitors. Maybe you have a 240Hz OLED TV, but will a PS5 do anything with that? Even when you have full 4k and Freesync support on your PC and video card, you'll find out that a console will run a game at some weird fraction of 4k just to make sure it can do locked 30fps.

There's just no reason to mess with big upgrades. Everything is retarded by what consoles can or can't do.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
Does anyone care about gaming on a PC anymore? The kids have the PlayStations and I thought most else was about the cryptolithic miners or other types of processing. By the 2030s there will be no PCs just dumb terminals accessing the clouds. I just want to process images locally while it is still allowed.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
The PC gaming scene is alive, well, and possibly the healthiest it's been in the last 20 years. Especially with the consoles (aside from the Switch) just being glorified PCs these days. It's rare we don't get a game released for us, now -- we even got some formerly Sony exclusives a couple years back, the God of War reboot and Horizon: Zero Dawn just to name a couple front-runners.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
Agreed that PC gaming is still an active platform and most of the friends I game with all gravitate to PC gaming vs console. That said, Mercutio's points are valid in that a lot of the limitations in gaming performance come from consoles.

However, real advancement does tend to happen on PCs for things like higher resolution graphics, framerates, and now ray tracing capabilities have begun to evolve. PC Gamers are mainly limited by the console market but we do get to have the earlier advancements even if they trickle in slower these days.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
Gaming is an expensive hobby, but very often the kids the grew up on Playstations and Xboxes move to PCs for their gaming because there have always been options available for PCs that the consoles never had, like full mod support, cheats or community support for titles that got dropped by a publisher. Many of the young women I'm friends with make a gaming computer their first "luxury" purchase. PCs are also to some degree the happy medium where platform exclusive console titles might get a second release that will never see a port to the other major console. Sony's latest Spider-Man game is an example of that.

On the other hand, I really do think that all consoles do is suck and make gaming worse. I can maybe sorta see the reasoning for a Nintendo Switch, but the better version of that would just be to make games that run on standard mobile platforms and some kind of little joystick and buttons that could clamp on to a screen, rather than having one more device to keep charged all the damned time.

My roommate has an Xbox One, but she no longer uses it because her laptop is good enough for the games she likes. I haven't seen her TV turned on since January.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
There are also a decent amount of games that only ever release on PC initially and/or work best with some kind of keyboard & mouse setup. Those tend to not be the AAA titles and/or are from indie developers. I enjoy seeing how games like Valheim get released for a reasonable amount of money on a PC and get such a fan-following that allow the devs to ramp up and make changes and fix issues based on community feedback. I get that we are essentially beta testing an early-released game, however sometimes an early access game releases with pretty good quality that surpasses many of those AAA titles.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
The 4070 seems like a practical choice with plenty of performance at 300W.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
I want to see where the 4060 lands in the overall ranking, but I'm pretty sure it'll wind up being one or the other. I'm still thinking I might do better with two less expensive cards rather than one big one.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
What is the workflow to leverage two GPUs versus one more powerful?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
It's not unusual for me to be working simultaneously in DaVinci Resolve and editing photos. Resolve uses a GPU more or less continuously. Capture One by itself doesn't do THAT much with a GPU, but if I turn things over to Topaz for denoise or sharpening, it's ~4 seconds to process an image with my GPU or ~30+ seconds with my CPU. I'm usually applying those processes to 300 pictures in a batch and suddenly wanting a second big GPU doesn't seem unreasonable.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
RTX 4090 and 4080 were just announced. The 4080 has a draw of 700W for the base model, which is WELL into insane territory IMO.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
RTX 4090
  • Starting at $1,599
  • Available October 12th
  • 24GB of GDDR6X memory.
  • 16,384 CUDA Cores, a base clock of 2.23GHz that boosts up to 2.52GHz, 1,321 Tensor-TFLOPs, 191 RT-TFLOPs, and 83 Shader-TFLOPs
  • Nvidia claims it’s 2–4x faster than the RTX 3090 Ti
  • Require a 850-watt power supply

12GB - RTX 4080
  • Starting at $899
  • available in November.
  • 12GB of GDDR6X memory
  • 7,680 CUDA Cores, 7,680 CUDA Cores, a 2.31GHz base clock that boosts up to 2.61GHz, 639 Tensor-TFLOPs, 92 RT-TFLOPs, and 40 Shader-TFLOPs.
  • Require a 700-watt power supply
  • 285W TDP

16GB - RTX 4080
  • Starting at $1,199
  • Available in November.
  • 16GB of GDDR6X memory
  • 9,728 CUDA Cores, a base clock of 2.21GHz that boosts up to 2.51GHz, 780 Tensor-TFLOPs, 113 RT-TFLOPs, and 49 Shader-TFLOPs of power.
  • Nvidia claims it’s 2–4x faster than the existing RTX 3080 Ti.
  • Require a 750-watt power supply
  • 320W TDP
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
These prices must be middle fingers to everyone to encourage people to eat through last years 3000 series stock and I'm assuming once that is done they'll "discount" these new 4000 series to a more-sane price.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
I say don't buy a damn thing and let them sweat. I don't know about you guys but my 1070 does fine for what I need a GPU to do these days. I'm only looking to replace it if it dies and even then I'm more likely to buy a used 3060 (so as not to alleviate their overstock problem) than anything new.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,168
Location
Flushing, New York
These prices must be middle fingers to everyone to encourage people to eat through last years 3000 series stock and I'm assuming once that is done they'll "discount" these new 4000 series to a more-sane price.
For those prices I hope they can render photorealistic games in 8K resolution at 120 fps. That said, are there even any games at this point with such content?

I suspect their target audience is cryptocurrency miners more than gamers/video editors/etc.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
Why does the 12GB RTX 4080 really feel like it should have been called the RTX 4070? (less memory and memory bandwidth, less CUDA/Tensor core, fewer RT cores and lower shader performance).
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
I'm curious of the same thing and have read plenty of comments/feedback also suggesting the same thing about the 12GB version of the 4080 giving it some hate.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
I suspect their target audience is cryptocurrency miners more than gamers/video editors/etc.

The big names in Crypto have moved to proof of stake rather than proof of work, so these aren't hugely appealing to them any more.
Gamers aren't going to see much benefit in this generation unless they're gaming across multiple monitors. Relatively few games support that AFAIK. As usual, consoles are going to be the spoiler of all progress.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
Why does the 12GB RTX 4080 really feel like it should have been called the RTX 4070? (less memory and memory bandwidth, less CUDA/Tensor core, fewer RT cores and lower shader performance).
In that case do you think the 4080 w/12GB would be a good choice for software processing? I refuse to play the video games.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
I'm curious of the same thing and have read plenty of comments/feedback also suggesting the same thing about the 12GB version of the 4080 giving it some hate.
Just got through watching some of the videos on youtube about the new cards, and there certainly is a lot of hate for the 12GB model name. (The card itself is fine, but hate for it being branded an 80's series card when many consider it's more like a 70's series card).
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
What type of processing are you doing, especially what components are being utilised? eg CUDA cores, nvenc, etc?
I don't really know what the video card does at the hardware level, but the files are Nikon and Canon or PSB.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
Just got through watching some of the videos on youtube about the new cards, and there certainly is a lot of hate for the 12GB model name. (The card itself is fine, but hate for it being branded an 80's series card when many consider it's more like a 70's series card).
On reflection of calling it hate, I now feel that the 4080 12GB is more of a disingenuous, maybe even borderline immoral, marketing move from nvidia because it's not just a 4GB RAM difference in the products as we all know now, but that it's branded as a 4080 and riding on the coat tails of the X080 sub series historic precedent as being basically the top-end consumer card. Unless nvidia publishes clearly the 4-5 top specs and differences when selling this compared to the 16GB 4080, it's sold as misleading to consumers.

Also to your point, I don't think the 4080-12GB card will be bad, it's just misleading.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
I don't really know what the video card does at the hardware level, but the files are Nikon and Canon or PSB.
What software specifically are you using to leverage a GPU in processing? Maybe if you share what it is, we can give you more insight into the possible benefits (or none) with a newer GPU.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
On reflection of calling it hate, I now feel that the 4080 12GB is more of a disingenuous, maybe even borderline immoral, marketing move from nvidia[...]

It absolutely is a bald-faced attempt to make what would be a 4070-class card sound more impressive by bumping it up a tier marketing-wise, obviously so they can squeeze more money out of it. And no, it certainly won't perform badly or anything, although it will probably make a few people run out and upgrade their power supplies and switch to smoothed out "budget" plans for their power bills.

They're getting cocky. It's time for someone to give them a nice gut punch again, and I don't particularly care who. Particularly in light of their CEO's remark that “The idea that the chip is going to go down in price is a story of the past.” We need another 9700Pro.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
Sheesh. It's just model numbers. What matters are the performance, power, and cost.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
What software specifically are you using to leverage a GPU in processing? Maybe if you share what it is, we can give you more insight into the possible benefits (or none) with a newer GPU.
I don't know about it yet. Maybe DXO, Topax, Canon, GUI, even the Abode will be installed if necessary. The computer would be the 7950x. From what I understand we can use it until October 2025.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
They're getting cocky. It's time for someone to give them a nice gut punch again, and I don't particularly care who. Particularly in light of their CEO's remark that “The idea that the chip is going to go down in price is a story of the past.” We need another 9700Pro.
Is it a case that they want to continue the profit margins as seen in the last few years (which many would consider an anomaly due to gpu driven crypto mining, pandemic and WFH), thus pleasing shareholders?

Also remember TSMC (and shortly after most other fab's) discontinued bulk discount rates, and IIRC TSNC even raised silicon prices by 20%. When your main product is reliant on someone else's ability and prices, you can either pass on the costs and take a hit to the profit margin. I would assume nVidia are simply passing on the costs...
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
I don't know about it yet. Maybe DXO, Topax, Canon, GUI, even the Abode will be installed if necessary. The computer would be the 7950x. From what I understand we can use it until October 2025.
IIRC Adobe products use the built in encoding engines (QuickSync on Intel, NVENC on nVidia) and on nVidia uses CUDA for acceleration for other operations...

So depending on how well the CUDA code Adobe has written, you could see linear improvements based on the linear increase in CUDA core count, or it may peak at a number of CUDA cores and then see no improvement past a certain CUDA core count. If the former, then look for a card with the most CUDA cores, if the latter, find what the limit is and look for the card with the closest to said count.

Only a good set on independent benchmarks that include your same workflow will be able to give you the answers you're after.
 
Top