Video Cards

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,704
Location
I am omnipresent
Website
s-laker.org
I don't know why anyone would expect a xy30 card to be anything but awful. The price is a naked cash grab, but IMO anything xy50 and below should be sitting at $100 on the highest end and $40 - $50 for the least expensive SKU.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,243
Location
Gold Coast Hinterland, Australia
No one was expecting the GTX 1630 to perform well, but IMO it's about 3-4x the price of what it should be, and that's the real stink here. (The HUB video posted does a cost per frame, and the GTX 1630 significantly under performs in this category).

Or with the release of the GTX 1630 just highlights how expensive the rest of the card is vs the cost of the GPU alone? (A lot of manufacturers have indicated all components have seen a huge cost increases recently, and I can't see the GPU core itself in this instance being that expensive to justify the US$150+ price tag).

If we break down the BOM of the card, are we now seeing something like this:
  • GTX 1630 GPU - $10
  • PCB - $5
  • Caps/Chokes/resistors - $20
  • Power regulation ICs - $30
  • Manufacturing - $30 (solder, equipment in factory, labor, etc)
With all things being equal, the current cost is in the card and manufacturing and not the GPU. Swap out the 1630 for a 1660 GPU and the BOM cost only increases by $30-40, and you get a far superior experience? This could explain the lack of new sub $150 cards, in that it's just not worth it to produce entry-level cards anymore?

Or that speculation a little unfounded?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,704
Location
I am omnipresent
Website
s-laker.org
The thing that I emphasize to people who talk to me about getting a new PC is that almost no one really needs a computer with top-end parts. CPUs and GPUs that cost over $300 new will almost certainly be wasted on someone who is going to play games at 1080P. Even someone with 4k screens who wants to play AAA FPS games with every setting cranked is going to find out that their card is limited by a game engine that assumes no one has more graphics horsepower than whatever consoles can do right now. Maybe you get extra draw distance because you have four times more real world frame buffer.

Enthusiasts really don't have the excuse of gaming to drive their purchases; mid-range hardware in the next generation will get them all the power need for 4k60 gaming, and until consoles refresh again, that will probably continue to be true. We just have to get out of the habit of buying the super high end just because.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
15,566
Location
USA
Thought this was relevant given the concerns here about power dissipation:

But it still contains the same general idea that has been espoused for years. We really need to see the performance/watt for the 3000 series compared to the 4000 series. All it takes is a graph or few rather than rambling prose or U-Tube video. Then the buyer can make a reasonable decision. If there is no 4070 soon enough or it is not significantly better I may just buy the 3080.
 
Last edited:

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,508
Location
USA
Definitely agree and once the 4000 series is released, we could easily ballpark in a spreadsheet a performance/watt in various categories to help people decide which area they want to focus on.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,269
Location
Eglin AFB Area
The thing that I emphasize to people who talk to me about getting a new PC is that almost no one really needs a computer with top-end parts. CPUs and GPUs that cost over $300 new will almost certainly be wasted on someone who is going to play games at 1080P. Even someone with 4k screens who wants to play AAA FPS games with every setting cranked is going to find out that their card is limited by a game engine that assumes no one has more graphics horsepower than whatever consoles can do right now. Maybe you get extra draw distance because you have four times more real world frame buffer.

Enthusiasts really don't have the excuse of gaming to drive their purchases; mid-range hardware in the next generation will get them all the power need for 4k60 gaming, and until consoles refresh again, that will probably continue to be true. We just have to get out of the habit of buying the super high end just because.

The thing is, 1080p60 is looked at now much the way 720p30 was looked at a few years ago. The new hotness is high resolution, high refresh stuff -- hell, I've fallen into the trap myself, my main monitor is 1440p144. My 1070 meets my needs handily, for the forseeable future anyway -- I don't really play much new stuff that really pushes the envelope.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,704
Location
I am omnipresent
Website
s-laker.org
That is because hardware companies do marketing and make people think they need the bestest ever new thing, but we've been through this cycle now for at least a couple generations.

At this point, all PC gaming is limited by what consoles can do, and consoles expect to be hooked up to less than ideal TVs rather than proper monitors. Maybe you have a 240Hz OLED TV, but will a PS5 do anything with that? Even when you have full 4k and Freesync support on your PC and video card, you'll find out that a console will run a game at some weird fraction of 4k just to make sure it can do locked 30fps.

There's just no reason to mess with big upgrades. Everything is retarded by what consoles can or can't do.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
15,566
Location
USA
Does anyone care about gaming on a PC anymore? The kids have the PlayStations and I thought most else was about the cryptolithic miners or other types of processing. By the 2030s there will be no PCs just dumb terminals accessing the clouds. I just want to process images locally while it is still allowed.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,269
Location
Eglin AFB Area
The PC gaming scene is alive, well, and possibly the healthiest it's been in the last 20 years. Especially with the consoles (aside from the Switch) just being glorified PCs these days. It's rare we don't get a game released for us, now -- we even got some formerly Sony exclusives a couple years back, the God of War reboot and Horizon: Zero Dawn just to name a couple front-runners.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,508
Location
USA
Agreed that PC gaming is still an active platform and most of the friends I game with all gravitate to PC gaming vs console. That said, Mercutio's points are valid in that a lot of the limitations in gaming performance come from consoles.

However, real advancement does tend to happen on PCs for things like higher resolution graphics, framerates, and now ray tracing capabilities have begun to evolve. PC Gamers are mainly limited by the console market but we do get to have the earlier advancements even if they trickle in slower these days.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,704
Location
I am omnipresent
Website
s-laker.org
Gaming is an expensive hobby, but very often the kids the grew up on Playstations and Xboxes move to PCs for their gaming because there have always been options available for PCs that the consoles never had, like full mod support, cheats or community support for titles that got dropped by a publisher. Many of the young women I'm friends with make a gaming computer their first "luxury" purchase. PCs are also to some degree the happy medium where platform exclusive console titles might get a second release that will never see a port to the other major console. Sony's latest Spider-Man game is an example of that.

On the other hand, I really do think that all consoles do is suck and make gaming worse. I can maybe sorta see the reasoning for a Nintendo Switch, but the better version of that would just be to make games that run on standard mobile platforms and some kind of little joystick and buttons that could clamp on to a screen, rather than having one more device to keep charged all the damned time.

My roommate has an Xbox One, but she no longer uses it because her laptop is good enough for the games she likes. I haven't seen her TV turned on since January.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,508
Location
USA
There are also a decent amount of games that only ever release on PC initially and/or work best with some kind of keyboard & mouse setup. Those tend to not be the AAA titles and/or are from indie developers. I enjoy seeing how games like Valheim get released for a reasonable amount of money on a PC and get such a fan-following that allow the devs to ramp up and make changes and fix issues based on community feedback. I get that we are essentially beta testing an early-released game, however sometimes an early access game releases with pretty good quality that surpasses many of those AAA titles.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,508
Location
USA

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
15,566
Location
USA
The 4070 seems like a practical choice with plenty of performance at 300W.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,704
Location
I am omnipresent
Website
s-laker.org
I want to see where the 4060 lands in the overall ranking, but I'm pretty sure it'll wind up being one or the other. I'm still thinking I might do better with two less expensive cards rather than one big one.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,508
Location
USA
What is the workflow to leverage two GPUs versus one more powerful?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
20,704
Location
I am omnipresent
Website
s-laker.org
It's not unusual for me to be working simultaneously in DaVinci Resolve and editing photos. Resolve uses a GPU more or less continuously. Capture One by itself doesn't do THAT much with a GPU, but if I turn things over to Topaz for denoise or sharpening, it's ~4 seconds to process an image with my GPU or ~30+ seconds with my CPU. I'm usually applying those processes to 300 pictures in a batch and suddenly wanting a second big GPU doesn't seem unreasonable.
 
Top