Video Cards

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
Intel provided a ton of info on Xe2 yesterday, including a bunch of architectural changes that should improve compatibility, but more official clues about what Battlemage will be offering, with the mid-range SKU, the B580, being approximately what an A770, but priced at $200, can do right now and the B980 being a 24GB card with approximately 4070 Super performance for $400 on 225W, available MAYBE Q3 2024.

AMD does not seem to be talking about next-gen discrete Radeons right now, which is a little weird given everything else that's come from Computex.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,454
Location
USA
Is that not providing much competition for nVidia to up its game? I was hoping to buy one more video card beyond the 4070Ti. I only buy one card every 2 years, so cost is not an issue.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
nVidia is going to have the high end. It has been conceded by everyone else. nVidia will also charge a small fortune for everything it makes and it has definitely decided to do so. Cost is not an issue, but even then, you aren't buying the giant 3-slot GPUs, so you must have a limit as well.
My limit in the world of GPUs is around $500. Past that and I'm out. Enough parts do pass through my hands that I'm sometimes able to work in a "free" upgrade, like the swap between my RX6650 and the A770, but in general I do like it when an OEM sticks to reality-based pricing.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,719
Location
Horsens, Denmark
I'm very happy that Intel is doing GPUs, but game devs still aren't putting in the work to make sure the experience is good enough. Even with AMD cards I feel the need to let the customer know that it is a "value" choice and that some compromises may be experienced.

I can see several reasons why nVidia is pricing their parts with such disdain for their users.

1. They don't need gamers at all, AI developers will buy everything they make for years.
2. Their cards are the most likely to work in whatever scenario you have. Gaming, CAD, rendering, whatever. Their stuff has had a large enough part of the market for long enough that the developers of whatever software made sure it works with nVidia cards before maybe considering AMD or Intel. Is there some backroom dealing by nVidia to encourage this state of affairs with all the big programs? Quite likely.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,787
Location
Eglin AFB Area
Website
sedrosken.xyz
Until they fix the drivers, and I really mean fix the drivers, I'm afraid that whatever performance improvements they make in the silicon aren't going to matter. I hate that I have to be this way but as a former Arc user, I'm now a certified hater.

At least you can in general expect games to work on the Radeons. Intel will work fine for extremely new stuff, but classics people still play online are a total crapshoot. I don't think Xenia's even trying to fix their issues on Arc, I believe the official line is "get a better GPU".
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
As usual, my caveat is that I really don't care how well games work. Do they mostly work? That's close enough for me. Developers will sort it eventually. Intel GPUs probably are vastly better on Windows than Linux, driverwise, just for the usual reasons around the size of the market and the limited number of coders working to support things. Starfield is the only thing I'm aware that Arc doesn't do well, and I think that's actually intransigence on the part of Bethesda's gang of absolute fucking morons with CS degrees from Incest Hole, Kentucky's Basement Koding Bootcamp Kommunity Kollege rather than lack of effort from Intel. The people who can't ship a game in a playable state even a decade and a half after its original release also can't fix something that doesn't work well on nVidia or AMD? Color me shocked.

I'm not sure what doesn't work with Arc, Windows-wise. Nothing that I've seen anyone try to start up and play in my home. Weirdest behavior I've seen was Doom (2016) randomly picking a screen to start on instead of using the first display like a normal title.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,787
Location
Eglin AFB Area
Website
sedrosken.xyz
That's a fair viewpoint for you I suppose. I just had so much either not work right or display weird issues that my friends coined "Arc moment" as a term within our vernacular. If I was lucky the issue would merely manifest as poor performance given the class of GPU I was using, like only just managing 60fps at native res in a title from ten years ago; if I wasn't, stuff would render incorrectly (NFS Heat, Killing Floor 2, Xenia) or crash at startup (BeamNG, GTA5, though granted that got fixed fairly quickly).

Starfield as I understood it ran like shit on everything because they're still using a 25 year old game engine with bolted-on "improvements" to last time's bolted-on "improvements". I lost faith in them after Skyrim, and looking back, even Skyrim isn't very good. Fallout 3 ran counter to every expectation for Fallout I can think of. I never cared for Oblivion. Morrowind, though, is one of my favorite CRPGs of all time. Possibly my actual favorite. It's probably telling that now the only Bethesda games I think are any good are the one from 22 years ago and the one they didn't even make, Fallout New Vegas.

I hear it's the reverse, actually. That the drivers work better on Linux than Windows. Given how the Linux drivers are open source and not reliant on a company team to make improvements, that doesn't really surprise me. That and you're far more likely to have compatibility issues due to the OS than the GPU on there for games.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
My partner has re-purchased Skyrim on every platform, every time it has been released. Same thing with the Bethesda Fallout games. I mentioned this somewhere else, but doing the math from her reported play-times across Playstation, Xbox, Switch and PC, she's spent a meaningful portion of her entire life playing those same four games. I will never not give her shit about it, because she insists on playing the unmodded versions with an eye toward getting every in-game award on every platform. Meaning that every time she re-starts a game, she's also looking into the very real possibility that she's going to hit a game-ending bug. She has literally thousands of hours in Skyrim and has never actually won Skyrim. The mods to fix these things are out there, but not for someone who wants all the achievements. So she keeps restarting the games, hoping this will be the play-through that all the scripted events happen and things spawn in properly where they're supposed to and it is just never so. I think she's actually nuts.

My friend group mostly sticks to League of Legends, Overwatch 2 and Fortnite (I play none of those, but I will watch them being played; I like Fortnite best among those) or Unreal Tournament, which they brought into the rotation specifically if they want me to play with them. We did a group Baldur's Gate 3 play through, which was fantastic. I was a knowledge domain gnome cleric.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,787
Location
Eglin AFB Area
Website
sedrosken.xyz
We keep UT99 and Quake III in rotation along with Worms (WMD or Armageddon depending on if one of us is playing on something old), Killing Floor and Killing Floor 2, we had a bit of a Call of Duty kick for a while (World at War/Black Ops 1/2 through Plutonium, Black Ops 3) mostly playing the various Zombies maps though we did do a run through the W@W campaign. We also play a fair bit of Civ 5 still, Bloons TD6, Deep Rock Galactic. I've been playing a LOT of Valheim with a friend, we just got through upgrading all our iron gear and gearing up to fight Bonemass so we can move on to the Mountains biome. We sometimes stream our FalloutNV runs to each other.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,916
Location
USA
I really enjoyed Valheim when it came out. Our group played together for a while but we have not gone back since they've updated a bunch of areas. V-Rising was also a fun game to play together and yet again another one we played through before the game was fully completed.

Now we're playing through The Planet Crafter which is pretty chill and fun to explore and terraform a planet.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
I went ahead and made a game thread in the pub just to keep this to bitching about nvidia and intel like normal.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,454
Location
USA
nVidia is going to have the high end. It has been conceded by everyone else. nVidia will also charge a small fortune for everything it makes and it has definitely decided to do so. Cost is not an issue, but even then, you aren't buying the giant 3-slot GPUs, so you must have a limit as well.
I can live with 3 slots, i.e., the 4th slot can have another card in it. Last year I found that all the 4080s were slightly wider than 3 slots. I can live with $1200 and 320W cards, but a 285W card is better.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
Someone posted their experiences with the current flagship Matrox C680 as of June 2024. It's actually an AMD workstation GPU, but Matrox provides all the drivers on its own. The person who made this video does take it seriously as a $50 graphics card and of course it's completely pathetic, but on the other hand, it's pretty sweet if you want to run six 4k monitors on 40W.

 

Santilli

Hairy Aussie
Joined
Jan 27, 2002
Messages
5,257
I remember running 3 monitors off one of those cards.
Think it was the 450.
One DVI, and a DVI splitter for the other two.
One 21", two 17"s.
No games, but crystal clear screen...
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,787
Location
Eglin AFB Area
Website
sedrosken.xyz
Ugh.

Got the 3060 in. It's Fine(tm). But the latest drivers break 10bpc color. The 476.whatever drivers Windows pulled by itself worked fine when I set my display to 1440p120 at 10bpc, with integer scaling even, wow, what a concept -- but the latest ones I pulled, 560.38, break in an interesting, psychedelic way.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,454
Location
USA
Do you have a naively 10-bit calibrated monitor? Years ago I bought an expensive nVidia Quattro to get the 10 bits, but I gave up on 10-bit workflows after finding that most software doesn't support it. I could not see any differences when switching inputs from the 8 bits to 10 bits in any applications at the time. I don't understand how going from 8- to 10-bit color can make a display go psychedelic.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,787
Location
Eglin AFB Area
Website
sedrosken.xyz
I doubt it was calibrated for it, but it did work at least, and I noticed a bit less color banding in stationary gradients with it on, so I would have liked to keep using it.

I don't understand either, but it worked before, and now with the latest driver, it screws up the color palette something fierce.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,454
Location
USA
It could be screwing up in 8 bits also. And you never know if being used and possibly abused is related.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,787
Location
Eglin AFB Area
Website
sedrosken.xyz
I imagine if it were actually a problematic GPU I'd be having much worse issues than the palette freaking out in 10-bit mode. That just sounds like a screwup in the driver, to me. And it's not screwing up in 8bpc, because I'm on it right now and it looks fine. 10bpc, for me, just smooths out a little of the banding you see in gradients on modern LCD monitors -- even if the computer's feeding the display a signal in 32-bit colors, with 8bpc, it's effectively truncated to 24-bit color, or somewhere around 16,777,000 colors. 10bpc brings that up to 30-bit color, which is more on the order of a billion colors. Even if I wasn't getting the full effect before because it wasn't a 10bpc certified display, it did work fine and looked a fair bit better than 8bpc on the same display, so I would have liked to keep it enabled. I'll try it again next time I see the driver's updated. Until then, this is fine.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,454
Location
USA
Back in the 2000s we were taught to use the monitor with LUTs and (preferably internal) sensors for production to maintain consistency over time and between locations. My current monitor expired during pandemonicium, but it uses the 16-bit LUTs regardless of the input. I have mine set to auto-calibrate every Saturday night. Profiles can be used with various softwares.
Regardless of whether you have 8- or 10-bit input from the video card, if the LUT is the suck, you will struglle to map correctly. Some displays have stupidly designed color spaces that are way over in one direction and deficient in another. It was nauseous trying to calibrate that 4K Dell I mentioned about 8 months ago since it was trying to squeeze and stretch to make it fit, which does not help the bandings. I have no clue what the designers of the panel were thinking other than nobody sees color and let's use some cheap LEDs.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
the RX 7900XTX and its associated SKUs are kind of shocking in what they're bringing to the table. It looks like you can get a card on Ebay for around $550 for unopened retail boxes vs $1k from usual retail suspects and these things are all in the RTX 4080-ish ballpark, performance-wise. I'd never bothered to check on these guys, but now I am definitely glad I did.

I might still hold out for a high end Battlemage card but from what I know right now, it's not going to be faster and it's not going to be much cheaper than $550. It's mostly just a question if I want to ditch a 350W CPU so I can get a 350W GPU instead vs something that might use a more responsible amount of electricity to operate.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
Apparently, AMD is letting price cuts happen on RDNA 3 cards because there's a huge oversupply in stocks. This explains the ease of finding NIB 7800 and 7900 boards. I haven't seen tons of 40% off 7900XTXs like I did last week, but I HAVE seen tons of Ebay sellers offering them at around 33% off retail. Radeons currently offer the best overall performance for DaVinci Resolve, so I am willing to pay attention to what's happening in that space. AMD does have a next gen card coming but I have heard basically nothing about it.

nVidia has also ceased production of high-end RTX 4000 chips. 4080+ hardware prices are actually going up right now.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,916
Location
USA
Rumors keep dropping about 5080 and 5090s maybe showing up at CES, so it makes sense the 4000s are stopping production.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,719
Location
Horsens, Denmark
I also heard that AMD will be leaving the top-end of the market, so nVidia will have no competition at all. Does not bode well for GPU prices in that range.

Hmm....could I sell the 4090 and just hold my breath long enough to get the 5090. No doubt the 4090 price will tank once the new cards are available.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,454
Location
USA
I also heard that AMD will be leaving the top-end of the market, so nVidia will have no competition at all. Does not bode well for GPU prices in that range.
That would leave a good opportunity for somebody, maybe the Chinese in a few years?
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,454
Location
USA
So long as we don't start putting huge tariffs on stuff from China.
The Swedes or the Americans? My point is that there will be opportunites for somebody to do it, though that might not intersect with state of the art for several years. By the time the nVidias come out with 70 or 80 series it could be any.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,787
Location
Eglin AFB Area
Website
sedrosken.xyz
And all this while I'm hearing rumors that Intel might not even let Battlemage come to production, Alchemist did so poorly in sales. With their dire situation -- apparently Qualcomm is looking at them and licking their chops -- it unfortunately makes sense. I so badly wanted Intel to do well here. If all they get out of this venture is a more competent platform for their integrated graphics, I guess that's all they get, but I am sorely disappointed. I put my money where my mouth is, too, I bought an A770LE in early 2023 and ran it in my main gaming machine right up until I just couldn't make excuses for their driver development team anymore.

If AMD couldn't hack it in the high-end space, I'm not holding my breath for any Chinese companies having to either make their own design from scratch to compete or use likely 5+ year-old stolen technology. I think they're making remarkable strides but to say they'll be trading blows with nVidia anytime soon I think is being perhaps a bit too optimistic. And then there's the tariffs, as already discussed -- no doubt they'll make the Chinese GPUs no better or worse value, and likely with a significantly worse driver development team.

I think we stand to see a good few years of nVidia capitalizing on a market in which they're the only contender, and maybe a renewed focus on value-for-money from the competition and from PC builders. I'm glad I have my 3060 now, because it's not looking like the pricing situation is going to get any better in the next 5 years for me.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
I also heard that AMD will be leaving the top-end of the market, so nVidia will have no competition at all. Does not bode well for GPU prices in that range.

AMD has functionally already left "high end" since there's nothing that gets close to the 4090, but the 4080 and 7900 are competitive. I do think AMD gets a bad rap for general-purpose gaming GPUs. It's the basis for the Xbox and the PS5. Even if the Steam crowd aren't fans and they don't do CUDA, there's still a lot to like, especially in the $300 - $400 range where the 7600XT, 7700XT and secondhand 6900XT live.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
And all this while I'm hearing rumors that Intel might not even let Battlemage come to production, Alchemist did so poorly in sales. With their dire situation -- apparently Qualcomm is looking at them and licking their chops -- it unfortunately makes sense

Intel is making Iris Xe cores regardless, in part because AMD is already there with its APUs and in part because the market is pushing for AI capabilities. Neither AMD nor Intel need to worry all that much how their Add in Boards do because they both have solid markets from traditional OEMs and game consoles, respectively. It does make sense for them to offer at least perfunctory options to system integrators. The AIBs just need to hit price/performance numbers that make them attractive compared to nVida

Arc launched off cycle. It landed between both AMD and nVidia refreshes. It was priced for the rtx30x0 line but by the time it hit, it was an also ran to the next generation parts. It's sounding like Battlemage will launch Q1 2025, right in line with everybody else, with the same strategy as before: a solid line of parts reaching up to mid-range at lower price points than the other guys. The cards are already being manufactured (there are leaked samples out there now), so I doubt they won't launch, but I think we all want to know where the high end ends up for Intel.

The main thing I know is that I'm not willing to pay a 33% premium to get an nVidia card with 33 - 50% less RAM. CUDA support is nifty but not THAT nifty.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,719
Location
Horsens, Denmark
The story I read about AMD leaving the high end quoted AMD execs as saying they needed more mid-range market share to get developers to give their architecture more attention, without which they couldn't be as competitive or reliable as nVidia. If AMD has this problem, Intel is really in trouble.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
The story I read about AMD leaving the high end quoted AMD execs as saying they needed more mid-range market share to get developers to give their architecture more attention, without which they couldn't be as competitive or reliable as nVidia. If AMD has this problem, Intel is really in trouble.

I think the specific area of AMD's concern is OpenCL vs CUDA and AI and datacenter workloads rather than whatever is going on with DirectX and gaming; AMD isn't seen as meaningful competition to nVidia; no one is. That's why nVidia is the whateverth-most-valuable company in the world right now. Intel isn't currently competitive in that space but rather an important actor because of the incredibly vast base of notebook computers that aren't going to have Ryzen or Snapdragon Elite chips in them. There might be more Adreno GPU cores out there (because phones) than Iris Xes, but a developer writing applications for end users would be incredibly foolish to ignore either one.

As far as Qualcomm potentially buying Intel, or parts of Intel: That would be really, really stupid of Qualcomm. It would be better to seek a merger of equals. Intel is weak right now, but it also has man-millenia of experienced engineers and business people who have worked in any number of conditions and market positions. Qualcomm is just wildly successful as a stock for the time being and it's incredibly overvalued for a company that has really just done one thing very well in the last ~20 years. Would Intel Engineering + Qualcomm management lead in productive directions? Would Qualcomm benefit at all from any amount of Intel management? Aren't they both still barking at the feet of TSMC, a company 3x the market cap of Intel and Qualcomm combined?
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,454
Location
USA
What level is considered the high end, in nVidias terms?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,232
Location
I am omnipresent
AMD consolidated its product lines from mainstream/gamer and AI/datacenter to a unified architecture for next generation products. I don't think it's giving up on RX x900 so much as $300k GPU compute racks.
 
Top