Video Cards

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,743
Location
I am omnipresent
Intel provided a ton of info on Xe2 yesterday, including a bunch of architectural changes that should improve compatibility, but more official clues about what Battlemage will be offering, with the mid-range SKU, the B580, being approximately what an A770, but priced at $200, can do right now and the B980 being a 24GB card with approximately 4070 Super performance for $400 on 225W, available MAYBE Q3 2024.

AMD does not seem to be talking about next-gen discrete Radeons right now, which is a little weird given everything else that's come from Computex.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,848
Location
USA
Is that not providing much competition for nVidia to up its game? I was hoping to buy one more video card beyond the 4070Ti. I only buy one card every 2 years, so cost is not an issue.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,743
Location
I am omnipresent
nVidia is going to have the high end. It has been conceded by everyone else. nVidia will also charge a small fortune for everything it makes and it has definitely decided to do so. Cost is not an issue, but even then, you aren't buying the giant 3-slot GPUs, so you must have a limit as well.
My limit in the world of GPUs is around $500. Past that and I'm out. Enough parts do pass through my hands that I'm sometimes able to work in a "free" upgrade, like the swap between my RX6650 and the A770, but in general I do like it when an OEM sticks to reality-based pricing.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,583
Location
Horsens, Denmark
I'm very happy that Intel is doing GPUs, but game devs still aren't putting in the work to make sure the experience is good enough. Even with AMD cards I feel the need to let the customer know that it is a "value" choice and that some compromises may be experienced.

I can see several reasons why nVidia is pricing their parts with such disdain for their users.

1. They don't need gamers at all, AI developers will buy everything they make for years.
2. Their cards are the most likely to work in whatever scenario you have. Gaming, CAD, rendering, whatever. Their stuff has had a large enough part of the market for long enough that the developers of whatever software made sure it works with nVidia cards before maybe considering AMD or Intel. Is there some backroom dealing by nVidia to encourage this state of affairs with all the big programs? Quite likely.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,660
Location
Eglin AFB Area
Website
sedrosken.xyz
Until they fix the drivers, and I really mean fix the drivers, I'm afraid that whatever performance improvements they make in the silicon aren't going to matter. I hate that I have to be this way but as a former Arc user, I'm now a certified hater.

At least you can in general expect games to work on the Radeons. Intel will work fine for extremely new stuff, but classics people still play online are a total crapshoot. I don't think Xenia's even trying to fix their issues on Arc, I believe the official line is "get a better GPU".
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,743
Location
I am omnipresent
As usual, my caveat is that I really don't care how well games work. Do they mostly work? That's close enough for me. Developers will sort it eventually. Intel GPUs probably are vastly better on Windows than Linux, driverwise, just for the usual reasons around the size of the market and the limited number of coders working to support things. Starfield is the only thing I'm aware that Arc doesn't do well, and I think that's actually intransigence on the part of Bethesda's gang of absolute fucking morons with CS degrees from Incest Hole, Kentucky's Basement Koding Bootcamp Kommunity Kollege rather than lack of effort from Intel. The people who can't ship a game in a playable state even a decade and a half after its original release also can't fix something that doesn't work well on nVidia or AMD? Color me shocked.

I'm not sure what doesn't work with Arc, Windows-wise. Nothing that I've seen anyone try to start up and play in my home. Weirdest behavior I've seen was Doom (2016) randomly picking a screen to start on instead of using the first display like a normal title.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,660
Location
Eglin AFB Area
Website
sedrosken.xyz
That's a fair viewpoint for you I suppose. I just had so much either not work right or display weird issues that my friends coined "Arc moment" as a term within our vernacular. If I was lucky the issue would merely manifest as poor performance given the class of GPU I was using, like only just managing 60fps at native res in a title from ten years ago; if I wasn't, stuff would render incorrectly (NFS Heat, Killing Floor 2, Xenia) or crash at startup (BeamNG, GTA5, though granted that got fixed fairly quickly).

Starfield as I understood it ran like shit on everything because they're still using a 25 year old game engine with bolted-on "improvements" to last time's bolted-on "improvements". I lost faith in them after Skyrim, and looking back, even Skyrim isn't very good. Fallout 3 ran counter to every expectation for Fallout I can think of. I never cared for Oblivion. Morrowind, though, is one of my favorite CRPGs of all time. Possibly my actual favorite. It's probably telling that now the only Bethesda games I think are any good are the one from 22 years ago and the one they didn't even make, Fallout New Vegas.

I hear it's the reverse, actually. That the drivers work better on Linux than Windows. Given how the Linux drivers are open source and not reliant on a company team to make improvements, that doesn't really surprise me. That and you're far more likely to have compatibility issues due to the OS than the GPU on there for games.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,743
Location
I am omnipresent
My partner has re-purchased Skyrim on every platform, every time it has been released. Same thing with the Bethesda Fallout games. I mentioned this somewhere else, but doing the math from her reported play-times across Playstation, Xbox, Switch and PC, she's spent a meaningful portion of her entire life playing those same four games. I will never not give her shit about it, because she insists on playing the unmodded versions with an eye toward getting every in-game award on every platform. Meaning that every time she re-starts a game, she's also looking into the very real possibility that she's going to hit a game-ending bug. She has literally thousands of hours in Skyrim and has never actually won Skyrim. The mods to fix these things are out there, but not for someone who wants all the achievements. So she keeps restarting the games, hoping this will be the play-through that all the scripted events happen and things spawn in properly where they're supposed to and it is just never so. I think she's actually nuts.

My friend group mostly sticks to League of Legends, Overwatch 2 and Fortnite (I play none of those, but I will watch them being played; I like Fortnite best among those) or Unreal Tournament, which they brought into the rotation specifically if they want me to play with them. We did a group Baldur's Gate 3 play through, which was fantastic. I was a knowledge domain gnome cleric.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,660
Location
Eglin AFB Area
Website
sedrosken.xyz
We keep UT99 and Quake III in rotation along with Worms (WMD or Armageddon depending on if one of us is playing on something old), Killing Floor and Killing Floor 2, we had a bit of a Call of Duty kick for a while (World at War/Black Ops 1/2 through Plutonium, Black Ops 3) mostly playing the various Zombies maps though we did do a run through the W@W campaign. We also play a fair bit of Civ 5 still, Bloons TD6, Deep Rock Galactic. I've been playing a LOT of Valheim with a friend, we just got through upgrading all our iron gear and gearing up to fight Bonemass so we can move on to the Mountains biome. We sometimes stream our FalloutNV runs to each other.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,782
Location
USA
I really enjoyed Valheim when it came out. Our group played together for a while but we have not gone back since they've updated a bunch of areas. V-Rising was also a fun game to play together and yet again another one we played through before the game was fully completed.

Now we're playing through The Planet Crafter which is pretty chill and fun to explore and terraform a planet.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,743
Location
I am omnipresent
I went ahead and made a game thread in the pub just to keep this to bitching about nvidia and intel like normal.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,848
Location
USA
nVidia is going to have the high end. It has been conceded by everyone else. nVidia will also charge a small fortune for everything it makes and it has definitely decided to do so. Cost is not an issue, but even then, you aren't buying the giant 3-slot GPUs, so you must have a limit as well.
I can live with 3 slots, i.e., the 4th slot can have another card in it. Last year I found that all the 4080s were slightly wider than 3 slots. I can live with $1200 and 320W cards, but a 285W card is better.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,743
Location
I am omnipresent
Someone posted their experiences with the current flagship Matrox C680 as of June 2024. It's actually an AMD workstation GPU, but Matrox provides all the drivers on its own. The person who made this video does take it seriously as a $50 graphics card and of course it's completely pathetic, but on the other hand, it's pretty sweet if you want to run six 4k monitors on 40W.

 

Santilli

Hairy Aussie
Joined
Jan 27, 2002
Messages
5,177
I remember running 3 monitors off one of those cards.
Think it was the 450.
One DVI, and a DVI splitter for the other two.
One 21", two 17"s.
No games, but crystal clear screen...
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,660
Location
Eglin AFB Area
Website
sedrosken.xyz
Ugh.

Got the 3060 in. It's Fine(tm). But the latest drivers break 10bpc color. The 476.whatever drivers Windows pulled by itself worked fine when I set my display to 1440p120 at 10bpc, with integer scaling even, wow, what a concept -- but the latest ones I pulled, 560.38, break in an interesting, psychedelic way.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,848
Location
USA
Do you have a naively 10-bit calibrated monitor? Years ago I bought an expensive nVidia Quattro to get the 10 bits, but I gave up on 10-bit workflows after finding that most software doesn't support it. I could not see any differences when switching inputs from the 8 bits to 10 bits in any applications at the time. I don't understand how going from 8- to 10-bit color can make a display go psychedelic.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,660
Location
Eglin AFB Area
Website
sedrosken.xyz
I doubt it was calibrated for it, but it did work at least, and I noticed a bit less color banding in stationary gradients with it on, so I would have liked to keep using it.

I don't understand either, but it worked before, and now with the latest driver, it screws up the color palette something fierce.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,848
Location
USA
It could be screwing up in 8 bits also. And you never know if being used and possibly abused is related.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,660
Location
Eglin AFB Area
Website
sedrosken.xyz
I imagine if it were actually a problematic GPU I'd be having much worse issues than the palette freaking out in 10-bit mode. That just sounds like a screwup in the driver, to me. And it's not screwing up in 8bpc, because I'm on it right now and it looks fine. 10bpc, for me, just smooths out a little of the banding you see in gradients on modern LCD monitors -- even if the computer's feeding the display a signal in 32-bit colors, with 8bpc, it's effectively truncated to 24-bit color, or somewhere around 16,777,000 colors. 10bpc brings that up to 30-bit color, which is more on the order of a billion colors. Even if I wasn't getting the full effect before because it wasn't a 10bpc certified display, it did work fine and looked a fair bit better than 8bpc on the same display, so I would have liked to keep it enabled. I'll try it again next time I see the driver's updated. Until then, this is fine.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,848
Location
USA
Back in the 2000s we were taught to use the monitor with LUTs and (preferably internal) sensors for production to maintain consistency over time and between locations. My current monitor expired during pandemonicium, but it uses the 16-bit LUTs regardless of the input. I have mine set to auto-calibrate every Saturday night. Profiles can be used with various softwares.
Regardless of whether you have 8- or 10-bit input from the video card, if the LUT is the suck, you will struglle to map correctly. Some displays have stupidly designed color spaces that are way over in one direction and deficient in another. It was nauseous trying to calibrate that 4K Dell I mentioned about 8 months ago since it was trying to squeeze and stretch to make it fit, which does not help the bandings. I have no clue what the designers of the panel were thinking other than nobody sees color and let's use some cheap LEDs.
 
Top