Video Cards

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
There are a lot of CPUs that don't have integrated graphics for one reason or other. I have a few R7 3700s sitting around, for example, and some of the newer i3s and i5s don't have iGPUs either (and what's up with that, Intel? Used to be i3 = iGPU. Not any more). Any of those guys would make someone happy to have, but I'm not pairing them with dozen year old graphics chips and I'm not willing to pay $200+ for a low-end RX6500.

I worked on a PC not long ago that had a contemporary Matrox-branded graphics card in it. Turns out they sell for around $100 on Ebay.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
Or $600 new for a 2015 card. LOL
I used the Matrox cards for years on the CRT displays. Most people were using Matrox with the Artisans even after the early screwed-up LCDs arrived.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
Matrox had those super-fast RAMDACs, which made them amazing for driving high resolution CRTs at extremely high refresh rates, like 1600x1200@90Hz (note to sed: that was beastly in 1999). They also had their house in absolute order with regard to driver quality.

I did a little bit of research, and the GPU that sits on those G420 cards is an AMD part, but all the software is provided by Matrox rather than using any of AMD's drivers. I'm not quite curious enough to go buy one but I'm definitely intrigued.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
IMO the i3 thing is more a production binning issue than a marketing one. Why throw away silicon if only the GPU component is faulty and can work fine as an i3?

Back in the day I went from a Trident based card to a Matrox Mystique 220 and picture quality and the ability to drive more than 1024x768 @60Hz was awesome. The image produced by the Matrox was far crisper/defined than the Trident card. (I later added in 2x Creative 12MB Voodoo II's in SLI, and that was an awesome Quake 1/2 machine).
I will agree about the Matrox drivers from that era being really good, (NT4 support was flawless, and they had open documentation that even XFree86 had working drivers as well).
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
But alas, Matrox couldn't compete with 3dfx, ATi and nVidia in 3D accelartion for OpenGL/DirectX and later morphed into a vendor that supported heaps of displays off a single card, or ultra-high-res for medical imaging. (both niche but profitable areas).
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
Having owned and used a Matrox Millennium from that era I also have fond memories for having stable drivers and fantastic 2D performance compared to anything else that was out at the time. Having eventually moved to a Nvidia Riva TNT2 that was a significant change in 3D gaming that I remember fondly. I still keep that GPU around for nostalgic reasons.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
Matrox had those super-fast RAMDACs, which made them amazing for driving high resolution CRTs at extremely high refresh rates, like 1600x1200@90Hz (note to sed: that was beastly in 1999). They also had their house in absolute order with regard to driver quality.

I did a little bit of research, and the GPU that sits on those G420 cards is an AMD part, but all the software is provided by Matrox rather than using any of AMD's drivers. I'm not quite curious enough to go buy one but I'm definitely intrigued.

I'm well aware of Matrox, heck, for a while I ran a G450 DualHead in my K6. My PPro, after I got the Voodoo2 to replace the Banshee with, used a Millennium II PCI for its 2D card. Both had absolutely superb image quality compared to what they replaced, and the Banshee and Voodoo3 aren't notably bad on that front to begin with -- I'd say they easily hang with the likes of the GeForce and Radeon in that regard. Matrox just cleans house, and embarrassed everyone in 2D acceleration speed to boot. My Millennium II got 94 MB/s in the DOS Screamer 2 benchmark. It would take a 5-year-newer AGP card to beat it. The Millennium II is the first card where I ever noticed that the background of Windows 98 setup is dithered blue/black rather than just dark blue.

The only reason I went back to the Voodoo3 on the K6 is because Matrox always lagged behind in terms of 3D speed -- The G450 was never meant to be the highest-end part anyway, but even the G400 Max only "competed" against the TNT2 and Rage 128 range. A good Voodoo3 spanked all but the TNT2 and Rage 128 Ultra, and even then it was more of an even fight than you'd expect. The TNT2 could render a 32-bit image and was sometimes a hair faster in Direct3D, but the Voodoo3 supported Glide for even better performance and had that 22-bit box filter on the output so the final image was mostly comparable anyway.

I'm also well aware that desktop resolutions higher than 800x600 were rare heading into 2000. 1024x768 was available if you spent a bit more on your monitor, but 1600x1200 was CAD-tier, especially at higher refresh rates. 1024x768 didn't really become "standard" until the cheapo tubes heading into 2003/4 could do that at 85Hz.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
Maybe I'm confused, but wouldn't you have been a very small child back then?
The Matrox were mainly used for work and content creators; Vodoo and such was for the gamers. I had a 1280x display since the 21" Artisans were too bulky and expensive for me at the time. After that the Apple Cinema LCD displays took over and the Matrix cards started to fade as video went fully digital. I started buying Eizos back then and used whatever video card worked well enough in 2D. It didn't matter much until more recent programs used the graphic accelerators just to do basic image processing.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
sed is unusually interested in retro-computing. It's very challenging to build and maintain 30 year old hardware, so more power to him.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
You only have to take a look at my signature and post history to know my specialty is with legacy equipment. Most of it predates me, yes.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
RTX 30 series cards are more abundant lately and prices are becoming more reasonable. I'm tempted to go for the 3080 this summer rather than wait for the 4070 if it is not smaller and uses significantly less power for the same throughput. When will we have better info on the 40 series?
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
I thought they were all Nvidia? Do you have a link to store that has them?
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
I thought they were all Nvidia? Do you have a link to store that has them?
Handy means it's an actual nVidia-manufactured card, PCB and all. Kinda like 3dfx did around 2000 where they stopped partnering with AIBs, except nVidia of course still does partner with them.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
I think I looked at those a while ago, but they are a Best Buy brand and not in stock or orderable. :(
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
The most current rumors suggest that the 4070 will be a 300W GPU, with the 4080 at 420W and the 4090 at 450(!)W.
That suggests to me that the upper ceiling on the class of card I'd be willing to buy is the 4070.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
I'm curious to see what the performance numbers will be on the 4070 compared to a 3080 give the reduced watts. Those numbers are lower than earlier rumors but still a lot of heat to deal with.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
I don't know that I'm buying a modern high-end card at all until they get power consumption back under control. 300W for a tier of card that used to draw only 150 is completely ridiculous especially now that I'm paying my own electric bill. First just for the power it'd draw itself, second for the amount of power it'd take to run the AC to bring the temperature of the office back down after running that miniature space heater. Are we seriously entering an era where PC power supplies need the special 20 amp IEC connector that was on the PowerMac G5? Between GPUs and power consumption on Intel's finest still being not nearly as low as it used to be, I'd say we'll be there within a few years if current trends aren't reversed.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
The 3070 was a 220W GPU, not 150W. The 4070 is slated to have considerably more cores than the 3070 and rumored to be on par with the 3090 (450W~480W) for performance. If the performance rumors hold true, you're seeing a considerable bump in speed with actually a large reduction in wattage when compared to the former 3090.

What you really want to compare the 3070 to is likely a 4050 or 4060 for watt for watt performance once we see benchmark numbers and real-world data.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
The 3070 was a 220W GPU, not 150W.
I was referencing my 1070, which is a 150W GPU. And the specific budget placement, not necessarily performance level. I'm well aware that typically the performance per watt technically goes up, I just find it unreasonable for a mid-high end part to take more than 200W to run. Even that much is a bit of a stretch -- I remember people being up in arms because the R9 390X took 275W and that was placed as AMD's highest-end part at the time.

My office is already the hottest room in my apartment. I'd prefer not to make that situation even worse. And power's only getting more expensive around here.
 
Last edited:

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
The point is, power consumption for these parts is getting wildly out of control. A mid-high end part like the 4070 is positioned to be, even if it will outperform a 3090 per your rumor source, has absolutely no business being a 300W card when energy prices are only getting worse. I won't be buying one just for that fact alone, even if I was in the market for a new card. Power supplies are already having trouble providing those peak numbers with the 3000 series, and I doubt nVidia fixed that particular issue for the 4000 series either, so 300W is probably a pretty conservative "most of the time" peak number.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,168
Location
Flushing, New York
At my current electrical rates a 300W card operating even 4 hours a day would cost $11 a month. That's not counting any additional AC use to remove the extra heat. How the heck do you even cool something like that? Air cooling isn't going to work, not without the system sounding like a vacuum cleaner.

If everything is getting more efficient in terms of computations per watt, I would thing low and mid-range cards would be using less power, not more.

This is one reason I'll stick to APUs. My system uses about 50 watts total most of the time according to the Kill-A-Watt. If I'm running a train simulator that might go to 135 watts. Graphics performance is plenty for my needs. The newest APUs have about twice the performance of mine for the same power consumption.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,737
Location
USA
The point is, power consumption for these parts is getting wildly out of control. A mid-high end part like the 4070 is positioned to be, even if it will outperform a 3090 per your rumor source, has absolutely no business being a 300W card when energy prices are only getting worse. I won't be buying one just for that fact alone, even if I was in the market for a new card. Power supplies are already having trouble providing those peak numbers with the 3000 series, and I doubt nVidia fixed that particular issue for the 4000 series either, so 300W is probably a pretty conservative "most of the time" peak number.
It's a matter of perspective-change and realizing this is the typical evolution and cadence every silicon-making company goes through. Yes, I've agreed that these high wattage requirements are getting out of hand and I don't like it either but the other aspect is, these new cards in the 4070 and higher are going to be insanely fast relative to everything else including that 1070 you referenced. If you can't make use of their performance, there's no benefit in getting one anyway regardless of their wattage requirements.

This is unfortunately nvidias evolution cadence to make progress and sell more product (just like AMD has also done in the past). AMD absolutely went through this back in 2013 to stay competitive (barely). I still have my R9 290X that would pull 350-380W during peak usage and run at 100C temps. AMD (like nvidia) has made huge strides in performance and efficiency since then, it just takes time.

Power supplies are not having trouble keeping pace with 3000 series. I'm literally using an almost 10 year old Seasonic PSU with my 3080 and a R9 3950X. Both of which are high energy consumers and there's no issues with PSU.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
Based on what I've seen the Mac Studio with M1 Max (32GPU) option is about equivalent to or just below a RTX3060... (depends on the benchmark, API, etc)
An M1 Max based Mac Studio draws 115W Max according to official specs: https://support.apple.com/en-au/HT213100
A quick glance at some RTX 3060 reviews put it at 190W peak draw for the GPU alone...

Note: I'm not promoting the mac over a PC, but highlighting that a lot more can be done to bring down power draw on GPUs.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
The M1 Max with 32 GPU cores is the lower end. The M1 Ultra chip has up to 64 GPU cores so that would be like a 3070Ti?
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,590
Location
Eglin AFB Area
It's a matter of perspective-change and realizing this is the typical evolution and cadence every silicon-making company goes through. Yes, I've agreed that these high wattage requirements are getting out of hand and I don't like it either but the other aspect is, these new cards in the 4070 and higher are going to be insanely fast relative to everything else including that 1070 you referenced. If you can't make use of their performance, there's no benefit in getting one anyway regardless of their wattage requirements.

I can make use of it but don't think it's worth the money or the power draw. It may be worth it to you, and more power to you, but I'm not buying until they can get the power consumption back below 200W. Granted, you can buy one of these cards and then run it with a power limit, but that's kind of cheating plus you don't quite get the performance they're touting for these cards with the brakes off.

This is unfortunately nvidias evolution cadence to make progress and sell more product (just like AMD has also done in the past). AMD absolutely went through this back in 2013 to stay competitive (barely). I still have my R9 290X that would pull 350-380W during peak usage and run at 100C temps. AMD (like nvidia) has made huge strides in performance and efficiency since then, it just takes time.

Power supplies are not having trouble keeping pace with 3000 series. I'm literally using an almost 10 year old Seasonic PSU with my 3080 and a R9 3950X. Both of which are high energy consumers and there's no issues with PSU.

That's not what Gamer's Nexus has to say about it. Granted, I still think it's a reporting problem on nVidia's end and they're just trying to pass the buck along. I'm glad you're not having issues, Handy, but they still exist.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
At my current electrical rates a 300W card operating even 4 hours a day would cost $11 a month. That's not counting any additional AC use to remove the extra heat. How the heck do you even cool something like that? Air cooling isn't going to work, not without the system sounding like a vacuum cleaner.

If everything is getting more efficient in terms of computations per watt, I would thing low and mid-range cards would be using less power, not more.

This is one reason I'll stick to APUs. My system uses about 50 watts total most of the time according to the Kill-A-Watt. If I'm running a train simulator that might go to 135 watts. Graphics performance is plenty for my needs. The newest APUs have about twice the performance of mine for the same power consumption.
A 300W card will generally have three fans and be over a foot long, so the ariflow is spread out and it isn't all that noisy.
I don't know about trains, but the APU in commercial aircraft is way louder than a video card. ;)
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
The M1 Max with 32 GPU cores is the lower end. The M1 Ultra chip has up to 64 GPU cores so that would be like a 3070Ti?
Depending on the benchmarks, that's about right... I've seen it range from 3070Ti at the bottom end, to easily beating a 3090 in some. (beating the 3090 was mainly in video encode/decode area). Gaming wise, I would expect closer to the 3070Ti as mentioned. IMO that's quite an achievement, since the M1 Ultra mac Studio only draws sub 250W for the entire system...

IMO I currently think nVidia (and AMD somewhat) a a little stuck in architecture design and are relying on operating frequency (which typically equates to higher power draw) for performance and relying on node shrinks to keep the power under control, than rethinking the design to reduce power draw at the current node size. It kind of reminds me of the late P4 era with Intel, it wasn't until the original Core series that power draw (and heat) were brought back to what many would consider reasonable.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,168
Location
Flushing, New York
A 300W card will generally have three fans and be over a foot long, so the ariflow is spread out and it isn't all that noisy.
Yeah, but you still have to get that heat out of the case, which implies a lot of case fans. I don't even have one case fan. The fan from the power supply provides enough air flow to remove the heat from the case without unacceptable temperature rise.
I don't know about trains, but the APU in commercial aircraft is way louder than a video card. ;)
APU = Accelerated Processing Unit (AMD's fancy name for integrated graphics, although I get the joke.

Trains generally don't have APUs. There's a separate transformer to supply heating/AC/lighting to the passenger cars. Typically this is referred to as "head-end power". For example, the ACS-64 has 970kW of head-end power. That and the power for traction comes from the overhead catenary, then is stepped down to proper voltages by the traction and head-end transformers.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,562
Location
I am omnipresent
nVidia is at the point where AMD was about five years ago. They are currently trying to brute force a low-efficiency design into better and better performance by relying on die shrinks for real improvement. TSMC is delivering, but it's pretty clear they're going to need a new architecture if they're going to get any better. AMD is doing chiplets with a fast interconnect now, but I don't know enough to say whether they're still figuring out how to squeeze everything out of that or if it's also approaching a dead end.

I suspect that the end goal is going to have to be a move to something more like an SoC and tighter component integration, but I don't know how that will work given the way x86 works. We did have a huge performance change when we moved the memory controller inside the CPU and changed to QPI/Hypertransport for external communication, but I don't know how much more of the chipset can go inside a CPU while we still have modular RAM and expansion. Apple ditched all that stuff and I think that's where it gets all the performance and efficiency gains. I just don't want to deal with a system where I have to spend many thousands up front to get all the RAM and storage I need up front.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
16,624
Location
USA
Yeah, but you still have to get that heat out of the case, which implies a lot of case fans. I don't even have one case fan. The fan from the power supply provides enough air flow to remove the heat from the case without unacceptable temperature rise.

APU = Accelerated Processing Unit (AMD's fancy name for integrated graphics, although I get the joke.

Trains generally don't have APUs. There's a separate transformer to supply heating/AC/lighting to the passenger cars. Typically this is referred to as "head-end power". For example, the ACS-64 has 970kW of head-end power. That and the power for traction comes from the overhead catenary, then is stepped down to proper voltages by the traction and head-end transformers.
I've seen a few electric trains in Eastern cities and of course in Europe. However I thought most US locojotives were burning fuels.

I built a µATX system last year with the AMD 5700G. It doesn't have the APU AFAIK, but a Radon graphics chip. It's only a 65W CPU all in, so the Wraith cooler is fine.

My main system is in an older, relatively open-meshed case with two fans in the front, one huge, slow fan on the side to cool the SSDs and PCIe cards and one fan in the rear. The rear fan is in line with the two fans on the Nocturnal NH-D15 cooler. All three are linked to the CPU temperature and ramp up with the CPU under load. The CPU is idling most of the time, so only a little airflow is usual. Likewise the slow side fan and the front fans don't make much noise. I have so far avoided the newer cases have that stupid glass side to display the guts and dust bunnies.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
And nVidia saw how poorly the AMD RX6400 performs, says "hold my heatsink" and releases the GTX1630, which costs more than a RX6400 and performs worse.

Most benchmarks put the GTX 1630 in a similar performance bracket as the old GTX 1050 (non Ti).

Interesting take: https://www.neowin.net/news/best-bu...s-gtx-1650-vs-rx-6400-vs-6500-xt-vs-arc-a380/

(At the time of writing there are retailers selling the GTX1630 for US$199).
 
Top