Video Cards

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,719
Location
Horsens, Denmark
At our place and in most of the cities in Denmark we get hot and cold water from the city. The water traditionally has been heated by burning waste, but as there is less waste these days some cities are implementing massive building-sized heat pumps to heat the water. Once the hot water reaches your house or apartment you use water-water heat exchangers to pull the heat from the city water and heat your underfloor heating and any hot water requirements. That is why I didn't do anything like that for my place.

And I haven't done anything with my PC yet, as we aren't sure where we'll end up. Current visa is good for another 3 months, working on options.

My friend lives in the country, and still has an independant heat pump water heater. The heat pump units are way more energy efficient than the tankless units, but take a longer time to heat the water, so still need tanks.

The heat pump tech is pretty cool.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,365
Location
Flushing, New York
I'm surprised they don't use tankless water heaters in Europe.
I looked into that first for the kitchen sink. They have ludicrous power requirements, starting at well over 5 kilowatts. That doesn't even get you a great flow rate. If your incoming water is 40°F (typical in NYC winters) and your outgoing is 110°F, 5 kW only gives you ~0.5 gpm, maybe twice that with summer water temps. OK to wash your hands, maybe clean lightly soiled dishes, but that's about it. To get enough for washing dirty pans and stuff you're probably talking 15 kW or more. For showering ( 30 to 60 gpm ) you get well into the 100 kW to 200 kW area. Granted, a tankless system using heat pumps instead of resistance heating can cut these numbers by a factor of 3 or 4, but in most cases you're still looking at installing a 240VAC circuit.

Tankless water heaters with batteries might be an idea. For every minute of hot water at 1 gpm you would need around 100 to 200 W-hrs of battery capacity, depending upon incoming water temperature. Not sure if the cost/benefit ratio would pan out but at least you could use a 120VAC, 15A circuit.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,365
Location
Flushing, New York
At our place and in most of the cities in Denmark we get hot and cold water from the city. The water traditionally has been heated by burning waste, but as there is less waste these days some cities are implementing massive building-sized heat pumps to heat the water. Once the hot water reaches your house or apartment you use water-water heat exchangers to pull the heat from the city water and heat your underfloor heating and any hot water requirements. That is why I didn't do anything like that for my place.

And I haven't done anything with my PC yet, as we aren't sure where we'll end up. Current visa is good for another 3 months, working on options.

My friend lives in the country, and still has an independant heat pump water heater. The heat pump units are way more energy efficient than the tankless units, but take a longer time to heat the water, so still need tanks.

The heat pump tech is pretty cool.
Sounds like a much better idea than every home having a furnace and hot water heater. Are there any heat exchangers to recover some heat from wastewater? In theory if you could recover most of the heat once you "prime" with some hot water to get things started you would need to add very little additional heat to maintain a continuous flow of hot water.

Heat pump hot water heaters are catching on in the states, but for now gas or oil or resistance heating is still far more common.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,451
Location
USA
Of course we are now totally off of the topic, but I meant like the Natural Gasses. Electric tankless is good for a sink or something.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,719
Location
Horsens, Denmark
We are totally off topic, but to answer JTRs question, the hot water from the city is an isolated closed-loop system. You control the flow rate of it through the heat exchangers at your place to heat your own water to your desired temperature. The city hot water is supplied at 80C+ and they want you to return it at 40C- to maximize efficiency of the infrastructure.

As such the hot water is actually billed in kwh since all you are doing is extracting energy.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,451
Location
USA
Maybe they can run some city cold water like air conditioning in the summer. ;)
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
Leakers seem to think that the RTX 5090 and/or 5080 will be announced this June, with availability in Q3 2024. The stated reasoning for this is increasing competition from AMD for AI hardware necessitating a speedy launch of next-gen nVidia chips more generally. So if you really want to get the plumbing ready for your turbo water heater upgrade, here it comes.

I popped an A770 in the Ryzen 5900 PC in my living room over the weekend. My partner says Lightroom AI Denoise is a couple seconds faster per image (~16s per image rather than ~19s per image, averaged over 100 processed files) than it was with the RX6650 and otherwise, she can't tell the difference for creative software, but it's also subjectively worse at Fortnite, which I thought was one of those runs on a potato esports titles. I
 
Last edited:

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,451
Location
USA
I'm still thinking about performance per watt and absolute power rather than the product level numbers. 300W is about my personal limit, so that may not be a 5080.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,719
Location
Horsens, Denmark
It might be if money isn't as much of a thing? I've seen really good performance from my rig even with the max power draw turned way down. It is certainly not performance/$, but performance/watt is reasonable, and if you do find yourself in a hurry at some point you can just turn it up. Your mileage may vary, as I've found myself only turning it down on hot days.

It is likely this tweak will also work on the 5090:
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,451
Location
USA
I ruled out the 4090 years ago since the behemoth won't fit without hogging the adjacent 8x/16x PCIe slots. Power is mostly a UPS and home circuits issue, not costs. My GPUs are either at very low power or at full power when DXO is processing the files. I'm not playing video games like the German on the U-Tubes. :LOL:
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
It strikes me that we might all wind up using eGPUs if the future is four-slot 600W graphics hardware. Just build the water cooling and 6kg of finned copper radiator directly in to the device and call it a day.
 
  • Like
Reactions: fb

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,365
Location
Flushing, New York
I'm thinking the reason for these monster graphics cards might have to do with the fact iGPUs have gotten good enough for most people to not need a graphics card. They essentially fill the niche which used to be taken up by lower end graphics cards. I guess the reasoning goes, why make a $50 or $100 graphics card which is only twice as fast as an iGPU? Probably not much of a market there. They figure if you're going to get a separate graphics card, then you want to go BIG. At least 10x the performance of the best iGPU, with a heat sink to match.

Also, these heat sinks are works of art in and of themselves. I'm sure that has something to do with it as well.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
I suspect that the day we get APUs that are actually somewhat competitive with current-gen consoles, we'll be able to say they're good enough for most people. That isn't the case yet, but supposedly AMD Strix Point, scheduled for release in 2025, be able to deliver 4070-like performance on an APU. For comparison, a Playstation 5 operates and somewhere around 6700RX or RTX2070-level performance.

Of course, by 2025 we should be looking at some kind of successor to current consoles anyway, but it's looking like things are going to get better for everyone pretty soon.

I look at videocardz.com as a rumor aggregator.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,451
Location
USA
I'm thinking the reason for these monster graphics cards might have to do with the fact iGPUs have gotten good enough for most people to not need a graphics card. They essentially fill the niche which used to be taken up by lower end graphics cards. I guess the reasoning goes, why make a $50 or $100 graphics card which is only twice as fast as an iGPU? Probably not much of a market there. They figure if you're going to get a separate graphics card, then you want to go BIG. At least 10x the performance of the best iGPU, with a heat sink to match.

Also, these heat sinks are works of art in and of themselves. I'm sure that has something to do with it as well.
Most all consumer computers need an NPU function for AI by 2025, so the older wimpy ones will be negligent.
The chiplet model might have a CPU, GPU, and NPU in one package like Merc said. The hardware/OS cabal is demanding that you upgrade. :LOL:
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,719
Location
Horsens, Denmark
I've read many places that the entry-level standalone GPUs are a suckers game already, with iGPUs beating or nearly beating them in most scenarios. The current AMD iGPU seems good enough for a lot of use cases, as seen by many of the handheld gaming consoles. The last several laptops I've had used some kind of technology where it switched from dedicated graphics to the iGPU when 2D content (web browsers or productivity apps) are in use. I wouldn't mind such things for desktops.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,451
Location
USA
I'm not knowledgeable about the efficiency gained by having a single package, but does a 65-105W "APU" outperform the video graphics of a 115W or 160W discrete card and 65W CPU for example? Maybe it's not fair to compare a 2022 40 series GPU to a 2024 integrated video technology, so think about a 50 series with the same power limits. The question is how many watts in the internal is equal to how many externally.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,365
Location
Flushing, New York
I'm not knowledgeable about the efficiency gained by having a single package, but does a 65-105W "APU" outperform the video graphics of a 115W or 160W discrete card and 65W CPU for example? Maybe it's not fair to compare a 2022 40 series GPU to a 2024 integrated video technology, so think about a 50 series with the same power limits. The question is how many watts in the internal is equal to how many externally.
Isn't quite a bit of power used just sending data to and from a discrete GPU? When everything is on one chip data transfer is a lot more efficient. It's easily possible that a 65 to 105 W APU can come close to the performance of a 65W CPU and ~100W graphics card as a result. Even if the discrete solution is better, the question is by how much? 1.5x? 2x? Probably not worth the hassle of a separate graphics card unless you're at least maybe 3x better than an APU. There's the extra power use, plus another noisy fan. Back in the day when you had no choice but to have a separate graphics card, I always went to the passively cooled ones. There's literally nothing passively cooled these days. You just can't do it when you're talking at least 100 watts of heat, at least not in any reasonable size.

My system (less the monitors) uses about 45 to 60 watts from the wall most of the time. My ear has to be within 6 inches of the case to hear anything at all. That's how I like it.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
Most all consumer computers need an NPU function for AI by 2025, so the older wimpy ones will be negligent.

I still wanna know why we need an NPU. Much as I love letting ChatGPT write 90% of a script for me, but it's not like I'm running a local LLM to do that.

I've read many places that the entry-level standalone GPUs are a suckers game already, with iGPUs beating or nearly beating them in most scenarios... I wouldn't mind such things for desktops.


They are a sucker's game, with the nVidia x50 and sub x500RX hardware largely unchanged year over year. We still have Xeons, Threadrippers, -F Intel SKUs and AM4 Ryzens that don't have internal graphics, but chances are that if you bought one of those CPUs on purpose, you probably weren't planning to use an iGPU anyway. Maybe you needed the budget thing because the iGPU didn't support 3x outputs or you had to have NVenc instead of Quicksync, but that's about it.

Even if the discrete solution is better, the question is by how much? 1.5x? 2x? Probably not worth the hassle of a separate graphics card unless you're at least maybe 3x better than an APU.

A handy thing to know about is the TechARP GPU database.

But as an example, the Ryzen 7 7600 (non-G) has 2x RDNA 2 cores in it, which should put it somewhere in the generational ballpark with a Radeon 5 or 6x00. Even the crappiest discrete card, the RX 5500, had 22 RDNA 2 cores and 4GB of dedicated DDR6, so you know it's not going to be a pretty comparison, but even estimating, we're seeing that it's probably going to be around 1/10th the performance in the best case, and that's generously assuming we'd buy the card we already know is going to be a poor value for money.

Is it worth the 150W or 220W to move up to a 7600 or 7700 over the integrated graphics? Probably, if you remotely care about graphics/3D/AI. It's not just 2x or 3x performance benefit, at least not for desktop CPUs and discrete GPUs.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,365
Location
Flushing, New York
I still wanna know why we need an NPU.
Same here.
Is it worth the 150W or 220W to move up to a 7600 or 7700 over the integrated graphics? Probably, if you remotely care about graphics/3D/AI. It's not just 2x or 3x performance benefit, at least not for desktop CPUs and discrete GPUs.
I'm going by how often you would actually need that extra performance. Take my use case. About the only thing I currently do which taxes my iGPU is play Open Rails. Even then, 90% of the time I'm getting 40 to 60+ FPS, with 30s most of the rest of the time. Maybe a few percent of the time frame rates drop annoyingly low, like in the teens, in scenery dense areas. Keep in mind this is with an A10-7870K, which was state-of-the art close to a decade ago. On my laptop with a newer Intel iGPU those same scenes which were ~15 FPS on my desktop only drop to 25 FPS. All the other stuff is proportionally higher, as in close to or over 60 FPS much of the time. A current generation Ryzen APU would probably put me over 60 FPS all the time. So what does a person like me need to waste money and a few hundred watts of power when there would be little or no benefit? Even my desktop APU only gets annoyingly limited for a few percent of the time I'm using one particularly program, or well under 1% of my total use.

I could imagine hard core gamers using the latest games might use the extra graphics HP often enough to justify it but that's not me. It's not even most people.

A big factor which might keep me from even wanting to go with a better APU and motherboard for my desktop in the near future is the fact the drivers for Win7 don't exist for anything much newer than my current setup. I'm not starting from scratch again with Windows 11. I went through that when I went from XP to 7. It takes literally months to get the system set up. And some things stopped working going from XP to 7. I hate to think how much would stop working going from 7 to 11. A faster system where I can't use half my current software isn't particularly useful to me. The only reason I can see to upgrade is if I suddenly have a use case for software which needs a much more powerful system. And then I'd still use my current desktop for most stuff, but fire up the new rig when I'm using software which just won't run well on the older machine.

Thinking about it, for now my laptop fills the need for a more modern Win 11 system if I find software that just can't work on my current desktop. And the 3K OLED screen is really nice.
 
Last edited:

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
I'm going by how often you would actually need that extra performance.

The standard line for AMD for a long time has been that its integrated graphics tend to be substantially better than whatever Intel happened to drop in a diaper. It was entirely valid for a long time: AMD integrated graphics was double or triple what Intel offered for the same money. That's not actually true any longer. Iris Xe isn't substantially worse than what's been stapled on to contemporary AMD CPUs (it's something like 40 - 50% worse, where it used to be 2x or 3x worse, a big real world improvement).

We don't know what people are doing with graphics generally. We also don't know what people will be doing in three years (AI! We're all gonna have Google Bard running everywhere! Is it still called Bard this week? Who cares? CoPilot will be in our watches by 2026! Also we're going to go back to wearing watches, so we can get extra CoPilot!). But bear in mind that your GPU also does things like decode video and in some cases even render effects on a desktop. We can say that it's probably better to have some extra capability for those things even if the end user doesn't directly or immediately care. From that line of thinking, it's better to have something than not, especially for a system a user is planning to keep to the point of actual obsolescence, which does tend to be the way home users are with their systems nowadays.

The difficulties present in moving from one Windows system to another are a whole other subject that I'll admit I don't typically see as a problem, but dealing with those transitions has been my job for my entire adult life. It's probably worthy of a different topic, especially given that probably none of us nominally interested hobbyists and pros are at all enthusiastic about contemporary OS offerings from Microsoft.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,365
Location
Flushing, New York
The standard line for AMD for a long time has been that its integrated graphics tend to be substantially better than whatever Intel happened to drop in a diaper. It was entirely valid for a long time: AMD integrated graphics was double or triple what Intel offered for the same money. That's not actually true any longer. Iris Xe isn't substantially worse than what's been stapled on to contemporary AMD CPUs (it's something like 40 - 50% worse, where it used to be 2x or 3x worse, a big real world improvement).
Well, that explains a lot. Honestly, when I first tried running Open Rails on my laptop, I crossed my fingers that the performance would at least match that on my desktop. When I saw it was probably 2x better, I was pleasantly surprised. I didn't know Intel closed the gap with AMD this much.
We don't know what people are doing with graphics generally. We also don't know what people will be doing in three years (AI! We're all gonna have Google Bard running everywhere! Is it still called Bard this week? Who cares? CoPilot will be in our watches by 2026! Also we're going to go back to wearing watches, so we can get extra CoPilot!). But bear in mind that your GPU also does things like decode video and in some cases even render effects on a desktop. We can say that it's probably better to have some extra capability for those things even if the end user doesn't directly or immediately care. From that line of thinking, it's better to have something than not, especially for a system a user is planning to keep to the point of actual obsolescence, which does tend to be the way home users are with their systems nowadays.
Here's the thing. Yes, having that extra capability if needed makes sense. The question is at what cost in terms of extra money, noise, and power consumption. Here's my idea for that. First off, keep the power dissipation under maybe 150 watts maximum. And use the entire side of a case to passively cool the GPU. Yes, it'll be a huge hunk of metal weighing probably 10 pounds, but you can passively cool 150 watts while keeping temperatures reasonable with something that size.

Next, still have an iGPU to use most of the time. The idea is the discrete GPU gets turned completely off, as in 0W, unless you need graphics HP beyond what the iGPU can deliver. That saves a lot compared to a card which might idle at 20 or 30 or 40 watts.

So we've taken care of the noise and power consumption issues with a separate GPU. Last issue is cost. I'm neglecting the cost of the huge heatsink. If we went this route, you would get a case with a heatsink, but be able to use that same heatsink if you changed GPUs. So basically you have a board that bolts on to the heatsink, with some sort of connection to the motherboard. If you can make the board for $50 or less, $75 tops, you have a winner that even people like me would buy.

EDIT: Another option is the board goes in a slot same as any other graphics card, but uses heat pipes to transfer the thermal load to the PC case heatsink.

A noisy, power hog GPU costing several hundred or more? Forget it. I'm not putting one of those in my system. Electricity costs $0.35+ per kW-hr here in NYC. I just can't afford the power, never mind the purchase price.
The difficulties present in moving from one Windows system to another are a whole other subject that I'll admit I don't typically see as a problem, but dealing with those transitions has been my job for my entire adult life. It's probably worthy of a different topic, especially given that probably none of us nominally interested hobbyists and pros are at all enthusiastic about contemporary OS offerings from Microsoft.
Definitely another topic. I just mentioned it because starting from scratch after you've "moved in" to a system is really difficult for people who don't do that stuff for a living. And IMO Windows OSes reached a pinnacle with 7. If the newer OSes kept a similar user interface, ditched the spyware, ditched crap like CoPilot, OneDrive, mandatory MS accounts, etc., plus allowed you to upgrade while keeping all your existing apps, we might all be more amenable to them. I don't want my PC to look and act like a smart phone.
 
Last edited:

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,451
Location
USA
If you are using some kind of archaic video games from pre-pandemonic eras, then the iNtel/AMD/NVidias probably don't care much about your market segement. Much of the new iNtel and AMD complicated integrated stuff is for thin/light laptops or low grade desktops that are not having a video card. I'm definitely interested in what can be done to improve the GPU in the 28W segment. The ARC in the Ultra Meteoric Lake laptops is only moderately better than the Xe in the Raptor Lake-P at least for content workloads. But faster than really slow is better than nothing. Denoising a 61MP image with AI is painful on my i7-1360P.

Is there some reasson you are more than mildly concerned about power draw in a desktop system? I'd suggest a current MAC for an excellent balance of performance and efficiency.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,365
Location
Flushing, New York
Is there some reasson you are more than mildly concerned about power draw in a desktop system? I'd suggest a current MAC for an excellent balance of performance and efficiency.
Yes, noise and the cost of electricity. Plus in summers it's that much more heat the A/C needs to deal with. With my intermittent but all day long use patterns, my PC is on 24/7. The monitors shut off after 10 minutes. At ~45W while idling, that's OK. It comes to only ~30 kW-hr/month. Something with a humongous video card and larger power supply might idle at three times that or more. At current rates that's another $25+ a month for electricity.

Then of course all this stuff costs $$$$ to buy. Same reason why I don't like MACs. For what they are they're crazy expensive.

I don't do stuff like denoise huge images, or anything remotely similar. If I ever need a graphics card, I'll know it.

Yes, I'm also interested in what can be done in the low-power area. Anyone can make a fast computer by throwing the output of a nuclear reactor at it. From what I've read, we're still about 9 orders of magnitude above what the theoretical minimum would be for computing power. In other words, systems like today's could in theory use microwatts.

With my mom passing I'm on my own paying for everything. I kicked the heat off in early March, for example. 50° to 55°F indoor temps takes some getting used to. I just insulated the attic so hopefully things will be better next winter. I also got cellular shades for the windows, and I'm considering adding 2" of foam board insulation on top of the outside facing interior walls. The walls are only around R6. The foam would bring that to about R20. The goal is a passive house which doesn't need heating except on the coldest days.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
Rather than look at Apple (once again, LM, can you at least not hold down the shift key when you type that?), we're very close to seeing mainstream ARM for Windows. Qualcomm says they have something that's competitive with Intel laptop CPUs that can get double the battery life and half the per unit cost. I'm not sure where those will be with regard to RAM or storage upgrades, but ARM Server boards use normal parts. I'm guessing these parts will be comparable to Snapdragon 1v3, which at least puts ARM/Windows systems in the same ballpark with i7 mobile and Apple Silicon.

It's something to keep an eye on.

There is precedent in the world of Windows to get binary translation up and running. DEC Alphas running NT4 could chew on an x86 binary for a minute or so and then output a reassembled version native for that platform, which they would use instead of the provided one. I don't know if that trick came from DEC's CPU wizards (who got sold to Compaq, then HP, then traded to Intel and wasted on Itanium if they didn't move on) or Microsoft, but since we're talking about something that happened in the 90s, is probably at least understood in concept by the engineers working on these things.

Anyway, Windows on ARM might be a big step forward on lightweight hardware.

(for mostly sed's benefit: NT4 ran on MIPS, PowerPC and Alpha as well as x86, and there was at least one preview build of Windows 2000 for Alpha; some of my early work in IT was integrating Honeywell industrial control equipment with NT4 non-Intel workstations, mostly because customers expected RISC workstation hardware but the guys who had to look at it were only familiar with Windows and/or Solaris)
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
Windows 10 and 11 had ARM versions, although I suspect you'd have a hard time buying it that way. Standard product keys work if you happen upon a copy, though. If your memory is long enough, Windows 8 and 8.1 ALSO had ARM releases, but that was called Windows RT.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,451
Location
USA
Do you think I ever would have stooped to using 8 or Vistas for that matter? ;)
The numbnuts at MS destroyed the Nokia phonees with that unholy alliance. We had to use the Blackberries in that era for security purposes. Nobody trusted Windows phones.
Windows 10 and 11 had ARM versions, although I suspect you'd have a hard time buying it that way. Standard product keys work if you happen upon a copy, though. If your memory is long enough, Windows 8 and 8.1 ALSO had ARM releases, but that was called Windows RT.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
Windows RT was actually used on low-power Surface tablets. I got familiar with them because a lot of POS systems tied to Microsoft Dynamics and used them for ordering and credit card processing. They were weird little guys but to their credit, they could play any media file you threw at them and they shipped with a full copy of MS Office 2013.

Windows 10/ARM is notoriously poorly optimized and awful. Apple Silicon reportedly ran x86-64 Windows in a VM faster than it could run Windows 10/ARM. I can only hope Windows 11/ARM is less of a joke, four years later.

Finally, give Windows 8 some credit. It had bar none the best desktop search ever to be found on Windows. Everything since has been a regression. I'd put it on par with Spotlight on a Mac. Between Windows 8.1 and 10, the biggest userland differences are 10's Start Menu and the Win-X shortcuts. Vista was a terrible OS, but 99% of everything people complained about with Windows 8 could be fixed with one of about a dozen 3rd party Start Menus.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,787
Location
Eglin AFB Area
Website
sedrosken.xyz
I've read many places that the entry-level standalone GPUs are a suckers game already, with iGPUs beating or nearly beating them in most scenarios. The current AMD iGPU seems good enough for a lot of use cases, as seen by many of the handheld gaming consoles. The last several laptops I've had used some kind of technology where it switched from dedicated graphics to the iGPU when 2D content (web browsers or productivity apps) are in use. I wouldn't mind such things for desktops.

I don't even have one of the "good" APUs -- mine's a Ryzen 4500U with I think Vega 8, and for everything I do on the regular -- mostly playing BTD6 and Fistful of Frags with my friends, on up to modded OpenMW and Skyrim Special Edition -- it's plenty. I haven't actually turned on my main desktop in about a month. My power bill has thanked me. It's a good thing I have a better GPU available when I do need it, but I haven't actually needed it for a while and I'm trying to go out of my way not to use it through this summer.

I've never had cause to play around with any incarnation of Windows for ARM. I kind of want an RT to screw around with, albeit not with Windows. I don't know that I ever will considering my budget for that particular misadventure is exactly zero dollars. I still think as long as Qualcomm has Windows exclusivity for ARM the idea's doomed, though, regardless of how well it performs.

Windows 8 was remarkably well optimized, I will give it that much. For all its UI/UX woes, it usually ran better than 7 on the same hardware and was still tolerable to use on spinning rust, which is a lot more than I can say for any version of 10 save perhaps LTSB 2015. I look back on it semi-fondly given you can get around the worst of its UI problems by just installing Classic Shell.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,348
Location
Gold Coast Hinterland, Australia
I still wanna know why we need an NPU.
To make stock holders happy that {insert company} is delivering on AI.
At a more fundamental level, LM mentioned the current primary reason (at least in the x86 world), is for AI accelerated workflows especially in image and video cleanup or enhancements. Adobe especially seems to be adding in generative functionality into their products (eg, crop image of couple in photo, and now place them in a scenic beach location).

If you want to look at gaming, then using on device AI acceleration will allow greater interactivity with NPCs, allowing more dynamic content to be delivered back to the player, bringing greater depth to the world the player is in.

The thing is Apple has been adding what they are now calling NPUs (Apple = Neural Engine) since the A11 (iPhone 8) and all M series CPUs. In Apple's case they are using it for a lot of assistive technologies in the products, eg for a blind person to hold up a phone point the camera at something, and the phone tells the person what it is seeing, or alternatively using voice generation functions to read books in the persons own voice. (Basically you let the iPhone sample your speech, it'll create a voice/speech model, and then using the text recognition in the camera app, be able to read back in that voice. This is a super handy feature for people who have very limited vocal functionality). Apple have been very quiet about the technology side of it, instead focussing on the outcome for the user.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,348
Location
Gold Coast Hinterland, Australia
DEC Alphas running NT4 could chew on an x86 binary for a minute or so and then output a reassembled version native for that platform, which they would use instead of the provided one. I don't know if that trick came from DEC's CPU wizards (who got sold to Compaq, then HP, then traded to Intel and wasted on Itanium if they didn't move on) or Microsoft, but since we're talking about something that happened in the 90s, is probably at least understood in concept by the engineers working on these things.
IIRC, the first Alpha's did it all in software (called FX!32 all done by DEC), but later versions added some additional CPU instructions to assist in the translated code performance. (IIRC Apple did the same with Apple Silicon, it has a very extra CPU instructions that assist Rosetta2 in working as well as it does).
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
I thought Apple's deal was having multiple binaries in the same file such that whichever architecture could run its software without having to worry about compatibility? IIRC That's how it handled the 68000 to PPC and PPC to x86 transitions.

My partner has been messing with the A770-equipped PC quite a bit and it's been generally fine, albeit a lot better at WQHD than 4k. Is that a surprise? I don't think it is. She filled up a 2TB SSD with Steam and MS Game Pass titles and so far there haven't been any big surprises, at least for the titles she's tried. There has some weirdness with games starting on the wrong display and one game that launched to a black screen with audio but no graphics, but I've seen that sort of nonsense on Radeons as well. Adobe Suite stuff generally doesn't seem to be impacted by the switch from AMD to Intel for graphics other than a few of the AI functions in Photoshop and Lightroom being a little faster, but my Topaz software generally runs a lot (like half speed) worse, although it does detect the GPU and says it's using it.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,787
Location
Eglin AFB Area
Website
sedrosken.xyz
I thought Apple's deal was having multiple binaries in the same file such that whichever architecture could run its software without having to worry about compatibility? IIRC That's how it handled the 68000 to PPC and PPC to x86 transitions.

They do in fact tend to do that when they can -- the practice I believe is called fat binaries. However, Rosetta also directly translated PowerPC applications to run on Intel hardware while they continued to support it for stuff that hadn't yet been ported -- it was handled a fair bit like the Classic environment from what I understand. People were given a cut-off date and without sticking to an outdated version of OS X that still supported it, the ultimatum was that you must abandon your applications and games that you'd paid for that didn't make the jump. I imagine that's going to be exactly how this pans out, too, but Apple will be hailed as heroes or some-such for handling the transition as well as they did.

Microsoft doing something similar for amd64 to arm64 just means they have a new support layer like Windows on Windows that they have to support for all of eternity and forever, lest they offend the golden calf that is backwards compatibility.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
One thing I am noticing about the A770 is that it is using around 40W just idling. I thought maybe it might be because that system is also my Plex Server and therefore has transcode jobs but no, I killed Plex and all the Adobe stuff on that system and the GPU still running kind of warm.
On the other hand, I can leave League of Legends open for an hour and it's not using (much) more power for doing that, even rendering hundreds of frames per second.

This is a PC where I have up to this point had well-behaved power management, so I think it's definitely the card. Apparently it's a Sparkle thing rather than an Arc thing. Hopefully they fix it eventually.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,365
Location
Flushing, New York
That's 5 to 10 watts less than my entire system, less the monitors, pulls from the wall socket for most of my use. Graphics or computational intensive stuff can get me around 125 watts, but that's relatively rare.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
Research says it's because I have three displays plugged in to it. There are rules about when the hardware will even attempt to power down the video card, but since there are two 4k displays and a 4k TV connected to that thing, it's just not going to happen without some kind of update to firmware or drivers even though I have all the settings right on the Windows side of things.
 

LunarMist

I can't believe I'm a Fixture
Joined
Feb 1, 2003
Messages
17,451
Location
USA
Wow, what are you using three 4K monitors for? They must be consuming far more power than the video cards.
I looked at 4K displays a few years ago and the good ones were astronomically priced.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,228
Location
I am omnipresent
My partner convinced one of the people she works for that she needed a pair of 32" Ultrasharp displays. She legitimately does do graphics work, print layouts etc. Does that need a 4k display? Kinda doubtful, but we got 'em.
The TV is where PC games get played.
 
Top