The standard line for AMD for a long time has been that its integrated graphics tend to be substantially better than whatever Intel happened to drop in a diaper. It was entirely valid for a long time: AMD integrated graphics was double or triple what Intel offered for the same money. That's not actually true any longer. Iris Xe isn't substantially worse than what's been stapled on to contemporary AMD CPUs (it's something like 40 - 50% worse, where it used to be 2x or 3x worse, a big real world improvement).
Well, that explains a lot. Honestly, when I first tried running Open Rails on my laptop, I crossed my fingers that the performance would at least match that on my desktop. When I saw it was probably 2x better, I was pleasantly surprised. I didn't know Intel closed the gap with AMD this much.
We don't know what people are doing with graphics generally. We also don't know what people will be doing in three years (AI! We're all gonna have Google Bard running everywhere! Is it still called Bard this week? Who cares? CoPilot will be in our watches by 2026! Also we're going to go back to wearing watches, so we can get extra CoPilot!). But bear in mind that your GPU also does things like decode video and in some cases even render effects on a desktop. We can say that it's probably better to have some extra capability for those things even if the end user doesn't directly or immediately care. From that line of thinking, it's better to have something than not, especially for a system a user is planning to keep to the point of actual obsolescence, which does tend to be the way home users are with their systems nowadays.
Here's the thing. Yes, having that extra capability if needed makes sense. The question is at what cost in terms of extra money, noise, and power consumption. Here's my idea for that. First off, keep the power dissipation under maybe 150 watts maximum. And use the entire side of a case to passively cool the GPU. Yes, it'll be a huge hunk of metal weighing probably 10 pounds, but you can passively cool 150 watts while keeping temperatures reasonable with something that size.
Next, still have an iGPU to use most of the time. The idea is the discrete GPU gets turned completely off, as in 0W, unless you need graphics HP beyond what the iGPU can deliver. That saves a lot compared to a card which might idle at 20 or 30 or 40 watts.
So we've taken care of the noise and power consumption issues with a separate GPU. Last issue is cost. I'm neglecting the cost of the huge heatsink. If we went this route, you would get a case with a heatsink, but be able to use that same heatsink if you changed GPUs. So basically you have a board that bolts on to the heatsink, with some sort of connection to the motherboard. If you can make the board for $50 or less, $75 tops, you have a winner that even people like me would buy.
EDIT: Another option is the board goes in a slot same as any other graphics card, but uses heat pipes to transfer the thermal load to the PC case heatsink.
A noisy, power hog GPU costing several hundred or more? Forget it. I'm not putting one of those in my system. Electricity costs $0.35+ per kW-hr here in NYC. I just can't afford the power, never mind the purchase price.
The difficulties present in moving from one Windows system to another are a whole other subject that I'll admit I don't typically see as a problem, but dealing with those transitions has been my job for my entire adult life. It's probably worthy of a different topic, especially given that probably none of us nominally interested hobbyists and pros are at all enthusiastic about contemporary OS offerings from Microsoft.
Definitely another topic. I just mentioned it because starting from scratch after you've "moved in" to a system is really difficult for people who don't do that stuff for a living. And IMO Windows OSes reached a pinnacle with 7. If the newer OSes kept a similar user interface, ditched the spyware, ditched crap like CoPilot, OneDrive, mandatory MS accounts, etc., plus allowed you to upgrade while keeping all your existing apps, we might all be more amenable to them. I don't want my PC to look and act like a smart phone.