Everyone's cutting costs, no one can make a product that doesn't kill itself to save their life, all anyone wants is your money and they don't care about making a quality product and getting said money legitimately.
Maybe eVGA did make the right call exiting from the GPU industry. If the chips were costing so much along with all the other issues they had, how could they keep going with such thin profit margins without reducing costs.
I was surprised they can make and/or continue a viable business model on those limited items. Maybe they're raising cash to get into other areas over time to supplement the "loss" of GPUs. Difficult to say if the GPUs eventually became a loss-leader after they established a name for themselves given the thin margins.
4060 came out. It is not particularly better in any way than a 3060Ti until the AI frame generation stuff is put in play. The A770 and 6700XT are both in the same price range. Intel released better Iris/Alchemist drivers just this week and while the 6700XT is 10 - 15% more expensive, it also has 50% more RAM and it's flatly better than nVidia's $300 card.
At some point doesn't nVidia want to move to one technology and the 30 series is getting old and lacking the DLSS? I read on the webs that they are not expected to implement a 50 series until 2025. If $300 is too high and sales volume are poor, then there will be sale prices. I'd really like to add a card like that to a (currently 5700G) system in a µATX case if it fits fully within two slots.
Some analysis I read showed nVidia selling anything that gamers don't buy to AI startups. The first time they overpriced the gaming market crypto bailed them out, now it is AI. They just don't need to care about the gaming market, especially not the price-sensitive end of the market where the margins are low anyway.
Everything in the RTX4000 series can be considered downgraded by one designation based on past standards of performance and pricing vs older generations. The 4080 probably should've been a 4070Ti, for example.
Apparently, the XT 7600 and 7700 take nVidia hardware to the shed at their respective price points. This is actually fantastic news. An XT 7700 with performance on par with an RTX 4070 that also has 16GB RAM AND costs $150 less is what we are all looking for for a next Gen update, I think.
It will only ever be available on nVidia hardware as it's an nVidia technology and I doubt they're willing to license it out, at least (maybe) until such time as nVidia goes completely mask-off and leaves the PC graphics card market behind for their greener AI pastures. nVidia benefits from leaving everyone else out in the cold as it makes it so a professional only really has one choice.
The other mainstream machine learning language is OpenCL, which has pretty pathetic support compared to CUDA. There's a dim possibility that Intel and/or AMD could leapfrog nVidia and CUDA, maybe with some new and hyper-specialized compute units or extra instructions, but given where nVidia's bread is being buttered these days, that's not terribly likely.
AMD has been a pretty good egg in a lot of this stuff: Freesync works with any combination of video card and monitor; OpenCL is, uh, open. AMD releases open source graphics drivers itself and apparently its competitor for DLSS, FSR, is also made to work with any combination of display tech.
No 'probably', definitely. OpenCL is treated as a back-path option only to be used if you absolutely must in most software if it supports it at all. If you're using any sort of professional compute solution, you're using CUDA or you're basically running in software. The only software I saw make decent use of it was Folding@Home. (I have a bad enough electric bill as it is, I don't need to dump more heat into my office to deal with.)
Selecting the Open CL box does nothing with the physical video card in my desktop systems. Now I understand the 'Cuda function.
The auto setting works best in the ultralight laptop with the puny XeRemis video system (no physical video card).
Apparently, second-hand RTX 3080TIs are now a very reasonable $400 - $500. If you don't mind the power draw, that's about as reasonable as anything I can see from nVidia. The 4070TI only just barely better for $300+ more, which says nVidia is still insane, even after another round of price cuts.