I don't think the 7000 X3Ds prices are dropping much more since they are a special purpose part that has not been replaced and the regular 9000s are not much better than the regular 7000s. The arrival of the 9000X3Ds should make more of a difference. AMD has to fix the CCX cluster by then, so it might be longer than expected.I resemble that remark. Your information has certainly made me happy with my choice. I do wonder if I had waited a bit, will the 7950X3D take a huge price drop?
Thank you. Don't think the cpu has ever been over 15%. Memory is pretty consistent at 20% used.I don't think the 7000 X3Ds prices are dropping much more since they are a special purpose part that has not been replaced and the regular 9000s are not much better than the regular 7000s. The arrival of the 9000X3Ds should make more of a difference. AMD has to fix the CCX cluster by then, so it might be longer than expected.
Frankly the X3D is only useful in certain applications and the lower TDP means lower performance than the regular CPUs in most others. I would only consider the 9950X3D if it can run well at 170W. 16 cores at 120W is inadequate on the 4nm process they are stuck with since APPLE gets the good 3nm stuff.
I don't think the 7000 X3Ds prices are dropping much more since they are a special purpose part that has not been replaced and the regular 9000s are not much better than the regular 7000s. The arrival of the 9000X3Ds should make more of a difference. AMD has to fix the CCX cluster by then, so it might be longer than expected.
Frankly the X3D is only useful in certain applications and the lower TDP means lower performance than the regular CPUs in most others. I would only consider the 9950X3D if it can run well at 170W. 16 cores at 120W is inadequate on the 4nm process they are stuck with since APPLE gets the good 3nm stuff.
Do they need 2 CCDs in 2024 rather than just making one larger one? I'm sure the cost would be more, but what do the more advanced manufacturers do?
Maybe I'm totally confused, but isn't the problem between the CCDs and not between the CCXs? It isn't like the Bulldog crap of years ago.AMD's hardware is valid and works well under Linux. This suggests to me that Windows is the place to put blame for not understanding something fundamental about how AMD's architecture works, or possibly a limitation of what the PPM drivers can do within Windows to keep performance on track. Either way, the fact that the multiple CCX arrangement is massively better tells me some people need to stare at the code Linux has until they figure out why it's wrong under Windows, and not that a small army of computer engineers need to go back to the drawing board for not making gamers happy enough.
I think you have an unrealistic idea of what AI is doing. It's a useful tool for optimizing and assisting in design, but doesn't "think" like a human to solve complex problems. Does it understand the issues between marketing and engineering and the stockholders?My guess as a layperson is that MS keeps reusing code which probably dates from the XP days, if not earlier, for large parts of its OS. They just changed the outward appearance of the OS, added some features, but didn't always update their compilers to take advantage of the latest architectures. The end result is layers and layers of inefficient code which technically may work, but not always well. I often wonder why some operations seem no faster on hardware which is 5 or 10 years newer than they did under XP. Sometimes I even get huge lags when I'm typing text on a website. Exactly what are they doing that things should be this slow? Optimizing compilers for each successive software update will probably be a job only AI can handle. The AI will know the end goal from reading the higher level programming language, then just see how to do the operation in the fewest lines of assembly. Sort of what I do now when I program large parts of microcontroller firmware with assembly instead of relying on a compiler to deliver efficient code. Only difference is the stuff I do is fairly simple. No human can manually optimize the stuff today's systems do.
Granted but my point is I wonder how well compilers work at outputting a bare minimum number of commands to accomplish a given task. To me it seems all they do is translate whatever the programmer wrote in high-level language into machine language. It the programmer wrote a convoluted way to arrive at an answer, you'll end up with a slower program. Same if they use a bunch of APIs, instead of writing a program from scratch to accomplish the same task. My thoughts were AI could figure out what the end goal is given a set of inputs, and optimize the machine language. In other words, cut out unnecessary intermediate steps. Today's machines are so fast barring some compute intensive tasks you shouldn't ever be waiting for your machine.I think you have an unrealistic idea of what AI is doing. It's a useful tool for optimizing and assisting in design, but doesn't "think" like a human to solve complex problems. Does it understand the issues between marketing and engineering and the stockholders?
That website is probably slow because it is running a ton of scripts that are tracking you to sell you a bunch of stuff or sell your info tothers. It may also be doing a lot of predictions and analytics.
I suspect that the proportion of exclusively Linux users on the desktop CPUs is pretty low, so Windows will be their target market.I agree it is incredibly likely to be an MS issue, and I hope they manage to sort it out. But as I'm not on either engineering team, and running on Linux isn't really an option for my games (yet, it is getting so close), finding an optimal solution in Windows is the goal.
Granted but my point is I wonder how well compilers work at outputting a bare minimum number of commands to accomplish a given task. To me it seems all they do is translate whatever the programmer wrote in high-level language into machine language. It the programmer wrote a convoluted way to arrive at an answer, you'll end up with a slower program. Same if they use a bunch of APIs, instead of writing a program from scratch to accomplish the same task. My thoughts were AI could figure out what the end goal is given a set of inputs, and optimize the machine language. In other words, cut out unnecessary intermediate steps. Today's machines are so fast barring some compute intensive tasks you shouldn't ever be waiting for your machine.
There should also be a way to turn off scripts that add nothing to the user experience. There are times I feel like I'm still on 56K dial-up Internet.
I'll also add some websites are just stupid. I remember one where I had to enter stuff like my email address. After every character I would get a warning "Not a valid e-mail address" until I typed the entire thing. Same with the other fields. Why didn't it just check for validity after you went to another field?
a second-gen i7
You should find a higher class of clients.
It boggles my mind that so many adults play video games, but perhaps AMD will pull something better out of the hat with the 3DX for them.
I have no idea what regular people do now nor then, nor is there anything wrong with it. I know that 10 YO children with money had the Odyssey in 1973. (IIRC Pong was an arcade game.) I had the Atari 2600 later in the 70s, but I just assumed that people outgrew video games when they were in school and then working full time, raising their own families, etc. Apparently I was wrong.LM, even boomers might've bought a Pong console in 1975, when they were already fully adults and on their way to the disco, cocaine and herpes -exposure parties they were undoubtedly having before that generation pulled the ladder of fun up for all time. The most active gamer I know is a retired and divorced woman who spends 12+ hours a day playing MMOs. Relative to other hobbies adults might have, gaming is a lot cheaper and more accessible than things like golf or motorcycle ownership. Or, you know, photography.
Back on track... There are still no Passmarks for the 9700X or 9950X and only the early one for the 9900X. I could upload for my 9700X, but it is not on the internet.
It's hardly a stretch to exceed Zen 5 technology. Will it be reliable is another question.
My sons computer feels pretty fast but he doesn't have Teams or Outlook installed. But every other computer I use lags, even VS Code feels slow.
I'm trying to think of anything I run, on either machine, one gen 4, other this one, that have ANY software that feels slow. Everything is pretty much blinding. Lots of RAM.Even if we had 32 x 6 GHz cores and 1TB of RAM developers would make software that makes the OS lag... more bloat, new frameworks, new versions of Teams and Outlook with AI and whatnot...
My sons computer feels pretty fast but he doesn't have Teams or Outlook installed. But every other computer I use lags, even VS Code feels slow.
I'm trying to think of anything I run, on either machine, one gen 4, other this one, that have ANY software that feels slow. Everything is pretty much blinding. Lots of RAM.
The AMD's are rarely over 15%, and ram use is between 5-20%.
m.2's don't exactly hurt...
So they are all dropping now. The 9700x numbers are obviously mostly or all at stock 65W. The 9950X (n=15) are less than 10% better than the 7950X in single-thread and about 6% in overall CPU Marks.Back on track... There are still no Passmarks for the 9700X or 9950X and only the early one for the 9900X. I could upload for my 9700X, but it is not on the internet.
Exactly why I never have AV software. Every single one I've tried slows the machine to a crawl, especially when doing file intensive operations. It's like the software has to scan every single file in use multiple times.Install Outlook and Teams and get back to me. Even better, do that, then add some corporate-grade antimalware, MDR, a backup program, whatever other meeting platforms your company's decided to "standardize" on...