Mercutio
Fatwah on Western Digital
This is probably news. What's going on in the world of semiconductors that Intel no longer feels like it needs 10% of the people who work there?
I'm not too sure on that, as time goes on, even if AMD goes (and I'll be disappointed if that happens), I couldn't see Intel being held as a monopoly in the market.They're probably letting AMD stay alive just to avoid the red tape that would come from being a monopoly. .
They've got to get to work on Skynet.What's left after that?
There's always scaling in three dimensions after that like they're already doing with flash memory. It won't get you higher clock speeds, but it'll get you more parallelism and more transistors on a chip.(This article : https://en.wikipedia.org/wiki/5_nanometer seems to imply 5nm, will be it, at least not without a fundamental shift in materials used).
I imagine that hubris has caught up with the current management.
Eg:
Intel Aims To Surpass Samsung In The SSD Market By 2016
I see that their current CFO (bean counter) is being groomed for the top job. To me personally, that's a huge red flag - if I was crazy enough to own Intel shares, I would divest immediately. But that's just my personal opinion.
Given how expensive each new die shrink is, my guess is eventually you'll have everyone making semiconductors sharing a fab, perhaps even sharing IP. It's getting to the point where no single company can bankroll everything. As for R&D, I really think we've reached a point of diminishing returns. It's sort of like what's already happened in the world of mechanical disk drives. There are ways to move forward in terms of platter density, but they're all so expensive/problematic that the cost per GB isn't dropping proportionately to the density increase, if it's dropping at all. To be sure, SSDs will eventually face the same problem, but much later in the game.
This all puts us at an interesting juncture. What's next? There are things to move us forwards for maybe another decade. After that, we'll likely need to seriously rethink computing architecture because we won't be getting any more performance increases from faster clock speeds or more cores. I wonder if we'll see a resurgence of some sort of analog computing?
That's an excellent point. I've noted at this stage SSDs are offering "good enough" storage at a low enough price point for 90% of the population to be satisfied. Although mechanical drives are still larger and less expensive per GB by an order of magnitude, most people are not screaming for more storage. They just want faster storage. As a result, increases in density of HDDs have stagnated because of relatively low demand for larger disk drives, combined with increasing difficulty getting each new density increase. SSDs have a much clearer path forwards at this point, but eventually even they will reach a point of diminishing returns.I don't mean to break out the "640k should be enough" argument, but we are reaching a point where there are fewer and fewer applications that would benefit from significant increases in computational performance or storage density.
As the cost rises and the number of potential applications/customers shrinks, there will come a time when literally all the demand in the world isn't enough to drive another iteration.
That's an excellent point. I've noted at this stage SSDs are offering "good enough" storage at a low enough price point for 90% of the population to be satisfied. Although mechanical drives are still larger and less expensive per GB by an order of magnitude, most people are not screaming for more storage. They just want faster storage. As a result, increases in density of HDDs have stagnated because of relatively low demand for larger disk drives, combined with increasing difficulty getting each new density increase. SSDs have a much clearer path forwards at this point, but eventually even they will reach a point of diminishing returns.
Same thing with computing power. We have smart phones with the computing power supercomputers had in the 1980s. Other than stuff like 3D games, I can't think of a whole lot of things the masses do which might benefit much from more computing power. I think we'll need a new killer app which everyone wants, and which benefits from more computing power, to bring us to the next generation.
Of course, the more I think about it the more I feel that killer app might be robots in all their forms, including self-driving vehicles. At this point AI can benefit from all the computing power we can throw at it. It just isn't in heavy demand-yet.
Of course there are loads of applications which require lots of computing power. I tend to think those aren't necessarily major drivers of advancing the state-of-the-art because those applications are often served by supercomputers consisting of lots of off-the-shelf CPUs. I'm not sure there's enough of a market there to support the needed R&D. In fact, I might even say these supercomputers are riding on the coattails of the consumer stuff. Supercomputers have gotten a lot faster, but it's rare these days to hear of a new CPU which was expressly designed solely for the supercomputer market. You might have specialized chips designed to do some problem in hardware much faster than a general purpose CPU but at the end of the day I tend to think what drives R&D is markets where you can sell millions of CPUs.I feel like you're only looking at this from the consumer adoption side of things. From the enterprise, scientific and research side of things, dense compute with higher efficiency is key as is the increase in capacity used for the cloud-related push that seems to be trendy. As more of the personal compute devices become mobile and tablet the push is to cloud services which is really just making the compute someone else's problem.
On top of that there's still loads of areas that can benefit from faster processing and faster storage they just don't happen to be as much on the average consumer side. You have things like big data analysis, machine learning, etc that can still leverage both faster compute and faster storage. Single point is CERN releasing 300TB of data to the public. How else does a team of people process such data in any reasonable amount of time without decent compute and storage? Sure these are examples that don't cover the broad base of customers but it still shows we can easily leverage advancements for some time to come. Even look at the folding at home project. There's something fundamental about understanding how our proteins work but even with thousands of people donating compute time it's still taking incredibly long periods of time to understand this stuff. Apply this to the tons of areas in life we have limited knowledge in and we could advance humanity in better ways (hopefully) if we could solve problems faster with better computer power.
On the consumer side I agree that more and more people will be satisfied with a basic SSD. The trouble becomes as they need more space they'll need more performance. This will happen as we expand our media content and creation along with internet connection speed increases. If there was ever a day where most people had at minimum a 1Gb connection with no real data cap, we would see a shift in storage needs and performance.
I'm guessing those aren't the employees they're laying off.I guess now they have no more use for their $7 an hour H-1B programmers.
The problem with that is that I just don't think that's what's going to happen.
Convertible devices have , and probably always will, feel gimmicky. Though I guess part of that will be remedied come Windows 10's anniversary update.
What I see more potential in, is features like Microsoft's Continuum. Take your phone, plug it into a special dock, holy crap there's a whole computer in there! Except unlike the Motorola Droid of years past, this one runs an operating system that most people can wrap their head around! I see a scary amount of potential in stuff like that. Make the docks wireless, much like how the Wii U handles its gamepad only better, and you have a computer and a companion device running on the same resources for an efficient price.
I think more effort should be put into minimizing size, power consumption and heat output of the chips we already have so we can have something akin to a golly gee-whiz screamer of a workstation... in a phone, that gets pretty good battery life.