Intel lays off 10% of its workforce

timwhit

Hairy Aussie
Joined
Jan 23, 2002
Messages
5,278
Location
Chicago, IL
Sometimes firing the worst 10% of your workforce can be a good thing. However, doing it every year like Jack Welch might be a bit much.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,373
Location
Flushing, New York
Intel has no real competition in the areas it serves at this point. They're probably letting AMD stay alive just to avoid the red tape that would come from being a monopoly. When you've already cornered a large market segment, you can't really grow much. The only way to increase profits is to either raise prices or cut expenses. Raising prices probably isn't an option. The only question is what expenses are they cutting? If it's mostly suits then it's no big loss. It it's R&D they'll pay for it down the road. Given the relatively small advances in CPU performance over the last 5 years I suspect R&D has already been cut quite a bit.

That said, I feel a better long term option for any company is to buy back stock. Once they're not public any more, profits to satisfy shareholders are pretty much moot. You just need to make enough to meet your expenses.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,357
Location
Gold Coast Hinterland, Australia
They're probably letting AMD stay alive just to avoid the red tape that would come from being a monopoly. .
I'm not too sure on that, as time goes on, even if AMD goes (and I'll be disappointed if that happens), I couldn't see Intel being held as a monopoly in the market.

ARM CPUs are everywhere (I haven't seen latest figures, but I'm sure ARM based CPUs outnumber Intel designs shipped when you consider all devices), Intel isn't number 1 in flash, SPARC/POWER are still used in high-end servers, and we have ARM based designs taking on the low-end desktop/server market... I think nVidia have more to worry about (in regards to being a monopoly) if AMD goes under, as there are very few high-end GPU design/manufacturing corporations.

Could this be a sign the the board at Intel understand their true position in the overall tech market and are trying to stay ahead of the curve through a downsize/restructure? (This article : https://en.wikipedia.org/wiki/5_nanometer seems to imply 5nm, will be it, at least not without a fundamental shift in materials used).
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
Maybe it'll dump a bunch of the money it has saved into acquisitions. IMO, Intel is already at a point where it's at least competitive with high end ARM for mobile and if it ever gets a Skylake or Cannonlake Atom with Iris Pro into shipping mobile devices, I'd even say it's ahead of the game. What's left after that?
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,373
Location
Flushing, New York
(This article : https://en.wikipedia.org/wiki/5_nanometer seems to imply 5nm, will be it, at least not without a fundamental shift in materials used).
There's always scaling in three dimensions after that like they're already doing with flash memory. It won't get you higher clock speeds, but it'll get you more parallelism and more transistors on a chip.

I think at this point we may need a new CPU architecture to move forward, perhaps something which works similar to biological brains with massively parallel connections.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,926
Location
USA
I feel Intel missed out on some of the shift to the lower-powered devices the world seems so attached with and because of those the much lower demand for new home computers is causing them some heartache. Hopefully their newer 3D NAND and 3D Xpoint storage will help add to their revenue in meaningful ways and they can continue to R&D in other areas.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,926
Location
USA
I imagine that hubris has caught up with the current management.

Eg:
Intel Aims To Surpass Samsung In The SSD Market By 2016

I see that their current CFO (bean counter) is being groomed for the top job. To me personally, that's a huge red flag - if I was crazy enough to own Intel shares, I would divest immediately. But that's just my personal opinion.

I think it's plausible that Intel could catch and surpass Samsung given they've entered the market with a 3D NAND offering. Intel also have a next gen technology that no one else has (or at least they haven't announced anything concrete yet) and by the looks of it, that will make a significant change to the storage industry over time. Those two technologies combined gives them the potential to position them for the lead. Intel also does a very good job with their software side of things which I can't say is as true for Samsung. Having spent time trying to manage Intel NVMe SSDs compared with Samsung NVMe SSDs, Intel went the extra mile and a half with their DCT toolkit.
 

time

Storage? I am Storage!
Joined
Jan 18, 2002
Messages
4,932
Location
Brisbane, Oz
I'm not sure OEMs or most users care about utility software for storage drives.

Samsung has a two-year head start delivering 3D-NAND. From the article I linked, they owned 45% of the SSD market more than a year back, which was 5 times Intel's share.

The only way they could become equals is if Samsung has a major FU while Intel executes perfectly. History suggests otherwise, but time will tell.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
I think a big question is who can manage the next couple semiconductor die shrinks and who can afford to build the fabs for turning out chips on that new technology. Samsung is one of the tiny number of companies competing in that space at the moment, but who is doing what in terms of R&D to move forward?
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,373
Location
Flushing, New York
Given how expensive each new die shrink is, my guess is eventually you'll have everyone making semiconductors sharing a fab, perhaps even sharing IP. It's getting to the point where no single company can bankroll everything. As for R&D, I really think we've reached a point of diminishing returns. It's sort of like what's already happened in the world of mechanical disk drives. There are ways to move forward in terms of platter density, but they're all so expensive/problematic that the cost per GB isn't dropping proportionately to the density increase, if it's dropping at all. To be sure, SSDs will eventually face the same problem, but much later in the game.

This all puts us at an interesting juncture. What's next? There are things to move us forwards for maybe another decade. After that, we'll likely need to seriously rethink computing architecture because we won't be getting any more performance increases from faster clock speeds or more cores. I wonder if we'll see a resurgence of some sort of analog computing?
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,926
Location
USA
Given how expensive each new die shrink is, my guess is eventually you'll have everyone making semiconductors sharing a fab, perhaps even sharing IP. It's getting to the point where no single company can bankroll everything. As for R&D, I really think we've reached a point of diminishing returns. It's sort of like what's already happened in the world of mechanical disk drives. There are ways to move forward in terms of platter density, but they're all so expensive/problematic that the cost per GB isn't dropping proportionately to the density increase, if it's dropping at all. To be sure, SSDs will eventually face the same problem, but much later in the game.

This all puts us at an interesting juncture. What's next? There are things to move us forwards for maybe another decade. After that, we'll likely need to seriously rethink computing architecture because we won't be getting any more performance increases from faster clock speeds or more cores. I wonder if we'll see a resurgence of some sort of analog computing?

Limitations in one area will drive invention in another area. One example of this is nvidia and their Maxwell architecture. They had to find another way to increase performance and reduce thermal usage without the help of a die shrink. Their fab facility TSMC wasn't able to get them smaller than 20nm for this release so they invented ways around this that were beneficial otherwise. I suspect we'll see some of the same shift happen with Intel over time but ultimately you're right that something else will be needed to make the next largest leap in advancement.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
I don't mean to break out the "640k should be enough" argument, but we are reaching a point where there are fewer and fewer applications that would benefit from significant increases in computational performance or storage density.

As the cost rises and the number of potential applications/customers shrinks, there will come a time when literally all the demand in the world isn't enough to drive another iteration.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,373
Location
Flushing, New York
I don't mean to break out the "640k should be enough" argument, but we are reaching a point where there are fewer and fewer applications that would benefit from significant increases in computational performance or storage density.

As the cost rises and the number of potential applications/customers shrinks, there will come a time when literally all the demand in the world isn't enough to drive another iteration.
That's an excellent point. I've noted at this stage SSDs are offering "good enough" storage at a low enough price point for 90% of the population to be satisfied. Although mechanical drives are still larger and less expensive per GB by an order of magnitude, most people are not screaming for more storage. They just want faster storage. As a result, increases in density of HDDs have stagnated because of relatively low demand for larger disk drives, combined with increasing difficulty getting each new density increase. SSDs have a much clearer path forwards at this point, but eventually even they will reach a point of diminishing returns.

Same thing with computing power. We have smart phones with the computing power supercomputers had in the 1980s. Other than stuff like 3D games, I can't think of a whole lot of things the masses do which might benefit much from more computing power. I think we'll need a new killer app which everyone wants, and which benefits from more computing power, to bring us to the next generation.

Of course, the more I think about it the more I feel that killer app might be robots in all their forms, including self-driving vehicles. At this point AI can benefit from all the computing power we can throw at it. It just isn't in heavy demand-yet.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,926
Location
USA
That's an excellent point. I've noted at this stage SSDs are offering "good enough" storage at a low enough price point for 90% of the population to be satisfied. Although mechanical drives are still larger and less expensive per GB by an order of magnitude, most people are not screaming for more storage. They just want faster storage. As a result, increases in density of HDDs have stagnated because of relatively low demand for larger disk drives, combined with increasing difficulty getting each new density increase. SSDs have a much clearer path forwards at this point, but eventually even they will reach a point of diminishing returns.

Same thing with computing power. We have smart phones with the computing power supercomputers had in the 1980s. Other than stuff like 3D games, I can't think of a whole lot of things the masses do which might benefit much from more computing power. I think we'll need a new killer app which everyone wants, and which benefits from more computing power, to bring us to the next generation.

Of course, the more I think about it the more I feel that killer app might be robots in all their forms, including self-driving vehicles. At this point AI can benefit from all the computing power we can throw at it. It just isn't in heavy demand-yet.

I feel like you're only looking at this from the consumer adoption side of things. From the enterprise, scientific and research side of things, dense compute with higher efficiency is key as is the increase in capacity used for the cloud-related push that seems to be trendy. As more of the personal compute devices become mobile and tablet the push is to cloud services which is really just making the compute someone else's problem.

On top of that there's still loads of areas that can benefit from faster processing and faster storage they just don't happen to be as much on the average consumer side. You have things like big data analysis, machine learning, etc that can still leverage both faster compute and faster storage. Single point is CERN releasing 300TB of data to the public. How else does a team of people process such data in any reasonable amount of time without decent compute and storage? Sure these are examples that don't cover the broad base of customers but it still shows we can easily leverage advancements for some time to come. Even look at the folding at home project. There's something fundamental about understanding how our proteins work but even with thousands of people donating compute time it's still taking incredibly long periods of time to understand this stuff. Apply this to the tons of areas in life we have limited knowledge in and we could advance humanity in better ways (hopefully) if we could solve problems faster with better computer power.

On the consumer side I agree that more and more people will be satisfied with a basic SSD. The trouble becomes as they need more space they'll need more performance. This will happen as we expand our media content and creation along with internet connection speed increases. If there was ever a day where most people had at minimum a 1Gb connection with no real data cap, we would see a shift in storage needs and performance.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
I suspect that ever-increasing video processing standards will also drive growth for both processing and storage on the consumer side. Samsung isn't going to get to 8k displays and call it a day when there's a solid market for convincing people they need a new $2500 TV every five years.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,373
Location
Flushing, New York
I feel like you're only looking at this from the consumer adoption side of things. From the enterprise, scientific and research side of things, dense compute with higher efficiency is key as is the increase in capacity used for the cloud-related push that seems to be trendy. As more of the personal compute devices become mobile and tablet the push is to cloud services which is really just making the compute someone else's problem.

On top of that there's still loads of areas that can benefit from faster processing and faster storage they just don't happen to be as much on the average consumer side. You have things like big data analysis, machine learning, etc that can still leverage both faster compute and faster storage. Single point is CERN releasing 300TB of data to the public. How else does a team of people process such data in any reasonable amount of time without decent compute and storage? Sure these are examples that don't cover the broad base of customers but it still shows we can easily leverage advancements for some time to come. Even look at the folding at home project. There's something fundamental about understanding how our proteins work but even with thousands of people donating compute time it's still taking incredibly long periods of time to understand this stuff. Apply this to the tons of areas in life we have limited knowledge in and we could advance humanity in better ways (hopefully) if we could solve problems faster with better computer power.

On the consumer side I agree that more and more people will be satisfied with a basic SSD. The trouble becomes as they need more space they'll need more performance. This will happen as we expand our media content and creation along with internet connection speed increases. If there was ever a day where most people had at minimum a 1Gb connection with no real data cap, we would see a shift in storage needs and performance.
Of course there are loads of applications which require lots of computing power. I tend to think those aren't necessarily major drivers of advancing the state-of-the-art because those applications are often served by supercomputers consisting of lots of off-the-shelf CPUs. I'm not sure there's enough of a market there to support the needed R&D. In fact, I might even say these supercomputers are riding on the coattails of the consumer stuff. Supercomputers have gotten a lot faster, but it's rare these days to hear of a new CPU which was expressly designed solely for the supercomputer market. You might have specialized chips designed to do some problem in hardware much faster than a general purpose CPU but at the end of the day I tend to think what drives R&D is markets where you can sell millions of CPUs.

The idea of offloading both storage and computing power to cloud is interesting but I wonder how much sense it makes. It made more sense back when consumer stuff was less capable. Now you can have a 500GB or 1TB SSD for a relatively pittance, and today's CPUs are capable of doing lots of things fast enough. Maybe offloading specialized problems which execute faster on purpose-built machines makes some sense. So does having copies of your most used files on the cloud so you can access them from any device (but I'd still want to store them locally as well).

We'll definitely need a new app which is both something lots of people want and which requires much more CPU power to drive R&D. Video processing and VR might well be two such apps. The only question is how popular would they be if made available to the masses? I can personally see VR replacing vacations.

All that said, at this point there just isn't a clear roadmap to increasing computing power. I wonder if we'll see a long period of stagnation? The closest analogy which comes to mind is what happened to Peltier coolers. 40 years ago we thought performance would improve enough to replace conventional refrigeration. Despite lots of R&D, performance has hardly improved since then although we have made great progress reducing cost. The reduced cost has made Peltier-based picnic coolers, wine coolers, and other niche applications viable. I guess the same can be said for CPUs. We'll probably find ways to reduce the cost of a given amount of computing power even if we can't increase it much. That in turn will open up lots of new applications.

While thinking about this some more, most technologies seem to have some physical or practical limit on performance. Right now LED efficiency is increasing very slowly as we approach the physical limits of what's possible. You can't exceed 100% and we're already past 60% in production. High-speed trains probably aren't getting much faster than 225 mph for practical reasons of power consumption and noise, even though there have been demonstration runs as fast as 357 mph. In the end the only thing which might follow Moore's law is cost. LED cost per lumen is still dropping despite the plateauing of efficiency. I've little doubt it'll be much the same in the computer world.
 

Stereodude

Not really a
Joined
Jan 22, 2002
Messages
10,865
Location
Michigan
So, Intel requested 8,351 H-1B visas and 5,172 permanent green cards for foreign workers between 2010-2015, citing that they could not find enough skilled American workers. Now they're laying off 12,000 workers. I'm sure it's just a coincidence...
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,373
Location
Flushing, New York
Of course. Intel is just like a lot of other American companies. There's no shortage of skilled American workers, just a shortage of ones willing to work for low wages. I guess now they have no more use for their $7 an hour H-1B programmers.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,275
Location
I am omnipresent
Not ALL Atoms, just the ones meant for mobile devices. I'm somewhat shocked by that, since it's seemed to me for some time that the primary thrust of its consumer CPU development has been to become competitive in that space.
 

Howell

Storage? I am Storage!
Joined
Feb 24, 2003
Messages
4,740
Location
Chattanooga, TN
If, as they believe, the tablet market will shrink in deference to the convertibles market, then this move makes some sense.
 

sedrosken

Florida Man
Joined
Nov 20, 2013
Messages
1,811
Location
Eglin AFB Area
Website
sedrosken.xyz
The problem with that is that I just don't think that's what's going to happen.

Convertible devices have , and probably always will, feel gimmicky. Though I guess part of that will be remedied come Windows 10's anniversary update.

What I see more potential in, is features like Microsoft's Continuum. Take your phone, plug it into a special dock, holy crap there's a whole computer in there! Except unlike the Motorola Droid of years past, this one runs an operating system that most people can wrap their head around! I see a scary amount of potential in stuff like that. Make the docks wireless, much like how the Wii U handles its gamepad only better, and you have a computer and a companion device running on the same resources for an efficient price.

I think more effort should be put into minimizing size, power consumption and heat output of the chips we already have so we can have something akin to a golly gee-whiz screamer of a workstation... in a phone, that gets pretty good battery life.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,926
Location
USA
The problem with that is that I just don't think that's what's going to happen.

Convertible devices have , and probably always will, feel gimmicky. Though I guess part of that will be remedied come Windows 10's anniversary update.

What I see more potential in, is features like Microsoft's Continuum. Take your phone, plug it into a special dock, holy crap there's a whole computer in there! Except unlike the Motorola Droid of years past, this one runs an operating system that most people can wrap their head around! I see a scary amount of potential in stuff like that. Make the docks wireless, much like how the Wii U handles its gamepad only better, and you have a computer and a companion device running on the same resources for an efficient price.

I think more effort should be put into minimizing size, power consumption and heat output of the chips we already have so we can have something akin to a golly gee-whiz screamer of a workstation... in a phone, that gets pretty good battery life.

Right, and we won't be able to have this screaming miniaturized mobile workstation because they're cutting back R&D (Tick - Tock - Tweak). That and the costs associated with shrinking down the die manufacturing process even smaller to help us get to your proposed pipe dream means it's unlikely to happen anytime soon. It'll be years (if ever) before an ARM processor can do all that. Even if ARM could, we'd be stuck in a finger-touching-app-ridden market-place of software that only does 50% of anything I really ever want it to in the first place.
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
Here's the situation, as I see it.

Intel is running out of space on high end processing. They gave up on Tick-Tock because it is becoming too difficult to keep shrinking the lithography size. Also, AMD has been out to lunch for the last several years.

On the bottom end, ARM is eating their lunch (and they don't even make chips!). Even though it's generally more powerful, the X86 instruction set just seems to be too fat and slow to work in light and low-power devices. They only need to hold the line here until what people want to do with these devices is more than what ARM-based chips can deliver.

Flash is big and getting bigger, but they're lagging. The guys who work on flash aren't the same guys who work on CPUs.

Networking isn't going anywhere, and it's doing it fast.

So, cut folks from product lines that aren't moving as fast, and refocus on what is. This move is really overdue, as the sectors where they lead do not and, generally, have not needed to be pushed hard. The money can be made without so much effort.
 
Top