AMD ahead of Intel...in the manufacturing process?

CougTek

Hairy Aussie
Joined
Jan 21, 2002
Messages
8,728
Location
Québec, Québec
According to this brief news at Digitimes, AMD will start manufacturing 65nm cores as early as mid-October. Intel won't before January. How's that possible? I can't recall AMD ever being ahead of Intel in their manufacturing process. Better chip designs, yes, but more advanced manufacturing process, no. With all the additional resources that Intel has over AMD, it should be a real shame to lose the face like that against a rival that's never been more than a torn under its foot.

Congratulations to AMD. By the time I'm ready to upgrade (replace) my shitty P4 POS, 65nm dual core Athlon64 should be out. Great.
 

Tannin

Storage? I am Storage!
Joined
Jan 15, 2002
Messages
4,448
Location
Huon Valley, Tasmania
Website
www.redhill.net.au
It's not the first time. AMD were first into copper by a mile - like a full year, or roughly that. At much the same time or a little earlier, Intel ran into huge production problems and couldn't supply their spiffy new Coppermine P-IIIs in meaningful quantities. This, in fact, was one of the major reasons behind the success of the original Athlon: a hell of a lot of traditional Intel-only shops went to Athlons because they just couldn't get timely supplies of Intel product. And, naturally, quite a few of them wound up sticking with the AMD product. Not most, but enough to make a long-term difference.

AMD's copper technology, BTW, also came from IBM, at least as I recall it.

Intel's production engineers are the best in the world, but they ain't perfect. Actually, it often occurs to me to think that they are even better than we usually think, as it is largely Intel's production expertise that pulls their very lacklustre designs out of the muck and makes them perform decently after all.
 

CityK

Storage Freak Apprentice
Joined
Sep 2, 2002
Messages
1,719
I imagine a fair amount of the reason is also attributable to management.
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
AMD's Cu process tech came from Moto(rola). IIRC though, there was some sharing deal between Moto & IBM.

AMD might get the initial tech from elsewhere, but the bods turning it into production ready tech are AMD's process engineers. Why don't they do the lot? The R&D is humungously expensive. AMD partnered with IBM for 90nm SOI and below. Before that, it was AMD/Moto.

Intel's 130nm & 90nm stuff was crap. Tualatin P!!!'s were the first and the leakage current (what caused all the heat related problems was > 30% of total (it should be under 10% and less is even better). The warning signs were there.

There was a posting on RWT a couple of weeks ago comparing process tech, and Intel still looked to be pushing the boundaries.

http://www.realworldtech.com/forums...PostNum=3658&Thread=1&entryID=55337&roomID=11
http://www.realworldtech.com/forums...PostNum=3665&Thread=1&EntryID=55449&RoomID=11

It seems that Intel may be on a good thing on 65nm.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,035
Location
I am omnipresent
It's also interesting to note that all the engineering gods that made Intel the semiconductor powerhouse are out of Intel's management now. Intel is run by MBAs nowadays and it shows.
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
Tannin said:
Intel's production engineers are the best in the world

I have to disagree. IBM's production engineers are the best in the world.

Unlike Intel, IBM not only makes in-house design chips, but is also a fab-for-hire. Using cutting-edge technology, and doing it well and in quantity *is* their job.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,862
Location
USA
Why then, did apple drop them for Intel? My understanding is that they couldn't produce fast enough chips in large enough quantities. I'm not saying IBM makes low yielding crap, but there is some implication draw from this change.
 

Buck

Storage? I am Storage!
Joined
Feb 22, 2002
Messages
4,514
Location
Blurry.
Website
www.hlmcompany.com
Handruin said:
Why then, did apple drop them for Intel? My understanding is that they couldn't produce fast enough chips in large enough quantities. I'm not saying IBM makes low yielding crap, but there is some implication draw from this change.

Handy, you know you're asking for a logical business explanation for the decisions that Steve Jobs makes? :lol:
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
ddrueding said:
I thought apple's move was more to do with inferior architecture or volume than fab quality.

Perhaps so, although the impression that I've gotten from my friends at Apple and IBM is that it was getting too expensive to make the technological improvements that Apple wanted. The Macintosh is a niche market, especially as far as the processor is concerned; there really aren't any other current uses for the G5. This makes the economics crappy, and Apple didn't want to pay up the steepening curve. Also keep in mind that the PowerPC 970 was never really meant to go into a laptop, and you have plenty of reasons to go to a commodity chip, even if inferior.

This doesn't really have so much to do with production itself, but its costs and chip design.
 

CougTek

Hairy Aussie
Joined
Jan 21, 2002
Messages
8,728
Location
Québec, Québec
Many must know, but in reality and contrarily to what Digitime reported, Intel will be the one shipping 65nm processor this year and AMD only 6 months later. Being more interested by current AMD chip designs, I find this quite disappointing. I couldn't care less about any Intel CPU still using the netburst architecture, no matter how small the manufacturing process would be.
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
Isn't this because Intel needs to do this and AMD does not?

Intel chips are getting physically large.
 

Pradeep

Storage? I am Storage!
Joined
Jan 21, 2002
Messages
3,845
Location
Runny glass
CougTek said:
Many must know, but in reality and contrarily to what Digitime reported, Intel will be the one shipping 65nm processor this year and AMD only 6 months later. Being more interested by current AMD chip designs, I find this quite disappointing. I couldn't care less about any Intel CPU still using the netburst architecture, no matter how small the manufacturing process would be.

Not to worry Coug, even with Intel moving to 65nm, their power consumption is still piss-poor, no comparison to AMD at 90nm:

http://anandtech.com/cpuchipsets/showdoc.aspx?i=2578&p=4
 

CougTek

Hairy Aussie
Joined
Jan 21, 2002
Messages
8,728
Location
Québec, Québec
I saw that yesterday. I'm curious to see if AMD will get better improvements with their 65nm process than Intel did with its own. It's sad that it will take more time to find out than originally expected.
 

Pradeep

Storage? I am Storage!
Joined
Jan 21, 2002
Messages
3,845
Location
Runny glass
Intel use a lot of transistors to get their 2MB L2 cache (and soon, 2MB L2 cache * 2 for the dual core chips).

Personally, I prefer CPUs that don't use L2 to make up for turdy performance :)
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
sechs said:
Tannin said:
Intel's production engineers are the best in the world

I have to disagree. IBM's production engineers are the best in the world.

Unlike Intel, IBM not only makes in-house design chips, but is also a fab-for-hire. Using cutting-edge technology, and doing it well and in quantity *is* their job.

I would have to disagree with both of these statements--to a lesser or greater extent. Horses for courses as the saying goes.

Intel gets great speeds out of bulk Si, but for various reasons they are behind the curve in more advanced processes. Witness their transistion to Cu. They were a process generation behind AMD (180nm v 130nm), and when it first debuted--P!!! Tualatin, it was leaky (idle current dissapation was 30% of total!). 90nm is leaky too. It appears that 65nm may be much better--but that is two whole generations later. Notwithstanding, nobody is as aggressive (transistor size, speed) in bulk as Intel.

IBM have fantastic R&D. They really do have cutting edge processes--probably even better than Intel, but they simply cannot transfer this to bulk manufacturing. O.K. if you can justify unit costs of $1 000, but no good when your business model needs a unit cost of $20~30. Have a look at IBM's books, their semi-con manufacturing has yet to turn a profit. They lost the NVIDIA (5x00, 6x00??) series to TSMC because they could not generate the volume required. Apparently their yields (IIRC) were in the range of 8~12%. But AMD adopted the process for 90, 65 and 42nm so it can't be all bad. SOI is proving it's worth despite Intel nay saying it--just look at the power draw of A64 v Netburst. Sure the design plays into this a lot--in fact, at this end of the spectrum, you cannot seperate design and process, but SOI designs draw less power than bulk

AMD on the other hand have used their own process--250nm, Moto's, 180nm, 130nm and now IBM's 90, 65,42 nm. And they have managed to successfully transform IBM's lack lustre production methods into real commercial manufacturing. But AMD does not innovate--they haven't used an in-house process since 250nm.

As I said, horses for courses.
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
CougTek said:
Many must know, but in reality and contrarily to what Digitime reported, Intel will be the one shipping 65nm processor this year and AMD only 6 months later. Being more interested by current AMD chip designs, I find this quite disappointing. I couldn't care less about any Intel CPU still using the netburst architecture, no matter how small the manufacturing process would be.

So?

Intel were first to 90 by a long shot, but power consumption was so bad that they only released 2.4GHz Celerons (lower FSB, 2/3rds frequency of top end chip, less cache etc.)--and these were pushing the edge of their platform power envelope. The cynic in me says the reason was that the process was rushed. By the time they got power under control, and their top end processors were on 90nm, AMD had released their 90nm and were ramping as well. I says bragging rights only.

If the process does not deliver tangible benefits in power saving and/or speed increases, then it's a pissing contest. You (generic, not you specifically) may be able claim that it reduces costs, but if you are tuning your process on on production samples, then you are wasting fab capacity--get it right in the lab and pilot lines first.

I still expect AMD to be behind Intel by six months or so from what they are saying, but I suspect this is more to do with slippage in K10 (next gen product) which is the flagship for the process, than due to process problems.
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
LiamC said:
IBM have fantastic R&D. They really do have cutting edge processes--probably even better than Intel, but they simply cannot transfer this to bulk manufacturing. O.K. if you can justify unit costs of $1 000, but no good when your business model needs a unit cost of $20~30. Have a look at IBM's books, their semi-con manufacturing has yet to turn a profit. They lost the NVIDIA (5x00, 6x00??) series to TSMC because they could not generate the volume required. Apparently their yields (IIRC) were in the range of 8~12%. But AMD adopted the process for 90, 65 and 42nm so it can't be all bad. SOI is proving it's worth despite Intel nay saying it--just look at the power draw of A64 v Netburst. Sure the design plays into this a lot--in fact, at this end of the spectrum, you cannot seperate design and process, but SOI designs draw less power than bulk

AMD on the other hand have used their own process--250nm, Moto's, 180nm, 130nm and now IBM's 90, 65,42 nm. And they have managed to successfully transform IBM's lack lustre production methods into real commercial manufacturing. But AMD does not innovate--they haven't used an in-house process since 250nm.

If IBM is so bad, why does AMD use them so well? It seems to prove my point.

And as to why IBM lost the business with nVidia... TSMC said that they could do it cheaper and in the desired volume. They couldn't meet both of those. IBM was simply honest about its abilities.

I would also like to point out that IBM doesn't keep money-losing businesses. If fabrication wasn't at least a break-even affair, IBM would have sold it, spun it off, or shut it down. The chip-fabrication business is certainly not a cheap one to be it; you don't want to be in it for a loss.
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
sechs said:
If IBM is so bad, why does AMD use them so well? It seems to prove my point.

AMD doesn't use IBM. AMD licensed the technology. AMD does their own manufacturing. AMD and IBM signed a joint development partnership for process tech at the 90, 65 and 42nm nodes. There is a world of difference

sechs said:
And as to why IBM lost the business with nVidia... TSMC said that they could do it cheaper and in the desired volume. They couldn't meet both of those. IBM was simply honest about its abilities.

I would also like to point out that IBM doesn't keep money-losing businesses. If fabrication wasn't at least a break-even affair, IBM would have sold it, spun it off, or shut it down. The chip-fabrication business is certainly not a cheap one to be it; you don't want to be in it for a loss.

My understanding of the agreement between NVIDIA and IBM is that NVIDIA paid for delivered chips and contracted for x volume. Whether IBM used 10 wafer starts or 100 000 was up to IBM--but it certainly was in IBM's interests (profitability) to have high yields in order to maximise profit. IBM, despite an underutilised plant could not meat the quantity targets -> yields sucked, no profit.

IBM almost has to have manufacturing tech. I can't see them getting Power or zArch processors fabbed by Intel now can you? ;)

If they can make a profit, even better. To make profits, you need the tech and the manufacturing ability--AMD seem to better at this than IBM because they have the experience and a lot of in-house developed process tools--which IBM wanted access to, it is a partnership after all.
 

CougTek

Hairy Aussie
Joined
Jan 21, 2002
Messages
8,728
Location
Québec, Québec
LiamC said:
AMD and IBM signed a joint development partnership for process tech at the 90, 65 and 42nm nodes.

Now you can extend that a little...
As expected, Advanced Micro Devices and IBM announced they had broadened the scope of their technology alliance and the pact now includes early exploratory research of new transistor, interconnect, lithography, and die-to-package connection technologies through 2011. Particularly, the companies are set to help each other developing 32nm and 22nm process technologies.
Source
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
You assertion that AMD uses IBM does not stand up to Coug's linked article, and many more that I have read on the AMD/IBM development agreement in EETimes, DigiTimes, News.com, The Inquirer, The Register and others.

You made an assertion about why NVIDIA moved its business--and your assertion is orthoganal to mine. You also offered no proof. Neither did I. So unless you can offer proof, then I see no reason to refute it--it is from your posting, only your opinion so far. If you do a little hunting on the sites mentioned above, you will come across articles and analysis on the deal. You can then draw your own opinion.

As for your final point, I would generally agree with you, but semi-con manufacturing is a special case and intimately tied up with IBM's business model. IBM cannot afford not to be in the business. comp.arch might be a good place to start.
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
I never made such an assertion. AMD, in general, makes it's own chips; it's a chip company. AMD has used IBM for technology.

You brought up the issue of nVidia and its contract manufacturer. I simply explained why it didn't prove your point.

If you agree then why argue against it?
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
I made reference to the slippage of K10 a couple of posts ago. Looks like it has more than slipped.

K10 is dead
http://www.theinquirer.net/?article=27421

This is not good news for AMD. Long live K8L. Thsi would explain AMD's slippage on 65nm as well. The new core would have to be designed and validated (in software), masks produced, initial silicon produced etc, etc.
 
Top