Nvidia problems

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,741
Location
USA
Bummer, I was hoping their most recent cards would storm the market and surpass ATI.
 

udaman

Wannabe Storage Freak
Joined
Sep 20, 2006
Messages
1,209
I think this v also bad news for Nividia...as notebook sales comprise the majority of computer purchases:

U guys probably missed this thread :D

Apple going 'rogue' >AMD? :D

Nvidia, Intel vie for lead role at Apple



Apple's updated laptop line, does focus on Nividia over sheer CPU, but that's bcuz they wanted longer battery life than the typical PC.

^I disagree with the analysts analysis on the Mac line being so GPU orientated, as mostly Steve-O has been reading SF and more specifically Merc's comments about fried testicles in early MBP's, along with Steve-O's known hate for loud fan speeds; and so he went for less powerful, less heat dissipating GPU, less TDP.

The current top MBP's use only a mid-level performance ~<25w GPU's. Funny thing is that ATI has some higher performing mid-range GPU's that Apple could use that consume just about the same amount of battery power as Nvidia...could be contractual reasons 4 that. If you want higher end GPU performance, you're SOL with Apple's laptops, gotta get a testicle burner, Alienware or such gaming laptop.

Seems to me if the ATI HD5650, which is X11 compatible, 6 monitor capable, lol + on new 40nm process = low current consumption; was available for site testing last Dec., Apple could have had time to put it into the latest refresh? Then again, Nvid did work w/Apple for custom auto switching btw discrete and integrated GPU's for longer battery life...maybe ATI did not have such an easy solution?

http://www.dvhardware.net/article39903.html

http://www.notebookcheck.net/Computer-Games-on-Laptop-Graphic-Cards.13849.0.html

Nvidia's future may hinge on the outcome of the lawsuits btw Intel & them.
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
I thought that ATi was falling behind again this cycle. Where did nVidia go wrong?
 

MaxBurn

Storage Is My Life
Joined
Jan 20, 2004
Messages
3,245
Location
SC
They are doing something wrong with their chip design so that the fab yield is really really low. Happening for a while now.
 

Stereodude

Not really a
Joined
Jan 22, 2002
Messages
10,865
Location
Michigan
That sounds like a fab problem, not a nVidia problem. TSMC gives their customers the design rules for their process and customers use those rules to design their chips. Bad yields mean the rules weren't accurate or the process isn't stable.
 

MaxBurn

Storage Is My Life
Joined
Jan 20, 2004
Messages
3,245
Location
SC
Well yes and no, there's that plus a story some where that explains the die area for the current chips is HUGE which drastically reduces the yield. What I got out of it is even a smaller process won't significantly improve things. They are also running into the whole bad design leading to required voltage and then thermal problems. Basically it is a setback that they are going to have some major challenges to overcome or possibly not even survive.
 

time

Storage? I am Storage!
Joined
Jan 18, 2002
Messages
4,932
Location
Brisbane, Oz
That sounds like a fab problem, not a nVidia problem. TSMC gives their customers the design rules for their process and customers use those rules to design their chips. Bad yields mean the rules weren't accurate or the process isn't stable.

Doesn't AMD/ATI use the same process with the same fab? Sounds more like an nVidia problem.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,593
Location
I am omnipresent
I'm under the impression that nVidia's current design is kind of a brute-force-just-keep-adding-transistors solution, where ATI has made some more elegant design choices that led to a smaller chip.
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
I'm under the impression that nVidia's current design is kind of a brute-force-just-keep-adding-transistors solution

NV has been that way since the intro of the GeForce FX series. The added transistors, means more heat, which mean lower clocks (to help with heat) which means performance no better than the competition.

I like to think of it this way, a 351 V8 or a 2L twin turbo? Both do the same, but one draws more juice, one gives off more heat...

@Uda, in regards to "which is X11 compatible", WTF does DX11 have to do with Mac? (which uses OpenGL). Shouldn't you be spouting OpenGL support, like it support OpenGL 3.2 or OpenGL 4.0?
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
Doesn't AMD/ATI use the same process with the same fab? Sounds more like an nVidia problem.
AMD Also uses GlobalFoundries, but I don't get the feeling that they rely upon them heavily for GPU manufacture.

nVidia won't use them on general principles.
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
I'm under the impression that nVidia's current design is kind of a brute-force-just-keep-adding-transistors solution, where ATI has made some more elegant design choices that led to a smaller chip.
I don't know about elegant, but, as I understand it, the ATi design is more modular. This means that they can make a low-end part far more easily, but that a very high-end part would have a lot of modules, and, therefore, take up a lot of real estate.

As far as I can tell, nVidia took the Intel path straight to the high-end, with hopes of figuring out the other stuff later. They failed.
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
www.semiaccurate.com

You'll find out all you need to know. It looks like ATi will have their next gen parts out before NVIDIA is finished getting all their current gen parts out.
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
AMD Also uses GlobalFoundries, but I don't get the feeling that they rely upon them heavily for GPU manufacture.

nVidia won't use them on general principles.

No GPUs are coming out of GF. GF are just starting to sample their 40nm bulk Silicon. All of AMD/GF previous processes have been Silicon On Insulator.

Supposedly, the first AMD GPU fabbed at GF will be the on die GPU as part of Llano late this year
 
Top