Quick question: best Gforce FX5900 card

Tea

Storage? I am Storage!
Joined
Jan 15, 2002
Messages
3,749
Location
27a No Fixed Address, Oz.
Website
www.redhill.net.au
Yes, another one. My customer wants the best GF FX5900 he can get. Doesn't really care about the price. wants it yesterday. He ordered an ASUS nearly a month ago but ASUS have given us three firm delivery dates and not made any of them. My ASUS dealer told me, frankly, that they now have a 4th firm delivery date from ASUS but she wasn't even going to tell me what it was because she didn't believe it herself.

We have given up on ASUS. Now my customer has mentioned the 256MB Albatron. Are there others we should consider? If so, are the differences significant of just the usual one-half of one-percent stuff? I can actually get the Albatron, which is a powerful argument in its favour. Go with that one? Or is there another?

Thanks guyz.

Small and frazzled hairy one
 

Pradeep

Storage? I am Storage!
Joined
Jan 21, 2002
Messages
3,845
Location
Runny glass
I believe the Gainward "Golden Shower" super secret special edition is currently the fastest?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
"Golden showers" are something that I'd normally associate with nvidia cards. Who knew that's what manufacturer's think, too?
 

Pradeep

Storage? I am Storage!
Joined
Jan 21, 2002
Messages
3,845
Location
Runny glass
Mercutio said:
"Golden showers" are something that I'd normally associate with nvidia cards. Who knew that's what manufacturer's think, too?

LOL I made up the shower part, it's Golden something.
 

Pradeep

Storage? I am Storage!
Joined
Jan 21, 2002
Messages
3,845
Location
Runny glass
It's amazing that Faud, with his limited grasp of the English language, thought up such a nice title. Tho he does seem to be improving.
 

Pradeep

Storage? I am Storage!
Joined
Jan 21, 2002
Messages
3,845
Location
Runny glass
Oh yes I remember now, the real name is "golden sample". So instead of visions of getting sprayed with urine from some hot chicky babe, we instead have to think about some pee-in-a-cup test.
 

Tea

Storage? I am Storage!
Joined
Jan 15, 2002
Messages
3,749
Location
27a No Fixed Address, Oz.
Website
www.redhill.net.au
Hmmmmm ... all things considered, youd advide that I should stay with the Albatron, then? :)

Thanks for the link, Pradeep. It seems to be mostly a review of the chipset drivers, but what they say about the card itself seems good.
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
Best NVIDIA card? Looks like none. I have been following the NVIDIA driver saga closely and I think NVIDIA have been cooking the books. I should point out that the since the TNT2 came out, all of my gaming cards have been NVIDIA (TNT2, GeForce 2 Pro, Ti 4200).

Recently I found this particulary interesting:

http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm

and now this:
http://techreport.com/etc/2003q3/valve/index.x?pg=1

Basically in DX 9.0 games, NVIDIA have been cheating as in anything other than a scripted demo, the 5900 Ultra performs at about half the speed of a 9800Pro when the DX 9.0 eye candy is turned on - which is the whole point of having a DX 9.0 card.

9800Pro is looking attractive, and I am disgusted at the blatant spin NVIDIA is churning out. I don't believe them.

It's a bit like WMD in Iraq. According to news today, a US Government "source" is claiming that Korea has a new intermediate range ballistic missile. Yeah, right. Does the tale of "the boy who cried wolf!" come to mind? No thanks NVIDIA.

/rant
 

CougTek

Hairy Aussie
Joined
Jan 21, 2002
Messages
8,729
Location
Québec, Québec
Things like this turn me off from nVidia's graphic cards for the moment. It makes no sense to me to lay 400-500$ on a card with poor image quality.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
<Insert MP3 of cackling laughter>

The article on HL2 performance is, uh, interesting. I'm an incredibly pessimistic guy, but reading that, I honestly wonder if there wasn't a problem with their testing methodology. 33% - 50% differences in performance? The best "optimization" for the top-of-the-line DX9 card being DX8? Ouch. Even I have to think there's something funny going on there.

On the other hand, I feel really good about my 9700 now. Too bad I have to teach really late the night of the 29th.
 

Jake the Dog

Storage is cool
Joined
Jan 27, 2002
Messages
895
Location
melb.vic.au
that was 4-6weeks ago. nVidia have since released two sets of drivers without optimisations and IQ has increased dramatically.

keep up with the times brothers!
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
A brief snippet from JC.

http://english.bonusweb.cz/interviews/carmackgfx.html

Hi John,

No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion, are these results representative for future DX9 games (including Doom III) or is it just a special case of HL2 code preferring ATI features, as NVIDIA suggests?

Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.

John Carmack

NVIDIA's latest trick is to force Eidos to pull the Tomb Raider:AOD patch as it is not "representative of performance".

What a crock! Every benchmark and every respected developer says the same thing - NV3x parts just aren't up to snuff. NVIDIA's counter is marketing bull, driver cheats and lawyers. I wish they would expend half as much energy getting their next design to market.
 

Tea

Storage? I am Storage!
Joined
Jan 15, 2002
Messages
3,749
Location
27a No Fixed Address, Oz.
Website
www.redhill.net.au
I don't like reading any of this. What it comes down to, is that every time we play with ATI products, we get our fingers burned. I just don't trust ATI.

But it's looking more and more as though we are going to have to switch (unless Nvidia suddenly get their act together) and I really really don't wean the disruption.

At present, we have a broad range of cards, from $80 through to $800 and 64MB through to 256MB, and they all use the same drivers, and the drivers just work. We can even use those same drivers on oldTNT-based system. It's a damn good arrangement, and I am very loath to bugger it up - particularly as ATI cards, when they make their way into the workshop now and then, are quite often problematic.

We have to do stupid things like:

1: Format drive ready for new installation:
2: Remove ATI card
3: Insert some other video card (any brand, as long as it isn't an ATI)
4: Install W2K or XP
5: Revove other card
6: Insert ATI card
7: Load drivers.

Now that ain't every ATI card, but Kristi has met two or three that just won't install unless you follow the above procedure - and there have been vcarious other weirdnesses,. I don't ttrust the damn things.

Please Nvidia, get it together.
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
Tea,

my opinion is that the NVIDIA cards are still reasonable, but the NVIDIA marketing is sleazy, underhanded and deceptive - and that makes me angry.

My brief synopsis of the situation is this.

NVIDIA worked closely with MS on DirectX 6, 7 & 8. The DirectX code very closely approximates NVIDIA hardware for these versions - which is why NVIDIA was the card to have.

Then NVIDIA got into bed with MS (XBox).

MS was unhappy with how the partnership was working out, and how much they were forking over to NVIDIA - I think MS underestimated how low, Sony and Nintendo would slash prices. For future DX versions, MS wanted NVIDIA to hand over the IP rights to their hardware - or at least they wanted to use it for free.

NVIDIA baulked.

Come DX 8.1, which added more advanced pixel shaders - suddenly ATi is back on the scene. I think (though am a little unclear at this stage) that pixel shader 1.1 is an NVIDIA construct, whilst 1.4 is an ATi construct. Remember, the more closely the hardware resembles the API, the better and faster it runs - this is why Glide ran so well on 3DFX - sorry, 3dfx cards -, Glide just about was at the hardware level. DX 8.1 (which supported pixel shader 1.4) was (I think) a warning shot across the bows to NVIDIA.

NVIDIA decided to play hardball.

MS knew what the NV3x hardware would look like. So they changed the floating point representation from FP16 (16-bits of precision) and/or FP32 (32-bits of precision) - which is what NV3x supports natively - to FP24 (you get the picture). Guess whose hardware supports FP24 natively and whose doesn't?

A couple of wags have brought up - how come NVIDIA seems to do so well on Quake/OpenGL? The answer to this is that OpenGL supports vendor specific and supported extensions. So NVIDIA can add the API extensions that suit their card, and so long as the software developers implement the extensions, NVIDIA looks good again. NVIDIA can't do this with DirectX because MS controls it. id software did/does support the NVIDIA extensions, but they are about the only major OpenGL game on the market. SPECView Perf is the other major benchmark that uses OpenGL, and NVIDIA have been active in getting support for their OGL extensions into that code. This is why the performance looks so psychotic - good in OGL, bad in DX9. But how many people know which is which? or care for that matter?

My beef with NVIDIA is that they have gone out of their way to boost the benchmarks to look like they are competitive. NVIDIA only has to answer to their shareholders, and it is a cut-throat business - but that is no excuse for what I consider deception.

I expect that once NVIDIA gets their hardware sorted, they will admit that NV3x wasn't that great in an effort to mollify consumers. That is not going to cut it with me. I believe it will only be a marketing tactic.

I do agree with you that NVIDIA still has the most trouble free drivers, and that is your area of concern.

As for ATi/Kristy, tell her to search for the video card model type in the OEM*.INF files - or all *.INF files if a generic - in the Windows\INF folder.

This is the file that tells Windows the card/driver/software to install. Once you've found the ATi INF file, delete it and the PNF file of the same name in the same folder. When you reboot, (after uninstalling the card software) you should be prompted to install the "new" card. Bloody brain dead if you ask me.

Cheers
 

Howell

Storage? I am Storage!
Joined
Feb 24, 2003
Messages
4,740
Location
Chattanooga, TN
LiamC said:
My beef with NVIDIA is that they have gone out of their way to boost the benchmarks to look like they are competitive.

If it is indeed possible to customize an application/game to run better on a piece of hardware than it would in it's default state, then I think NV is justifed in making the custom part in the drivers if the application developer is not willing to make the customizations.

Based on my mostly ignorant understanding of the problem at this time, I think NV should not be condemned for making these optimizations so long as they are clear about whether or not the real performance gains are dependent on the application vendor moving the optimizations to the app code base from the driver.

All this to say, NV should be allowed to show the potential of the product as long as they are clear you will not see the same level of performance unless the app vendor cooperates.
 

LiamC

Storage Is My Life
Joined
Feb 7, 2002
Messages
2,016
Location
Canberra
Howell, the problem isn't that NVIDIA is making optimisations - they are entitled to do that.

The problem is that they are detecting when benchmarks are run in their driver and running the driver with lower image settings than were set in the benchmark/game/comtrol panel. This artificially boosts the scores delivered. When you run the game/software in "normal user" mode - you either won't get the speed the benchmark said you would - because you are rendering in a higher quality mode, or, the settings you specified won't take effect - resulting in a lower quality image than what you specified in the settings and what the game developer intended.

There are a specific set of optimisations possible where the driver is sped up, and no image quality suffers - I have no problem with these. But the furore isn't over this. If I want 4X anti-aliasing and anisotropic filtering, I expect the driver/game to run with it. What I don't want is the graphic driver to detect the game and substitute much lower quality modes so as to acheive an acceptable frame rate - simply because the hardware isn't as good as the marketing people would have you believe.

More info (with pics) and commentary here:

http://www.tomshardware.com/graphic/20030918/index.html

I don't normally like THG andymore, but TP didn't write this and it is well researched.

The major point you raise:
Based on my mostly ignorant understanding of the problem at this time, I think NV should not be condemned for making these optimizations so long as they are clear about whether or not the real performance gains are dependent on the application vendor moving the optimizations to the app code base from the driver.

All this to say, NV should be allowed to show the potential of the product as long as they are clear you will not see the same level of performance unless the app vendor cooperates.

NVIDIA aren't being clear. They are deliberately obfuscating the issue. Either that it is a major miscommunication between NVIDIA hardware and marketing that has gone unchecked for months despite repeated calls for clarification from numerous sources.

Also, I don't think the issue is NVIDIA making changes when the software dev won't. NVIDIA cannot make changes to DirectX. And if NVIDIA want to pay the developers to optimise, they can. And in fact they do.

The issue is that they are substituting lower quality images in place of what the developers code path, and the game settings indicate. That isn't an "optimisation" as it has only one purpose - to increase frame rate. Let me make this clear - NVIDIA aren't telling people that they are doing this.
 

InFeRnO

What is this storage?
Joined
Sep 19, 2003
Messages
16
Location
Madison
I must agree with LiamC, Nvidia has not only lost their stranglehold on the market, but they are relentless and unscrupulous in their endeavor to retake the throne from ATI. ATI cards are just pound for pound the best cards on the market. The fact that the designers one of the 2 most hyped up games of the year (HL2) have signed an OEM contract with ATI says a lot. Look at the benchmarks, just read the articles on
www.hardocp.com
for about a week and you'll start to get a feel for how the hard core gamer market is swaying. If this guy is hell bent on getting the highest end vid card on the market, nudge him towards the Sapphire Radeon 9800 Pro 256 (I have it in my comp and get 10% OC on it!). I truly believe that if you are going to drop over $400 on a single component in your PC, you better damn well get the best bang for your buck that is not only going to be good for now, but will percievere in the face of new competetion 3-6 months down the line and doesn't fall to the ass last benchmark spot on the dockett. Well, that is just my pointless rambling, I used to be an NVIDIA and AMD junkie, I now run the Rad 9800 Pro 256 and an Intel P4 3.2C. Call me a sellout, or just call me a guy who can't afford to buy junk!
 

InFeRnO

What is this storage?
Joined
Sep 19, 2003
Messages
16
Location
Madison
I thought that some of you might get a kick out of this. The link for the page is http://www.theinquirer.net/?article=11659 if you want the background on the story. This is absolutely hilarious. Look just below the counter on the Nvidia booth on the right side of the pic, looks like ATI made a move on Nvidia!
:)
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
It's silly to do an "us against them" thing with the graphics-chip makers. Frankly, both sides cheat as much as possible on benchmarks, and I really don't believe there's such a thing as an unbiased source of information about ATI or nvidia.

Both nvidia and ATI make products that perform more-or-less identically at any particular price point.

There's a "bang for your buck" issue, maybe, but honestly, you buy the one you can afford or the one you've had better experience with.

Now, this is coming from someone who is as anti-nvidia as can be, but what really annoys me is that we might be coming to a day or time when particular software, by virtue of the ever-shrinking numbers of GPU makers, will only run on one chip or the other.

I don't want to see that, and neither should anyone else. But it might happen someday soon. The whole "Half Life 2 works better on ATI" promotion is just a sign of unpleasant things to come.
 

InFeRnO

What is this storage?
Joined
Sep 19, 2003
Messages
16
Location
Madison
Mercutio said:
It's silly to do an "us against them" thing with the graphics-chip makers. Frankly, both sides cheat as much as possible on benchmarks, and I really don't believe there's such a thing as an unbiased source of information about ATI or nvidia.

Both nvidia and ATI make products that perform more-or-less identically at any particular price point.

There's a "bang for your buck" issue, maybe, but honestly, you buy the one you can afford or the one you've had better experience with.

Now, this is coming from someone who is as anti-nvidia as can be, but what really annoys me is that we might be coming to a day or time when particular software, by virtue of the ever-shrinking numbers of GPU makers, will only run on one chip or the other.

I don't want to see that, and neither should anyone else. But it might happen someday soon. The whole "Half Life 2 works better on ATI" promotion is just a sign of unpleasant things to come.

I concur, I think that that day is approaching at an ever increasing pace. However, the problem doesn't necessarily lie in the perverbial laps of either the GPU makers nor the software companies. I think that a lot of it has to do with consumers. Consumers desire, and almost need to know that what they are buying is going to be the most optimum purchase for what they intend to use it for (correction: American consumers are akin to that). Thus, the marketing groups are targeting audiences that are receptive to the slogan "ATI OEM w/HL2" because they then assume (often impulsively) that that chipset is the best for that game. I think that consumers need to make it clear that we want GPU's that are ubiquitous in the software coding industry so that we aren't left unable to play certain games or run certain software due to specific hardware constraints. Well...I'll just wait until quantum computing reaches the brink and then we won't have to worry about any of this, our boxes will all be the size of soda cans anyway. :mrgrn:
 
Top