Geforce Decision...Which One

Which Geforce is the best price/performance bargain?

  • GF4 Ti4200 128MB

    Votes: 0 0.0%
  • GF4 Ti4200 64MB

    Votes: 0 0.0%
  • GF3 Ti200 128MB

    Votes: 0 0.0%
  • GF3 Ti200 64MB

    Votes: 0 0.0%

  • Total voters
    0

Clocker

Storage? I am Storage!
Joined
Jan 14, 2002
Messages
3,554
Location
USA
I'm considering upgrading my GF2 GTS 32MB to one of the following (in the following order) but cannot decide which.

GF3 Ti200 128MB
GF4 Ti4200 64MB
GF4 Ti4200 128MB
GF3 Ti200 64MB


Price/performance is my primary consideration. Some questions I have are:

Will there be games that use 128MB RAM before I need a new card anyway?

Are the added DX features of the GF4 really necessary/worth it?

Any of you video card gurus out there...what do you think?

Clocker
 

Prof.Wizard

Wannabe Storage Freak
Joined
Jan 26, 2002
Messages
1,460
I think games which might take advantage of the 128MB are just around the corner. But everything depends on your monitor...

If 19" or more then consider seriously the 128MB G4 Ti4200... if less, go for the 64MB edition.

Let go with the Ge Ti200. The price difference is not so great. Take the newer product.
 

Clocker

Storage? I am Storage!
Joined
Jan 14, 2002
Messages
3,554
Location
USA
Good point PW. My monitor is 19" and will probably get larger. I'd say the min resolution I'll use is 1024x768.

Keep that in mind while voting....

C
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
Just a second:

Nvidia should have a new chip out in what? July? August? Whatever makes the current chip so zippidee-do-dah special, they'll probably toss a whole bunch more execution units in to do more of it, bump the local memory I/O and maybe add some special new stuff to make idiots with too much money think it's worth buying right away (c.f. Geforce T&L).

Today, there is *nothing* that uses the full power of a GF3. All those neato features in a GF4 aren't doing squat in any game I'm aware of.

In short, hold out until August if you think for some terrible reason you want to pollute your computer with an nvidia chip, or buy whatever is cheapest right now. Doubtless you won't be able to tell the difference anyway.
 

Prof.Wizard

Wannabe Storage Freak
Joined
Jan 26, 2002
Messages
1,460
Mercutio said:
...if you think for some terrible reason you want to pollute your computer with an nvidia chip, or buy whatever is cheapest right now. Doubtless you won't be able to tell the difference anyway.
I just knew it... I was betting you're gonna write something like this... :lol:
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
At this point, in 2D, it doesn't matter. A 16MB card does every desktop resolution any monitor can support.
In 3D, it matters a little bit, but only up to the limits of what game designers are assuming people have... Which right now is 32 - 64MB.
Memory is mostly used as framebuffer (a place to draw frames prior to display, with higher resolutions requiring more memory per frame, and don't forget that we're talking about a new card being able to render 200 frames of Quake3 every second) and for storing textures - and most games at this point don't load so many textures that there's a good case for 128MB cards, either.
Things like Antialiasing also play here - people using lower resolutions might have 2x or 4x or nvidia's weirdo quasi-4x AA turned on. The idea is to render at some multiple of screen resolution and then average out the pixel values for display (and that's a good enough explanation. I'm sure it's more complex than that). This expands memory usage and memory bandwidth needs dramatically, but with any AA, there's a dimishing return at high resolutions; an angled straight line is probably going to look like an angled straight line (instead of a stair-stepped jaggy line) at 1280x1024 whether it's been oversampled or not.

Anyway, like I said, most of this stuff just isn't going to matter. You get to 150fps and the action is so fact and furious that it probably doesn't matter whether the arches on the screen have a slight jaggedness at 1024x768 anyway or the "water" on the screen does accurate refraction and has accurate bump-mapping (which is a silly way to do water anyway).

And yes PeeWee, I do believe that nvidia makes truly inferior products. Cope.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
32MB will handle 1024x768xtruecolor pretty well, within some reasonable limits for texture size and scene complexity.
 

P5-133XL

Xmas '97
Joined
Jan 15, 2002
Messages
3,173
Location
Salem, Or
Gimmie a break. Even 1600x1200x32 bit color requires only 8MB of ram; All the rest is related to 3d and Texturing. Monitor size has little to do with video card memory requirements because very little of the memory has to do the increased resolution one can do with larger monitors.
 

Prof.Wizard

Wannabe Storage Freak
Joined
Jan 26, 2002
Messages
1,460
Modern simulations and games (IL-2 Sturmovik, WWII Online, SOF2*, etc.) could use the more-than-64MB RAM of the newest cards.
And these are titles already in the market. Don't let me talk about newer ones (Doom III, Warcraft 3, etc.) that are about to be released...

The memory size is a serious bottleneck if you use a more-than-1024x768 resolution with all features of the game on. And using that resolution in games is not strange if you use a 19+" monitor...


*it's amazing the things you can tweak in the video settings menu of this title...
 

P5-133XL

Xmas '97
Joined
Jan 15, 2002
Messages
3,173
Location
Salem, Or
Prof.Wizard said:
Modern simulations and games (IL-2 Sturmovik, WWII Online, SOF2*, etc.) could use the more-than-64MB RAM of the newest cards.
And these are titles already in the market. Don't let me talk about newer ones (Doom III, Warcraft 3, etc.) that are about to be released...

The memory size is a serious bottleneck if you use a more-than-1024x768 resolution with all features of the game on. And using that resolution in games is not strange if you use a 19+" monitor...


*it's amazing the things you can tweak in the video settings menu of this title...

Most modern monitors, even small ones can easily handle 1078x768+ resolution and even a small monitor will look better at higher resolutions assuming you are not exceeding its bandwidth limitations so I would assume that those with small monitors will generally choose higher resolutions. Thus, even the indirect association with monitor size and resolution breaks down.

As one increases the resolution then there is far more work to do (geometric increase); both by the video card and the CPU. However, the amount of RAM required is not a limiting factor, but rather the basic throughput capabilities of the card, the CPU, the various buses, and even the throughput of the ram. The amount of ram needed is specified by the amount needed to store the tempory data and that tends to be linearly related to resolution in addition to the amount needed to store the basic textures. All modern video cards have plenty of video ram: I will accept that future games with their planned exceedingly detailed textures will probably need lots of ram, but that has nothing to do with monitor size or monitor resolution but simply what is required to store those textures.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
Prof.Wizard said:
Mercutio said:
And yes PeeWee, I do believe that nvidia makes truly inferior products. Cope.
Then you're off topic. Check again the thread's title.

And as you'll see, I offered my opinions within the confines of the options given - Wait or buy the cheapest one.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,920
Location
USA
Since I don't know what each card actually costs, and I'm not inclined to go look it up right now, I chose the GeForce 3 Ti 200 64.

Why? Because in my own opinion, I was playing all my games just fine with a Radeon 64 DDR. I've since acquired a GeForce 3 Ti 200 64DDR and it made a considerable difference in all of my games. I'm able to play WarCraft III @ 1600x1200 32 bit color at 85 Hz with no problems. None of the games I have show signs of lag with this card currently.

So lets say 6-12 months down the road Doom III comes out. It may very well swamp the GF3 Ti 200. But in that amount of time the top of the line GF 4 cards will be around the same price as the GF 3 is right now. Nvidia will be pushing out the latest and greatest and if Matrox and Creative have their act together, the competition may help to drive some of the costs down.

I think the better question to ask Clocker is, what do you need the card to do? If it is more for fun and to have a much faster card, then spend the buck and get the GF4 that everyone else voted for. If you don’t really need it for anything to run correctly at this time, then buy the cheaper option or wait. If the card I voted for makes little sense to buy, then I would hold off for a few more months.

Most of this same reasoning went into my decision for the Radeon 64 DDR at the time. I didn't want to shell out high $300's USD for a GF 2 and so far I'm glad I did not. My less expensive Radeon ($180 at the time) has paid for itself and now I can skip a generation and buy the almost end of life GF3.
 

Groltz

My demeaning user rank is
Joined
Jan 15, 2002
Messages
1,295
Location
Pierce County, WA
Mercutio said:
Today, there is *nothing* that uses the full power of a GF3. All those neato features in a GF4 aren't doing squat in any game I'm aware of.

Then clearly you haven't tried a newer game engine with 4XS antialiasing plus 8X anisotropy turned on. I have a GF4 Ti4600 on a 1700MHz Athlon XP with 1024 MB RAM and have my frame rates drop into the 20's and 30's in parts of 3dMark SE at 1024 X 768 when those features are enabled. To a lesser degree, actual games have the same effect. Max Payne, Serious Sam 2, etc. Sure you can run newer games with anisotropy and antialiasing turned off, but the difference is very apparent if you do some A/B comparisons. They make a game look much better.

Kevin, if you want the ability to run those features and still get useable framerates, go with a GeForce4 chipped card. Also 2D quality has improved significantly on the GF4 compared to the GF3 as noted by many reviews. There is little difference between the 64 and 128 meg versions of the Ti4200 until you get high up in resolution. The 64 meg card has faster RAM than its 128 meg counterpart (500MHz versus 444MHz). However, a utility like RivaTuner can easily clock up the RAM on a 128M Ti4200 to 500MHz with no problems. That would be my choice. I'd go with VisionTek if possible, their cards are produced domestically.

--Steve
 

Clocker

Storage? I am Storage!
Joined
Jan 14, 2002
Messages
3,554
Location
USA
Thanks for all your input guys...keep it coming...I'm reading it all....

I'm kind of liking Handruin & Merc's advice at this point since I have no problems with selling and upgrading as needed or when it makes sense...I don't really game a whole hell of a lot but this GF2 is 18 months old now....the fan broke and I have a big passive cooler on it that takes up a PCI slot when I have the fan installed...(don't really need it except on really hot days)....

C
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
[sarcasm mode]
Wow, Groltz, you mean there's a benchmark that can make your spiffy new nvidia card as slow as any old 18-month old card?

How is that possible?
[/sarcasm mode]

OK, really. Now that *I'M* done laughing, show me games that can cause your card drop to 30fps.

For that matter, show me what difference 4xAA makes on a 1024x768 display.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,920
Location
USA
Mercutio said:
[sarcasm mode]
Wow, Groltz, you mean there's a benchmark that can make your spiffy new nvidia card as slow as any old 18-month old card?

How is that possible?
[/sarcasm mode]

OK, really. Now that *I'M* done laughing, show me games that can cause your card drop to 30fps.

For that matter, show me what difference 4xAA makes on a 1024x768 display.

Return to Castle Wolfenstein at 1600x1200 or higher @ 32 bit with 4XS and all detials set to the highest, that should slow it down. :)

I think it was Groltz telling me how morrowind chuggs with a GeForce 4. I forget the link, but people were complaining in the forums.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,920
Location
USA
Mercutio said:
[sarcasm mode]
Wow, Groltz, you mean there's a benchmark that can make your spiffy new nvidia card as slow as any old 18-month old card?

How is that possible?
[/sarcasm mode]

OK, really. Now that *I'M* done laughing, show me games that can cause your card drop to 30fps.

For that matter, show me what difference 4xAA makes on a 1024x768 display.

I also forgot to add that I could see the difference of 4xAA makes on UT even when I'm playing at a high pace. I notice the detail on the edges of the walls and corners are smoother. Maybe it's not a big deal, but it does add a nice touch to not see (as many) jaggies while playing.
 

Prof.Wizard

Wannabe Storage Freak
Joined
Jan 26, 2002
Messages
1,460
P5-133XL said:
Most modern monitors, even small ones can easily handle 1078x768+ resolution and even a small monitor will look better at higher resolutions assuming you are not exceeding its bandwidth limitations so I would assume that those with small monitors will generally choose higher resolutions. Thus, even the indirect association with monitor size and resolution breaks down.

It doesn't break at all. If I had a 19" or 21" monitor I would have surely set the resolution of Soldier of Fortune II at higher-than-1024x768 levels.
There are two reasons I'm NOT doing it...
1) I have a 17" monitor (even if its max res is 1600x1200 the sprites then become so small...)
2) My G3 Ti200 64MB can not cope with "heavy" scenes...

Your same assumption
...even a small monitor will look better at higher resolutions...
is proving one thing: if small can handle, imagine the big ones!!
 

Prof.Wizard

Wannabe Storage Freak
Joined
Jan 26, 2002
Messages
1,460
P5-133XL said:
As one increases the resolution then there is far more work to do (geometric increase); both by the video card and the CPU. However, the amount of RAM required is not a limiting factor, but rather the basic throughput capabilities of the card, the CPU, the various buses, and even the throughput of the ram. The amount of ram needed is specified by the amount needed to store the tempory data and that tends to be linearly related to resolution in addition to the amount needed to store the basic textures.
That's why I said high-resolution games/sims with all features on. Then the textures, even compressed, can saturate easily a RAM.

All modern video cards have plenty of video ram: I will accept that future games with their planned exceedingly detailed textures will probably need lots of ram, but that has nothing to do with monitor size or monitor resolution but simply what is required to store those textures.
First of all, let me repeat-I said: monitor size in sense of monitor resolution.
Exceedingly detailed textures on exceedingly higher resolutions will need exceedingly higher RAM capacities. The correlation is straight.
Suggest if you don't believe to try one of the latest titles (SOF2, RtCW, etc.)...

BTW, why don't you all read this article...?
http://www.xbitlabs.com/video/gf3ti200-128/

The conclusion is the same:
Well, the results obtained give us every reason to say that your GeForce3 Ti200 based card doesn't need 128MB of graphics memory onboard in the today's games. However, in the future, when new games with larger textures come out, 128MB of graphics memory may turn out a really good advantage. By the way, you may remember that when the first graphics cards with 32Mb and then 64MB graphics memory came out we were making the same conclusions about the onboard memory as we do now. As the time passed, we see that 23MB of graphics memory are sometimes not enough for the contemporary games. Moreover, most today's graphics cards are equipped with "absolutely useless 64MB graphics memory" as we thought then.

Says all. Cheers.
 

Prof.Wizard

Wannabe Storage Freak
Joined
Jan 26, 2002
Messages
1,460
Mercutio said:
OK, really. Now that *I'M* done laughing, show me games that can cause your card drop to 30fps.
World War II Online. When I passed from a Voodoo5 (G2-class card) to G3 Ti200 the frames passed from a mere 15-20fps to 40-50...
The only change was that, of course, so I presume that it was all fault of the old video adapter. Same resolution. 800x600 and 4x AA on both cards. Now the same happens if I place a 1024x768 resolution. I think I HAVE reasons to believe that the same could happen to a G4 Ti4600 if you place 4XS AA on 1600x1200...
For that matter, show me what difference 4xAA makes on a 1024x768 display.
Jeez. When was the last time you played a sim?! 1980?

------------------------------------------------------------------------------------
Well, I can't say I don't understand you... Tux plays great even with an old TNT. Why would you ever need a G4? :lol:
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
Prof.Wizard said:
Jeez. When was the last time you played a sim?! 1980?

Unless you're willing to count the Mechwarrior games, never.
Er, OK... "GATO". It ran on XT-class machines and featured exciting CGA graphics. Or maybe "Jet", which was wireframe CGA game that ran on PCjr. systems.

I'd like to see screenshots of purported differences between highres and highres+AA.
 

Sol

Storage is cool
Joined
Feb 10, 2002
Messages
960
Location
Cardiff (Wales)
A couple of my friends recently purchased G4TI4400s. They leave antialiasing on pretty much the whole time and boy is it pretty. If the 4200s had been out at the time they might have gotten them since the price performance ratio does seem to be better just looking at bench marks.

By the way, antialiasing at high resolution is a big RAM user, and on a big monitor antialiasing is more noticeable
 

Tea

Storage? I am Storage!
Joined
Jan 15, 2002
Messages
3,749
Location
27a No Fixed Address, Oz.
Website
www.redhill.net.au
When it comes to games, I like Quattro Pro quite a lot.

But while any comment I could make about the relative merits of the cards themselves would be merely a distorted echo of the things I have learned from you guys (especialy Sol, who helps me out quite a lot), I can comment on the relative longevity and esstimated resale value of the cards.

As I see it, Clocker, you will be keeping the card for quite a long time as video cards go: 18 months maybe, and pushing it only moderately hard.

That to me suggests that it's worth your while to get one you will be happy with: that you don't want to be looking back in six months time and saying "I wish I'd spent the extra $50". Argument for a 128MB GF IV.

On the other hand, you won't get any benefit to speak of from the extra RAM for a good while yet, and by the time you do you will be already thinking seriously about a Parhelia II or a GF VI. Argument for a 64MB GF IV.

And if it comes to that, I doubt that you'll pick much difference between the GF III and the GF IV. Argument for a 64MB GF III.

On the other hand, resale value tends to be overly effected by the amount of RAM on the card (because buyers are stupid). (Except us, of course. :)) Argument for a 128MB GF III.

Then again, you will get more for a second-hand GF IV than a GF III because IV is a bigger number. Argument for a GF IV.

But, contrary to that, the extra $$ you spend now, to get that resale value, will never ever go close to coming back to you when you sell it - i. e., if a 128MB GF IV costs $100 more than a 64MB GF III, it's reasonable to expect that you will get an extra $20 or $40 back when you sell it - i.e., a net loss of about $70. Argument for a 64MB GF III.

And, making it still more complicated, there are relativities to take into account: often the 128MB version is barely any dearer than the 64MB version, so you will get your money back when you sell it.

Bottom line: you have two choices.

(a) Keep your eyes open for a bargain. Don't worry too much if it's a III or a IV, or how much RAM it has, just look out for something of quality (Leadtek, ASUS, MSI, Hercules, etc.) in a GF III or IV with whatever amount of RAM it happens to have at an especially attractive price. That is, just buy the one that makes the best value for money proposition to you. Be paitent, wait for your opportunity.

(b) Say what I say: "To hell with it! It's my money isn't it? I can waste it if I want to!" And race out and buy the best one in the shop. Tell them if you can't have it delivered by yesterday you don't want it.

But what about me? I'm happy with my G450s, thankyou. Despite their not actually being made by God. (Refer other thread.)

I have ECS 1.0 fixpack 1, a 32MB G450, Mozilla/2 1.0, Quattro Pro and a spanking new Mitsubushi 21 incher .... I'm happy.

PS: why am I running an incredibly expensive (~AU$400) 32MB G450 when, with the apps I run, I could get the exact same performance and the exact same picture quality out of my two 16MB G450s, my 16MB G200, or even my old 8MB G200?

Just because I can. :wink:
 

P5-133XL

Xmas '97
Joined
Jan 15, 2002
Messages
3,173
Location
Salem, Or
PW,

The article you mentioned uses only one manufactures storage algorithums (Nvidia's). By no means is that the only or necessarily the best method of memory management. The numbers that they are quoting can be relatively easily modified downward signifigently. For examples the Savage 4 with its compression algorithums or the upcoming Maxtor's Parhelia (There is no inherient need to apply 4xAA to every pixel (increasing the resolution from 1600x1200x4 to 3200x2400x4) but only those that deal the diagonal lines. There are many possible algorithums to deal with AA that don't require extrordinary amount of ram and do not harm the picture quality or slow down the card.

The games are saturating the bandwidth of the ram, not the amount. If you took a current game and added 100GB's of ram to any reasonable modern card, they would not run any faster. If anything, adding extra ram slows down the card because of the capability of doing more to the picture. You don't need more ram, you need faster ram, or a faster CPU, or a faster bus. The games are not even coming close to using up the amount of ram they currently have access to; yet. The amount of ram is irrelevant as long as you have enough. Thus, the amount of ram is disassociated from the monitor size.

The future may be different with the addition of very large and very detailed textures, the amount of ram needed will become far greater. Note that textures have to be stored at the resolution that they are created, not at the resolution they are displayed at (again, this dissociates the ram requirement from the monitors resolution). That texture is then applied to the surface at its resolution and does not require extra ram if displayed at a higher or lower resolution than the textures native resolution. There is a big difference is the storage requirements if the textures are 2000x2000x4 as opposed to 64x64x4 and there are litterally thousands of them in a game that need to be stored in the video cards local memory. There really isn't extra ram required to apply the texture (the resolution doesn't change before vs after applying the texture), the extra ram requirement comes from the original storage. Currently, we are at the 64x64x4 level and maybe someday soon we will be at the 2Kx2Kx4 level but not yet.
 

Prof.Wizard

Wannabe Storage Freak
Joined
Jan 26, 2002
Messages
1,460
Mark,
I'm not interested in FPS. I'm interested in the amount of rendered textures in a given resolution.
I know and understand that FPS hasn't got anything to do with RAM's capacity, but as you rightly said with the bandwidth. However modern, texture-rich games can be limited by exactly that amount. There's no doubt about that.

Also, regarding the methods used by X-bit. I find them justified. Using manufacturer's algorithms is the most objective way to do a test.

Make no mistake about it, thank you for your professional and clarificating answer. :D
 

Prof.Wizard

Wannabe Storage Freak
Joined
Jan 26, 2002
Messages
1,460
Screenshot's from IL-2 Sturmovik...
I think you can see for yourself, Mercutio, what means to play this combat simulation with Anti-Aliasing off:
MercutioBird2.jpg


and

WizardBird2.jpg


I'll give you a hint... :wink:
better plane = better graphics
 

Tea

Storage? I am Storage!
Joined
Jan 15, 2002
Messages
3,749
Location
27a No Fixed Address, Oz.
Website
www.redhill.net.au
(But Tea, you don't know anything about video cards!)

(Never stopped you before, Tannin.)

Ahem. As I was about to say, Nvidia are famous for having a particular approach to video card problems: essentially it boils down to "never mind the finesse, just thow a lot of transistors at it." Not dissimilar to Intel's approach to CPU design, in fact.

By contrast, consider the finesse approach, exemplified by Kyro.

Work harder? Or work smarter? By inclination, I prefer the latter approach, but it seems that the brute force methods favoured by Nvidia must be adjudged the more successful ones, at least so far. Perhaps, sometime in the not too distant future, they will hit the wall, just as Intel did with the Pentium Pro, and faced with the physical limits of what can be achieved with a bit of purified beach, have to resort to transistor-saving methods instead. Doubtless, they are already thinking along these lines.

(There. Now you've gone and proved it.)

(Proved that I don't know anything about video cards?)

(Yup.)

(Shrug.)

Mark: excellent summary, by the way. Thankyou.
 

Prof.Wizard

Wannabe Storage Freak
Joined
Jan 26, 2002
Messages
1,460
Tea said:
What's the matter with that second aircraft? I can't even see the propeller properly.
Haha... that was a cool, Tea... Thanks for the laugh!
 

P5-133XL

Xmas '97
Joined
Jan 15, 2002
Messages
3,173
Location
Salem, Or
One correction in my previous post: The numbers 1600x1200x4 and 3200x2400x4 should actually be 1600x1200x3x4 and 3200x2400x3x4. This is actually because Nvidia is actually applying the 4xAA to the Z-axis too. I really don't know why they are applying the AA along the Z-axis but they are.
 

Clocker

Storage? I am Storage!
Joined
Jan 14, 2002
Messages
3,554
Location
USA
Well, I decided to go for a change. Just ordered a Radeon 8500LE for $107 shipped via NewEgg. This is supposted to have the 3.3ns RAM according to the Anandtech HotDeals forum...

We'll see....

C
 

Tea

Storage? I am Storage!
Joined
Jan 15, 2002
Messages
3,749
Location
27a No Fixed Address, Oz.
Website
www.redhill.net.au
Awesome performance, they tell me .... will the drivers work? Crossed fingers.

Probably a wise choice, Clocker, at least in value for money terms. Which Gforce is it most closely equivalent to, and what price do they go for?
 

Jake the Dog

Storage is cool
Joined
Jan 27, 2002
Messages
895
Location
melb.vic.au
i'm considering an upgrade to an ATI 8500 AIW or a GF4 4600.. 64mb is fine is i can get a card as such. i currently have a gf2 gts and i would consider a jump to a gf3 or low spec gf4 big enough to warrant an upgrade. that's just me though.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
Radeon 8500 is neck and neck with the higher-end GF3, IIRC.
Clocker, if you end up not liking that card, I'll buy it off you. For only a slight loss on your part. ;)
 
Top