Gilbo
Storage is cool
The Best GPU for $100 thread that Clocker started reminded me that I have a budget (ideally) video card purchase coming up. The decision has become surprisingly difficult, so I am hoping to leverage some of SF's experience.
Just about the only work I do that is demanding on hardware is Photo Editing, and this pretty much exclusively drives my hardware purchases.
Crucial considerations:
1. Linux support. The computer runs Linux so this is mildly important. nVidia is certainly the leader with respect to this consideration.
2. 3D performance is utterly irrelevant.
3. 2D analog signal quality is crucial. This is the kicker. The card needs to drive two displays at 2048 x 1536 at 75Hz (a little room in the specs would be nice though). My research indicates this requires 350+Mhz RAMDACs (which are pretty common these days).
Now, you can get an NVidia 5200 with dual 400Mhz RAMDACs quite inexpensively these days. Unfortunately, the RAMDACs don't appear to be the be all and end all when it comes to 2D analog signal quality. I've read numerous reports of cards --particularly NVidia cards-- offering terrible analog signals to monitors at high resolutions and refresh rates. I hear it relates to capacitors? I have mild electrical skills, but please don't ask me to solder audiophile capacitors onto the ass of my graphics card in search of a good deal .
I am very curious to hear about what experiences, if any, any of you at SF have had at the higher resolutions with analog connections. I currently run 1600x1200@75Hz on a Matrox G550. The image is perfectly fine and I can make none of the complaints I have heard from NVidia and ATI owners regarding blurriness, fuzziness, and various general funniness at high resolutions. I also have never seen the output of an NVidia or ATI video card above 1280x1024, so I have little experience to bring to the table regarding what differences there really are between Matrox and other brand cards. Is there really a problem with the non-Matrox cards?
Why don't I use Matrox? Matrox does indeed seem to be the logical choice, although there is quite a premium attached to their cards. I discovered during the course of my research (while lurking and searching at the Matrox forums) that Matrox's 2D reputation, in one specific respect, is rather undeserved. It appears that even the Parhelias can't push 24-bit colour, or greater, above resolutions of 1600x1200 pixels with 2 monitors. You must make various tradeoffs to go above 1600x1200, one of which is 16-bit colour (I had a better link earlier but I lost it). This does not appear to be an issue with NVidia or ATI cards. I posted regarding my concerns here, and while the thread looks very nice it is an example of my problem. One, the official Matrox posters avoid at all costs telling users the specifics publically on the forum --I had to dig up a post by a user regarding the details of the tradeoffs to find out about 16 bit colour, which I can't find anymore. Two, the sales team messed around with me for a couple e-mails and also avoided giving me specifics. I never extracted anything of value regarding my specific usage. Consequently I would prefer an alternate solution.
So, does anyone think any NVidia 5200-based cards are going to offer decent signal quality at higher resolutions? Any specific brands that might by a good bet? Will I have to go to the more expensive cards (5700 etc) to get 400Mhz RAMDACs that are strapped to decent filters and produce quality output? Am I doomed to having to make a blind purchase and return and swap cards until I find one that doesn't make my monitors blurry?
Just about the only work I do that is demanding on hardware is Photo Editing, and this pretty much exclusively drives my hardware purchases.
Crucial considerations:
1. Linux support. The computer runs Linux so this is mildly important. nVidia is certainly the leader with respect to this consideration.
2. 3D performance is utterly irrelevant.
3. 2D analog signal quality is crucial. This is the kicker. The card needs to drive two displays at 2048 x 1536 at 75Hz (a little room in the specs would be nice though). My research indicates this requires 350+Mhz RAMDACs (which are pretty common these days).
Now, you can get an NVidia 5200 with dual 400Mhz RAMDACs quite inexpensively these days. Unfortunately, the RAMDACs don't appear to be the be all and end all when it comes to 2D analog signal quality. I've read numerous reports of cards --particularly NVidia cards-- offering terrible analog signals to monitors at high resolutions and refresh rates. I hear it relates to capacitors? I have mild electrical skills, but please don't ask me to solder audiophile capacitors onto the ass of my graphics card in search of a good deal .
I am very curious to hear about what experiences, if any, any of you at SF have had at the higher resolutions with analog connections. I currently run 1600x1200@75Hz on a Matrox G550. The image is perfectly fine and I can make none of the complaints I have heard from NVidia and ATI owners regarding blurriness, fuzziness, and various general funniness at high resolutions. I also have never seen the output of an NVidia or ATI video card above 1280x1024, so I have little experience to bring to the table regarding what differences there really are between Matrox and other brand cards. Is there really a problem with the non-Matrox cards?
Why don't I use Matrox? Matrox does indeed seem to be the logical choice, although there is quite a premium attached to their cards. I discovered during the course of my research (while lurking and searching at the Matrox forums) that Matrox's 2D reputation, in one specific respect, is rather undeserved. It appears that even the Parhelias can't push 24-bit colour, or greater, above resolutions of 1600x1200 pixels with 2 monitors. You must make various tradeoffs to go above 1600x1200, one of which is 16-bit colour (I had a better link earlier but I lost it). This does not appear to be an issue with NVidia or ATI cards. I posted regarding my concerns here, and while the thread looks very nice it is an example of my problem. One, the official Matrox posters avoid at all costs telling users the specifics publically on the forum --I had to dig up a post by a user regarding the details of the tradeoffs to find out about 16 bit colour, which I can't find anymore. Two, the sales team messed around with me for a couple e-mails and also avoided giving me specifics. I never extracted anything of value regarding my specific usage. Consequently I would prefer an alternate solution.
So, does anyone think any NVidia 5200-based cards are going to offer decent signal quality at higher resolutions? Any specific brands that might by a good bet? Will I have to go to the more expensive cards (5700 etc) to get 400Mhz RAMDACs that are strapped to decent filters and produce quality output? Am I doomed to having to make a blind purchase and return and swap cards until I find one that doesn't make my monitors blurry?