legoman666 said:
			
		
	
	
		
		
			So I took the chart and found the lowest price for each card (inc rebates) on newegg. Then I divided the avg PPD by the cost. 
The card with the best ratio is the 8800GS with 52.63PPD/$. The next highest is the 8800GT 256MB @ 46.07PPD/$.
Updated. The numbers previously listed were based on older data.
	
	
	
		
		
		
			
		
		
	
	
		 
I'm going to try to incorporate energy costs.  I scoured the internet to get aprox system energy usage on the various cards while loaded.  I'll also assume a $0.10/KWh cost (your local cost may be different) and then extropolate all PPD and costs over a two year lifetime.
	
	
	
		Code:
	
	
		                             Orig   Energy   Points
Card          PPD       W    Cost    Cost      Per
                              $       $/y        $
  
8800GTS        4223    283    150    248      4770
8800GTX        4538    228    170    200      5810
8800GT 256     4607    236    100    207      6540
8800GT 512     4762    269    125    236      5820
8800 Ultra     4950    330    236    289      4440
9800 GX2       9996    289    330    253      8730
8800 GTS 92    5512    264    150    231      6570
9800 GTX       5937    212    170    185      8030
GTX 260        6476    263    245    230      6710
GTX 280        6901    313    400    274      5310
GTX 280 SLI   13002    508    800    445      5620
9800GX2 SLI   19992    461    660    404      9940
8800 GT SLI    9524    271    250    237      9602
	 
 
Most of the loaded System Wattage comes from AnandTech from a variety of video card reviews.
The 9800 GX2 has the PPD doubled on the assumption of running two clients.
The SLI versions have had their PPD doubled assuming twice the clients. so the x2 SLI'd version has 4 clients.
The Points per dollar does not include any CPU/SMP clients that may also be running
The GPU2 PPD is assuming a CPU capable of running the GPU client at its maximum.
The Energy cost/year calculation = System Wattage/1000 (to convert to KW)  x 365days x 24Hr x .1($ 0.10 per KWh) 
Example GTX 280:  $274/year = 313W x 365 days x 24 hours x .1 ($/KWh) / 1000 
So to get the points/$ over two years = PPD x 365 days x 2 years / (Original Cost + Energy Cost/year x 2 years)
Example GTX 280:  5310 = 6901 x 365 x2 / ($400 + 2 x 274).
I'm making this so that others can judge a more realistic cost for GPU2 folding.  The Wattage numbers are at best an estimate since they have not been measured while actually folding and they incorporate the entire system but different computer are more or less efficient.  So take the numbers with a certain grain of salt.  That being said, I think they may have a value to some.