This assumes that whatever reporting biases are present in the survey are spread across all makes of drives. It also assumes that the samples for each drive are large enough to draw reliable conclusions. While you might make a good case for the first part, the number of people self-reporting for each different type of drive simply isn't enough in many cases to give decent results. If you remember from your statistics courses the margin of error gets smaller as the sample size increases. If you flip a coin 1000 times you will almost always turn up heads between 49% and 51% of the time. Not so if you only flip a coin ten times. You may turn up heads 3 times or 7, and the chances of that happening are fairly high. Same with the SR Drive Reliability survey. If only 10 of each size and type of drive are in the database, I personally don't think I can draw any conclusions worth a damn from this. It may be better than nothing, but not by much. The only meaningful conclusions I think will come from the manufacturers of the drives themselves, but that is most likely privileged information that will never see the light of day. Data from someone selling large numbers of drives of the same type is almost as good. Presumably any failures due to mishandling will be spread equally among all models since it is the same customer base, so if the sample sizes are large enough any differences in RMA rate will be due to differences in reliability.e_dawg said:I don't care that the reliability results for all drives are off by a factor of ten as long as all drives are more or less off by a factor of ten. It's the relative differences I care about. And I think that even the SR reliabilty survey preserves enough of these differences to be able to say that the X15 (98th percentile) is more reliable than the Maxtor DM+9 (21st percentile).
Earplugs always become uncomfortable for me after short durations.Radboy said:http://www.drugstore.com/qxp40608_333181_sespider/leight_sleepers/ear_plugs.htm
That's a pretty good idea. Newegg would be great from a large sample size perspective.Mercutio said:I'd love to, for example, get ahold of Newegg's RMA rates for hard disks, instead.... now that I think of it, has anyone ever asked a company like newegg for that information?
Also, would a large retailer have to enter into any sort of NDA about releasing any kind of failure information about the products they purchase?
jtr1962 said:This assumes that whatever reporting biases are present in the survey are spread across all makes of drives. It also assumes that the samples for each drive are large enough to draw reliable conclusions. While you might make a good case for the first part, the number of people self-reporting for each different type of drive simply isn't enough in many cases to give decent results. [...] If only 10 of each size and type of drive are in the database, I personally don't think I can draw any conclusions worth a damn from this. It may be better than nothing, but not by much.
The only meaningful conclusions I think will come from the manufacturers of the drives themselves, but that is most likely privileged information that will never see the light of day.
Data from someone selling large numbers of drives of the same type is almost as good. Presumably any failures due to mishandling will be spread equally among all models since it is the same customer base, so if the sample sizes are large enough any differences in RMA rate will be due to differences in reliability.
e_dawg said:I disagree. Sometimes it's true, but the problem is that different drives sold by the same retailer can take different paths through the marketing channel through different distributors, different couriers, and different handlers. And that's not all. What about differences in packaging? For example, Seagate's SeaShell clamshell packaging (often used even for OEM drives) offers their drives greater protection during their journey through the channel than the plain ESD baggies used for some of the other drives. Seagate has data that shows their SeaShell packaging makes a significant difference in the number of returned drives. Packaging standards are not uniform either. A nationwide survey like SR's is more likely to randomize any such effects from packaging, shipping and handling differences.
Adcadet said:For all I know every WD drive in their database was sold through Vendor X who puts every bare drive through a clothes dryer before shipping.
Tea said:Tannin, would you mind not answering my questions?
Ahem ....
Bozo: 4 failures out of 75 drives is not an acceptable failure rate these days. That is about 5.3% You should be getting a failure rate about ten times lower than that. Our Samsung failure rate currently stands at less than 0.3% and has been stable on or around that figure for quite some time, on a sample that is now gradually approaching 3000 drives. (7 failures within the three year warranty period, I haven't counted the total number of drives shipped lately, but over 2500, I think. I'll try to find time to check exactly sometime soon.)
A sample of 75 isn't definitive, but 4 failures is sufficient to give a reasonable indication, and your Western Digital drives are failing more than fifteen times as often as our Samsungs.
That5.3% number doesn't really surprise me, as our own WD failure rates (back before we stopped selling Seagate and Western Digital) were somewhere in-between your numbers and Mercutio's.
As you may have guessed Dave I know little to nothing about statistics. Thank you for the explanation. I'm glad it was only Tea that I was teasing and not Tony. :mrgrn: (Mums the word Tea).time said:At the risk of sounding anally retentive Bill, it's not a self-selecting sample.
ALL of the drives he sold are included, rather than a sample selected by the whim of individual customers.
The only real issue is the fact that he buys them from a single supplier.
Bozo said:Western Digital: 68+ installed. This is a mix of BB and JB drives. Most have been running for almost 3 years. We have had 3 failures.