I finally bought a basic NAS and some drives. Should the drives be fully tested separately outside of the device or does the NAS do that? Thanks.
I suppose it depends on how good your backup infrastructure is. If it would be inconvenient to lose the entire array (meaning, more drives than you have redundancy fail within the rebuild time), run the tests. I don't, and am lucky enough to have never lost an array, but that is luck that will change eventually.
Of course, there is still a chance of failure (backups!), but if restoring from those backups would be a PITA, may as well run the tests.
Out of curiosity, which NAS and drives did you get?
The DS1515 or the DS1515+? The DS1515 is enough for your needs IMO.It is the SynoLogic 1515 and 4x8TB WD RED.
If you want fast rebuild times and 10GbE that doesn't cost a fortune, what you want is called a PC.
Yes, depending on the number of drives in the system, but that's a reasonable ballpark number.So a faster sytem would run at the full hard drive speed, 10-15 hours?
I'll second that. Synology does have several great NAS units that have 10GbE (or support adding cards) and higher end CPUs.
QNAP's NAS units are typically cheaper at any given performance level.
For instance, this QNAP NAS is a very powerful unit for only 1227$. You'll need to buy the 10GbE NIC separately, but there are several compatible models and you could probably get one cheap on eBay. You'd save money with a pair of SFP+ NICs IMO, one in the NAS, one in your PC.
There are cheaper NAS with 10GbE ports, but they often are under-powered and cannot fully benefit from the faster network connection.
I've almost completed backing up data to the RAID, but now realize that one of the wrong drives is in the array. :crap:
Party consistency is now at 95%. I assume that and the copying should finish before I replace the drive.
Then it will take a few days to rebuild if I understand correctly.
Yup. Will be a good test of the redundancy, just pull the drive and swap it live. Make sure you know which drive, first.
It's the drive not for NAS, so that was pretty obvious.
The NAS beeped incessantly with a different drive installed. I had to disable the beep and start the rebuild manually from the web interface.
The Repairing (Checking parity consistency...) is at 11% after about 100 minutes.
It completed successfully after 16.5 hours. I'm not clear on whether that checks all disks completely or only the data.
RAID operates at the block layer (or bit layer for some raid levels) so the controller and disks generally don't know what blocks contain data that are relevant to the file system/OS. A rebuild thus requires reconstruction from all disks in a RAID5/6 array or from the appropriate mirror disk in a RAID 1/10 array.
The exception to this rule might be SSDs which track the mapping between LBAs and the underlying flash so they can implement wear leveling and TRIM optimizations. However, even though an SSD may know that an LBA is no longer used, this information is probably not tracked by the RAID controller in a manner that allows optimizing the reconstruction process.
Thanks. I understand correctly, during the repairing activity the normal drives are all being read and the new drive is fully being written. 16.5 hours is not too much longer than the time required to write a full 8TB drive directly.
The drive contents sure are strange after NAS usage.
View attachment 1100
The NAS password is lost every time it reboots. Rebooting the computer has no effect. Is that normal?
It is annoying, because one must log into the NADS to power it down.
7 Pro.
Open up Credential Manager. Delete whatever weird crap you have saved in there for that IP/hostname. Try again.