I finally put together the parts for my next NAS and I'm working through some configuration and performance tests. I decided to go with ZFS on Linux so that I have a playground to learn on and to increase my skills in this area. This was not intended to be a budget build so I'm sure you'll have some head-scratching moments when you look over the parts list.
CPU/MB/RAM
I decided to go with a full socket 1150 motherboard vs some of those Intel Atom setups that are popular. I found a combo deal on newegg which combined the Supermicro X10SL7-F-O and an Intel Xeon E3-1270V3 (Haswell). I added 16GB (2 x 8GB) of ECC RAM and I plan to increase that to 32GB shortly which is why I listed 32GB in the build list below. I decided to go with a bit more CPU power than I originally planned because I wanted to be able to give enough to ZFS and still use some extra for processing other work in the future. I'll be running some kind of Samba/NFS/CIFS to transport data to other systems in my house. I also plan to run some media server components once I've digitized movies into this NAS.
The supermicro board sold this config for me. I did a lot of reading and research on popular configs, and this board won me over. It comes with a built-in LSI 2308 adapter giving me 8 SAS 6/Gb ports plus the 6 built-in ports on the motherboard. The LSI adapter works in IT mode making it so I don't have to configure all the drives as 8 x RAID 0 to get the OS to see them. Basically they're all just JBOD. All the drives were seen right away through the OS and it was painless. When I priced out other configurations they were more expensive and more quirky than this setup. The X10SL7-F-O also comes with a full IPMI 2.0 BMC for easy remote management (and it works awesome). There are 3 x 1Gb NICs out the back which one of them is for the BMC. I can eventually team the two non-BMC NICs with my layer 2 switch to play with higher amounts of concurrency.
Case
After a long search for the right case for me, I chose the Rosewill L4411. It's an interesting mix of of good space, decent quantity of hotswap bays, good cooling, and relatively low price. The case comes with 3 x 120mm fans, 2x 80mm fans, 12 x SATA cables, front dust filter, and a metal locking front panel. It's very roomy inside and can even be rack-mounted if needed. With everything installed and running, the case if very quiet. I'd have no issue putting this on my desk next to me if I wanted. The drives seem to be cooled very well by the 3 x 120mm fans. I posted temps a bit further down in my post. The one negative I've seen with this case is that the hotswap bays won't recognize the Samsung 850 Pro SSD drives. This isn't a huge issue because I wasn't originally planning to mount them in the bays, but it was a surprise none the less. All info I read said the hotswap bays were simple pass-through. The SSDs are free-floating at the moment but I plan to mount them with sticky velcro for simplicity.
HDDs
I chose to go with 8 x HGST 4TB NAS for this build. I've had good luck with these in other builds and I've seen other decent reviews of them. I may decide to max out the bays on my case and add 4 more to the config down the road. If I decide to grow this larger than 12 drives, I'll look into replacing the case. That will also mean adding another adapter into the x8 PCIe 3.0 slot which gives me further expansion if needed.
SSDs
I imagine several of you will likely question the reason I added two higher-end Samsun 850 Pro SSDs to a NAS device. I did this to experiment with various things. The 128GB SSD is being used as a boot drive for now and I'll likely use it to stage other media-related work. The 256GB SSD is intended to experiment with a ZFS SLOG and also an L2ARC configuration for ZFS. It's way too big for a SLOG, but the L2ARC could take advantage of the size. Both are likely not needed in my home environment but I'm using it to learn/experiment. I chose the Samsung 850 Pro because of the increased durability and 10-year warranty. Given the nature of L2ARC and SLOG, it will possibly have more IO than normal going through it so I decided to go with a more-durable drive.
Power supply
I went with Seasonic in a 650W 80 plus gold for this build. This will give me decent efficiency and some room to grow. It's a bit overkill; I know.
OS
I'm going to play around with the OS but for now I chose Xubuntu 14.04.1 LTS 64-bit since I'm familiar with it. It may not be the best option but I'd like to experiment and find out for myself before putting this into full production in my house.
ZFS config
raidz2
zfs_arc_max=8589934592
Code:
NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT
nfspool 29T 45.1G 29.0T 0% 1.00x ONLINE -
doug@doug-X10SL7-F:~$ sudo zpool status
pool: nfspool
state: ONLINE
scan: none requested
config:
NAME STATE READ WRITE CKSUM
nfspool ONLINE 0 0 0
raidz2-0 ONLINE 0 0 0
wwn-0x5000cca24ccd25c9 ONLINE 0 0 0
wwn-0x5000cca24ccc6404 ONLINE 0 0 0
wwn-0x5000cca24cc6129e ONLINE 0 0 0
wwn-0x5000cca24ccd0a9b ONLINE 0 0 0
wwn-0x5000cca24ccd5e18 ONLINE 0 0 0
wwn-0x5000cca24cccb387 ONLINE 0 0 0
wwn-0x5000cca24cccb39d ONLINE 0 0 0
wwn-0x5000cca24cccb370 ONLINE 0 0 0
(note, I haven't added the SLOG or L2ARC yet)
Performance
I haven't had time yet to go through in thorough detail but the when I configured all 8 drives in a basic zpool and did a very simplistic Linux "dd" write test (all zeros) I topped out at 1.2GB/sec in writes on a 60GB file (to surpass page caching). The reads topped out at 1.3GB/sec on the same file using the same "dd" test. I know this is far from real-world, but I wanted to see what it was capable of in the most optimal run.
Code:
sudo time sh -c "dd if=/dev/zero of=/dctestpool/outfile bs=64k count=900000"
I have iozone running more purposeful performance benchmarks. I'll post those details once I have them.
Example drive temperatures (during izone benchmark and 18+ hours of uptime)
Code:
doug@doug-X10SL7-F:~$ sudo hddtemp /dev/sd[cdefghij]
/dev/sdc: HGST HDN724040ALE640: 33°C
/dev/sdd: HGST HDN724040ALE640: 33°C
/dev/sde: HGST HDN724040ALE640: 34°C
/dev/sdf: HGST HDN724040ALE640: 35°C
/dev/sdg: HGST HDN724040ALE640: 33°C
/dev/sdh: HGST HDN724040ALE640: 34°C
/dev/sdi: HGST HDN724040ALE640: 34°C
/dev/sdj: HGST HDN724040ALE640: 33°C
Here is the
album of images on imgur showing the build.