Linux + FreeNAS + Crashplan setup

Adcadet

Storage Freak
Joined
Jan 14, 2002
Messages
1,861
Location
44.8, -91.5
Hey Gang,
My main machine is 4 years old (2600k, running at 4.4 GHz) and still running Windows 7. Its performance has some issues, and it feels glitchy. I'm also tired of windows, and my reasons for sticking with Windows are fading. So I recently rebuilt my previous linux machine using an i7 4790k, 32 GB RAM, and a Samsung 850 pro, installed Linux mint, and now it's my main machine. So far I'm very happy with it, and want to fully migrate away from my Windows machine. I'd like your advice on how to set up the backup system using FreeNAS and Crashplan.

I have a dedicated machine running FreeNAS - a 2600k with 16 GB RAM. It has two arrays (zpools) of drives; one RAID-Z that I have used previously to store media to stream via Plex (running in a FreeNAS jail) that I want to keep, and one is configured as a RAIDZ with a hot spare (maybe RAIDZ2 would have been better) that has been a backup target for Crashplan on Windows (mapped it as a drive to trick CrashPlan to think it was local). These two zpools have folders in them that are shared using CIFS (the only thing I really knew about when I set things up) but I'm thinking now perhaps sharing with NFS would be better - faster?. Perhaps even mapping some NFS shares from the FreeNAS machine to my Linux machine. Perhaps my Linux CrashPlan client could then backup what is in the NFS shares to the CrashPlan cloud and elsewhere. Or perhaps I should just keep my all my storage local and let CrashPlan back it up to an NFS share on FreeNAS (vs install a CrashPlan client in a FreeNAS jail to receive backups). Or maybe I should just use something like rsync or similar to keep a copy on the FreeNAS machine and let CrashPlan backup to the cloud and elsewhere.

Wonder what I should do with my old 2600k. My wife is happy with her 2500k. Another server of some kind? Put it in our new detached garage for semi-off site backup?

Thoughts? Any guidance would be much appreciated.
Thanks!
Adcadet
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,931
Location
USA
I use a similar setup with my home environment. We have three main workstations used for different tasks. Two of the primary workstations run CrashPlan on them under different accounts. They each backup to the crashplan cloud for off-site protection. Since that sync can take a while given the amount of data that churns (photos/video), we also tell each of our crashplan clients to sync with my local Linux-based NAS. I'm running Xubuntu on my NAS with ZFS into a single large zpool (8 x 4TB raidz 2). I carved out a spot in the pool for remote crashplan backups that are used by my home workstations so that there is at least one copy done quickly while the cloud takes its time to sync. I also use the NAS as a Plex server and also for hosting a web-based front for gaming servers and a Mumble chat server. It also functions as a CIFS/NFS for some other projects.

I have additional plans in the work to put my old NAS at a family location to sync both of my NAS devices together. I may use something like the rsync over an SSH tunnel or bittorrent sync to manage that backup but I haven't decided yet. I'll likely pre-seed the backup and then move the device so that it doesn't take a month to replicate.
 

Adcadet

Storage Freak
Joined
Jan 14, 2002
Messages
1,861
Location
44.8, -91.5
Handy, how did you do ZFS in Linux? FUSE? Kernel port?

I'm thinking I might keep my local copy in Linux on Ext4 for speed and simplicity. But maybe not. I've enjoyed using FUSE.

I've deployed a Linux mint machine using ZFS-FUSE to my parents home and running a CrashPlan client to do off-site backup. Every now and then they've had the power go out and forgotten to turn the machine back on. About a month ago this happened and my Dad said the machine wouldn't turn back on - I wonder if the power supply died. It was a slick set up.

We could always try to arrange the StorageForum CrashPlan offsite cooperative.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,297
Location
I am omnipresent
I've been repurposing fast spare machines (and out of warranty 3TB drives) as Plex servers rather than backup targets. You don't need a super CPU for crashplan, but the ability to support eight threads is super-helpful for Plex's video transcoding tasks. If you have the disk space and interest, you certainly can do both, as Plex is a CPU hog (with enough client utilization) but not a RAM hog while Crashplan is a RAM hog but not a CPU hog.
 

Adcadet

Storage Freak
Joined
Jan 14, 2002
Messages
1,861
Location
44.8, -91.5
And I've also got a 3570k laying around now. Any reason to favor a 3570k over a 2600k for Plex?
 

Adcadet

Storage Freak
Joined
Jan 14, 2002
Messages
1,861
Location
44.8, -91.5
oh man, I also have my old HTPC upstairs - an AMD X4 640 (Propus) with 4 GB of RAM. The tiny little fan was way too loud so replaced it with an Amazon Fire TV, and it plays movies via Plex well. Any obvious use for the X4 640 these days?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,297
Location
I am omnipresent
And I've also got a 3570k laying around now. Any reason to favor a 3570k over a 2600k for Plex?

The extra threads are more advantageous than the per-thread execution speed. An i7-980X is a better Plex host than a Haswell i3 for the same reason.
The AMD x4 could probably be refurbished and passed along to another family member. You have plenty of fast machines on hand.

The FireTV as a Plex client has one down side, which is that you really need to configure an external player (I use XBMC) to make it deliver audio with full fidelity. Plex/Android only wants to output stereo. I kind of talk about that more in the Media Player Appliances thread.
 

Adcadet

Storage Freak
Joined
Jan 14, 2002
Messages
1,861
Location
44.8, -91.5
Thanks, Merc. Stereo is just fine for now - we're still using the built in speakers in the TV.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,931
Location
USA
Handy, how did you do ZFS in Linux? FUSE? Kernel port?

I'm thinking I might keep my local copy in Linux on Ext4 for speed and simplicity. But maybe not. I've enjoyed using FUSE.

I've deployed a Linux mint machine using ZFS-FUSE to my parents home and running a CrashPlan client to do off-site backup. Every now and then they've had the power go out and forgotten to turn the machine back on. About a month ago this happened and my Dad said the machine wouldn't turn back on - I wonder if the power supply died. It was a slick set up.

We could always try to arrange the StorageForum CrashPlan offsite cooperative.

I installed zfs using the ubuntu-zfs package under Xubuntu 14.04. My understanding is it's based on the ZFS on Linux project which is the native Linux kernel port. So far the speed in this setup is one of the fastest I have in the house in terms of raw MB/sec. My Samsung 850 Pro SSD under SATA 3 isn't fast enough to read from in order to drive my pool of disks to their limit. :)

There were some discussions in other threads or maybe in PMs of trying a bit torrent sync among people to share some files. That would make a non-centralized cloud for SF users. If you're interested in trying something like that, I'd certainly play around with it. We could probably even do this in an experimental virtual machine that could be shared with everyone to play with.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,297
Location
I am omnipresent
I've found that BTsync chokes and dies if you throw 4TB of files at it. We could be each other's crashplan partners easily enough. Or just start a communal Google/Onedrive account.
 
Top