WHS+AnyDVDHD+MyMovies+WMC+PowerDVD9

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,729
Location
Horsens, Denmark
Trying a new, cleaner (in use if not implementation) method of HTPCing.

Server Side:

1. Windows Home Server running the My Movies Add-on
2. My Movies requires AnyDVD HD to unlock disks
3. The new LG 10x BR drive connected via eSATA to WHS

This part is really slick. Just drop the DVD or BR in the tray, movie info is automatically pulled from the internet, and the rip happens without touching a button. When the rip finishes, the tray pops open.

HTPC Side:

1. Workstation running 7 x64 Ultimate with My Movies add-on to Windows Media Center
2. SlySoft's Virtual Clone Drive to mount the ISOs
3. PowerDVD 9 Ultra to do the actual Playing

Complaint #1
Ripping a DVD takes 45 minutes, ripping a BR disk takes 5-6 hours. CPU limited? Still working on it.

Complaint #2
DVD Playback quality is not great, and BRs show crazy vertical interlacing issues in grayscale.

Once I sort those out, though, it looks pretty darn awesome.
 

MaxBurn

Storage Is My Life
Joined
Jan 20, 2004
Messages
3,245
Location
SC
Complaint 1:
We discussed in the other thread how anydvdhd straight to ISO has some sort of speed limit to it, I think you are seeing that only much worse. Should be just minutes for DVD and about an hour or less with that drive I am guessing for bluray. Change it over to file mode and see if there is a difference? RDP in and pull up task manager, see if you can track something down there.

Complaint 2
I used to do exactly the same thing only with daemon tools but I didn't have a problem with bluray.

For DVD Don't bother with either of those, just rip the movies to file mode and add the folder to media center libraries for movies. It will pick them up and play them fine with menus, good sound etc.
 

Santilli

Hairy Aussie
Joined
Jan 27, 2002
Messages
5,278
I assume in two you are using AnyDVDHD to do the ripping to iso?

Your Bluray times are WAY off the charts. MAYBE it should be around 45 minutes, and that's usually doing two at a time.

I still use RipIt4Me, and Dvd Decrypter, and DVD shrink, and it works on almost all my DVD's.

The Beast really likes DVDShrink, making short work of the projects.

Shrink used to take 100% of the Dual Xeons.
The Beast takes about 45 Seconds to read Lord of the Rings 3.

To rip,
It takes 1.7 gigs of ram, and 7-10% of the processor cores.
I just kicked it up to Real-Time in processes.
Thats upped the processor use to 9-14% and, It should be done in about 13 minutes.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,729
Location
Horsens, Denmark
One of my complaints about WHS is that there are no performance enhancing RAID levels available. Your options are essentially RAID-1 or single drive. My new plan involves creating two-drive stripes, then adding those as 4TB "drives" into WHS. I'm doubling the chances of failure, but as I specify redundancy on everything anyway, this is effectively RAID-01. It should significantly affect STR, while still allowing me to add and remove drives from the system (4TB at a time).

I'll let you know how it goes...
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,927
Location
USA
I installed WHS last night and also found that to be a complaint. I know its market isn't typically geared for you or I, but the options would be nice.

I had some real issues with network performance when transferring large files. I mentioned in another thread that I can get about 2/3rds through transferring a large file (about 5GB), and the remaining 1/3rd was extremely slow. I don't know if this is because of incompatibilities with my hardware, but it was frustrating none the less.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,729
Location
Horsens, Denmark
I haven't experienced that issue, but I was seeing transfer rates in the 50-60MB/s range off the single 2TB WD Green drives. If that were closer to 100MB/s I would feel much better streaming multiple BR movies off it.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,927
Location
USA
I was just doing some test right now with OpenFiler and I'm able to get about 85-90MB/sec upload to the NAS with large monolithic files now that my software raid 5 is finished syncing.

At first I thought I was wasting RAM by having 6GB in my NAS, but OpenFiler actually uses it for cache (64-bit version sees all 6GB where as WHS stopped at 4GB due to 32-bit). After uploading the 3.4GB file, if I proceed to download it, I'm getting about 99% utilization on my GigE network card likely because of the cache.
 

timwhit

Hairy Aussie
Joined
Jan 23, 2002
Messages
5,278
Location
Chicago, IL
What application did you use to see the memory usage? Is it something built into OpenFiler or is it a freely available FOSS application?
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,927
Location
USA
The memory usage and stats are built into their status page through their web interface.
 

timwhit

Hairy Aussie
Joined
Jan 23, 2002
Messages
5,278
Location
Chicago, IL
That's nice. Do you know what their web interface is built on? Can it be used with other Linux distros?
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,927
Location
USA
I don't know what it's built on. From a general standpoint, the web page does not require any plug-ins (like flash, java, etc). I'm assuming it uses some scripting language like php, but I forgot to look. I do know it's managed using rpath/conary so I don't know how well that would translate into other Linux distros. The rbuilder project can be found here and their repository of packages is here.
 

timwhit

Hairy Aussie
Joined
Jan 23, 2002
Messages
5,278
Location
Chicago, IL
Let me know if you have figure out what the package is called, I wouldn't have the first clue of what to look for.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,927
Location
USA
I'll see what I can find. I don't know what to look for either, but was hoping there would be a package group for the UI. I looked quickly and didn't see anything obvious.
 

blakerwry

Storage? I am Storage!
Joined
Oct 12, 2002
Messages
4,203
Location
Kansas City, USA
Website
justblake.com
What application did you use to see the memory usage? Is it something built into OpenFiler or is it a freely available FOSS application?

That looks like the output from the 'free' command (which really is just reading the /proc/meminfo file) turned into HTML with some minor CSS styling.

Something like that (reading, parsing, and formatting) could be implemented in bash (through CGI), php, perl, etc. Then you'd need a web server to serve up the HTML.

It's pretty clean, but you could functionally get a lot more through calling 'top -n 1' or similar through cgi with zero effort. The effort comes into play if you want to make it pretty.

If you're looking for a larger scale, SNMP and something like cacti can be a great tool to monitor and keep a central history of many systems.
 

blakerwry

Storage? I am Storage!
Joined
Oct 12, 2002
Messages
4,203
Location
Kansas City, USA
Website
justblake.com
For example, place this cgi file onto a webserver that supports execution of cgi files (the file typically needs the file to have +x permissions, may additionally need the ExecCGI permission set) and then access the file in your browser.

top.cgi:
Code:
#!/bin/bash
echo Content-type: text/plain
echo ""

top -b -n1
 
Top