Gaming off the LAN? Feasibile?

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
This is a very undeveloped idea that I'm hoping you guys (gals?) can help me work through.

I'm toying with the idea of mapping a drive to a server, than installing a game (UT2004 for example) to the mapped drive, then playing the game off the mapped drive. Clearly this will have some significant requirements in terms of speed (both the LAN and storage subsystem).

Now, what would the requirements be if I wanted, say 25 systems to all be playing games off the same server (different shares) simultaneously? GbE to the workstations would be enough, but what about on the server side? I'm sure I would require a substantial RAID array (or two, or three) to handle the demand.

Does anyone out there have some numbers I could start with? Or some suggestions on how to start testing the requirements?
 

freeborn

Learning Storage Performance
Joined
Feb 4, 2003
Messages
131
Location
Longmont, CO
Any reason not to install the game locally? I have my game DVD imaged to my server and have mounted it on 5 machines simultaneously. All machines installed the game locally and then used the image for copy protection validation. This works fine for me. I still need to download the UT2004 dedicated server but I suspect I could run the dedicated server and host the image from the same machine.

Free
 

BooST

Learning Storage Performance
Joined
Mar 31, 2004
Messages
111
I imaged my "Play Disc" from the CD version and I get some weird sort of error... just installed a no-cd patch and it works fine, until they patch the game ;) The machine I use for gaming doesn't have a CD/DVD drive
 

BooST

Learning Storage Performance
Joined
Mar 31, 2004
Messages
111
25 Systems running the game at the same time would be impossible imo, but I haven't explored the wide world of SCSI yet. I guess the main problem I see is getting the files to the client machines fast enough. The real question here is "How fast do the files need to be transfered in order for the game to run well"

I'm still trying to figure out the Logic behind this, as freeborn stated, it's much more logical to install the game on the 25 machines rather than trying to build a superserver to run the game over the network.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
You didn't hear me, but I just said "ick".

Let's ask a few questions:
Each of your stations has a Raptor, right? GBoC = around 40MB/s in ideal conditions (large files to stream etc). Raptors are what? 35 - 70MB/s?
Which will be faster?

Your "server" is a vanilla PC with some GBoC cards and maybe i875 onboard whatever, right? Your disk controller is 32/33 PCI? Topping out at maybe 100MB/sec for PCI. One PCI 1000BT NIC+ 3 disc Raptor array = all your PCI bandwidth, and I'm being generous there.

Raptors are fast-seeking drives, but 25 simultaneous requests for different "little game files" (e.g. user pref files, or calls to the main .EXE or .DLLs) would not be pretty.

Now, what about your switches? Do you have a Catalyst 4500 or the like? 'Cause those low-end Linksys/Netgear guys don't have the grapes to tolerate LOTS of Gigabit traffic. I've seen cheapie 24-port switches die (literally) from heavy sustained 100Mbit use. I suspect the current generation consumer 1000BT switches wouldn't fare better.

Segment everything by adding switches? Do you really WANT to build multiple Gbit networks? Ugh.

Are you space-limited on your local PCs? Why not use your Gbit network a little better and set up hard disk images for several sets of games and use Ghost Enterprise to multicast 'em out? You could deploy x copies of a game in around 5 minutes that way

In short: Your local disks are faster. Doing THAT MANY clients off Gbit would kill your server and its disks, not to mention do awful things to your LAN, and you're probably better off administering the games through ghost images anyway.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
OK, I have some more time...so here's the thought (if any) behind my maddness.

This proposal isn't for my current project, it's for the second store (the funding is already in the bag). The problem is twofold:

1. We are currently using 36GB raptors and they are 95% full (UT2004 alone is 5.5GB). Adding another game isn't possible at this point. How big will Doom3 and HL2 be?

2. Licencing. We currently buy a licence of every game for every machine. Even the less popular games that will never have more than 2 or 3 users at a time.

Here's the thought:

Have many shares on the server, each containing an install of a game. If we have 20 licences of UT2004 than we have 20 shares (each with their own key). When someone runs the game, our middleware maps the drive and makes the necissary registry modifications before executing the app. This way we only need to buy as many licences (keys) as we have users concurrently playing that game.

I know it will requre some serious hardware, I'm envisioning 10+ drive RAID arrays on PCI-X busses serving segments of the network with GBoC. Possibly even multiple servers hosting a few games each.

And this opens up another possibility, the idea of going diskless on the clients via (that cursed technology) PXE.

If all this sounds a little extreme considering what I'm doing (a lowly game center); keep in mind that the "wow" factor is a major selling point. Going A64 and GbE were primarily sales tools, not affecting performance to the point where it would be cost effective.

Merc: Thanks for your initial numbers, what kind of server(s) do you think would cut it? If I wanted, oh, 30MB/s to each client (worst case) than I would need 2 GbE cards and a 4-disk RAID10 array for every 4 computers?

I'm just trying to get a grasp on exactly what this concept would involve and cost.
 

freeborn

Learning Storage Performance
Joined
Feb 4, 2003
Messages
131
Location
Longmont, CO
BooST said:
I imaged my "Play Disc" from the CD version and I get some weird sort of error... just installed a no-cd patch and it works fine, until they patch the game ;) The machine I use for gaming doesn't have a CD/DVD drive

SecureROM 5.02 blacklists alcohol.exe and older Daemon tools installs but I was able to image okay using the Blind Write 5 trial and mounted the image w/out error using the latest Daemon tool.

DDrueding: maybe you could work some sort of swap? Copy the game directory from your share if it is not on the local machine while removing another less used game directory to make room. This will tax your bandwidth once and maybe could become self regulating, the lesser used games would mostly just reside on the server and the popular ones would already be local. If you run out of licenses, remove the directory from a machine not running the game (and one that hasn't run it in ahwile) and put it on the one that needs it. It will cause a delay when a non local game is called up but should save your network bandwidth for when it is needed. I assume all machines will be on at the same time and that you will be monitoring usage. If my assumptions are correct than you should be able to use the local drive as a swap disk for the network shares. Maybe even prioritize files, copy the bare minimum to get the intro going while the remainder is copying. I don't know how well you would run with all files residing on the server.

Free
 

freeborn

Learning Storage Performance
Joined
Feb 4, 2003
Messages
131
Location
Longmont, CO
Another thought comes to mind, a self extracting compressed archive. If I understand the technology properly the data will remain compressed until it is transferred and expanded locally. That should reduce network bandwidth as well.

Free
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,920
Location
USA
You might also want to consider that by implementing your RAID idea that you are essentially creating a single point of failure.

How much business would be lost if the array went down, became corrupted, or hardware failure? If one PC dies...no big deal, you have 19 others...if the server dies...no business until you fix it.

Just buy an HP superdome... ;)
vista_superdome_1.jpg
 

BooST

Learning Storage Performance
Joined
Mar 31, 2004
Messages
111
I see the need for some new raptors, not in the server side. Why not just invest in some more 36G raptors or the 74s?
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
Windows does NOT handle being diskless very well. Lots of folks with an interest in AV have tried it, certainly.
The better and cheaper option would be to buy, say, a 160GB drive for each machine with locally contained disk images, or, like I said, ghost over gigabit.
Let's call that $2500 in hardware costs.

On the game-over-LAN route, I think a good starting point would be 3 machines with PCI-X (or some other 533MB/s bus) interfaces for some kind of decent disk controller and some large, fast drives. My instincts say go 15k SCSI for fast seeks and high STR both; not every file you load will be that 100MB UT2k4 player skins file. Of course "Large 15k SCSI drive" is an oxymoron. There are Raptors but I hear they don't do well with serverish loads.

Minimally, these are $1500 PCs. And we haven't talked about the costs involved in additional switches ($150 apiece on the cheap) and the PITA of managing extra servers. Or the fact that startups don't need to be spending money that way.

...and the local raptors are STILL faster.

Now, David, have you ever done anything with Intellimirror?

How about setting the PCs up with a "base" diskload of WinXP + UT + CS + that WWII game - say, a 20GB load of just the most common games.
Intellimirror lets you publish games so they appear to be on your local PCs, but they really aren't installed until a user tries to start them, and they CAN (IIRC, I've never tried this part) be uninstalled when users log off.

Something else to look into.

But I still think you'll be best served with disk images.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
Mercutio said:
Windows does NOT handle being diskless very well. Lots of folks with an interest in AV have tried it, certainly.

This would be a quick death to this idea. It really becomes more feasible if there is no drive on the system whatsoever (due to the cost of drives "transferring" from the workstation to the server). I was actually playing with the idea of "swapping" the files out to the local drives (as Freeborn and Merc have mentioned), but was concerned about that kind of massive move being harder on the network (UT2004 is 5.5GB...this would kill even a GBoC connection for several minutes.) Especially considering that a small percentage of the files would actually be requested during gameplay.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
Let me modify my previous post slightly: I was thinking 2 Gbit NICs per server machine, not just one. That'd make failover a lot easier.
 

Fushigi

Storage Is My Life
Joined
Jan 23, 2002
Messages
2,890
Location
Illinois, USA
Handruin said:
Just buy an HP superdome... ;)
HPs are junk. :lol: You want power? Go with a pSeries 690.

Anyway, on to the task at hand.

I like Merc's idea of a base config with Windows and the 2 or 3 most popular games and using Intellimirror or somesuch to JITI (Just In Time Installation) anything else.

Honestly, though, if you want the least lag time for game delivery you just have to pony up and install them locally. Raptors, or at least disks with that space restriction, are proving to be your Achilles Heel. 143GB 10K SCSIs would be faster, have a better warranty, and provide sufficient space for some time. Of course they're also more expensive, probably louder, generate more heat, and may use more power. But it would solve your game delivery issue.
 

Bozo

Storage? I am Storage!
Joined
Feb 12, 2002
Messages
4,396
Location
Twilight Zone
Actually, switching to 200GB, 7200rpm, SATA hard drives at each machine sounds like the logical thing to do. With their higher platter densities the time to load would be about the same as a 10000rpm drive. This would also leave room to expand.
And, you wouldn't have to worry about the server going down and putting you out of business until it's fixed.
If the server is the only way for you and you have the money, how about 10 15000rpm Cheetahs (73 GB each) in RAID 10 going out to 4-GBe NICs set up in a load sharing arrangement. Each box should have a GBe NIC installed if it doesn't allready. You might have to add another switch or two to keep things moving.

Bozo :mrgrn:
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,920
Location
USA
I forgot to add, run a dedicated cable from each work station to each port on all 5 cars.
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
Bozo said:
Actually, switching to 200GB, 7200rpm, SATA hard drives at each machine sounds like the logical thing to do. With their higher platter densities the time to load would be about the same as a 10000rpm drive. This would also leave room to expand.

Better yet, just *add* a 200GB 7k250 to the workstations and install games to there. As we all know, if you load a machine up with memory, the performance of the storage subsystem is less important....
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
What do you guys think of this?

Two Servers with the following spec:

One
Tyan Thunder K8W

With Three
Intel Pro1000MT Quad Server Adapters

And One
3Ware 9500S-12

Handling Twelve
Western Digital WD740GDs


My biggest concern is the total bus capacity of the motherboard, it only has 2 channels (one 100Mhz PCI-X and the other 133Mhz PCI-X). If I could find a board with 3 PCI-X channels, and enough internal switching power to hack it, I'd prefer to put 6 quad GbE and 3 RAID adapters into the same system.

And for those reccomending the switch to bigger internal harddrives, I appreciate that you're looking out for my sanity. However I won't really be considering those alternatives until I've got the numbers of my evil scheme worked out.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
The 12-port 3ware 8500 is ~ $700 new. I'll assume the 9500 will take that price point. The 1000MT Quad is $500. 12 Raptors @ $225 apiece = $2700 + probably another $1000 for some kind of enclosure, board, CPU and RAM.

That's $5,000, David. Per Server.

How many months of rent, utilities and payroll is that? How much advertising could that buy? 'Cause those are REALLY important numbers.

I know that it's easy to get wrapped up and say "I have the budget and this is technically possible", but that doesn't make that a good idea. There are very real and very good arguments for solving your problems with software, and I really think you need to take a step back and at least justify why you're doing this.

It's a cool project technically. Don't get me wrong.

Maybe e_dawg will come along and talk MBA to you. It's not my department but if I have to I'll dig out my Finance for Engineers textbook and see if I can figure out exactly how bad an idea this is.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
Mercutio said:
There are better things for you to spend your money on for your business.

Quite likely. But a pair servers as mentioned above with 2GB of RAM and Dual Opteron 240s would cost under $15k.

If we are running diskless, that is saving me $2600 from the workstations, and not needing the GBoC 24-port switch allows me to save another $1300. The savings associated with software licencing is likely to be more than $6k*.

The "cool factor" is worth the $5k difference.

Getting back to the feasability of the forementioned hardware, what are your thoughts?


*The reason for this is that we insist on offering all the popular games. (even though 95% of gameplay is BF and CS) We are currently only offering 20 of them, but that was a limitation of the HDDs. Even only putting 20 games on the systems cost us well over $12k. If we only had to buy a handfull of copies of these and the other less-popular games we would like to offer, the savings would be significant.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
Mercutio said:
But also look at something like this. Combine with Windows Server-style published applications and your space problems are taken care of.

Looks really interesting. I'm currently wading through Ghost Corperate and PXE servers, this looks like it would be better suited.
 

Handruin

Administrator
Joined
Jan 13, 2002
Messages
13,920
Location
USA
If deep freeze works as well as they say, the pricing seems very reasonable, even for the enterprise license.
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
Perhaps we should question why you're spending $6,000 on licensing <$50 a pop games.

I'd also like to know if your licenses would *allow* you to do what you're planning. Last I checked, it's one copy on one machine; nothing about copies on servers, for any purpose.
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
Some games also require special licensing for public use.
In reality no one cares, but imagine if, e.g. EA or Microsoft decided to crack down on LAN centers.

Once David starts collecting usage stats (assuming he can), perhaps he can sell games/media that he won't be using.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
20 games, 22 licences of each game, average cost ~$35.

20*22*35=$15,400

We don't want to pirate or steal games, that's why we're spending so much. By doing what I have described above, we are only keeing each licence of each game on one machine at a time. This is specifically to avoid licencing issues and reduce the cost detaileed here.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
Mercutio said:
Some games also require special licensing for public use.
In reality no one cares, but imagine if, e.g. EA or Microsoft decided to crack down on LAN centers.

That would be an issue. The different companies have different policies, but the safest route is just to buy enough standard copies and not make a fuss.

Mercutio said:
Once David starts collecting usage stats (assuming he can), perhaps he can sell games/media that he won't be using.

I already can track usage, and this move is specifically targeted at our new store (no purchases made yet). Also imagine the benifits of this accross multiple stores, maintaining a single "pool" of licences.
 

sechs

Storage? I am Storage!
Joined
Feb 1, 2003
Messages
4,709
Location
Left Coast
Actually, you're probably already violating licenses. This is from EA's boilerplate license:

The Product is licensed to you only for your personal use and enjoyment and may not be used for any commercial purpose whatsoever. You may not loan, rent, lease, give, sell, offer for sale, sublicense or otherwise transfer the Product, or any portion of the Product or anything incorporated therein, including any screen display, sound or accompanying documentation, to any third party, nor may you permit any other person to use the Product in exchange for remuneration. Further, you may not place the Product on any computer, communications, or other system or network that would allow multiple users to access it. Notwithstanding the foregoing, in one case you may transfer your rights under this License on a permanent basis provided you transfer this License and the Product, including all accompanying printed materials, while retaining no copies, and the recipient agrees to the terms of this License. If the Product is an update, any transfer must include this update and all prior revisions.

It goes on to say:

COMMERCIAL EXPLOITATION PROHIBITED. Without limiting the generality of the foregoing restrictions, specifically you may not offer the Product on a pay-per-play basis or on a computer system or network to which you lease or rent access to others, electronically distribute the Product, or any portion of the Product or anything incorporated therein, including any screen display, sound or accompanying documentation, or, if the Product contains a map or level editor, sell or permit any third party to sell maps or levels you create for use with the Product. Commercial exploitation licenses are available in certain circumstances by contacting EA as set forth at the end of this License.


You need to get appropriate commercial licenses from EA in order to "rent" the games. Presumably, these cost more -- but would likely leave you free to implement this server system.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
Does anyone know of a motherboard with or more PCI-X busses? A couple Supermicro boards have 6 PCI-X slots, but I think they are all on the same bus. I think I'd need at least 3 busses to handle the load from all 20 machines, correct?
 

blakerwry

Storage? I am Storage!
Joined
Oct 12, 2002
Messages
4,203
Location
Kansas City, USA
Website
justblake.com
i think you're going overboard... games dont rely that much on the hard disk, and even when they do it's to load a map or something which usually goes pretty quickly and the downtime is apreciated anyway (bragging, stretching, taking a bathroom/snack break, etc).


I think i would personally be inclined to go half and half...

right now you're doing full installs to the hdds?
And are suggesting doing full installs to the server?


Why not store the ISO images of the game on the server and do a minimum install onto the game stations? Use daemon tools to mount the CDs over the network.


If this is not possible, I would put some games on the server and some locally. Which games go where depends on the game and how well it runs over the network as well as other issues (licensing like you said).
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
blakerwry said:
Why not store the ISO images of the game on the server and do a minimum install onto the game stations? Use daemon tools to mount the CDs over the network.
This is what I'm doing right now.

blakerwry said:
If this is not possible, I would put some games on the server and some locally. Which games go where depends on the game and how well it runs over the network as well as other issues (licensing like you said).
I could do this, but it would cost an additional $2600. Ideally there would be no hard drive in the workstations at all.

I have been thinking, however, that dedicating a GbE port to each client is a bit overkill as well.

Working with the numbers provided my Merc earlier, if I assume each NIC in the server is capable of 40MB/s and I use 2 of those 4-port cards ganged together, that's 8*40=320MB/s of throughput. This is provided, of course, that I have a GBoC switch capable of that much throughput and of ganging that many ports at a time.

I'm trying to drive 20 workstations, and assuming a worst-case of 75% trying to pull files simultaneously, that is 320/15=21.3MB/s per machine.

This is, of course, assuming I have a hard drive array capable of that speed as well. Using Two of the 3Ware Escalade 8506-12 cards each supporting a RAID-10 array of WD740GDs, then software striping the two controllers should give me a theoretical speed that is ~40% of the aggregate for the drives. Assuming the drive has a minimum STR of 54MB/s, 54*24*0.4=518MB/s. This would also give me a capacity of 74*12=888GB, or 44.4GB per machine. This capacity is 20% higher than I currently have, while this technique should also reduce my storage needs.

Please help me with my math...and the numbers. Merc is right, this is exciting from a technical POV...
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
22,269
Location
I am omnipresent
It'd be cheaper on your back end to go SCSI. You could go with a bunch of 74GB 10k SCA drives for maybe $175/apiece, save $50/drive that you could apply to getting a chassis with a large SCA backplane, and you could use a midrange SCSI RAID controller instead of that horribly expensive 12-port 3ware card.
 

ddrueding

Fixture
Joined
Feb 4, 2002
Messages
19,728
Location
Horsens, Denmark
Mercutio said:
It'd be cheaper on your back end to go SCSI. You could go with a bunch of 74GB 10k SCA drives for maybe $175/apiece, save $50/drive that you could apply to getting a chassis with a large SCA backplane, and you could use a midrange SCSI RAID controller instead of that horribly expensive 12-port 3ware card.

Thanks for the tip, I'll look into it. To be honest, I didn't even consider it...as SCSI has always = expensive.
 
Top