Chewy509
Wotty wot wot.
Hi Guys,
So I sit here at work waiting for some unit tests to complete (they take about 5 minutes) after doing some code modifications, and I wonder about how resource heavy our modern software is.
(Disclaimer: these thoughts were sparked after reading an article on /. today on a ray tracer written in 254 bytes)
So at my desk is a machine (i7-860, 8GB RAM, 250GB SSD, about 4TB of spinning rust, Win7), and on start up it's using 1.1GB of RAM (no apps open, just logged in), and with my IDE open (Eclipse) it's using 1.6GB of RAM. Just about every app I run wants 50MB+, eg Chrome - 2 tabs - 220MB, Outlook a measly 35MB... and I count at least 8 auto-update applications (java, acrobat, AV, windows update, nvidia update, bing desktop update, google update x2).
So when did having 4GB of RAM become the minimum and WTF does modern software need all these resources to accomplish what we did with a 10th of the resources years ago. (S^&t some might argue, 640K is all we really do need).
My home desktop, yes runs Solaris 11 a server orientated OS, it only uses 480MB on start-up, and yet seem now to struggle with only having 4GB of RAM** when I want to do more than a few things at once. (I'll except out the use of VMs - I do have a Win8 VM running that needs at least 2GB to be usable), but with firefox, thunderbird, pidgin and eclipse or netbeans open I just keep running low on RAM...
I remember the day when having 4MB was huge (for DOOM), and I completely understand RAM requirements when using large images or files or databases... but for code, I thought we were better than that. (My current Uni project needs about 120GB of RAM for processing genome sequences, yet the code itself - a measly 120KB in C++)...
I don't know, maybe I'm in the wrong industry, or there is a LOT of really crap people in this industry who still believe that we can just buy more RAM. (I wait for the day when a virus scanner needs 4GB of RAM just for itself)!
Sorry, this is just a rant whilst I'm waiting for tests to run...
**My current home desktop has 4GB of RAM (4x 1GB Reg ECC DDR2-800), and last I looked maxing out the RAM to 16GB would require roughly the same investment as a new i7 box with new motherboard, SSD, PSU, Case, etc, and with double the RAM! Hard to justify to the wife since I'm still only working part time whilst I finish my degree.
So I sit here at work waiting for some unit tests to complete (they take about 5 minutes) after doing some code modifications, and I wonder about how resource heavy our modern software is.
(Disclaimer: these thoughts were sparked after reading an article on /. today on a ray tracer written in 254 bytes)
So at my desk is a machine (i7-860, 8GB RAM, 250GB SSD, about 4TB of spinning rust, Win7), and on start up it's using 1.1GB of RAM (no apps open, just logged in), and with my IDE open (Eclipse) it's using 1.6GB of RAM. Just about every app I run wants 50MB+, eg Chrome - 2 tabs - 220MB, Outlook a measly 35MB... and I count at least 8 auto-update applications (java, acrobat, AV, windows update, nvidia update, bing desktop update, google update x2).
So when did having 4GB of RAM become the minimum and WTF does modern software need all these resources to accomplish what we did with a 10th of the resources years ago. (S^&t some might argue, 640K is all we really do need).
My home desktop, yes runs Solaris 11 a server orientated OS, it only uses 480MB on start-up, and yet seem now to struggle with only having 4GB of RAM** when I want to do more than a few things at once. (I'll except out the use of VMs - I do have a Win8 VM running that needs at least 2GB to be usable), but with firefox, thunderbird, pidgin and eclipse or netbeans open I just keep running low on RAM...
I remember the day when having 4MB was huge (for DOOM), and I completely understand RAM requirements when using large images or files or databases... but for code, I thought we were better than that. (My current Uni project needs about 120GB of RAM for processing genome sequences, yet the code itself - a measly 120KB in C++)...
I don't know, maybe I'm in the wrong industry, or there is a LOT of really crap people in this industry who still believe that we can just buy more RAM. (I wait for the day when a virus scanner needs 4GB of RAM just for itself)!
Sorry, this is just a rant whilst I'm waiting for tests to run...
**My current home desktop has 4GB of RAM (4x 1GB Reg ECC DDR2-800), and last I looked maxing out the RAM to 16GB would require roughly the same investment as a new i7 box with new motherboard, SSD, PSU, Case, etc, and with double the RAM! Hard to justify to the wife since I'm still only working part time whilst I finish my degree.