The problem with the way how OS X keeps data cached in the inactive memory is based on the assumption that you are going to re-use the same app within reasonable amount of time. With the current behavior/performance response (without knowing exactly how Apple engineers implemented it), it really feels like a giant garbage collection system that takes age to free up its own memory with no real sense of concept that if you don't use certain apps for day, chances are, it is going to take a long while before I use them again.
I am one of the devs out there running a Macbook Pro with 8Gig of memory (I wish I could have more but I have a 2010 old model). For web development, I have at least Firefox/Chrome/PhpStorm/SmartGit/Mamp/Thunderbird/Terminal/Notational Velocity/Dropbox/Alfred/Sophos AntiVirus open at all times. Now, a long the way, I may open a few other apps that I use rarely, like Photoshop/CyberDuck/VMWare Fusion/iTune/iCal/iOS Simulator/Preview/LibreOffice/Skype. Now, pretty quickly 8Gig gets used up, and the system runs to the ground shortly after.
If I then have VMWare Fusion shutdown for awhile, and relaunch later, the system really just can't take it anymore. The last resort? purge&
At least, that's my day-to-day experience with OSX. Personally, I find the memory management really lousy, worse than other other OSes I used in the past (both Windows & Ubuntu/Fedora/Gentoo).
So why the heck I use Mac still? Because of the driver support is still far better than Linux. With Mac, you are less likely in need of blacklist of some drivers because of freeze up issues.
Either way, I am definitely not a happy camper with the current memory management system in OS X
The inactive count is a misleading figure in many ways. It includes both 'dirty' and 'clean' (i.e. identical to what's on disk, and hasn't been altered) memory that's not been used for a long time. Purging 'clean' memory is instantaneous. The only cost you'll see is writing 'dirty' memory out to disk.
Your explicit purging is changing the cost of writing dirty data out from an ongoing cost to a single, longer, upfront cost. Instead of writing only when more RAM is required, you're forcing it all to happen at once.
Incidentally, the OS does try to keep an area of free RAM so that some memory can be allocated instantly, it's not only swapping things out when RAM is absolutely full. It's possible though to outrun this process if an app tries to allocate huge amounts of RAM at once though (i.e. more than is kept free for this purpose).
For your specific case, presuming apps are behaving well (see below) you would be better off quitting apps and relaunching them when you need them later. This will free up app the dirty RAM they've allocated (just like when you purge), but the 'clean' inactive RAM will not be purged (because that's not necessary - as I said, it's free to purge that kind of memory when it's needed for something else).
You also want to run Activity Monitor when your system is in it's bad state and see if it's one of the apps you're using in particular that's allocating lots of memory (check out the "Real Mem" column). The OS can't do anything if it's an app that's really allocating and writing to memory, it's obviously not able to just discard this written-to memory.
Really though, if you want to do all those things at once, more RAM might be required. Remember, with VMWare and the iOS Simulator running, you've got two whole other OSes running at the same time, it's reasonable they'd require lots of memory to work well!).
By the way, the purge command was written to simulate /worst case/ conditions when performance testing. It's designed to flush out caches so that the system has to e.g. load all an app's code from disk when launching.
[Source: I worked analysing this kind of thing at Apple until a couple of years ago].
> So why the heck I use Mac still? Because of the driver support is still far better than Linux.
I feel like this is a stereotype that just won't die.
Yes, up till a few years ago you might have to do some poking in /etc to get things working, but as long as you spend a few minutes looking up basic background info before you buy those problems just don't happen these days. I haven't had to edit a config file to get hardware working since 2007.
I've had to edit configurations files to get hardware working in the last month.
On a laptop we have here at the office I had to disable a certain driver from loading before a different driver or else the two would squabble over the wifi card and it would never show up.
The other thing that is more software related than hardware is that it is a MS Windows shop, all of the local domains are machine.domainname.local. This conflicts with MDNS as you could imagine, so the Linux machines are unable to access any of the resources on the machines named machine.domainname.local because MDNS would respond with a failure. Had to modify /etc/nsswitch.conf to fix that issue.
Linux is not without its failures. Saying it just works is certainly not the case. Whereas the Mac OS X machines I deploy come out of the box, get configured and are ready to go. Drivers work, software works, don't need to go googling for hours trying to figure out why ping won't resolve a machine.domainname.local address but dig is doing just fine.
I said if you spend a little time scoping these kinds of things out up-front, it's very easy to get a machine on which it will work without issues. Obviously if you found a machine lying around the office and tried to load an OS on it, your chances of it working well are not going to be as good. You can't just load OS X on random hardware and expect it to work either.
After 10 years of Linux desktops, including purchasing a laptop with low-performance hardware just because it used entirely OSS drivers, I still has issues with basic things like multiple monitor support (it worked, but only when I disabled compositing, which makes redraw suck).
Sorry, from my personal experience that still isn't true, as much as I wish it were.
I've been using dual monitor setups at home and at work for a few years now through several hardware builds, both desktops and laptops first using Ubuntu, then switching to Debian Testing for a rolling distribution about a year ago (after finally realizing I love the latest software, but don't want to spend time to upgrade/rebuild every 6 months to keep up with Ubuntu's release schedule or deal with the hassle of installing everything from source). I've been using Nvidia cards with the nvidia driver and dual monitor support has been fantastic for me.
Well, I've been using Nvidia cards with the nvidia-driver and dual-monitor support as well, and it has been fantastic, until I upgraded my Ubuntu install, got Unity without asking for it, after which multi-monitor support was completely broken. My colleague who has 2 screens of a slightly different type (all are Lenovo ThinkVision) and is running Arch with Gnome 3, has been experiencing random multi-monitor glitches since day one. Sometimes for no apparent reason one of his displays doesn't get a signal after waking his laptop from sleep or hibernate, and the only way to resolve it is a reboot.
To make a long story short: we could exchange anecdotes all day about the state of 'Just Works' on Linux, but at the end of the day, I think no one with enough experience using various Linux distros and OS X, can honestly and sincerely say Linux is even close to OS X in that aspect.
Myself, I've been using Linux since Slackware 4 and have tried about 10 different distro's over time, alongside OS X for the last 5 years or so. Up to this day, I regularly run into problems that need fixing on Linux, particularly after upgrades, or when switching hardware. Whether it's Wifi cards, USB hardware, multi-monitor support, network configuration issues, software that stops working, system library problems: there's always something. OS X on 3 different machines, from OS X 10.4 through 10.7, I've only had one issue that required maintenance once, on a b0rked upgrade. It was pretty nasty, but fortunately OS X has Time Machine and target disk mode, so in no-time I was able to pull off any important data just to be sure, re-install the OS, restore my Time Machine backup, only to find out everything was back to normal, to the point I didn't even need the files I had to pull before the restore.
I still haven't figured out how to get my phone to tether over USB on my friend's Mac. Works flawlessly on my Debian machine though. It's not like OS X is seamless.
Well, I would say yes, Linux has come a long way to be much more mature and much more usable than before.
However, up to this date, it is not without issues, especially on laptop hardware. Remember that Lenovo ThinkPad T400 from a few years ago? Well, the level of stability from a popular distro, such as ubuntu, has been quite up and down. One release (like 11.04), I had trouble with it booting up and playing nice with dual graphics mode. Today, with 11.10, it is much better. How about that shiny Acer AspireOne 722 netbook? You should check the online thread. There are still discussions about how to prevent freeze up and etc. All these little quirks here and there are the reason why I would still run OS X.
Interesting - I do about the same thing (Minus the Antivirus.... that could be your big pig there), with 8 gigs of physical ram, and I have paging (swapping?) disabled.... and I've never had a "not enough memory" crash or whatever.
With vmware going, iTunes, Xcode building stuff, skpye, dropbox, item, mail, a few browsers, a bunch of tools..... a video going maybe on the second monitor. On a mid-2009 mbp.
I am one of the devs out there running a Macbook Pro with 8Gig of memory (I wish I could have more but I have a 2010 old model). For web development, I have at least Firefox/Chrome/PhpStorm/SmartGit/Mamp/Thunderbird/Terminal/Notational Velocity/Dropbox/Alfred/Sophos AntiVirus open at all times. Now, a long the way, I may open a few other apps that I use rarely, like Photoshop/CyberDuck/VMWare Fusion/iTune/iCal/iOS Simulator/Preview/LibreOffice/Skype. Now, pretty quickly 8Gig gets used up, and the system runs to the ground shortly after.
If I then have VMWare Fusion shutdown for awhile, and relaunch later, the system really just can't take it anymore. The last resort? purge&
At least, that's my day-to-day experience with OSX. Personally, I find the memory management really lousy, worse than other other OSes I used in the past (both Windows & Ubuntu/Fedora/Gentoo).
So why the heck I use Mac still? Because of the driver support is still far better than Linux. With Mac, you are less likely in need of blacklist of some drivers because of freeze up issues.
Either way, I am definitely not a happy camper with the current memory management system in OS X