Ideally, I should be able to download an untrusted binary application from the internet, run it on my local machine, and be able to trust that it won't be able to steal my passwords, read my contacts and pictures and documents, use my camera and microphone, drain my battery, use up all my CPU and RAM and network and disk resources, freeze my system, or do any other nefarious thing without my permission. OSes have failed so misererably at this that web browsers have had to pick up the slack and the web is slowly morphing into a bad operating system.
Say you have three unreliable programs that occasionally leak memory, spin CPU etc. (yes, it would be nice if all our programs were perfect, but they're not). It should be possible to run these three programs on the same server in such a way that the failure of one won't affect the others (so e.g. you have 3 servers, each runs one of each program, and you load balance between them, and you have some system that eventually detects when one of the instances fails. So as far as the outside world is concerned, all your programs are running reliably). At the moment, the most practical way to do this is to run 3 different VMs on each server, one for each program. Which is insane. There are some encouraging recent developments (e.g. docker/lxc), but it should be easy to do that kind of isolation purely at the OS level.
Can you elaborate? In what way are they failing here?