That didn't end up working out. Instead, individual big institutions bought mainframes, and then individual small institutions and hobbyists bought minicomputers, and then individual average consumers bought PCs. These days, everyone has at least one computer; there are billions of computers out there.
This isn't all that bad. It's nice that people are able to own and control their own computer.
But this trend of proliferation has not stopped at one computer per person. Instead, almost everyone has at least two computers:
Technology hobbyists today may have even more computers:
The less obvious cost is that today's computers are self-obsessed: they are closed off in their own little worlds and they barely interact with each other.
In a single computer:
All of this becomes immensely difficult in a world with multiple computers. To do any of this, I have to set up special services (which then also have to be administered!) and configure everything to connect to them, and make sure that I use the right service, and on and on.
Instead of doing that, I just have a single computer:
This is great.
If I want to serve a new file on the web,
I just stick it in
If I want to run a new service and make it publicly available,
I just run it.
You might ask: but what if your single computer goes down? Multiple computers give redundancy!
In the old days, protocols were designed to cope with occasional downtime. Email delivery, for example, will be periodically retried for hours or days until it makes it through; or it will be delivered to a different server which is closer to the end destination and can take on responsibility of doing the final hop. You could imagine the web being designed in a similar way, perhaps with transparent caching, but alas it is not.
So if I turn off my desktop, my personal website will go down, along with other services. For the services I care about, that's not a problem.
You might ask: what about security? Multiple computers allow you to isolate services into separate security domains!
Multiple Unix users also allow you to isolate services into separate security domains, in an equally robust and substantially less complex way, a way that is enabled by default for any distro-packaged service. The more complexity in the system, the more likely you are to have accidentally left some vulnerability lying around, no matter how many computers you have.
You might wonder: What if you want to use your computer while away from your desk? Luckily for me, I'm a homebody. If I wasn't, maybe I'd use a different approach, where my single computer was portable, and served my website and other stuff over the cell network, Manfred-Macx-style.
I do, unfortunately, have a separate phone. Connecting a cell modem to my desktop and getting rid of the separate phone (or something like that) is on my TODO list.
But nevertheless, as of now, I have (essentially) a single computer. It's my personal computing utility.