I know we're probably ten years or more out from it (because it'd break way too much software ... I have an application I want to use but can't due to a 16-bit installer ... in 2016); but I really hope Microsoft eventually puts out a 64-bit only operating system. It would be lovely to be rid of SYSWOW64, Program Files (x86), the alternate shadow registry, ~3GB of wasted space for all the 32-bit code, etc.
I intentionally leave off the 32-bit compatibility with my FreeBSD systems, and it's lovely. Of course, it's so much easier there with everything being open source.
I feel just the opposite. I have a pile of the most powerful general-purpose commodity hardware ever built, and it's a damning indictment of the industry that it can no longer run software that used to work fine on tiny machines.
I disagree. Things that made sense on a tiny machine, like the original 8086's wraparound at 1 MB led to hacks like the A20 gate [1]. Or timing loops that were dependent on a processor running at 4.77 mhz, requiring things like the turbo button. Compatibility is a balancing game, because you end up needing to keep bugs lying around that otherwise would be fixed at some point in a newer architecture. I'm more of the opinion that we're much better off explicitly emulating those areas, so they are run in their own context and it's a lot easier to mitigate damage. While compatibility is a positive thing, improvement is a much more positive thing.
I agree that virtualization is better than support for old instruction sets in new cores, but users should expect that virtualization to be ever-present and reliable rather than mutely accept when tools randomly stop working.
> and it's a damning indictment of the industry that it can no longer run software that used to work fine on tiny machines
On the contrary, I think it's an indictment of the industry that we still put effort in hardware into running ancient software and continue to burden the most widespread architecture in the world with 1970s nonsense when virtualization is the obvious solution.
Plenty of people want to run games for the 6502 architecture on their PCs, yet Intel would be out of their mind if they added a 6502 to the die. Why do we not treat the Intel 8086 the same way?
There's also no PC manufactured in the past two decades that can run it. Real mode has nothing to do with the challenges necessary to emulate that demo.
(There's also the fact that the 8088 MPH demo is not a particularly compelling use case, even if the hardware were present.)
If they wanted to, they should have done it in 10. They can't now, because they promised to support all the devices it could be installed into for their lifetime. Some of those devices are 32bit.
I intentionally leave off the 32-bit compatibility with my FreeBSD systems, and it's lovely. Of course, it's so much easier there with everything being open source.