Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

So who are they talking about? It sounds like a warning, and focuses on emulation style approaches. They compared to Transmeta.

Edit: Seems odd to use Transmeta as a comparison if they are talking about software os level emulation. Wasn't Transmeta's emulation all built into the chip, no os level support/software needed?



Windows 10 on ARM (to be released later this year) includes x86 emulation to support 32-bit x86 desktop applications on ARM. I assume Intel is talking about Microsoft.

A couple of demos of Microsoft's x86 emulation. It seems to work well:

https://www.youtube.com/watch?v=A_GlGglbu1U

https://www.youtube.com/watch?v=oSXUDKpkbx4


Heh, and Android on x86 includes ARMv7 emulation to support 32b NDK apps on x86. libhoudini, written and still maintained, by Intel.


Though 32 bit apps will get rarer. They will need to have a solution for 64bit apps.


Maybe if Microsoft stops shipping 32-bit OSes. But if your app needs to run on 32-bit Windows, and your app isn't one that would benefit much from 64-bit, why bother releasing a 64-bit version? It's just one more build to keep track of, and one more source of bugs.


The WOW64 layer has a few caveats. For instance, filesystem virtualization means that users of your app may see different content in a "File > Open" dialog than they would in an Explorer window.


Well, I would imagine they're hoping we go straight from 32-bit Win32 to UWP which compiles for ARM automatically. I doubt anyone today is releasing 64-bit only apps, and Win 10 for ARM is almost here, so the selection of 64-bit only apps probably will never grow too large. For developers who want to avoid UWP, they can continue to compile for both 32-bit and 64-bit as we've been doing for years.


> For developers who want to avoid UWP

For those paying attention to the BUILD talks, those developers will be gently dragged into UWP thanks to Desktop Bridge, regardless how they feel about it.

New Office version for Windows 10 will be UWP only.

CP/M, MS-DOS and Win16 compatibility were eventually removed.


MS-DOS and Win16 compatibility is NOT removed in 32-bit Windows.


I was thinking about 64 bit and future hardware only.

I don't think about 32-bit Windows since I left XP.


Maybe they think most of those can be recompiled?


If the developper is convinced there is a big enough market


For what it's worth, in that first video where they say the video is running "really smoothly", the video is visibly chugging (though maybe that's just due to recording).


There's been several reports of a partnership between microsoft and qualcomm to make an "always on" cheap laptop running a snapdragon 835. So just like a phone is always on, can receive message at any time, and ideally the battery lasts all day. Said project would emulate x86 on the arm to allow backward compatibility.

Chromebooks are having somewhat of the opposite battle. Intel's trying to emulate arm to run android apps on chromebooks. The reports I've seen is that intel's not competing so well. Many popular android apps run better on an arm chromebook than an intel chromebook.

In both cases I think the consumer wins, competition is good, and I'd consider a new chromebook just for the hardware and then I'd install some flavor of linux on it.


A managed OS like ChromeOS makes the underlying processor irrelevant, for application developers that is, which means more competition among those the deliver CPUs to OEMs.


My bet is it is apple and they are looking for ARM emulation of x86 on new Mac laptops slated for next year.

Oops, they are probably referring to Microsoft/Qualcomm's announced project here: https://www.extremetech.com/computing/249292-microsoft-decla...


This reads like they caught wind of something that really scares them. I'm in the, "it's probably Apple" camp, too. Not only is Apple doing very well with it's A series chip designs, but they have a ridiculous amount of money in the bank to fight any legal fights.


Yeah, it probably is Apple.

I can't imagine Intel going to war against Microsoft, both sides have too much to lose if they start seriously fighting. It's not called the Wintel monopoly for nothing.

But if Apple were to stop buying Intel chips, and start using their own chips in their Macs, Intel could become very litigious.


> Yeah, it probably is Apple.

Did any publicly evidence appear officially or accidentally on the internet that Apple is working on such a project (for Microsoft the situation is known)? I am just a kind of person who prefers strongly to look at the evidence instead of rumors, speculations and hopes.


Apple is notoriously secretive, so you won't get the evidence you're looking for until such a device is very close to launch.


Sure, but some of this can be learned from monitoring LLVM patches and discussion groups (or attending the usual conferences). It's difficult (or impossible) to keep secrets like this from their underlying communities.


Apple has three (iOS, tvOS, watchOS) of their four platforms on ARM, so LLVM patches related to OSX on ARM could probably be plausibly disguised as patches related to these platforms.


Often evidence leaks beforehand for new Apple products or plans.


Apple did make some noise about getting rid of support for 32 bit binaries. Reading between the lines this could be because intel owns many patents related to x86. 64 bits is less of an IP nightmare because AMD created x86-64, not intel.


Don't forget they just said High Sierra (ug) will be the last version to run 32-bit software "without compromises".

In the next version they may run 64-bit clean and use emulation in some sort of Rosetta style environment for "legacy" 32-bit software.


That seems like a pretty bizarre plan to me. The 32-bit emulator is going to be way bigger than the code that handles IA-32e mode. I think it's more likely that they'll reduce the amount of system libraries for which they ship 32-bit variants and/or require separate downloads of them. That's the real cost of keeping 32-bit support, and an emulator doesn't help on that front at all.


It seems odd, but I wonder if it would be useful in some way. At that point you could buy chips that have no 32 bit mode at all. Or you could MAKE them.

Or maybe you could make ARM chips and run a Rosetta like thing to run the 64-bit code, but since 64-bit x86 is so much cleaner it's not much of a problem compared to the whole 32-bit mess.

This is all pure speculation. I thought the way they phrased things with a little odd. It may simply be that they don't intend to update any libraries in 32 bit mode so stuff will just stop working as libraries change. It may be everything's going to start going through some layer that thinks 32-bit calls into 64-bit calls thus possibly reducing performance.

I don't know… I just feel like that statement meant something interesting and I am coming up with fun guesses.


My guess is that the "compromise" will be some fugly alert that pops up when you run a 32-bit app, saying that it's degrading system performance and you should update it and be ashamed of yourself. This is what iOS currently does, so there's precedent.


> but since 64-bit x86 is so much cleaner it's not much of a problem compared to the whole 32-bit mess.

I would not call x86-64 "cleaner" than x86-32. In many senses it is much more messy than x86-32.


I remember people saying that, but I don't have experience with it myself. I'll take your word for it.

They may have been referring to the fact that 64-bit runs in 65-but only, no choice to drop into/out of 16-bit, etc.


> They may have been referring to the fact that 64-bit runs in 65-but only, no choice to drop into/out of 16-bit, etc

Even in 64 bit mode it is possible to drop into 16 bit (protected) mode (code) if one wants to. What does not work is dropping into Virtual 8086 mode.

GNU/Linux never supported 16 bit code. For Windows it would be possible in principle (and it even would IMHO sense to support this feature on 64 bit versions of Windows). On

> https://hackertimes.com/item?id=14246521

I wrote something about this topic. TLDR: 64 bit Windows uses 32 bit handles, which will cause problems with 16 bit applications.

But there are also people on HN who stated the opinion that while this makes implementing support for 16 bit applications on 64 bit Windows harder for Microsoft, it would have been far from impossible, thus this technical reason is a mere convenient pretense not to implement this feature.


Actually, NTVDM on 64-bit Windows is implemented as a prototype - and it works.


x86-> ARM is waaaay more complicated than 68K->PPC or PPC->x86 because of ARM's much weaker memory model than x86.

It's borderline a non-starter.


Why would they need to emulate x86 they can just compile for ARM.


So that users can run all of their existing binaries.

Both Mac processor architecture changes so far have included emulation or binary translation layers for this purpose.

68k -> PPC: https://en.wikipedia.org/wiki/Mac_68k_emulator

PPC -> x86: https://en.wikipedia.org/wiki/Rosetta_(software)


With NeXT they had "fat" binaries to handle different processors. No emulating.


That does not solve the problem of existing binaries.


Apple doesn't have the same control of the app ecosystem on macOS that they do on iOS so they'ed need to maintain some compatibility system like they've done for their previous architecture migrations


They have to convince everyone else to as well though. There will need to be a transition period powered by emulation for third parties, similar to Rosetta from the last time they switched architectures.


They’d have had a very hard time making the PowerPC -> Intel transition without Rosetta, but I’m not sure they’re in the same position now. How many apps on the average 12” MacBook are no longer maintained or have significant asm in them? I’d guess very few. Start with the MacBook and, over time, as Ax chips become faster and more software gets an ARM64 port, transition the whole line. (Not saying I’d bet on this scenario happening, but it doesn’t seem impossible like making the move to Intel without Rosetta would have been.)


One of the important things about the Rosetta transition was that the x86 processors were DRASTICALLY faster than the G4s, especially in mobile. The difference wasn't as big on the G5 processors, but if you had an Apple laptop even after running through Rosetta the speed difference was noticeable.

If they were to switch processors and it was only (say) 10% faster, the hit from the transition layer may make it a hard sell.

Apple has been knocking it out of the park with the A-series processors though, maybe they could make a big enough difference that it would work out again.


There is another important difference: there was no GPU back then (and QuickDraw was not very suited to GPUs. OSX changed that).

Applications where CPU time was mostly drawing on screen (e.g. text editors) worked a lot better than those where most CPU time was in app code. Today the GPU and graphics code is an even larger amount of the processing.

So for a lot of apps it will not be a big problem if the emulation is a bit slow.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: