In other bad news, Microsoft developers on Twitter stated that 8th Gen Intel or 2nd Gen Ryzen CPUs will, actually, be required to install Windows 11 at all by RTM. If you are on 7th gen Intel or 1st gen Ryzen, there is no mercy. A 2013 MacBook gets more support than a 2017 Windows laptop.
A security director now says that a blog post "clarifying the floor" is coming. But frankly, if this turns out to be a miscommunication, if you read those tweets Microsoft would have to be unbelievably incompetent in their use of words.
Man, even checking their official website makes me want to pull my hair out. Nowhere do they concretely mention either TPM or after some research what should be called TPP for intel. I can only guess that maybe they hide it behind yet another name (maybe Intel® Trusted Execution Technology ?) but the information on this is basically impossible to find on the internet at this point.
On the other hand, 4 more years is a lot, and they'll probably continue to release security updates for a year or two past 10's EOL (like they have with Windows 7).
Yes. It's to ensure Microsoft always have full control over the code running on your device, so eventually Microsoft can require all Windows apps to come from its store, and you won't be able to do anything about it; because the hardware TPM module will reject any attempts to modify it.
you mean in the same vein as the huge epic apple trial? nah. sideloading and app installing will be allowed due to fear of government action. the tpm can ensure system files are not malware infected tho
Is there any justification for the cutoff? It might be possible to just patch out the cpuid check and have the OS run.
If it's boot attestation someone will do it. This is probably to force enterprise vendors to move to devices that better support enterprise management, but yes, it drags us kicking and screaming along with it, at nontrivial personal expense.
"Seems like you are assuming there is a specific security feature that defines 8th gen as the CPU floor. The floor is set for a range of quality, performance, support, and reliability reasons to ensure a great experience."
That sounds quite unreasonable. "You don't deserve to "experience" Windows 11, unless you can afford the newest computers. And if you can't, we won't let you have it. Because maybe it won't work perfectly. Even though you got your computer 4 years ago."
Not true. 7th gen gained Mode Based Execution Control which greatly decreases the overhead of VBS. 7th gen also gained SMM Security Mitigations 1.0 and some UEFI hardening.
The list of supported Windows 10 CPUs doesn't even list my current CPU (Xeon E5-1680v2). I'm excited for the release but they've really bungled their messaging and scared people.
I have an i7-3930k on some Asus MB and it has Secure Boot but no TPM. There's a header for it, but it's empty.
My understanding is that the SB keys are known and verified by the BIOS, not by the TPM.
The TPM is a mechanism to verify that the BIOS (among other things) hasn't changed from a known state. If it hasn't, the TPM will then release some key, for example to automatically decrypt the root partition at boot. I've also seen it used as a store for SSH keys.
This is how the default setup of BitLocker works. On my machine, I have to switch BitLocker to password mode, or type in the key at each boot.
Nvidia which people love to hate has both a console command to generate your x config or a handy gui which lets you set options and then write the resulting config to a file of your choosing. This worked fine in 2003.
Linux on laptops/desktops will only survive if Microsoft maintains their model of working on different types of hardware and not insisting on being the hardware controller. With so many people embracing Apple's approach of one company controlling every part of the stack, I don't think Microsoft will keep supporting their model for long unless forced to.
That isn't Apple approach, rather Apple is the surviving vendor of that approach, which was quite common until the PC came into the scene, thanks to IBM not being able to prevent Compaq from the attack of the clones. They even tried with PS/2, MCA architecture, but by then the pandora box was open.
MacOS diverged enough from *nix I think, to be third camp. Things they do like read-only system drive, app permissions, signed binaries, no package manager.
There are other perspectives other than yours. I've installed Linux for many people: in fact I have an entire family (three kids, a dad and a mom) using Linux as their daily drivers. Zero complaints.
But I was also smart about what distro I put on their systems. Endless OS or Solus, for example, are not prone to breakage on upgrade.
You've been using Linux for a long-time. You should know that Arch and Ubuntu are not the only two choices in this space.
You should also know that when manufacturers weren't threatened by Microsoft to do otherwise, there was a thriving ecosystem of SFF laptops, netbooks, that shipped delightful Linux distros that just worked. Moblin and Jolicloud were fantastic explorations of UI and OS design that might have been.
Same here, most family (except my brother, but he lives abroad) on Linux and no problems. Even better when I switched them from Debian to Manjaro. Debian is still my favorite for tinkering, but to get things done and general stability there are better options.
And here is our main Linux problem: its fragmentation, which is the natural byproduct of its freedom; everyone and his cat can develop or modify things, which translates in different distributions, which of course is a nightmare if you are a software producer with developers paid to port your software to Linux, and that's why a lot of commercial developers either choose one or max two distributions to support or ignore it at all.
One thing the Linux Foundation could do about that is mandating a minimum set of requirements that all Linux commercial software could count on, and all distributions should meet if they want a badge that guarantees the software will be 100% compatible out of the box, or that the host system can be adapted with extremely low effort.
I'm not sure how much having Microsoft as Platinum Member would help with this, though.
Windows support will continue to at least 2025, so it's hardly like users are being left behind - they have multiple years to upgrade, and Microsoft has typically extended its support window when the time comes.
What MS is doing here is forcing OEMs to package TPM 2.0, which will be a massive win for security.
The intel generation is seemingly a bit arbitrary, we'll see how that plays out. But again, it's not like people are being left behind - they have 4+ years of support left.
So now 3 year old hardware won't work on windows 11 which will be the only available option for a lot of people in 4 years.
That's 7 years only, a lot of perfectly usable computers will be get rid of for no reason other than mild security benefit. Seems like they want to go in mobile direction forcing people to buy new hardware every 4 years.
If people are going to downvote the other child comment for either FUD or whatever, could they at least provide reading material/counterpoints as to the benefits for those of us without the MSc in cryptographic systems?
A TPM holds a secret. It's useful for establishing identity, performing signing, etc. Think about the fact that an attacker on your system can take a session cookie, move to another computer, and connect as you. A TPM can prevent this by establishing a cryptographic, hardware backed identity.
This is a very significant win and critical to 'zero trust', which is now something the government is telling people to embrace.
The whole thing with security is silly. They said it about UEFI - and yet I run Linux on UEFI systems all the time. I also have systems with TPMs that run Linux. Works fine.
edit: People are downvoting you now, though I think your question is very reasonable and worth answering. Such is HN, though.
> They said it about UEFI - and yet I run Linux on UEFI systems all the time.
Not just any UEFI, UEFI with SecureBoot enabled specifically. On SecureBoot-enabled systems, you're booting a Microsoft-signed executable that chainloads Grub, which then loads Linux. This has given Microsoft a power they should not have: the power to refuse to sign future versions of that first executable.
As others have said, this is either FUD or limited to specific systems with broken UEFI implementations.
I think the whole "MS-signed Grub" is a way by Ubuntu (and possibly others) to facilitate Linux installation by "non pros". And it's not that bad, I think.
I, personally, use Arch Linux and the installer isn't signed with anything. They have a "do your own signing" policy. I had to disable SecureBoot to install it, because I couldn't be bothered to create a new ISO, but then I registered my own keys to the UEFI and boot directly my own signed kernel [0], without using GRUB at all.
I did all this on multiple generation HP "enterprise" laptops and desktops and it worked without any issue. And those systems don't come with any sort of Linux support from HP.
---
[0] For the curious, this is EFISTUB combined with sbsign, that bundles the kernel image, initrd and CPU microcode in a single EFI binary and signs it. This is then booted directly by the UEFI, and you can even register multiple binaries and have them show up in the UEFI boot selection menu.
Uhm, but on my Thinkpad I have to add keys to load Linux in trusted mode (installer asks to choose a password to lock these keys in order to establish their legitimacy when rebooting.)
Also Ventoy needs to install user-defined keys into UEFI to make it work with Secure Boot.
So my assumption is that - at least on Thinkpads - you can add your own user-defined keys; the only Caudine Fork where a Microsoft blessed loader might be involved is the one used by the distro installer.
When this was first discussed in the early 2000s the very legitimate worry was that it would be introduced as optional and gradually become mandatory terminating in machines where the user had as little control over the hardware/software as a game console.
There is no reason to believe that this wasn't implemented thus far because of the high ethical standards Microsoft folks have historically adhered to and while we are to believe Microsoft has turned over a new leaf much of the current leadership was involved at the executive level in old bad Microsoft.
...So my session cookie gets tied to a crypto key that (for all intents and purposes of what I've gleamed from the marketing speak, is burned into the TPM at the factory) is unique?
I thought we were trying to get away from identifiable fingerprints? This seems like the most unique one you could get.
I'm trying to learn about it, but all google wants to throw up these days is news articles on win11's requirement if it, marketing none-speak about "securing the platform" (which does to a layman, read similar to the other comments cry of lock-in), or dense whitepapers on threat models and cryptographic math.
So, first off, the entire point of a session cookie is to uniquely identify you. Like, if you log into a website, it should be you, right?
Second, you're close but just a bit off. While the TPM has a secret internally, it wouldn't expose that - it would be used to generate other secrets, or to sign things, basically it's sort of like putting a crypto library into your device so that you can perform cryptographic operations without knowing the secrets involved. But yes, there's a seed that gets soldered into the device itself.
I use TPMs to ensure that no one can access our AWS environments from devices other than those we own, for example. I'm not really an expert on this sort of thing, though I do work in security professionally - hopefully I'm not butchering this. There are quite a few experts on the matter who comment on HN though, maybe one of them can speak more authoritatively on this.
edit: Actually, I misread maybe - you aren't off. The session cookie is tied to the TPM - either existing within it, or requiring re-validation via the TPM.
I don't want most session cookies to know I'm the same person from a previous cookie. And even if they know I'm the same person, I'd prefer they still not know whether the device I just logged in on is the same as a previous device.
Separately from that, using the TPM to lock the session cookie to the device sounds useful.
I think we should differentiate a TPM backed cookie from, say, a tracking cookie. It isn't like an arbitrary website can track you with the TPM. TPMs aren't exposed to the web by any browser, at least to my knowledge - except through intermediary protocols provided by something like webauthn or an extension (Extended Verification, for example).
Again, maybe someone more knowledgeable can explain more - I'm a consumer of TPMs, and I understand the concept, but I'm no expert.
For a web exposed use case: TPMs are used as part of Windows's FIDO2 implementation, to make sure that the secret actually cannot be exfiltrated to other hardware.
That doesn't come with any particular privacy concerns however.
Right, I meant that there's no javascript API to communicate with a TPM directly as far as I know. You can still use a TPM as part of your auth to a website, it just has to go through a protocol where the browser handles the interaction. So a website can't leverage the TPM for tracking purposes afaik.
But web is really not so much my thing, so I'm now at the point where I'm probably going to start saying incorrect things.
Well you can't secure normal cookies against extraction with a TPM, because you have to extract them to use them.
So the question is how a particular protocol works, and whether a site can still identify you after you've deleted your session cookie. As far as I understand it, if you were wantonly using webauthn for all your session cookies, it would be relatively easy for a site to check if you're the same person as before.
You would have the TPM hold a secret that's used to repeatedly re-authenticate the session. Though today I don't know that this is often the supported workflow - for example, most of my TPM-based policies only enforce device-authentication on sign-in and not on every API call.
OK? So you log into a website with an email and password and you're OK with moving the cookie to another computer and using it there, rather than having to re-prove that it's you? I don't think you know what a TPM does honestly, there's a clear ignorance in this thread as to how this identity functions - it does not have privacy or tracking implications.
I don't necessarily care about moving a valid session to other hardware as much as I care about being able to create a valid session on any random hardware without having to trust any 3rd parties or rely on any 3rd parties permission.
TPM would be fine if 2 things were true:
1 Only the end user managed the secrets inside. Or at least they could, and without any penalty like "This device is insecure and so you can't use it for X". It's fine if there are default secrets and most people never change them. I'm not sure about this last bit but I think ideally also any factory default secrets should be erasable, no indellible unique serial number.
2 The interior implementation is fully open and disclosed. How the secrets are generated and managed. Not just a description and a promise, but an actual full set of public specs such that any manufacturer can produce a fuctioning device, or even I could out of other parts.
These things would allow the end user to be benefit from the mechanism, while leaving the end user in ultimate final control over their property and their identity and information. And it requires both. 1 is hot air without 2.
Any objections to this are automatically invalid, as they depend something they have no right to.
"I need to know that your computer is not doing something I don't want." is just fundamentally invalid.
I think their might have been some disconnect on how we view the term session cookie (A cookie for a singular browser session vs. one specifically for an authenticated user) but the rest of this is starting to make sense.
No. Every TPM has a factory-supplied unique key called the endorsement key, but every other key on the TPM is randomly generated on the TPM. The keys you actively use /can/ be tied back to the endorsement key, but there's no need for them to be - for most cases, any cryptographic material you generate on the TPM will be indistinguishable from material you generate on the CPU, other than to the local client.
It’s like having a Yubi key soldered into your machine.
Each identity you want to protect has its own key (so its’s not necessarily de-anonymizing) with physical access - pin, biometric verification - required for use.
It’s potentially very useful, although I 100% expect it to be backdoored by every geopolitical block.
The problem is that the TPM holds a secret that even the user cannot access despite being the owner.
This is a very significant win and critical to 'zero trust', which is now something the government is telling people to embrace.
If the government is really saying the equivalent of "trust us, but not the people you usually trust", that should a huge warning sign. The analogies to 1984 are disturbingly close.
Let's say you don't want users to view foobar.com so you convince Microsoft to configure it's browser not to resolve foobar.com.
No problem the users install their own browser.
Now we ask Microsoft to limit software that is signed by it effectively granted permission to run some software but not others. The alternatively browser is right out.
The user installs his own OS now we ask Dell to only allow it's computers to boot software that implements the same restrictions as Microsoft has agreed to.
So the user gets his own computer. Now we simply ask your bank or better yet your ISP not to connect to "insecure" devices that don't follow the standards described.
Now you need a substantial unlatched vulnerability to be able to resolve foobar.com
At it's root to make a users own device effectively restrict what they are allowed to do you need something difficult to defeat that you control and they do not.
At one point a us senator wanted this used to allow music labels to remotely destroy users computers if they believed that you were pirating music and every other restriction described has been implemented or discussed using on user devices belonging to users and most implemented.
You can access TPMs on Linux just fine so this entire "but then what if" chain kinda makes no sense. A TPM restricts nothing on its own, it's just a hardware crypto token.
I mean sure, if Microsoft decided that it wanted to do all of the things you said, and could convince all of the OEMs to help, and a bunch of services and ISPs or whatever, yes, that would suck. It's a totally fabricated hypothetical that would never play out, but sure.
Linux is "insecure", Society^TM supports Windows only, please install Windows to continue using Bank^TM, School^TM, and irs.gov. 'Legacy' tax paying is now unsupported, please use new and improved windows.irs.gov
When trusted computing was first discussed everything but the "trusted" browser blocking a particular website was discussed as a feature not a threat model and you can absolutely see how many consumer devices are currently locked down in just such a fashion.
Accessing the tpm doesn't mean what you think it means. You can't by design get the key out of the tpm nor necessarily control which keys are trusted by local software nor remote machines.
Control of what software is allowed to run is ultimately control over everything else because one kind of control gives you every other variety. The fact that you can disable secure boot or control which keys are trusted is a nicety that could be withdrawn next year along with your ability to run Linux.
Your vendor approved software can then implement any sort of restrictions it pleases and remotely attest to the efficacy of such restrictions.
Yes sir mr Amazon sir user id running trusted windows 11 and can't save or share this video as per your contract with Disney nor are they allowed to view any known pirate sites or sites that look like pirate sites according to our heuristics.
Alternatively. Please upgrade to a trusted windows 11 machine to view this content.
It's not silly hypothetical there have already been devices that used secure boot to lock out alternative OS and allowing third parties a say in what software is run is literally the entire point of remote attestation.
The last sentence is particularly disappointing. The article's analysis is correct and not particularly complicated it stands on its own regardless of whatever bias you possess.
Let's not throw words like 'bias' around when linking to Stallman.
> allowing third parties a say in what software is run is literally the entire point of remote attestation
No it isn't.
We're clearly of two very different minds, there's no way we're finding common ground on this, and I've already done the work of explaining to many people what TPM technology is, so I feel like I've done my part.
There are quite a few computers already where Linux only runs as long as Microsoft provides a signed shim for the boot loader. There was quite a bit of a controversy because of this when secure boot came out.
Really? Which computers? I don't know of any. To my knowledge the entire secure boot thing was blown way out of proportion - Linux users still can install Linux, they can even sign their own bootloaders.
At least HP sold some where it seemed to intentionally disable setup mode for consumer grade hardware[1]. Arm based Surface devices where locked down to WindowsRT using secure boot[2]. The WindowsRT case was basically the foundation of the outrage, not only did Microsoft claim to not abuse its position as only entity able to sign code that would run on nearly all secure boot enabled devices, it immediately released a secure boot enabled product that locked out Linux.
> And how is HP fucking their eufi up a Microsoft problem?
My claim was that there where computers where Linux only ran with Microsofts signature. I listed two systems (HP/Surface) where this is the case. Mainly because finding more is a pain, with support requests more likely to go off to nowhere instead of outright showing that no support for it exists.
One might make the case that Microsoft claimed it would require the ability to add alternative keys and conveniently failed to enforce that claim. If the goal was to blame Microsoft instead of just showing that Secure Boot can be easily used to lock users into one top down approved system.
Not to do the 2021 version of Godwining the discussion but this is like getting vaccinated and concluding covid was overblown.
There is absolutely at least some money to be made locking down machines and corporate America would sell our kidney's for a quarter if they could get away with it.
There would likely have been lawsuits against Microsoft if they had taken further steps on that road but that doesn't mean they won't be interested in boiling us all slowly now or in the future.
It's a token and the ability to attest to others that you are running software signed with a key you don't control making it the root of any effective set of controls.
Without it how do you keep someone from running whatever they please and lying to the machine on the other side of the connection?
You absolutely have to trust whoever manufactured the tpm module not to have back doors.
You have to trust every manufacturer of every other chip the same way also, and we have in fact found incredible breaches of that trust already many times over, and so it's too late to suggest it's a crazy thing to worry about.
The only saving grace with other chips is thst they are generic and at least you usually have your choice of some range of suppliers.
Will I be able to buy a laptop with an intentionally defective or fake tpm if I want, which will allow me to use modern software without trusting anyone?
Probably not. Probably plenty of Chinese manufacturers would be willing to produce them, like all the HDCP defeating hdmi switches, but probably software vendors will have some way to detect and invalidate them.
That would depend on future implementations. For example, should Microsoft one day decide to tie session authentication caching to TPM, then a backdoor in the TPM adds a universal chain for a state actor to log into the OS. Worried about state actors? No, but they hire contractors that I probably don't trust to keep this stuff secure. The OS could of course already have a back door, but reversing their code and finding it is higher risk than a closed access chip. I could envision a mandate to relocate the lawful intercept code from the OS to the hardware.
Another implementation could be used to tie DRM to hardware and then brick any hardware that is in license violation. Actually I can think of many scenarios that have a unique identifier on the hardware could be used/abused by corporations.
It should be noted that I lived through a time when corporations wanted this same setup on televisions. Each TV would have a unique chip that would give corporations control over what a person could watch, or even brick their TV. This was called the V-Chip. [0] And then of course, there was also the Clipper-Chip [1]. In terms of boiling frogs [2], TPM could be the base framework to slowly reintroduce the Clipper-Chip under a different name. i.e. used to boot-strap the pieces of code required to watch movies, buy things, browse the web, anything you can imagine.
The point is that the TPM is an essential element of a set of standards called trusted computing that includes secure boot and more.
It's not that secure boot nor the TPM in any fashion keeps Linux from inherently working. It's that it enables the OEM to decide what software is allowed to boot on the computer and enforce this.
Microsoft typically gives OEMs a substantial break and the OEMs in turn sell most SKUs to 99% windows users and make only a small margin on those devices. A small discount on the cost of windows could make devices disproportionately more profitable and MS might be apt to see a percentage in offering such a discount to help lock out the competition.
They have in fact over the years engaged in far more unethical behavior including investing tens of millions in a "partner's" fraudulent lawsuit/pump and dump scheme against various Linux vendors bankrolling an entire list of felonies.
This is a typical hacker news comment. Not everyone upgrades their PC every week. I know many people using computers much older than the newest unsupported hardware and they will still be using it after 2025. They just won't be getting security updates I guess.
Computers can be in sales channels a long while after they were last made, are kept an average of 6 years and in the case of many machines as much as 12.
Intel stopped taking orders for such chips in April 2020
I don't really get what you're saying, but Windows 10 was released 6 years ago, and they're committing to another 4, so that's a decade of support at minimum.
Windows 10 has a decade of support but devices that were sold relatively recently will have significantly less time supported.
You could have bought a new laptop just 3 or even 2 years ago that will now be effectively obsolete in 4 years.
It's basically the Chromebook model, if you think of it -- and kind of radical of Microsoft to go that route when historically they've never really had stringent hardware requirements beyond the ones for OEMs to put those Windows Ready stickers on.
> sold relatively recently will have significantly less time supported.
OK but the 2025 EOL wasn't a secret, it was announced alongside the Windows 10 release a decade ahead of time, as well as the extended support date of 2029.
So you had a decade of notice for EOL, and currently you have 4 years of notice that you'll need a TPM 2.0.
Maybe you think that's unreasonable, but I'm not so sure.
99% of buyers don't know what a TPM is or if they have one. They also weren't shown such a notice when they bought their PC at Walmart.
Virtually nobody buying a windows machine new today knows that their new machine has less than 4 years of life if it's 7th gen if they even know it's 7th gen.
There may still be machines in retail channels sold new next year since Intel merely stopped taking orders for said chips last year.
If it's EOL in 2025 you could have machines sold new that become unsupported in 3 years whereas the average lifespan of a PC is 6 years.
Why are you defending a metacorp planning on creating untold tons of toxic waste and destruction of millions of dollars of consumer value.
It ought to be illegal to sell a new cimp with less than 6 years of supported life from date of sale and buyers ought to be entitled to a partial refund if this promise isn't kept with the retail seller expecting to get their money back from the OEM who has a contract with suppliers like Microsoft not to do stupid shit like this to necessitate such refunds in the first place.
I'm not defending Microsoft, personally my belief is that:
a) The intel stuff they're pushing is silly and should be loosened, and made part of their Windows 10 label OEM thing
b) TPM 2.0 will be a massive step forward for security, possibly the most significant since XP Service Pack 2, and I understand why Microsoft is getting aggressive about this.
Frankly, I think most people discussing this don't seem to know what the hell they're talking about. I'm really just presenting facts.
If you're asking about TPM in general, the answer is, among other things, cryptographic identity tied to a device. So, assuming it's everywhere, no more phishing for user credentials, no more password re-use attacks, no session exfiltration, and those are just a few - there are many more. Personally, I would rank those threats as being the top threats for the vast majority of users and organizations, so that's not nothing.
If you're asking about 1.2 vs 2.0, 2.0 makes considerable progress on the original specification both in terms of hardening of the TPM and its capabilities, allowing for it to be used in a much more broad set of authorization and authentication schemes.
If tomorrow every user had a TPM, and major services supported it, it would massively change the security landscape - I really can't stress enough that it eliminates some of the most widely exploited attacker techniques.
> you have 4 years of notice that you'll need a TPM 2.0.
It's very uncharacteristic for Windows. Totally unprecedented, both for consumers with no tech knowledge and for the more technically inclined.
This is the first time in the history of Windows where a large portion of the install base will be (ostensibly) totally unable to upgrade.
When you buy a Mac or a Chromebook, you expect this sort of thing. On Windows, though, the expectation has been that windows would at least try to run on the oldest platform even remotely usable, e.g. Win10 (2015) would run on the old Core 2 processors (2006).
It's not the end of the world, it just feels a bit slimy. I'm not one to cry "planned obsolescence" but this restriction (if it's hard enforced and not "just" requirements for OEM licensing) is clearly an attempt to nudge people to buy new hardware. Which is fine. Just unexpected.
The average PC is kept for 6 years meaning people would be expecting to use this new computer they just received say tomorrow as late as 2027. A minority of machined will remain in service as long as 12 years
Historically for most of recent history Linux and Windows both continued to run on machines that were previously supported for much longer than 4 years subject to the machines capability to keep up with present software. For example not all hardware released in the XP era had vista drivers but not only did most XP machines work OK with vista if they had sufficient ram to run but whereas there weren't many machines sold with XP after 2006 MS provided updates to XP for another 8 years.
This means that machines sold in 2006 were either in the OK to update to vista camp or in the OK to stick with XP camp until they were rendered permanently insecure after no less than 8 years of service with machine capable of running vista mostly being capable of upgrading to 7 as well for another 6 years of service.
Even when an old machine has been necessarily retired by the shifting winds of software it normally has been due to being incapable hardware or oems not providing the means to keep moving forward.
It is unprecedented in Microsoft or Linux land for such a monkey wrench to put put in the works. It seems likely that they will end up having to move the date back to allow a greater portion of machines sold including as we speak with incompatible hardware to age out.
For today programs and most games it's perfectly fine. War Thunder, Arma 3 and many other AAA runs max quality without issue; recently COD Warzone required to cut back clutter a little and that was it.
Synthetic benchmarks are almost irrelevant, we've been in good enough territory for a decade now.
I call bullshit on that one, there's no way you are running a CPU expensive game like ARMA on a q6600 at any usable FPS much less max settings, maybe at < 720p with tiny render distance. That game struggled on modern PCs when it came out in 2013, you are claiming an ancient CPU that's 6 years older. Yeah no way.
You're correct, and the dangers of this redefinition of "security" were known even further back:
"Unfortunately, the attestation model in TCG's current design can equally effectively prevent the software on a computer from being changed deliberately by the computer owner with his or her full knowledge and consent. While the owner is always free to alter software, attestation adds a new risk: doing so may now eliminate the computer's ability to interoperate with other computers."
Which can already be seen in action on Android: Various apps refuse to work if you're rooted, and Google is now slowly moving over to hardware-based attestation that probably can no longer be easily fooled.
Remember what they said about Stallman 20 years ago...
Alex Jones is associated with fake news and there is plenty of evidence about that; the same can't be said of Techrights, or at least I didn't find any such claims.
> In other bad news, Microsoft developers on Twitter stated that 8th Gen Intel or 2nd Gen Ryzen CPUs will, actually, be required to install Windows 11 at all by RTM. If you are on 7th gen Intel or 1st gen Ryzen, there is no mercy. A 2013 MacBook gets more support than a 2017 Windows laptop.
By the time Windows 10 support is EOL (2025), those laptops and CPUs will be 8 years old. Right now, Monterey (successor of Big Sur (aka macOS 11) won't suppport MBP 2013. So when Catalina (aka macOS 10.15 / last macOS 10.x version) is EOL, which should happen next year, then MBP is only supported by Big Sur which is probably lasting one more year compared to Catalina. So in two years, the 8 year old MBP 2013 is no longer supported. While its already only receiving security and reliability fixes since that's what Big Sur and Catalina receive. All in all that's 10 years of support, quite massive (but then again there have been very little progress on the CPU performance between 2014 and 2020). You can still run some other OS on any of this hardware, such as Linux or ChomeOS.
Yes, except that Windows 10 installs and runs absolutely fine on a Core2Duo laptop from 2008(I know because I have one).
It might not be officially supported but it works just fine. And with an SSD and 8GB of ram it actually runs pretty well.
And actually, MacOS runs on much older hardware with some simple patches. I also have a late-2008 unibody MacBook Pro that runs Catalina and receives latest security updates. Installing it was a tiny bit complicated but again, it works.
If you meant an Early 2013 MacBook Pro, the current macOS Big Sur doesn't support that. If you meant a Late 2013 MacBook Pro, macOS Monterey releasing later this year drops support for it and all MacBook Air/Pro before 2015 as well as all MacBooks before 2016.
Old versions of MacOS get about 2-3 years of additional security updates. So if you bought a 2017 laptop with a 7th gen processor, you got 8 years of support from 2017 to 2025. Unless you bought Surface Studio 2, which is $3499 from Microsoft and comes with a 7th gen chip so it only gets 4 years of support. If you bought a 2013 MacBook which just got cut off at Big Sur, you'll probably get supported to 2024, or 11 years.
Yes, Apple sucks at legacy support in many respects. Luckily, Macs are only ~10% of the PC market, so they can't create as much of an e-waste disaster.
Two wrongs don't make a right and I expect better of Microsoft.
It's amazing how no amount of altering the deal to be worse for consumers seems to get people to mass-exodus from these proprietary platforms, despite the existence of numerous alternatives
They're captured. They want their games. They want their shiny new peripherals to work. And they want all this seamlessly, without messing with Proton/Wine or config options.
In fact why even go this far? Most people have no idea what an OS is. They just buy a "device" and use whatever software platform comes baked into the device and all of them are proprietary.
It's not consumers that matter much in this game. It's businesses, which will incur massive cost and down time to retool their entire fleet if they switch. Windows has clearly not become unappealing enough for that, sadly.
>It's businesses, which will incur massive cost and down time to retool their entire fleet if they switch.
That's a non issue since most businesses upgrade their fleet every 3 years or so and are also very slow and reluctant when upgrading to newer versions of Windows (my last job switched from Win 7 to 10 only 4 years ago) so by the time they finish waiting out for Microsoft to iron out Windows 11, years down the line, their fleet will have been already replaced at least once.
Have you ever trained a few hundred or more 'non-tech' people (and I mean people who don't know what a file is) on a switch from Windows to something non-proprietary? How did that go for you, if so?
As a Ryzen 1800x user that sees no need to upgrade my CPU at the moment (ie. performance per dollar improvement for current gen isn't really worth it to me), this is an outrage.
Maybe time to go back to Linux then. After pragmatically settling on Windows around 10 years ago, I've been happy enough. My 5 year old laptop is perfectly good, and have no need to upgrade it. I guess I could keep using Win 10 for a while, but the drive to move back to Linux just became a fair bit stronger.
What's the EOL for Windows 10? Apparently it's 13 Dec 2022 or 09 May 2023 (education/enterprise) for the currently released versions -- I guess that's it? https://endoflife.date/windows
LTSC versions are officially supported for 10 years, so some version of Windows 10 will be supported for at least a decade. Windows Server 2019/client v1809 LTSC is supported until 2029, for example.
For those considering a new machine for Windows 11, remember that upcoming Intel and AMD and Qualcomm-Nuvia-Mx CPUs will include the built-in Microsoft Pluton (inspired by XBox) hardware root of trust, which will play a role similar to Apple T2 or Google Titan.
> The Pluton design removes the potential for that communication channel to be attacked by building security directly into the CPU. Windows PCs using the Pluton architecture will first emulate a TPM that works with the existing TPM specifications and APIs, which will allow customers to immediately benefit from enhanced security for Windows features that rely on TPMs like BitLocker and System Guard. Windows devices with Pluton will use the Pluton security processor to protect credentials, user identities, encryption keys, and personal data. None of this information can be removed from Pluton even if an attacker has installed malware or has complete physical possession of the PC.
Sad news, but I don't believe they are worried about that. It's good for their profit that hardware gets more and more commoditized and irreplaceable, just like phones. if it breaks, you buy a new all-in-one device which you can as each year passes customize LESS, and that's it.
I'm actually starting to worry about opensource approaches since the Year of the Linux Desktop (TM) is getting farther and farther.
Today I gave a go at XUbuntu 21.04 without a VM (which is how i have been consuming my linux fix recently) and it was okay, but even basic things like suspend didn't work and simply crashed the X window manager or other non-poweruser-ready shit.
It's getting more and more complicated.. sometimes i wonder if it's NOT and it's just me getting grumpier. But it really seems that consumer devices are getting more bundled and less customizable and tinkerable by the year.
I think this is saying that the information can not be extracted and used outside of the processor, not that that the information can't be cleared and the device factory reset.
The biggest concern this presents is if your CPU dies and you want to recover your encrypted disk. All of this is the same situation with T2/M1 Macs or iOS devices today.
MS seems offers a few ways of back the decrypt key. One of them is uploading the bitlocker key to your account if you logged into ms account in that system. Or you could write it down (Which isn't very useful IRL, I haven't see anybody actually write down recover code or whatever to paper yet.)
I suppose that it could still be done by "unlocking" the device by logging into MS servers, similar to the Activation lock on Apple devices: Malware can't "remove" the information… but Apple.com can.
I assume it would be similar to how you sell an old computer with an encrypted hard drive. You unlock it, format it, and then sell it. If you have the credentials to unlock the PC you should have the ability to remove your encrypted state.
Great, we are step by step getting to a place where a government could actually ban software because these machines are no longer under our control.
All for the sake of "consumer safety" when in reality it has to do with control and preventing someone from committing the horrible crime of copying a Netflix show. /s
Great that it can't be removed, but can it be used by the malware on the system to perform the task on behalf of the user? In the days of always connected computers just remotely controlling the SW on the HW itself is enough to do pretty much all the interesting things you could do (as a criminal) if you were in physical possession of the computer itself.
So Microsoft has chosen to go down the Windows Vista route again. People are notoriously reluctant to update Windows, and if they really want to start this early with FUD about their next OS, it's very likely it's going to be rejected by the masses like Vista and 8. 10 took over 7 only because it was actually what people asked for, worked well and it worked like people expect Windows to behave.
Lots of people keep computers for years now, and it's not that uncommon to see people still holding on 6 or 7 years old machines. Most people don't really need computers that much now that smartphones are ubiquitous, and even an old computer can run a browser or a word processor just fine.
Microsoft still can't realize they are not Apple. People won't just buy a new computer just to run Windows 11. That's not how it works, Microsoft still hasn't really accepted they don't have neither the mindshare nor the appeal for doing moves like this. I thought they had learnt from their mistakes after the Vista/8/Kinect debacle, but I guess they just cannot.
10 took over 7 only because it was actually what people asked for, worked well and it worked like people expect Windows to behave.
No, it took over only because of the silent forced upgrades that also included some typical malware-like dark patterns. There's plenty of stories here and elsewhere about it. Some examples:
Find me a functional ecosystem of linux laptops that does basic shit such as... suspend and works for 10 years and does not need a linux IT professional to maintain for a small shop.
I hope Win11 will have the same uptake as new versions of android, ie not at all. As long as relatively few people upgrade it will prevent the rest of the industry from locking out the majority of users who happen to be on 'legacy' windows. MS is of course playing the long game and if the frog doesn't boil with Win11 he will by the time of Win12 or Win13.
Microsoft said that the secure boot and TPM requirements will not be enforced by the OS now but will by time Windows 11 hits RTM (which is why the Windows 11 installer enforces it even though the OS runs fine).
What does this even mean? Secure boot has little impact on anything except reducing the complexity of Windows (since it doesn't need as many boot configurations)
It makes Linux more complicated to deploy, for one. And if they ever change their mind and don't allow it on x86 any more, Linux is basically exiled from the PC OEM market.
It shouldn't, as Microsoft made sure that the major distros (Ubuntu, RHEL, Fedora, CentOS, maybe Debian?) have access to a signing key that is trusted by the major OEMs. And you can trust your own keys, per the Microsoft guidelines that require that x86 machines allow their secure boot to be disabled.
> per the Microsoft guidelines that require that x86 machines allow their secure boot to be disabled.
Yeah but then you can't boot into Windows? Who is actually going to go into the firmware settings to switch settings on and off for every single boot to the other OS?
you can boot debian/redhat/... out of the box without disabling it, as the shim used as part of the boot process has been signed by MS
if you want to sign your own kernels: the shim will also let you do that relatively easily ("machine owner keys")
if you want to own your entire boot process you can replace the platform key and sub-keys with your own, and then trust whoever you want (even adding MS' keys if you wish, so Windows can boot in secure mode)
I'm pretty sure we don't want to be in a world where we can only use one of a few Microsoft-approved Linux distros without it being a pain the the ass to deal with.
> the major distros (Ubuntu, RHEL, Fedora, CentOS, maybe Debian?) have access to a signing key
Those distros also all use systemd. Do you think they'll sign an image that doesn't include it? We're heading down a path where developers need permission to innovate, and the switching cost of running something non-standard becomes too high for even the most technical users.
And just wait until all executables signing becomes mandatory with Microsoft and few selected partners holding all the keys. Window 11 will set the tone, and Windows 12 will kill the indies, we will have to go trough Microsoft, steam, ea, origin of whatever stores will be able to pass the bar.
The Steam folks are as worried about this as anyone. They have more to lose than we do, and the same amount of influence at Microsoft (read: none at all.)
I think for x86 you are (at least very close to) correct, but I've been seeing a serious shift towards ARM chips and (my pipe dream) soon RISCV. X86 may just need to fall to the wayside at this point. Maybe PineBooks could be released with ARM chips too! (RISCV as well, but peripherals need to be a real thing(TM) first). Heck, the Pi400 is a good start in that direction (though obviously isn't enterprise ready yet).
Honestly, I vastly prefer OpenPOWER over ARM from a consistency in implementation perspective. Not to mention, at least with POWER9, no mandatory binary blobs or opaque management engine chips that can't be properly audited.
OpenPOWER is extremely expensive though. There is no universe in which I would pay several times the price for a high-powered x86 gaming PC/workstation just to have an open-source CPU and hardware like Talos. It has to be both open and affordable.
I guess if the firmware allows users to install their own CA it's ok. I wouldn't be surprised if that feature was neglected by the OEMs or intentionally removed with windows 12.
To their credit, Microsoft has signed a secure boot "shim" that allows the user to do that, with explicit prompting. It's being used in the boot flow of many Linux distributions.
In other words, it was Microsoft who effectively "gave permission" for Linux to run.
One OS company has control over whether they allow competitor's OSs, on hardware that the company doesn't even produce. That should be an absolutely horrifying thing to anyone who believes in software freedom.
> You can disable secure boot and add your own keys in the firmware config page.
You can disable secure boot or add your own keys because Microsoft required all manufacturers to allow it. If it wasn't mandated by Microsoft, some manufacturers would not allow it. And for ARM devices, Microsoft required the opposite (https://softwarefreedom.org/blog/2012/jan/12/microsoft-confi...). So yes, the only reason we can run non-Microsoft operating systems on our computers is because Microsoft "gave permission".
Nothing wrong with freedom. Not running the master of all telemetry OS increases the possibility to read and study what you want without feeding the data hungry sensors Microsoft has set in stone for you. These datasets are brought to market with your consent (see the thousands of EULA pages you accepted directly and indirectly).
I'm on the Mac side, and I wanted to reinstall macOS. I messed up the hard drive wipe and ended up breaking the chain of trust. That meant Apple could no longer guarantee the correctness of my install, and no longer allowed my laptop to decrypt my data to reach the login screen directly. I had to input the password for my login at an earlier step during boot, which comes with a litany of small caveats.
I'm sure Microsoft hopes to achieve something similar here at some point: secure boot would give them enough trust to decrypt an install upon boot all the way to the login screen.
At least you could recover by simply typing a login password. A similar screwup on a Windows box might require you to resort to BitLocker recovery keys, which add a fair bit of complexity and some users might not have these at all.
Any "Windows x was more stable than Windows y" (including the legally mandated "No, it wasn't" replies) is mostly rooted in what drivers the person used and if the hardware had issues, and has less to do with Windows.
Reducing the complexity how? Their Legacy boot code is already written, and legacy BIOSes aren't going to change either. That code basically comes for free to MS.
Maintaining old code has a cost. Any changes Microsoft wants to make need to be compatible with the old code unless they remove it
Things like tests/validation on new hardware is also costly. Microsoft (used to?) have an absolutely massive fleet of physical hardware to test Windows on
What's even the point of enforcing these requirements when the OS seems to be running quite fine otherwise? Users who are running without SB or a compliant TPM will simply stay on Windows 10, and maybe stay on it past the official EOL date.
They're doing this to force people to buy new hardware and a new Windows licence. If they let you upgrade from Windows 10 for free, they don't make any money. They've already gotten people used to free updates, so they can't charge money for Windows 11 upgrades directly. Most people buy pre-built computers, so a Windows 11 licence will be included by default for most, so they will make more money.
This is silly. Microsoft doesn't even consider Windows to be their major priority in terms of money - they're investing much more heavily in Azure. They also have given free updates repeatedly, so this is an especially weird argument...
The reason they're doing this is because Microsoft doesn't control OEMs directly. They can't make Dell or whoever put in good hardware unless it's a hard-requirement to run their OS. They obviously want to start leveraging TPM 2.0, probably in order to properly compete with Chromebooks, which all require that tech already.
Chromebooks and GSuite are a meaningful threat to Microsoft - Google has a huge head start in that they've enforced much stricter restrictions from day 1 on Chromebook hardware. Microsoft is just getting aggressive about doing the same. And it's going to take at least 4 years for them to catch up, given that Windows 10 EOLs in 2015 at the earliest.
This fits far more into their business model of 0365, Sentinel, and Azure than it does with their Windows business model.
edit: Expanding on this, TPM technology is critical to Zero Trust Networking, which I'm quite sure Microsoft is going to want to push - especially since Active Directory is getting ripped out of networks practically by government order at this point. If they follow through on this, in 4 years Windows networks could be radically more secure than they are today. This fits in well with where Microsoft is taking its business (cloud, security, organization support).
> They can't make Dell or whoever put in good hardware unless it's a hard-requirement to run their OS.
They actually can. They have Windows Logo program, which specifies conditions that your product has to comply with, if you want to qualify. OEMs like Dell want to qualify, that allows them to put the Windows sticker on the box.
How do you think Microsoft made the OEMs ship UEFI and Secure Boot in the first place?
Functionally, how would that change the situation? Sounds like it's just a different method of enforcement, with this perhaps being a stricter one that prevents maybe some sort of off-label selling of the OS?
They're also going to create a large number of new Linux users. This might also allow great hardware to be obtainable for pennies on the dollar. I don't like the direction this is going, but there can be good things to come from this.
Unless Linux PC's are 90% of shelf space at Best Buy/Amazon it won't. HN forgets that 95% of people are computer illiterate and would not be able to install Linux themselves.
First-gen Threadrippers are not supported by Windows 11. Freaking Treadrippers. If they think their owners are going to get new ones for Windows 11, they must be deluded.
Best guess is that win 11 will require full disk encryption at some point. With both secure boot and tpm Microsoft will be able to lock down windows in ways they simply couldn't before.
Like it's happening on the Mac, it's getting harder for the average user to screw up since software from "unidentified developers" can't run by default.
I can still run anything I want as a power user and that will not change on macOS and won't change on Windows.
That's an early build, maybe Windows 11 RTM will actually always use a TPM (1.2 is advertised as minimally supported though).
As for secure boot, I don't see how that could be anything else than policy (that can have an impact on a security model and so on associated security measures, granted, but not having secure boot should technically not prevent booting / installation unless it is enforced by an explicit artificial limitation). But they could at least remove legacy boot support, in which case it just won't work without UEFI.
Because they want to set a minimum configuration they have to test and support for the next ten years. It might work fine today, but will it work after five feature updates?
Yeah, you can run Windows 10 on some pretty ancient unsupported hardware too, but when they break support for a driver a couple years in, you end up with a nonworking machine.
Is Windows 11 likely to perform some sort of attestation for this? Or are we likely to see something like the way the old Windows cracks used to work - a hacked up version of Grub or another bootloader able to patch the necessary firmware and BIOS information before chainloading Windows.
Unless it's doing some sort of remote-code-download-and-execute(!) based on the attestation results, it will always be possible to crack everything locally. All the checks just need to be patched out, and finding them all is the hard part, but it is theoretically possible as long as you still have full control over the hardware.
But such a setup will be very fragile to automatic updates (which are already difficult enough to turn off completely as it is), and with this whole "update mentality" I wouldn't be surprised if they eventually leave in certain security holes and use those as an additional force to coerce people to take their updates --- along with everything else the users didn't want.
You're talking about a fantasy world that doesn't exist where Microsoft has gone to extreme precautions like you see on mobile devices and game consoles where there are constant attempts to steal the ownership from users.
In the actual world they don't even enforce disk encryption right now and correct me if I'm wrong, but currently having your BitLocker "recovery code" is enough to decrypt the disk on another machine and changing that would be a massive issue for many data recovery processes.
Unless they plan to do massive changes to the system this is very likely to not be a problem and cracking is likely to still be quite possible.
It's not as much "doesn't exist" but rather "hasn't been turned on". We can speculate/disagree on the future plans of course, but in general the whole process as described in the previous comment is in place. Enabling you to lock down the whole boot process is the goal of SecureBoot. Right now Windows is not preventing you from modifying the startup binaries and system permissions, but they could easily do that. Apple already partially did that with the forced encryption and T2.
These are not "massive changes to the system". Probably still a lot of effort, but closer to "MS now owns the ActualAdmin user who is the owner of system files; you get Administrator who can't touch them".
On Linux you can already prepare system like that fairly easily, with Arch wiki describing most of the steps.
You’re just replacing one set of problems with a different set of problems. Linux isn’t a panacea. It has its own set of meaningful issues that make it an unattractive option to a lot of people.
I do agree at least that not enough people ever make that consideration that should, but that kind of requires qualities that free software either doesn’t care about or is not competent at (ie marketing)
This is getting downvoted, but honestly? It’s kinda true. Normal tasks like setting up hardware (drivers), using HiDPI displays, and installing software can be a struggle. Many tasks require a user to jump into a terminal and run arcane commands (which sometimes have to be tweaked for your specific distribution - good luck). Even the user-friendly Linux distros can be very challenging.
I mean today compared to 10 years ago it is in a far better spot. But the FOSS nature of everything still makes it difficult to fix every nit picking gui bug. Right now there is an issue I've encountered with Manjaro where it just doesn't register a left click for some reason. No idea why but I need to restart to fix it if it happens. That is enough for any end user to write it off completely.
It's not like Windows is perfect though. For example my mum's PC stopped printing with last month's Windows update then started again with the next update.
It's definitely not. I really do wonder what would happen if Microsoft wasn't able to enforce OEM machines to have Windows or basically give the OS away for free. I think Linux would be a more than comparable OS today.
Why is it acceptable to market an OS in 2021 where throwing out your current machine is “plan A”? Why is no one challenging microsoft to come up with an e-waste strategy?
Has Microsoft publicly made it clear that the true reason for the tpm requirement is all about shepherding all users into the windows App Store? As in users might be allowed to install non App Store programs for a while, but the operating system will give complaint/hassle dealings and require the user to go through several extra steps (aka macis).
Some future windows releases will likely be licked to only allow Microsoft App Store-signed apps.
All this talk of TPMs has me wondering: most of the premise of a TPM is that it's basically a hardware cryptographic device that lets you generate keys and do various operations with them, but never extract the key itself, right? Then, if you do actually want a TPM-like device that behaves mostly like a normal one but lets you get at the keys through some backdoor, what's stopping you from making and using one? Some manufacturer (model? batch?)-specific private key programmed at the factory that lets it authenticate itself as a genuine TPM?
> Some manufacturer (model? batch?)-specific private key programmed at the factory that lets it authenticate itself as a genuine TPM?
Yes, AFAIK every TPM comes comes with a unique "endorsement key", signed by the TPM manufacturer, which can be used to prove that it's a real TPM from that manufacturer. A quick web search found https://tpm2-software.github.io/tpm2-tss/getting-started/201... which explains how it's used.
Remote attestation is for identifying yourself to a remote that has already agreed to trust you previously. If the remote has already agreed to trust device foo based on its TPM's EKpub, then foo's TPM can later prove that it's foo by signing something with its TPM's EKpriv. There still needs to be some prior out-of-band registration to register foo's TPM's EKpub with the remote.
It makes sense for, say, an organization that provides the devices its employees use, because the organization can pre-register those devices' EKpubs in its servers and refuse to acknowledge any device that can't attest. But in the case of Windows, presumably MS is not going to become the single source of all Windows computers.
At best, they might register your device's EKpub when you install Windows and create a MS account or something, but if you already had a backdoored TPM at the time, that backdoored TPM is what will get registered.
It is also possible that MS could require a TPM with an EK certificate that is chained to a set of CAs based on some popular TPM manufacturers. That would certainly prevent you from using any device that doesn't have a "real official" TPM, but I feel this would be quite overkill of MS to do. Then again I would've said the same about an OS that requires a TPM in the first place, but here we are...
Nothing prevents you from making your own TPM implementation. There are even TPM emulators, both at the software level - mssim / ibmswtpm2 (Linux-only) - and at the hardware level - virtualized TPMs for VMs (qemu, and I believe Hyper-V too).
I tried to go back to Linux in one of my laptops recently (old x220 I want to use for ECU programming). I used to be a gentoo user before their wiki disappeared.
It seems to me like Linux is going through a growing phase with several growing pains.
Why are my graphics working worse out of the box with common Intel Graphics than back in the day?
Bluetooth is a terrible experience.
Wifi doesn’t work 60% of time and I don’t know why, the UI won’t tell me.
These used to be issues in the past but I figured that 15 years of progress would’ve changed.
However I am aware that all these problems are open and it’s up to me to fix them and I feel no entitlement or have any expectations out of a free and open product. I wish I was better equipped to assist in fixing them myself.
Unfortunately I ended up putting Windows back in that laptop. For a while I used it as a hackintosh and to be honest it worked better with High Sierra than it did with Mankato or Ubuntu.
Windows 10 does have mbr2gpt.exe now, so migrating from Legacy to UEFI isn't a terrible experience, provided you do it from the recovery/boot/troubleshoot Windows screen.
There really shouldn't be a need for mbr2gpt.exe except as an instructional aid to see how they accomplish the deed.
Keep in mind that using an MBR-layout HDD (SSD) is specified to be fully supported by UEFI.
No need for a GPT-layout HDD to begin with except on some screwy sub-specification UEFI firmware on a number of crummy things like tablets which have no Legacy CSM and which for a while were too defective to even recognize an MBR-layout USB device for booting.
Recently worked out the latest reference implementation of a dual-boot Windows/Ubuntu HDD that boots on "any" x86_64 hardware whether BIOS or UEFI, using the most recent W10 21H1 and Ubuntu 21.04 released over the last few months.
There's never been a reason to settle for less than a HDD which will boot on the widest variety of PCs that you might need to quickly physically transfer the HDD hardware over to. Just in case the PC fries in the middle of an important session and the HDD is still good, you need to be able to just remove the HDD and place it into whichever backup PC you might have available. In this type emergency you really need a HDD layout that can accomodate the widest variety of new & vintage PC hardware, just in case.
Nothing less is an option.
Unless you want to admit that you haven't really tried to get maximum reliability out of fundamental hardware & software to begin with.
For partition 1 it still works great to have a plain ordinary FAT32 volume (which is preferred by UEFI and still works the regular old way for Legacy BIOS boot) and for that a 32GB size is the traditional maximum amount that is really comfortable for Windows98SE (and the FAT32+DOS it was based on). So I can actually boot to a W98SE DOS floppy if I needed to and (re)format partiton 1. This is a bog-standard boot partition layout on the HDD, which will be used by both BIOS or UEFI, whichever one you need at the time.
At this point the HDD is so conventional that you can even install W98SE or at least its underlying DOS version if you tried but it will only boot with UEFI+LegacyCSM (or a real BIOS mainboard), and you may have to use IDE mode for the HDD to load W98 (like often needed with WXP), plus maybe even more changes to BIOS settings. DOS alone still works OK in SATA mode, and DOS can also handle USB drives recognized by the BIOS if they are plugged into the PC before you boot to DOS. No need for trying to load USB hardware drivers in DOS unless you need to plug in USB drives after you have booted the DOS PC. DOS handles FAT format volumes only, and not even NTFS. And this is on a high-performance layout for improved reliability operating the latest Windows (plus dual-boot Linux to boot) for current mainboards at the same time.
But normally you don't need DOS or W98 so I just make the first partition 30GB in size and format it FAT32 using Windows 10 or using the command line when booted to the W10 setup media.
Backward-compatibility in so many ways, exactly along these lines is the main thing that makes Windows worthwhile, without it why bother?
Last time I checked, mbr2gpt would only handle about a 1GB FAT32 boot volume, so if I did want a quick conversion that would seem helpful, but I have tested this app for reliability well within this limitation and it failed miserably even on simple layouts like I am posting here (using a much smaller FAT32 volume for testing). There is also another comment about failed conversion on what is probably a regular Dell business machine. Not good enough for general use as can be seen. So might as well treat your HDD, Windows, Linux and youself to the roomy & useful 30GB FAT32 volume of your '90's dreams. Even if all we're really going to put there are some non-sizable boot files & folders, I still would use the full 30GB unless the HDD is smaller than about 120GB. Then any night you want to in the future you could actually party like it's 1999 if you get a wild hair.
Could also add a bootable live Linux distribution right there on the FAT32 volume independently of the Linux which will be fully installed to its own EXT-formatted volume later. Live distributions usually boot on FAT32-formatted USB sticks anyway. This would be another optional OS you don't really need on the (functionally hidden) FAT32 boot volume of the basic Windows/Ubuntu dual-boot HDD, unless the installed Ubuntu itself turns out to be unsatisfactory for something.
So none of that at this point either, no extra distribution(s), no W98, nor DOS.
Just a regular valid binary Master Boot Record with its accompanying partition table stored in sector 0.
The partition table which defines a 32GB or less Actively Marked partition1, and this boot partition has been formatted FAT32 and has the appropriate Volume Boot Record for either Windows or Linux bootability under Legacy BIOS.
On the FAT32 volume, boot files and all accompanying boot folders readable from the filesystem for both Windows and Linux, supporting BIOS and UEFI booting for either OS. Concentrated in this particular choice of default locations, all the boot files for everything can easily be backed up, restored, or manually modified even when booted to an OS as simple as ordinary DOS if perhaps that might be needed as a last resort or something. If needed the entire FAT32 partiton can easily & quickly be reformatted and the boot files & folders replaced from backup. Without ever touching anything on the main Windows or Ubuntu partitions, each of which simply stand by waiting to be booted to from some appropriate boot files of some kind, whether bootfiles are executed by BIOS or UEFI. To support both Windows and Ubuntu for either BIOS or UEFI that makes a total of four complete sets of independent boot files on the Active bootable FAT32 volume.
The same partition table which is also defining an NTFS partition2 and an EXT partiton3 in addition to FAT32 partition1.
Windows ends up on NTFS partition2 except for its boot files and maybe the Recovery console folder on partition1.
Ubuntu gets completely installed to EXTx partition3 except for its boot files on partition1. No separate /swap, /usr, none of that.
Works like a charm and in Windows 10 the defects which could cause difficulties when physically moving the HDD to a different PC have been largely overcome. You may need to go into safe mode when including IDE mainboards or jumping to way different graphics, but usually any new drivers needed are autoloaded and you can do what you were doing as long as the alternative mainboard is not lacking some unique showstopping hardware your apps need.
Both Ubuntu and Windows are continuously improving the relability of physical HDD relocation, and Windows actually seems to be pulling ahead of Linux in this respect, but it's still neck & neck.
Of course this is only a fundamental layout for a full stand-alone PC without any dependence on web access or even networking to install, boot and be fully functional. Taking this into consideration in your approach when it comes to file management and things like that, you can join networks and webs to taste while maintaining full functionality during times when ethernet and wifi are disconnected, whether the disconnection is intentional or not. I actually enjoy plugging & unplugging my ethernet cord without any hesitation all the time. That's how I got completely comfortable leaving it unplugged almost always when in Windows except when I really need it.
There is a stepwise outline and a full installation procedure to build the Windows/Ubuntu HDD if there is any interest.
It's very probable that Windows 11 will run on your P51. You may have a warning advising you to stay on Windows 10, but it will fit the hard floor so you will be able to upgrade regardless.
IIRC the hard minimal req different from Win 10 is a TPM, 64 bits >= dual core, I think UEFI + secure boot, and WDDM >= 2.0. I just checked on a Kaby Lake and I have everything needed.
The published list of processors is probably for the soft floor and/or for OEMs.
Now, knowing MS and especially the situation in regard with some processors following the Win7 -> 10 migration, there is always the risk they fuck up the support even more for unlisted processor, voluntarily or not...
I think wait and see. That kind of CPU req published now sorta makes sense for OEMs, but for computers to upgrade, just WTF. Does MS really want a split install base to mostly Win 10 machines and a just a few 11? And BTW there is this "little" shortage and high demand that is supposed to continue for years; and Skylake or even before are (lets say at least for core ones: more than) perfectly capable of running Win 11.
So a hard cut-off at Intel 8th gen and Ryzen 2nd gen really makes no strategic sense whatsoever. I have no doubt MS employs plenty of clueless people, but maybe not going that far to insanity-land.
I mean I was starting to consider switching to a Mac, undecided still, but if MS persist this will really be a no-brainer.
The specs are especially for CPUs are for manufacturer and never list discontinued CPUs.
If you look at the same specs for win 10 you would assume it needs modem hardware but it does not it runs on 10 year old laptop.
I tested the leaked version on single core CPU with 1GB ram and it just works. (probably not enough ram to install updates but it boots)
Are you sure it doesn't have PTT? AFAIK recent-ish intel CPUs should have TPM support using the trusted computing capabilities of the CPU itself, without the need for a discrete TPM chip.
Microsoft has clarified on Twitter of all places that TPM 2.0 isn't the only requirement. It must be 8th Gen Intel or 2nd Gen Ryzen or newer regardless of whether it has a TPM.
>Firmware TPMs are firmware-based (e.g. UEFI) solutions that run in a CPU's trusted execution environment. Intel, AMD and Qualcomm have implemented firmware TPMs.
For the Legacy BIOS piece, has anyone tried using Clover? It's a bootloader designed for Hackintosh systems. macOS is and always has been EFI-only on Intel computers, and when Clover was released EFI was still uncommon on PCs. So, Clover has its own EFI implementation that can be started from a BIOS boot.
The problem is that Windows updates overwrite the EFI partition very frequently. I have updates disabled in my dual boot hack in case I accidentally pick Windows at boot. I have Windows just in case I mess something up on the Mac side and I NEED to get something done
I wonder if an AME version of W11 will get released. Requiring me to have a Microsoft account to log into my own computer is a do-not-pass-Go unacceptable condition.
I thought they said that Windows 10 would be the only version of Windows forever, and that everything would be updates of Windows 10. Did they change their mind or did I misunderstand in the first place?
It is a free update. That quote is out of context and has become a meme. 0
The continuous build integration system for windows makes it so that new builds come out every week. Under the hood W10 and W11 are the same thing barring UI refresh and regular feature updates.
Its Windows 11 because marketers know most people are technically illiterate and want higher version numbers. Its mainly about the UI refresh to pull more people away from Apple and build out the MSFT store.
This has nothing to do with Apple. Big Sur, released in 2020, cannot possibly have any impact on the naming of Windows that started in 2009 with Windows 7.
Big Sur is when they went from "everything is 10" to 11. The accusation is that moving off of TenVer is following them, as was getting onto TenVer in the first place. And Microsoft was definitely on TenVer. If they were merely incrementing from 7 they wouldn't have skipped 9 and they wouldn't have stuck on 10 nearly as long.
You don't know why they skipped 9? Your argument makes no sense.
The comment above is 100% correct.
Just because Apple left v10 it doesn't absolutely mean Microsoft is just doing it to copy it. You're talking absolute nonsense. Apple doesn't even call it macOS 11 publicly so it doesn't even make sense from a marketing perspective.
They skipped 9 because they thought ten sounded better. Because TenVer, which is what apple was also doing.
Got some other reason in mind? I'm very confident it wasn't backwards compatibility, by the way, so please name a specific program that would have errored if you claim that. Windows lies about its version to old programs.
Hi I already addressed that in my post. That code would not fail.
The only plausible version of that I've seen was Java code, and the Java runtime would not have returned "Windows 9" to that API unless deliberately changed to do so.
Not to be rude, but did you skip the second line of my post after reading the first one? Your reply confuses me if you didn't.
Yes I agree that the public reason for skipping Windows 9 doesn't hold. If marketing wanted to release Windows 9, the Compatibility tab is well capable of handling old programs that look for Windows 9*.
Yes because the best way to tell people to run 15-year-old software is to tell them to visit a UI they’ve never seen in their life as opposed to, say, just skip a version number.
No need to visit anywhere, compatibility shims are not only deployed regularly in Windows updates, they're also automatically enabled when heuristics indicate a compatibility issue.
I usually upgrade my computer every 3 to 5 years. It won't run Windows 11 because of the CPU, but I can buy a new CPU and motherboard and RAM for a faster system that can run Windows 11 and transplant my other hardware to it.
I think Microsoft wants people to buy newer hardware so OEMs can profit from it.
I remember when a Pentium 4 could run Windows 7, 8.1, and 10 in the 32 bit edition.
Also you can "deploy" window 11 on any SSD as windows2go and boot from it directly if you just want to test it out on real hardware. All checks are skipped this way and you can put the SSD in any toaster as long as it has a 64-bit CPU it will most likely run.
Out of couriosity, would Win 11 install on a system with UEFI, Secure Boot and TPM2 enabled, but where you supplied your own keys - or does it require hardwired keys from Microsoft?
What I am most concerned with for Win11 is Privacy...I suspect we are going to see some major consternation on that front as soon as Microsoft can be cornered on it.
Looks like Microsoft will be getting a lot of people to move over to Linux. My pc is more than capable with 2 xeon cpus but my Dell t5500 has no secure boot!
Cant u just install it in VirtualBox and copy and boot to the created virtual disk? I was just thinking about this, because I needed Windows 10 on old laptop.
Yep. i7 6700K on a relatively-high-end Asus matx board. No cpu support and no TPM slot. Kinda crazy, since it’s plenty powerful - I’ve felt no need to upgrade the processor or motherboard.
You’d think that they could build a USB TPM or something.
TL;DR: Install windows 10 installer on usb drive, then replace the sources/install.wim with the one from windows 11 ISO. Then boot off of that usb drive and it'll install windows 11 now.
https://linustechtips.com/topic/1351028-microsoft-makes-thin...
A security director now says that a blog post "clarifying the floor" is coming. But frankly, if this turns out to be a miscommunication, if you read those tweets Microsoft would have to be unbelievably incompetent in their use of words.