There's a reason Microsoft is aggressively deprecating "older" CPU's that work perfectly fine. Heck, I have one laptop with Windows 11 that worked great, but won't update from 22h2 to 24h2 because CPU support was dropped between versions, leaving me with only the glib suggestion from the Windows Update UI to "Buy a new device".
Ironically, installing Windows 10 and activating ESU would lead to longer hardware life.
Of course, I didn't. Instead, I installed Linux on that laptop too. My partner had no issues switching.
TPM wasn't the only reason older CPUs were dropped. The biggest reasons where the line in the sand Microsoft chose would not be supported in Windows 11 was Spectre/Meltdown [0] mitigation. Windows 10 added a bunch of intentional slowdowns to mitigate that disaster and people incorrectly blamed Windows 10 for being slow and not the CPUs and their CVEs. Windows 11 seems to have wanted a clean slate without needing to have any of those slowdown mitigations in the codebase and eliminate some classes of "Windows 11 is slow on my machine" complaints.
I'm not sure Microsoft took the best approach. I might have opted into a "Windows 11 Slow CPU" SKU if it was marketed right. That might have been a little kinder than "all these CPUs with this awful series of bugs are trash, even though we have had a successful workaround".
Yes, Microsoft is blocking CPUs which lack the ability for Virtualization Based Security. Given OS security is important to Microsoft (surprise, I know), enforcing VBS is a priority.
I think "chooses to" is doing a lot of work there in your understanding. Spectre exploits were found in the wild even in JS code submitted to ad networks. I suppose a user could choose to uBlock all ad JS and never visit webpages they don't trust. Those are choices, sort of.
But also that's a bit victim blaming isn't it? Do you want to explain to your grandfather or partner or child "Oh sorry, you had a password stolen because you chose to visit Google.com on a day where Google let an ad buyer attach Spectre exploit malware"? (Google could also chose to not let ads attach JS at all, but that's a very different problem.)
Computers have millions of places they get code from to run. Is "your CPU has a data leaking bug in it" the user's problem or the OS's problem? When there's a mitigation the OS can manage? When security-in-depth is an option?
I installed Bazzite on my own old Desktop not supported by Windows 11. One of the first things the Linux kernel spits out on boot if I have the boot console up is about running with Spectre mitigations. The Linux kernel also thinks it is important to mitigate (as Windows 10 did, but Windows 11 doesn't include and so doesn't support this old Desktop).
Sure I am might have been a wee bit too bitter writing that.
The point I want to make is that allowing remote code execution is such a big attack surface that it makes all the other security measures look silly, which indicates that signed execution contexts in them self is an attack on privacy and control etc.
If there was any actual security concerns there could be a push for server side rendering or something.
Ironically, installing Windows 10 and activating ESU would lead to longer hardware life.
Of course, I didn't. Instead, I installed Linux on that laptop too. My partner had no issues switching.