No. There are still missing -important- stuff and "works but you have to stand on your hands while singing" kind of things. It's nowhere near being friendly to an end user unless someone picks and sets up the hardware and the software for them.
> and "works but you have to stand on your hands while singing" kind of things.
You mean "works if you don't actively try to break it".
Most stories about Linux breakage involve esoteric distros (especially bleeding edge "rolling" distros or one-man Ubuntu-derived distros) and users trying to "customize" shit they shouldn't.
Can confirm, I try it every couple years and the jank and crashiness/glitchiness (application-level, not the kernel) still make me regret it in a matter of hours. Ran Linux for years on laptops and desktops but just had no idea how much time I was wasting, working around things it couldn't do or fixing broken shit or just living with some stuff not functioning correctly. Every time I go back I find it's the same. Whole foundation of the GUI stack and UI toolkit(s, which is part of the problem) need a tasteful, high-quality ground-up rethink and Wayland sure as hell ain't it.
Linux is problematic on hardware designed for other operating systems. My Librem 15 works flawlessly, including suspend and WiFi (without even any binary blobs).
The biggest thing keeping me from using Linux for my desktop pc right now is lack of HDR support.
Apart from that the whole display servers world is just not working well. Some things still don't support Wayland, you can't share screen easily, X has it's own issues etc. HiDPI is still problematic with both; you get weird scaling issues with apps and and the occasional humongous mouse cursor. Using multiple displays with different DPIs you again get scaling issues. Setting up multiple displays still requires fiddling with xrandr, xinerama etc.
Just let me know if you need more examples. I tried to fully switch to Linux many times spending lots of time + money (bought one of the "recommended" devices).
> The biggest thing keeping me from using Linux for my desktop pc right now is lack of HDR support.
HDR is just a marketing term for software that increases the colour and black saturation of content beyond the actual input. For photography that means more saturation than what you see in real life, for video that means tweaking it after you receive the signal.
An OS can't "support" or not support HDR; if a video has HDR and your display has the dynamic range then you'll see it in "HDR". Unless you actually want your OS to post-process everything on the screen to have marginally higher contrast than normal?
> HDR is just a marketing term for software that increases the colour and black saturation of content beyond the actual input. For photography that means more saturation than what you see in real life, for video that means tweaking it after you receive the signal.
Not even remotely correct. It is definitely not a marketing term. HDR is set of specifications and formats for capturing, storing and displaying visual data with a higher than "Standard" dynamic range. It basically carries more data and it requires proper color management. It has to be supported by relevant drivers and the display server. Right now there is no way to view HDR content (video, photos or games) under Linux.
If you are interested; there is some effort going on to enable this but it will probably take a couple more years.
"HDR" is just metadata to pass along to the device to tell it how to optimally adjust the colours (or not adjust them). It's main use seems to be so that HDR displays can oversaturate "SDR" content in HDR mode.
> carries more data
Not really. The metadata isn't more colours or something, it just passes info to the device about certain aspects of the video.
HDR content will be HDR if the display has enough range regardless of whether or not the OS has HDR "mode". It's more about adjusting all the other content.
Dunno if you ever had a plasma TV back in the day but they could display way more contrast than LCDs of the day so you could adjust the settings to change the way all content looks. HDR metadata is basically that, but on a per-video basis. Useful enough but not some game changer... Which is why there's tons of articles about what HDR does and doesn't do.
Edit - where HDR really is great is photography... You take multiple photos of the same scene with different ISO and shutter settings then stitch them together for more colours than would otherwise be captured. But a screen can either display colours or it can't.
You're getting somewhere but still not complete. HDR formats we have today use 10bit or 12bit bit depths so that alone is a big difference. About the metadata, HDR10 uses static metadata meaning whole content will have the same metadata but HDR10+ and Dolby use dynamic metadata.
HDR content "will not" be HDR if you're playing it under Linux. It will however be HDR if you play it under Macos or Windows (keeping everything else like the screen and content same). There is no way around this right now. So in basic terms you will not see the dark blacks and bright whites under Linux.
> Edit - where HDR really is great is photography... You take multiple photos of the same scene with different ISO and shutter settings then stitch them together for more colours than would otherwise be captured. But a screen can either display colours or it can't.
It's the same for video or games really. With HDR you can record and display more details and contrast.
And that HDR photo you just described will not be shown as HDR even if your display supports HDR if viewed under Linux
HDR as we know today always meant 10 bit colour. Before that some screens emulated HDR in 8 bit color by doing dithering to -poorly- create the illusion of a wider luminance range.
edit: and 10 bit color support on its own is not enough for HDR. I mean I don't know why we are even discussing this honestly. HDR under Linux is not supported at all. It's not an opinion; it's an objective fact which you can confirm in a couple of different ways. You can not view HDR content if you're using Linux no matter what content or display you have. There is an active effort in the community to have HDR support in Linux but it will take a couple more years probably.
Multiple displays have worked out of the box for me for over a decade. Nothing fancy here, just X11 with the plain jane MATE desktop (and Gnome 2 before that.)
Multiple displays works well if both displays have the same or similar dpis. When they are different like one is hidpi and the other one is not, you get scaling issues.
I second the "no". I have given Linux 3 tries (several months, and a lot of patience) over the last 10 years and each time I've had to ship back to windows at some point. It's not there.
Ever checked in on the youth lately? It's all smartphones, game consoles, iPads.
Students will use MacBooks/Laptops/Chromebooks for school if their course requires one but "devices" are the main computing device nowadays. Heck, even many non-technical business people have switched to iPads. Kiosks? Tablets.
The PC is no longer "the" personal computing device...