I just started a new job where I'm subjected to Windows 11. They gave me a behemoth of a laptop. 64GB of RAM, absolute screamer of a CPU, big GPU, the whole deal.
Windows 11's file browser lags when opening directories with more than 100-ish files. Windows 11's file browser takes a few seconds to open at all.
Context menus take a noticeable amount of time to appear.
I'm getting used to a new keyboard, so I keep hitting Print Screen by accident. Half the time I can smack Esc and Snipping Tool will go away. The other half of the time, I have to mouse over and click the X to close it. There is no pattern to when Esc does/doesn't work.
If my computer goes to sleep, WSL becomes unresponsive. I have to save all my stuff and reboot to continue working.
If Windows 11 struggles this badly on a brand new laptop that I'm certain would retail for $4000+, I can only imagine how miserable it is for everyone else. All my colleagues who have been here for a bit longer got last-generation laptops. oof.
Edit... and besides, what does Windows 11 even do that KDE Plasma 5 wasn't doing a decade ago? How did it take this long to get a tabbed file browser?
> Windows 11's file browser lags when opening directories with more than 100-ish files. Windows 11's file browser takes a few seconds to open at all.
> Context menus take a noticeable amount of time to appear.
I can almost guarantee this is from some endpoint management software your company installed.
I have a Windows 11 workstation that I use all the time for some CAD software and the occasional game. Everything is fast. There's no lag with context menus or browsing directories with a lot of files.
If I have to browse network CIFS shares with a lot of files, Windows does it better than my mac or Linux boxes by a mile. I've switched over just to Windows a time or two just to deal with high file count shares.
> If Windows 11 struggles this badly on a brand new laptop that I'm certain would retail for $4000+, I can only imagine how miserable it is for everyone else.
I put Windows 11 on an old low powered laptop for a family member. FYI you can easily circumvent some of the Windows 11 requirements and put it on old hardware.
It's fast. It doesn't have any of the problems you're describing.
I do wonder how many of the "Windows 11 is painfully slow" comments are coming from people with corporate laptops with extremely laggy endpoint management overhead.
>> Windows 11's file browser lags when opening directories with more than 100-ish files. Windows 11's file browser takes a few seconds to open at all
> I can almost guarantee this is from some endpoint management software your company installed.
You can repro this on demo Surface laptops at Costco. It’s not a good look when expensive laptops render their darn File Explorer slowly.
Also re endpoint management, corporate Macs also have endpoint management and still provide better experience vs corporate Windows PCs.
Microsoft isn’t a mute participant in the corporate device market. Their recommendations and best practices carry enormous weight. Windows division can work with security vendors and customers to improve UX. But they maybe haven’t done enough. Maybe because Windows is an increasingly small fraction of Microsoft’s bottom line? Who knows.
But today you’ll see increasing numbers of Macs in even super-Windows-heavy workplaces, especially in digital/cyber/AI/leadership roles. That’s not a one-company quirk.
I don't share his experience entirely, by even on my desktop built for gaming I can notice the right click menu is delayed in comparison to Windows 10. Even more heinous, before you remove it, the AI button would lazy load causing you to sometimes hit it by accident when you mean to hit something else. God forbid I'm not 80 years old and click my menus with any sort of speed.
Also, if I'm going to have to adjust anything to use an operating system, I might as well use Linux. The only value prop for me to use Windows was gaming, but at this point I'm just completely ripping the band-aid off because it doesn't seem like Microsoft is going in a better direction.
> I can almost guarantee this is from some endpoint management software your company installed.
I disagree. I've got windows defender as the only endpoint software on both my daily driver machines, and I see the same issues.
In 2019, I was working for a place that installed Carbon Black on my desktop and it went from fast to unusable overnight. I've since changed jobs, and I've seen a decay in the baseline of the OS over the last 6 years.
> I can almost guarantee this is from some endpoint management software your company installed.
I know you're getting hammered on this, but this is also an indicting statement. If your brand-new OS requires you to have endpoint management that locks it down so much that it affects how long it takes to open files, that's on the OS, not the endpoint management.
Okay, it's on both... endpoint management as a rule is horribly written software, which is shocking knowing how intrusive it is into the system. But, if the OS has so many vulnerabilities that you're required to have endpoint management, that's not a good look on the OS.
My current and former $JOB both required endpoint management on Macs (and a limited amount for folks who used Linux), so it's not a blanket statement. But the impact of the endpoint software on Mac and Linux were still much lower. That is, once I figured out that a certain (redundant) enterprise firewall was crashing my work Mac anytime I plugged in a USB network adapter.
>>> Windows 11's file browser lags when opening directories with more than 100-ish files. Windows 11's file browser takes a few seconds to open at all.
>>> Context menus take a noticeable amount of time to appear.
> I can almost guarantee this is from some endpoint management software your company installed.
This can also be due to OneDrive / Sharepoint / Teams etc. Which I suspect supports your point.
A hard lesson I learned was that cloning a git repository into a directory managed by OneDrive is a recipe for interesting behavior.
Windows seemingly hate many tiny files, even in sharded directories, many ecosystems suffered because of this; node_modules, .git, the examples are many.
Yeah, I remember trying to delete a fully loaded Python installation that had found its way onto a OneDrive-managed folder. After a chat with IT, I learned that OneDrive can only delete X number of files at once. We agreed that the most practical solution was for me to spend an hour deleting files by hand, and choose another drive next time. Fortunately I don't really depend on OneDrive as a backup, since GitHub does that job well enough.
The other thing is that both Git and OneDrive are in some sense fiddling with your file system at once.
If I were an assuming feller I'd "almost guarantee" that you haven't been blessed/cursed with anything besides Windows 11.
A lot of my beef, personally, can be chalked up to Windows' aggressively long animation times. It's serviceable with them turned off. But even with animations turned off on an aggressively debloated consumer PC there is either a notable delay or a perception thereof in context menus and file explorer that did not exist with Windows 10, or on my Linux machines.
Speaking of animations, it’s shocking to me how bad they are.
I turned on hiding the taskbar the other day. I don’t think they’ve changed it since Windows 95. I have a modern gaming laptop, and the animation is purely linear, no acceleration. It feels so weirdly unnatural. Even worse, it’s not smoothly animated! I have a 120Hz monitor but it seems to be animated at 5fps.
Nobody on the Windows team seems to give a single shit at all.
win11 look and feel felt a rushed metoo kneejerk reflex
From the comment you're replying to: "Windows does it better than my mac or Linux boxes by a mile"
So I wouldn't assume they've only used Windows. FWIW I also primarily use Windows 11 currently, but have also used other OS'es. I've experienced frustrations with all of them. Just because it's fast for you doesn't mean it's fast for everyone, and vice-versa. I could certainly buy that more people are having problems with 11 than they did with 10, though it hasn't been my personal experience. Just saying we shouldn't assume our own experiences are universal.
The irony of that first line might be lost along the wire because I explicitly called it an assumption where the gp did not.
"I have a Windows 11 workstation ... There's no lag with context menus or browsing directories with a lot of files."
You have the same Windows updates as everyone else and it will be painful. Also you should be keeping those CAD and games up to date and that will be very painful. Updates often happen at unfortunate times.
The Win 11 start menu has managed to be worse than the Win 10 effort and jumped to the middle of the task bar because ... reasons. Search on it is ever so slow. For some reason Win server 2025 has decided that I want to use a welsh keyboard (I'm english and tend to en_GB) when I RDP to one. Cymraeg (soz if I got "welsh" wrong) is alphabetically first in the en_GB list of keyboard mappings and I didn't even know there is a welsh keyboard! I suppose they must have some accents and diacritics not found in english. Its all just a bit weird that a bug like that surfaces after well over two decades of me using RDP from a Linux box to a Windows server.
You wag your finger at endpoint management in the same way that most software vendors used to do at AV back in the 90s and 00s (and 10s and 20s!) Its nothing new and basically bollocks! Modern AV is very good at being mostly asynchronous these days and besides, we have unimaginably faster machines these days and very fast CPU, gobs of RAM and SSDs. Copy a multi GB file and yes AV will take a while but at least you might be saved from nasties.
There is a good reason that corp devices have to run things like inventory agents, log shippers and the rest too - its about security. You doing your own IT security is fine and I'm sure you'll be fine.
You can get Win 11 to work on an old machine for now but as you say, you have to circumvent things. When you do that, I think you are storing up issues for later. Perhaps you will be lucky but perhaps not. My dad will soon be rocking Linux instead of blowing a grand+ on a new PC. He will get a secure booting Ubuntu based effort that looks quite similar to Win 11 that is fully supported by the vendor ... and me. I managed to "port" my wife some years ago and she is a much tougher proposition than my dad!
> Also you should be keeping those CAD and games up to date
Not OP, but why? I have a perpetual license and a 12-year-old copy of a corpo CAD package and it works fine. I see no reason to compulsively update something that's feature-complete and functional.
Updates break shit or make shit worse for me all the time. See: Windows 11, macOS Tahoe, and KDE next year when they drop my working X11 session and expect me to use busted-ass Wayland that's missing functionality I use daily.
Why do I need updates? "Security?" I'm not exactly a nation-state hacking target. I don't run random pirated software. I'm firewalled to hell, and behind CGNAT on Starlink. I'll keep my browser up to date, fine, but I'm still running -esr.
"that's feature-complete and functional."
I get where you are coming from. That was my stance roughly 20 years ago too. I also note that you are quite clearly not daft!
You and I have different "jobs". I worry about thousands of systems on many sites, one of which is my home. I'm an IT consultant and am the managing director of my company. I think you are an engineer, perhaps retired ("12-year-old copy of a corpo CAD package and it works fine")
If Solidworks, Catia, AutoCAD or whatever (?) works then fine. You might like to firewall off whichever vendor's website/security systems might want to stop a 12 year old copy of a corpo CAD from working if it isn't licensed. It probably is because all of the above generally need a license service.
I worry about many 1000s of PCs and I think updates, patches etc are a good idea. If you are an engineer, then you will have to do your own "deploy, fix issues" cycle. IT is just the same.
>> Context menus take a noticeable amount of time to appear.
>I can almost guarantee this is from some endpoint management software your company installed.
It seems to be a common complaint online, dating back to the launch of 11. I see some of the blame being put on extensions but what changed in the extensions between 10 and 11 to cause this?
I know on my work computer I was experiencing this plus I almost always have to click show more and wait for that lag to finish.
I was able to edit the registry to show all at the cost of 1 lag...so I guess a step forward?
I guessed the same thing. Probably the fault of employer-mandated software gumming things up. But since the only reason to run Windows is because my employer mandates it, almost all of my Windows experiences involve enterprise-managed lag in the extreme.
It may not exactly be Microsoft's fault, but it piles nicely onto a pre-disposition against them and all pro-MS IT departments who can't seem to tie their own shoelaces. It takes a maturity that is sometimes lacking in moments of frustration not to blame all the world's problems on MS.
I have similar suspicions. I have a decent but not spectacular company Thinkpad. When I first got it, it was super-fast; it didn’t matter that sleep very quickly turned into an automatic shutdown, as it booted in mere seconds.
Gradually, over the past 9 or so months, it’s just become progressively worse and worse in a range of ways. It might be Windows updates, but the magnitude makes me suspect it’s layer upon layer of corporate management and security nonsense.
Could also be a temperature throttling problem caused by dust or a stuck fan. My old work Laptop suffered from that, and recovered after I cleaned it.
The endpoint stuff kills laptop performance. I left my previous job and they let me keep my X1 Nano (1st gen; 16GB memory) which was performing abysmally towards the end.
Deleted all the partitions and did a 100% clean install (multi boot Win11/Fedora), and it's suddenly what feels like 2-4x as fast. Made sure to disable some of the Copilot and Internet content in search menu rubbish etc with a few registry tweaks (yay for having admin access to get rid of the bloat/junk).
Fedora/Wayland/Plasma still feels faster though - I just had some issues getting my video to work properly across all of Teams and Zoom.
Back in the times of Windows 95 and Windows XP, reinstalling the OS at least once per year made Windows noticeably faster. Then it degraded month by month. And yet I still remember how incredibly faster the same laptop was with Ubuntu 8.04. Faster than a newly installed Windows.
How about a right click on the desktop? I have a very fast computer with no bloatware on, yet it takes half a second for the desktop context menu to appear. When I do this repeatedly. The first time takes 1 second or more.
Compare with a right click menu in a browser which is instant.
HKEY_CURRENT_USER\Control Panel\Desktop > MenuShowDelay set to 100 (ms) close regedit, reboot.
Lol . You made my day. I was doing that kind of Registry mangling 25 years ago. Brought me good memories, its been a while.
I'm so happy I haven't had to use a Windows machine in more than 10 years.
For me, MacBookPro for coding, Linux Mint for home desktop and Steam + Xbox Live online for gaming. We live in excellent times
Good god.
Why is this set to 400ms?
Any reason it's not 0?
> Why is this set to 400ms?
I can't test it on my current computer, but does the setting affect menus that are triggered by hovering over something?
If so, then 400ms makes more sense, and the real bug is that menus summoned by an explicit click should be exempt from the delay.
The reason for a hover-delay is that it allows someone to flick their mouse to their real destination without triggering a "trap" of content which pops up on the way to obscure their goals.
But why?
> Windows 11's file browser lags when opening directories with more than 100-ish files. Windows 11's file browser takes a few seconds to open at all.
There's a guy that has written their own version of explorer that's so fast in comparison to the built-in, that you'd think they were cheating somehow because of everyone's experience with explorer.
And someone has written an IDE for C++ that opens while Visual Studio is on its splash screen.
And another that has written a debugger with the same performance.
And a video doing the rounds of Word ('97?) on spinning rust opening in just under 2 seconds.
Basically, everything MS is doing is degrading performance. Opportunities for regular devs to go back to performant software, and MS is unlikely to fix theirs in the foreseeable future.
> version of explorer that's so fast
$250 for a version with updates past a year? yikes
For a lifetime license incl updates forever that seems quite reasonable to me. It's a bit over a year of Netflix.
In fact, given that it includes perpetual priority support (within a business day!) I expect the author's gonna change that soon, once he gets one of those infinitely demanding customers and realizes what a terrible mistake he made (inf support for a one-time payment, oops!). So better bite while it's hot!
The €40 option for one year of updates is a lot more economical and is still a perpetual license for the software itself.
Now I'm shocked by the cost of Netflix.
The monthly subscriptions always sound cheaper than they are
Imagine paying for a file browser. This is why windows will always win. They have the most docile userbase ever. They'd rather pay 250 bucks for a file picker than to change OS.
Hey Total Commander is free/shareware (if you can live with the nag screen) and superior to anything on any OS
My solution to the nag screen was that I never turned off my computer, just put it to sleep, so Total Commander was always running.
Interestingly, TC was one of the few software that I considered paying for, but in the end I didn't because they asked for too much information at the time. Not long later I switched to Linux, and I couldn't use TC there.
This is more of a macOS thing.
Windows users just don't pay and keep using Explorer.
If you use software that is $10k/year and Windows only, a few bucks here and there to improve your quality of life is a rounding error
Double Commander is open source and no cost.
some folks about to make a decent amount of money if the trend wrt win11 continues
> $250 for a version with updates past a year? yikes It cannot handle CJK encodings too. what a joke
I've tried this a few times. Windows 10. Downloaded the 2MB file, double-clicked on it, and nothing happens. Same thing when I tried it a few months again. Put it in a command prompt and no output of an error.
I'm starting to worry I just launched something malicious.
The latter is normal on windows. Executables have a header flag which specifies they either use the terminal or not. If a terminal program is opened from outside a terminal, it opens a terminal window. If a nonterminal program is opened from a terminal, it instantly detaches.
After downloading, did you open its properties and "unblock" it?
woa!!
The problem is on windows you're competing directly against the guys who own the operating system. So even when there is a gap for a better file manager the one that microsoft makes is so entrenched and microsoft can make sure they always win. It sucks.
That was the argument used for IE/Edge. But eventually it got so terrible that the first thing everyone does now is install Chrome/Firefox/Brave.
They obviously have an advantage, but it’s not insurmountable to being garbage.
It was VERY common in the spinning rust era to already open (office, etc) applications in the background. I think the launch operation only allocated window resources and finished the job; all the hit the disk work was already precached in memory while the OS was doing the slow computer starting up / logging into the network steps and the user was off getting a coffee or something.
I found chrome was putting itself into "eco mode" on my Lenovo (work) laptop all of a sudden. Meant that waking up took FOREVER, and accessing a web page (required as part of a daily login) took 15+ seconds to load when first logging in, as opposed to a few seconds, which caused our password app to timeout at times, etc. Who the heck comes up with these ideas? "Eco mode" by default? And no way to disable it easily? I had to add an obscure switch to the chrome startup to make it run normally again.
A similar example: Microsoft's Windows Search function is so pathetic and slow, yet there's another little company who gives a blazing fast file search tool, that's available as (portable) freeware since 15+ years.
Everything Search: https://www.voidtools.com/
Everything Search uses the NTFS indexes to do blazing fast file or folder searches. It has a neat and clean interface, and no nagging ads (unlike.. cough, cough.. Windows 11). Everything Search is one of the first tools I install on any new Windows PC.
Same experience here, but I'm not sure it's just MS fault; companies have a way of installing a bunch of stupid software on top of one another, that you can't get rid of without admin rights, that continuously do things that slow the system down.
(And, you can have a tabbed file browser on Win7. I still have a Win7 box at home that works perfectly well and that does have tabs in file explorer. I think it was an addon I installed a while ago; don't remember exactly, but it works perfectly.)
It seems industry-wide these days.
What I’ve seen as an older-than-average developer is that the Agile movement has made it increasingly difficult to make time for paying attention to some of the more subtle aspects of user experience such as performance. Because I can’t predict how much work it will be accurately enough to assign story points to the task, and that means that this kind of work frequently results in a black spot on our team performance metrics.
CD makes it even harder because this kind of work really does need some time to bake. Fast iterations don’t leave much time to verify that performance-oriented changes have the intended effect and no adverse side effects prior to release.
That’s not agile’s fault. That’s the orgs fault.
We used to have a few days set aside regularly to fix things that would never get prioritised.
Microsoft is progressively making everything an instance of Chrome. They've seemingly altogether given up the notion of native platform rendering. The win32 api for native ui elements hasn't been touched in two decades. There have been a few failed attempts to move on from it like Siverlight, WinForms, UWP, LightSwitch, etc, but they never bothered to revisit their native UI library. So now everything is a Chrome instance.
> what does Windows 11 even do that KDE Plasma 5 wasn't doing a decade ago? How did it take this long to get a tabbed file browser?
Management features like application based firewall. Dedicated views in explorer tailored for common file types, automatic view type based on content, plus tile view. Proper title bar customisation. Contextual Ribbon toolbars and to be honest, good menus in general. Have a professional UI designer try KDE and I'm pretty sure they'll have a migraine. Also Win32 in general, Wayland still has a loooong way ahead of it.
More a reply to few fellow comments: I have few Win 11 hosts that I use. No management software other than just antivirus. 90% of them are super slow, on every hardware. The latest super-feature on one of them is empty Task Manager. It just doesn’t display a single process. Of course Process Explorer works, open faster and show all data without thinking how to display table with 500 rows for dozens of seconds.
Similar story here.
Started a new job about a year and a half ago and got a powerful laptop with a really top of the line CPU and GPU, 64 GB of RAM (Now upgraded to 96GB, needed for my work, even with these specs compile times are longer than I'd like...), and it was a terrible experience, coming from someone who's used to Linux having used it for a bit (started in 2013 with Ubuntu with a dual boot. Moved all-in to Arch in 2016, distro-hopped or played with different desktop enviroments/wms after that (Recently switched to niri), but all of which are leagues ahead of Windows 11 IMO. Only occasionally ran Windows on a spare device or a VM on the rare occasion I needed to, eg for work / school.)
Tons of issues, slow in some operations, weird bugs (in the explorer like you, or with my Bluetooth headphones, or other issues), and even occasional blue screens! It's not just my setup too, my coworkers have similar issues. Plus, it just isn't a nice environment to use.
At first, I tried to set up a nicer environment (as much as IT would allow). I installed PowerToys for QOL improvements, GlazeWM to emulate a tiling window manager setup, I tried debloating as much as I can, I installed Wezterm for my terminal (Why is Windows Terminal so hyped up? It seems like an extremely basic terminal emulator to me...), oh-my-posh theming for my shell, and several other things.
But every convenience program I added just noticeably slowed down my laptop, to the point I just gave up some of the niceties and lived with it. Why is such basic functionality able to be run so smoothly on a much weaker device on Linux, but struggle on Windows on a much more powerful device? I can only think of one reason...
Your first gripe kind of sounds like DLP software is installed on the system and it is scanning files you're "accessing".
I don't know, my personal windows install which I use for photoshop, lightroom, and the occasional game also has similar issues, and it only has the included windows defender. I've noticed on many computers that whenever there are a bunch of files in a directory, the explorer grinds to a halt.
At work we use clownstrike for our driving-around-with-the-handbrake-on needs, which I have installed on both Linux and Windows, and the former flies while the latter lags all the time (I dual boot, so it's the same exact hardware). Doing something which is fully equivalent, like installing an IntelliJ update takes around a minute on Linux and many more on Windows.
The fan also comes on much more often on Windows than Linux, even though most of my job is done on remote servers via SSH. Under Linux I only hear the fan when I compile something. This morning I booted windows and the fan was running constantly while I was just catching up with a few mails in outlook.
If it's DLP then using alternative file browsers should also be affected, right? Which at least in my case it isn't.
On my company provided laptop with Windows 11 (previously Windows 10), the top three CPU usage was and is usually from Antimalware Executable, Microsoft Defender and MS Teams (or Crowdstrike). I don’t download files or get files from other sources often, yet these things keep doing busywork and slowing things down. Despite virus and threat protection running quick scans often and forcing a full disk scan every couple of weeks or so.
It’s almost as if these programs are people who ought to show that they’re doing something even though they’re just heating the room and running the fan.
Same here. ng install takes 2000x as long as on my similairly priced mac. Installing a package for any language locks up the laptop for indexing
> If my computer goes to sleep, WSL becomes unresponsive. I have to save all my stuff and reboot to continue working.
Try wsl --shutdown. Works for me when WSL hangs for no apparent reason.
I've also noticed that, in my case, these hangs are somehow tied to Docker for Windows. Couldn't figure what triggers them so far, though. I just restart DFW and kill WSL when that happens.
Restarting the vmcompute service sometimes helps. Doing so completely blue/blackscreened my machine this morning so it just makes me more confident in WSL's low level hooks.
My wife recently got a new laptop. She mostly just uses office and the browser so I gave her some specs to look for SSD, 16GB ram, Lenovo should be good (fatal mistake I didn't specify the CPU). She went out and bought a cheap Lenovo laptop with a Celeron dual core and 16GB ram, SSD. It can barely run windows 11. Everything slows to a crawl, she can't be on a video call, and have a google doc open at the same time. It's insane and frankly should be criminal to sell such a poorly performing piece of hardware.
It's so bad that she actually switches to her old laptop from 10 years ago (still on windows 10, also a dual core) for video calls, and it performs way better.
The engineers working on Windows should be embarrassed. I may just try to load ChromeOS on it. Would be nice to get Windows out of my house for good.
> a cheap Lenovo laptop with a Celeron dual core
Yeah, those things are born e-waste. I'm surprised Intel even bothers. Even on Linux they would varely play an HD Youtube video if it weren't for the hardware acceleration. A dual core from several years ago, assuming it's a proper i5 or i7, will do a lot better.
Windows 11 doesn't make things any better.
> Even on Linux they would varely play an HD Youtube video if it weren't for the hardware acceleration.
Video decoding has always been a brutal workload, but that isn't Microsoft's fault. I had to replace my Thinkpad X220 with an X270 for no reason at all except h.265. It's a ULV i5 too, so the perf is almost identical to the old laptop's... until you watch video on it.
h.265 can get annoyingly heavy, but the better compression isn't coming from nowhere. But for less powerful machines, H.264 works just fine and works back far enough that even I got surprised. Even LGA 775 machines will do that just fine, and h.264 is more or less a constant lowest common denominator.
If it can't play a blu-ray off CPU, then it's either so old that DVDs were the media du jour, or it was never meant to be in a general-purpose computer. They'll do e-mail and Office in a pinch, and they'll play video within their limits, but venture outside for anything else, and it all comes crashing down.
Me and my fiancé both bought Lenovo laptops with 16GB RAM and 5000-series Ryzen, 500 GB SSD. They were on sale, and the price seemed nice.
Some of the Windows 11 features are laughably, hilariously slow. If I enter anything in the taskbar search, it will take a solid 6-7 seconds for the app to appear in the result. The result window will just be blank. If I press enter after having typed in, the app will start - but still, it is so, so laggy.
And some weird flicker when running certain applications. It was like that out of the box, and I feared I had gotten a defective screen - nope, only certain apps.
> I may just try to load ChromeOS on it.
If you wife would be OK with ChromeOS, basically all she needs is a browser. I just installed Linux on the computer my wife is using. For a while she was on Ubuntu and then once she got used to it, I replaced Ubuntu with Debian (because I use Debian everywhere: NUCs, laptop, dekstops, servers, hypervisor (Proxmox, which is debian), etc.). It's easier for me to just slap Debian everywhere but YMMV.
People have no idea the amount of people who nowadays only need a browser (and working sound/microphone: but that nowadays Just Works [TM] on Linux).
It's never been easier to switch people to Linux than it is today.
I vibe-coded my own apple system-6 style shell in rust and use that. If I don't like a feature, I change it. It is lightning quick, in it's (emulated) 1-bit glory. There's no requirement for you to use the built-in explore.exe to launch things, even for games. The graphics are decoupled from the shell so I use it for windows and linux.
If vibe coding your personal GUI utopia is too much, you can use something like Cairo - https://cairoshell.com/
Windows does allow you to enter Chinese characters when you say your language is Chinese.
I have moved to Bazzite at home but it is ridiculous that you can't just use your language right away. A normal person would just bail out.
If you choose Chinese, all of the content is localized but you have to enable an input method and that doesn't even actually allow you to input Chinese characters!
It shows you a scrolling(!) pop-up that tells you you have to enable a different option in order to type in the language that you chose at the very beginning. Hope you didn't choose the wrong one out of ibus, fcitx and fcitx(wayland experimental) – whatever happened to scim, and xim, I miss editing environment variables.
Even the cheapest Android gadget can do this out of the box.
Explorer file browser is just disaster. I am forced to use third-party browsing app when the directory contains hundreds of media.
It has happened since win7 or older but still not fixed.
I'm definitely not defending Windows but when I ran Windows 10 Pro for 11 years straight, I had no problem with performance.
We're talking a 4 core i5-4460, with 16 GB of memory and an SSD running WSL 2, Docker Desktop, real work loads, video editing, etc..
It was very performant and never got in my way. I'd leave the computer on 24 / 7 and only the monitors turned off. It only got rebooted for forced Windows patches.
With that said, my hardware can't run 11 and even if I did patch around that, I'm choosing not to run 11 so Windows for me was over near the end of 2025.
I'm running Arch now on the same box and except for GPU memory leaks, it's quite snappy. CPU intensive tasks finish faster and disk I/O feels even faster than Windows. There's also unlimited flexibility to tweak things however I see fit. Gaming performance is substantially worse for the few games I play. No regrets, except for gaming.
If you had 16 GB of RAM, yeah Windows 10 was mostly fine. The laptop I had in high school was rocking 4 GB and a first-gen i5. Windows 7 was rock solid on that machine, but 10 brought it to its knees. Sandy Bridge fared much, much better, to be fair, but the jump from "4GB is enough" to "8GB is pushing your luck" was not pleasant, as I recall it.
Oh yeah, 4 GB would be really rough on Windows 10.
The machine I had before this one had 2 GB of RAM and I ran Windows 7 on it. It had a Core 2 Duo E6420 CPU and GeForce 9800 GTX+. I remember things being mostly ok for playing a bunch of games back then but struggled when I tried to run VMs which makes sense given 2 GB of RAM.
It was that machine that taught me not to skimp on RAM. The last few years of that machine's life started to get pressured pretty hard with memory requirements and having 4 GB would have solved all of those problems. That's why when I built this machine I went with 16 GB, back in 2014 still running Windows 7.
I have the same specs in my work machine.
Task manager takes 10 seconds to load the list of processes. Right-click on the desktop takes about 1.5-2 seconds to show the 'new' context menu. Start menu is actually fast to start drawing but has a stupid animation that takes about half a second to fully load.
I sort of understand how the anti-consumer 'features' (ads) get added to a piece of software. But I have no idea how they manage to continuously degrade the experience of existing parts of the system for seemingly no one's benefit.
Please try https://filepilot.tech/ Its like explorer but so much faster in every single way. I have it pinned on the taskbar, it launches quickly than a new explorer window.
Main one is no-multiline support atm. Which means that the icon view does not show full file name, list view etc are perfectly fine though.
Current problems I have with it are no native zip support, which means you must use 7-zip, winrar etc and set them as a defualt viewer for zips. Otherwise, double clicking zip opens the explorer.exe.
Had the same issue with slow file explorer in Windows 10. A couple of things helped a bit, such as disabling "Show recently used files" and "Show frequently used folders". I also cleaned up the Quick access list. For some reason if you have a network share there it makes browsing local dirs slower, go figure. It's still not instant but a lot faster than the 3+ second delay.
I tried OneCommander and they're super fast, so it's not something slowing down disk IO, it's purely File Explorer.
Now I'm still struggling with closing chrome tabs being super slow sometimes.
I'm using it on a work-issued ThinkPad with 8 gigs of RAM and an Intel i3. It's fucking horrible
An i3 is a damn near OSHA violation.
Honestly - I think it must be a laptop thing. I have a similar spec laptop [0] for work, and it's... borderline unusable. I can ignore the upsells, always online, etc but the OS is just fundamentally falling apart more and more every other release. If I unplug the power cable, it throttles itself to the point of being neutered, and if I try to put it to sleep it _very regularly_ will just stay awake and drain its own battery.
My work desktop on the other hand (i9, 64GB ram, 4080 GPU) is absolutely screaming and I have some of the same problems but they're nowhere near as bad.
I don't think I could genuinely buy another windows laptop.
[0] https://www.dell.com/en-uk/shop/laptops-2-in-1-pcs/dell-xps-...
> if I try to put it to sleep it _very regularly_ will just stay awake and drain its own battery
Sadly Linux isn't any better on this front because it's a hardware issue. Laptop vendors have gotten terrible at managing sleep states. S0 sleep is a joke. I changed my laptop to hibernate on lid close after the second time it nearly cooked itself in my bag.
I noticed significant slowdown on my home computer, so I did some optimization - namely turning off some services.
AI related things, one drive (this could be one of the reasons file browser is slow), widgets on the screen like news and weather, some other optional/not needed things.
They added a lot of not needed crap to File Manager. I think it's almost better to install a third party one.
Your experience is very far away from mine. I don't experience any of these issues on a standard 32gb office laptop, or my home gaming machine.
iirc the cause of context menu delay was there is an invisible animation I think.
The system is animating the menu opening, except there's no animation. So it just waits for a while doing nothing then the menu pops open.
> If Windows 11 struggles this badly on a brand new laptop that I'm certain would retail for $4000+, I can only imagine how miserable it is for everyone else.
I purchased a Surface Pro at one point, with the intention of using it for sketches and using Windows on it, thinking it'll be better because the hardware and OS is from the same organization. Nope! Slow out of the box, horrible thermal management and software bugs galore. Installed Arch with Gnome on it, ran better in most aspects. However, pen usage and touchscreen even after I tried my best to adjust it was worse than Windows, I guess they won there.
I didn't use to hate Windows, because at least it wasn't utterly broken, just stupid, but they're really in a deep decline lately. Maybe it's time for one of those "stop the press" type of moments where they have to stop, take a serious look and fix what they have as it currently feels impossible to use reliably.
I have dual boot on decent laptop, doing nothing, on windows fan is always on, computing something? On Linux it is just silent
Maybe investigate the background apps that are running on your laptop?
By the way, I just opened a directory that I hadn't accessed in months. It contains 10945 log files, and Windows Explorer displayed them instantly.
Your work laptop might have an excessive amount of "security" software installed that causes it to lag far more than it would normally without such bloated software installed that runs in the background and slows down practically every process you do with the machine.
I am also subjected to Windows at work and I hate it. WSL is an okay experience until it just crashes and stops working.
The only reason I was forced on Windows was because they couldn't find an edge management system for Linux
> There is no pattern to when Esc does/doesn't work.
Its non-deterministic, as if developed with LLMs....
> Windows 11's file browser lags when opening directories with more than 100-ish files. Windows 11's file browser takes a few seconds to open at all.
I've got bad news for you. Nautilus also lags when opening some directories.
I swear windows is just full of sleeps and it doesn't matter how faster your system is.
It's more likely network calls that are taking a long time or timing out. A lot of developers insert function calls that under the hood hit HTTP servers, and it can take a few hundred milliseconds to stand up a new TLS connection and then however long it takes to send the request and get the response. It's also probable that the endpoints form an accidental microservice architecture in which case everyone is always hitting a different set of connections. This creates a perfect storm of having to reconnect to everything you hit occasionally which can create little slowdowns all over the place all without actually using CPU so it doesn't show up in any resource monitors.
HTTPS calls should be treated as calls to sleep() with undefined timings.
Microsoft Windows: Accidental Microservice Architecture Edition
The real question is why is my file browser blocking on an http call? Oh right, tracking/telemetry/ads.
If it's Intel then it might not be fully down to Windows 11. The PC laptops are universally crap. I had a few latest ones, Ultra 9 and they are atrocious. Experience reminds me using a netbook in early 2010s.
I would refuse to work anywhere without a Mac. If x86 then it would have to be linux, as that would be passable (apart from fan noise).
Panther Lake and Lunar Lake are pretty good I'd say. Intel has been improving.
The Lunar Lake and Arrow Lake platforms are finally really good again, comparable to AMD. Fast and power efficient. Before Lunar Lake they were pretty much crap, I agree.
KDE is bloated garbage too! Half of the OS didn't work. The only DE's that I haven't had poor experiences with are xfce, bspwm, i3/i3gaps, and xmonad. Note how 3 of these are tiling WMs.
KDE has been great for me on Fedora. What problems have you encountered?
KDE is the best. Everything they make is awesome, Im a big fan.
So much so I donate to them monthly.
Corporate probably loaded up the laptop with work monitoring software, and some terrible AV software. Among other bloatware. A PC of your spec should run without noticeable delay, something else is going on there.
Having said that, Windows has made a lot of the basic functionality way to resource heavy.
Another common issue on corporate-issued workstation laptops is that they don’t install the proper GPU drivers. The basic ones that ship with the OS are awful, but work just well enough that people don’t notice that they’re missing something important.
This was me in 2022 or 2023. I have posted on HN about my shift a few times. I gave up with Windows 10 because you needed Windows Pro in order to make an "offline" account, I spent $2000+ for a gaming rig, and I couldn't add new users, one program told me to use the other program which brought me back to the original program... I had to go out of my way, buy a license just to make it work. I just went and installed Linux finally. I was on POP_OS! for a good year, but been on Arch Linux for one year plus now.
I know its a "meme" to talk about how great Arch is, but when you want the latest of something, Arch has it. I use EndeavourOS since it had a nicer simpler installer (idk why Arch doesn't invest in whats standard in every other major distro) and if you just use "yay" you don't run into Pacman woes.
Alternatively, I'm only buying Macs as well, but for my gaming rigs, straight to Arch. Steam and Proton work perfectly, if you don't sell your games on Steam or in a way I can run them on Linux I am not buying or playing them.
> if you don't sell your games on Steam or in a way I can run them on Linux I am not buying or playing them.
So much this. People like to moan about "oh game XYZ doesn't run so it's not reasonable for gaming". More games run on GNU / Linux than any gaming console. There are simply too many games that do run to give a second thought about the ones that don't, and it's been that way for years.
The giant bugbear in this conversation is always multiplayer. That's because almost all of the big players in that space currently favor rootkits in the form of overly invasive anti-cheat, which the Linux wrappers (mostly the wine project) refuse to support for security reasons.
If you don't play PvP specifically, the rest of the library is significantly more open to you. Personally I have always favored single player experiences and indie games from smaller studios, and for the most part those run great.
It's unfortunate but at the same time if enough people switch to Linux then they'll be forced to change their ways.
So if you can go without those games or don't play MMOs that is rootkits then switch to force their hand.
Besides, them installing a rootkit on your machine is not an acceptable practice anyways. It's a major security issue. Sometimes we need to make a stand. Everyone has a line, where's yours?
MMOs are actually fine. WoW, FFXIV, RuneScape, all work great on Linux. They’re not really games that rely on hidden information, are not pvp first and need to simulate stuff on the server anyway, so can verify moves are valid there.
It’s the competitive progression shooters and ranked esports games that go in for the restrictive anti-cheat
Even within competitive shooters there’s still plenty that run great on Linux. 90% of my time spent gaming is on Overwatch or CS2, and I’ve found that both ran significantly better on my Debian 13 installation than they ever did on Win11.
And it's worth noting that CS2 is still the most played game on Steam. It has double the players of the second most played game, Dota 2, which also works on Linux. And that has double the player base of the number 3 game, Arc Raiders, which also works great on Linux.
The idea that you'll be missing out is ill founded. Yes, there are some games that won't work. PUBG, Bongo Cat, Rust[0], and EA Sports FC 26 are the ones on the top 10 multiplayer list. But it's also not like you don't have plenty of massively popular games to choose from.
I'll even say don't switch to Linux, just stop playing these abusive games. Honestly, if you're unwilling to change OSes but willing to do this then people that want to jump ship can. We all win from this behavior. Even you as it discourages Windows from shoving in more junk and discourages publishers like EA from shoving in massive security vulnerabilities like rootkits. I mean we've all seen how glitchy many AAA games are, you really think their other software isn't going to be just as unpolished and bug ridden?
[0] Apparently works with Linux servers? https://www.protondb.com/app/252490
P.S. If anyone wants to check for yourself:
- Steam Multiplayer by rankings: https://steamdb.info/charts/?tagid=3859 - Proton Support: https://www.protondb.com/
This is true in principle but most gamers are just gonna take the path of least resistance. If they can't play fortnite on Linux (I'm using an example, I don't know if it's actually unplayable on Linux) then they will use whatever OS lets them play.
People have been saying "vote with your wallet" every time gaming companies do something anti consumer like day one dlc or buggy releases (don't pre-order!) or $90 games, but gaming companies continue to push the envelope on what gamers will pay for because gamers keep paying for it.
It's a sad reality.
Take a step back. Why do people want to play Fortnite so much and not anything else?
Because their friends play Fortnite, for example? Multiplayer is often social, so "just play something else" turns into "just get new friends".
There's another way. Only a small portion of friends need to change to pull the rest of the group. Pull them to a game that runs on Linux.
Don't do it like "let's play this game because it runs on Linux" do it like "let's play this game because it's fun".
If you want to be the one to lead this change you have to do extra work. Dual boot Linux and find a game that's fun that you can do online. Find the other friend or two in your group that will do the same (at least play the game, Linux is optional but encouraged for this subset). Just play together for a bit, give it a trial run. Then when playing the other game with the larger group say "hey, so and so and I have been playing this game, you guys should play with us sometime". They don't have to install Linux, just play a new game that their friends are already playing. That's why they're there, to play games with their friends. Don't try to get them to switch to Linux, just play games with your friends. You might have a holdout but if most people move then everyone will. But if you want to do that move you have to find what works and at least one other friend to give it a trial (who won't need to do as much work as you). That's how you do it. No crazy scheme and honestly not massive amounts of work either. Just the normal process of finding new games to play with one constraint. It just seems complicated because I stated the process explicitly.
I don't play a lot of online games anymore, but when I did, it wasn't just because friends were playing it. It was because it was fun, it was part of the cultural zeitgeist, it's popular, the community is fun, etc. You can't really replace something like that with just "another game," no matter how fun the other game is.
I agree. But I think there are a lot of fun games. Plenty of them on Linux.> It was because it was fun
This is the harder part, but we are in an age where there are a lot of games. I think you'll be surprised to see the games that do work on Linux[0]. Looking at the most played multiplayer games on Steam[1] (in order): (1) Counterstrike, (2) Dota 2, (3) Arc Raiders, (5) Terraria, (8) Grand Theft Auto, and (9) Marvel Rivals all have good proton support. What doesn't work in the top 10 are (4) PUBG, (6) Bongo Cat, and (10) EA Sports FC 26. (7) Rust supposedly works, but only on Linux supported servers (smaller user base).> it was part of the cultural zeitgeist
The point I'm making here is that while you may not get to be part of every cultural zeitgeist, you can still participate in the 3 most popular ones and more than half of the top 10. Frankly, most people won't be able to participate in every zeitgeist for any number of reasons (cost, hardware, restrictions, etc). But I think considering this you don't have to fear being left out.> it's popular,Maybe you're obsessed with PUBG or Battlefield and then yeah, Linux isn't going to work for you. That's okay! But looking at the numbers, for most people, they can still be a part of all the cultural excitement. It's not going to work for everyone, and that's okay! If it doesn't work for you, it doesn't work for you. But I want to make sure we can distinguish real blockers from ones Microslop and EA want you to believe in.
I think this is less of a blocker than you might think. Honestly, in my experience smaller communities tend to be more fun. They develop their own close knit culture. You've been on HN a long time and seen it grow. Isn't that a similar reason you come here?> the community is fun
You're right, but again, I think there are fewer blockers than you think. I can't tell you if those blockers are real or not because what is a blocker comes down to you and your personal interpretations of all those variables. But if you're frustrated with Windows and the system, why not give it a try? You don't even have to switch to Linux to pressure the studios to change. Just spending more time playing games like Counterstrike or Arc Raiders than games like PUBG or Battlefield. And if you play more games like the former you make it easier for others that are thinking about making the jump. But hey, if PUBG or Battlefield is your jam and you don't want to try anything else, then no worries. You do you.> You can't really replace something like that with just "another game," no matter how fun the other game is.There's one more important thing I want to bring up. I think it is important to ask "where is your line?" How much junk can Winblows shove in before you're willing to make sacrifices? Is EA installing a rootkit enough of a security concern where you won't take it? What is? You don't need to tell me what the answers are to these questions. What's important is that you yourself know where these lines are beforehand. The lines are personal and unique to you. People are going to have other lines than you and that's completely fine. I just ask you think about what conditions would cause you to make sacrifices? That way if they happen you can respond.
> Looking at the most played multiplayer games on Steam
Note that this is skipping over some extremely popular games which aren't on Steam. Notably Fortnite, Roblox, League of Legends, Valorant, and everything else from Riot Games, none of which work on Linux. From the Steam examples there's also some grey areas, GTA5 singleplayer works but multiplayer does not, and Counterstrike works on official servers but not on Faceit servers, where a lot of serious competitive play happens.
League of Legends is the only thing keeping me on Windows
You're right to bring up the limits, but I think you're missing what I'm saying. I'm not trying to say that everyone can and should switch. But I am saying that the costs are probably less than one might think.
The costs of switching can only be answered at the individual level. No one can answer for you. But people can state their experiences and help you understand the costs and benefits.
Let's make sure we can accurately understand the costs and benefits and differentiate from imaginary ones.
I also said that you can take a stand without switching to Linux. Maybe the costs are too high for you right now. But maybe the costs of meeting up with your friends to play Dota rather than League is easy. At the end of the day the costs are due to the network effects. You can reduce those costs slowly and make it easier for others to jump ship without you needing to, which makes it easier for you to jump ship in the future if things change. The same is true for social media. Maybe you can't break from Instagram as you have too many contacts where that's your only way to communicate with them. But you can still encourage others to text you, Signal message you, or whatever. This still reduces the power of that network.
Here's the thing: the less sticky platforms are the better it is for everyone. I'm not going to tell you that you aren't going to have to put in more effort, but I will criticize you if you think that effort is insurmountable. I will also say that this is also part of our social duty. If something like using Signal instead of Instagram to communicate with your friend because they want to is "too hard" for you, then I envy the life you have where such trivial actions are your biggest concerns. If trying new games with friends who want to try new games is "too much" for you, then I think you should question if you're an addict.
I'm not saying you have to switch. I'm not saying you have to play certain games and not others. But you do have to be open to changing things and recognize that if you don't then you're creating a doomed self-fulfilling prophecy. If you're unwilling to have the slightest inconveniences then the enshitification and dystopia is on you. If you are unwilling to have the slightest inconveniences then you have no right to complain as you are the one preventing that change. But also, if you don't have any of those concerns of enshitification and tech dystopia then you have every right to stand your ground and not be inconvenienced. But I want to make the conditions clear. We live in a society. The society has a duty to you and you have a duty to the society. You don't just get to take and give nothing back.
First off, a hell of a lot more of those top 50 are unplayable. But more importantly the thing that you are ignoring is that every single one works on Windows. By choosing to use Linux you are choosing to not be able to play these games and an unknown number of future games for... what? If you only have a PC to play games with your friends, what could possibly be more important to you than the ability to play games with your friends?
This isn't about me though. I game on Linux. I love it. My original reply was to this in your comment
> It's unfortunate but at the same time if enough people switch to Linux then they'll be forced to change their ways.
The whole point of this subthread is that companies are not going to make Linux compatible games as long as there are customers OK with installing root kits on their companies to play their games. And most gamers are ok with that line being crossed. It sucks for the rest of us, but capitalism gonna capitalism.
I apologize for misunderstanding your comment, but I hope mine still stands to help others recognize the issues you brought up aren't as large as some may actually believe. I agree with you, companies that abuse us, the users, want to amplify that fear. It empowers them. It's why I am encouraging anyone who reads my comment to ask themselves where that line is. Personally I'm with you, the line has already been crossed. I've made the move and don't regret it for a second. Nothing changes for the better when no one is willing to take the first step.
No worries, and no I completely agree with you on all fronts.
There are lots of fun games in the cultural zeitgeist.
> Don't do it like "let's play this game because it runs on Linux" do it like "let's play this game because it's fun".
Indeed! i have some online and irl buddies who aren't on linux that i've got playing games like veloren with me, simply because they are good games. i've got loads of hours in games like veloren, luanti, xonotic, pokeMMO, and osu! for example lol and encourage everyone to check those games out if they're up their alley. :)
The market for multiplayer games, shooters especially, is already a mess, because people don't want to play a game that doesn't have an infinite pool of players to matchmake into, or a game that doesn't have all their cosmetics, or... etc. etc.
So this ends up being easier said than done. I've had success, but that's my friend group out of however many.
Try to find a shooter with a playerbase that doesn't use EAC/etc. - it's a crapshoot, unfortunately. You've got Valve's stuff and one or two outliers, but if those don't meet your group's genre needs, you're whomped.
You assume I have friends. Or at least, friends that care about video games.
Besides, more likely is that I leave to do my own thing, 0-1 peers joins me for a bit, then we all kinda drift away. Friendships in this era are much more ephemeral.
I want to step away from the conversation about Linux
Honestly, I'm disheartened to hear this. Frankly, those don't sound like friends, or at least close friends. If a friendship can evaporate by the simple act of wanting to try another game, then it barely seems like a friendship and it seems like those will evaporate as soon as the next popular game comes about. I don't want to tell you to abandon your existing friends but I would encourage you to find friends you can have stronger bonds with. To have closer relationships. Hard truth is you need to put in work to make this happen. It doesn't matter what games you play or on what platform: everyone deserves to have deep human relationships. I really do hope you can find some friends. I hope the friendships you do have are stronger than you have conveyed because frankly, as humans, we all need close friends.
That's most how most of my life has been in a nutshell; it isn't limited to games. Schools,college, old coworkers. A lot of the glue comes undone the moment you need to move on. I'm a late Millenial, but the advent of social media among Gen Z gave us the ability to connect more intimately than ever over the most niche topics. But at the cost of losing the deep bonds you'd normally form then bundled with a community based on proximity.
I've had long conversations with some former guild mates yet can't point you to a name or face. I know quite a few never even lived in my country. But things loosen up once the game shuts down or one of us needs to move on. It's neat in some ways, hollow in others.
On the larger scale, it's why local community is also weaker than ever. No one really puts and effort to come down to community events, or they may come once or twice and never again. Those gatherings are also less frequent than ever, often once a month. You can't really form a deep bond meeting once a month. So meetups end up frustrating in their own way (at least, the tech meetup. Maybe a run club would be different).
I've even heard notions that it's easier to find a mate than a close friend these days. I can completely believe it.
I feel your pain, as someone in their mid-30s, and also a new dad. It's trite to say that it's hard to make friends in your 30s but it really is true. Between career, family, and just day-to-day life, it's really hard to form connections that are any more than just superficial.
What works for me is finding communities. I participate in Toastmasters, which meets every week (one meets twice a month), and it's a good way to make connections with people and get to know them better. It's also fun because there's people of many walks of life. Retirees, college students, business owners, executives, everyone in between. It's a great way to meet people I probably wouldn't have otherwise met in tech circles.
I'm also in my 30s so I get it. The only real advice I can give is that it just takes more work to maintain friends as we get older. I had to be the one reaching out rather than waiting for others to reach out to me.
The other thing is recognize where friendships came from. Most of it was just being physically near people. Sitting in the same classrooms day to day. If work doesn't create that space (or isn't good enough or you want to distance from work) you need something else that does the same thing. Join a club. Set up weekly beers with your friends. Or literally anything that puts you in the same physical space with the same people, routinely. A friend of mine gets together with his gamer friends once a year and they socialize off the game too.
The convenience of social media is also its weakness. The ease of connecting makes it just as easy to disconnect.
Real friendships require work. That's true of any relationship. I'll tell you my friends can be annoying and exhausting, but I love them and I'll gladly put up with their shit to keep them around. After all, who else is going to put up with my bullshit? lol
>Real friendships require work. That's true of any relationship.
indeed. Perhaps that is part of my frustration. Friendships can also be a 2 way street, and it can feel like I'm putting in a lot more effort than the other party when it comes to trying to form them. I don't expect 50/50 effort, but when it's 95/5, are they a friend or simply a familiar guest?
Maybe I need to accept that the tech circles here aren't going to have that deep bond and expand. But I'll admit that the job hunt also slowed down my efforts to break out of my comfort zone.
i like a few multiplayer shooters, fortnite being one of them. i also exclusively use gnu/linux on my machines, so i got around this issue of games like fortnite/battlefield/etc issue a long time ago by simply doing what i've always done for years, playing these on xbox. i even 'stream' these games to my linux machine from my xbox if i want to play them from the computer with the xbox controller, and can join and create xbox live parties through the xbox web interface.
i only do this for those couple of games i play with friends that won't support linux because of the aforementioned rootkit it wants to run on windows machines. console for those games, and all my other games run happily either through steam+proton or natively on linux, and there are a fair few FOSS games with amazing multiplayer. i love luanti, xonotic, openarena, veloren, etc, and play them frequently with some friends. :)
I empathize with the question. But you are essentially asking *why do people want to use instagram and not any other one of millions social media app?*
I can't answer that, but probably similar reason why anyone plays any game. It's fun, their friends play it, etc.
I don't personally play fortnite. But substitute fortnite for any DRMd multi-player game (or MMO).
1. The audience is mostly kids. They can't buy any premium games easily (and is the lens for the rest of my points)
2. Network effects. Works as well on them as any of us. Especially in a world that makes it more and more hostile to have them meet IRL.
3. It's a generation raised on "forever games". They are used to games they pick up and will continually play for years. Games that will always provide new stuff for them. They fundamentally have different habits from Millenials.
4. Mobile support. So many kids play on mobile. So they are even more isolated from the consple market.
Even this framing is silly, if you have a PC to game there are not enough pros to choose Linux. You are giving up the ability to play some popular games and increasing the amount of effort required to play another chunk of them in exchange for what? A snappier file browser? Fewer anti-consumer dark patterns? It's not about "path of least resistance" it just flat out isn't worth it.
> Fewer anti-consumer dark patterns
> isn't worth it
This is a gross reduction of why people choose Linux. People don't choose it just for a snappier file browser and fewer anti-consumer dark patterns.
1. games that install what amounts to be rootkits on my computer are not ok 2. windows potentially spying on my data without my consent is not ok
If you wanna label these as dark patterns, that's fine, but let's not pretend that this behavior is ok.
I like playing games. But I like privacy and security more than playing games, which is why I have a linux gaming machine and a PS5. Some people would rather just play games and not worry about the other stuff, which is understandable for the reasons you mentioned.
This is overestimating the amount of effort involved to game on Linux, imo. It is true that there are a couple games using kernel-level anticheat which preclude their working on linux, but for the most part the effort required to play games on Linux now is zero if it's a Steam game and almost zero elsewhere.
Rust on Linux only works for Linux based servers https://www.protondb.com/app/252490
Apex Legend used to work but doesn't anymore (still marked Silver) https://www.protondb.com/app/1172470
Delta Force used to work but also doesn't anymore (still marked Bronze), people are tinkering with config files but nothing seems to work https://www.protondb.com/app/2507950
NARAKA: BLADEPOINT is working but requires custom Proton, some tweaked settings, launch options, etc https://www.protondb.com/app/1203220
GTA V public lobbies don't work, requires you to tweak launch options, disable battleeye anticheat, seems to just not work for some people. https://www.protondb.com/app/271590
BG3 also seems to require a custom Proton and tweaked settings for some people https://www.protondb.com/app/271590
It goes on and on these were just from the first few games sorted by player count. Much of the tweaking seems to be different person to person, sometimes it just works sometimes it's Nvidia's fault, sometimes it's something totally different. There's a "recommended for tinkerers" option for reviews. To be clear, every single one of these works right out of the box first time on Windows.
I was not shocked that that top comment mentioned they used Claude because the config line is dumb.> but requires custom Proton, some tweaked settings, launch options, etcThe line is
Here's what they meanPROTON_DISABLE_D3D12=1 PROTON_HIDE_NVIDIA_GPU=1 %command% -force-d3d11
Here is the sane equivalent linePROTON_DISABLE_D3D12: Disables DirectX12 There are also D3D11, D3D10, D3D9 options too PROTON_HIDE_NVIDIA_GPU=1: Tells the game you have an AMD GPU instead of Nvidia The default setting is that Proton hides the GPU, so this option here is superfluous. -force-d3d11: forces usage of DirectX11 This is already going to happen because you disabled DirectX12
People are copy pasting settings and sharing but not actually looking at any docs. Disabling DirectX12 is going to give you a pretty good success rate of making a game work if it doesn't work out of the box.PROTON_DISABLE_D3D12=1 %command% Alternatively %command% -force-d3d11Here's a useful resource for understanding the settings. Use this before you ask the AI: https://github.com/GloriousEggroll/proton-ge-custom
Also, let's be clear about what those rankings mean on ProtonDB
I'm not trying to say everything works on Linux. It doesn't. But let's also not pretend that it is worse than it is. That's the same error in the other direction. Linux is not the right choice for everyone, but it is a good choice for many people.Native: Just works i.e. Devs are cool Platinum: Just works (but is using Proton) i.e. Valve has got this shit handled Gold: Works but you either need to use proton experimental or change an option that someone has already figured out. i.e. Community has figured it out, Valve is tweaking. Note that many people are on Proton Experimental by default so possibly that's why it "just works" for them. Silver: Very likely to work with a setting someone has listed. i.e. Community and Valve working on it Bronze: People are figuring it out, leave it to your friends that know Linux i.e. Sorry, you're probably out of luck. Leave it to the tinkerers Borked: Publisher is actively working against the community. i.e. EA hates youYou're implying that 'clicking the cog icon > properties > and then copy pasting some text into a text box' is overly burdensome. To be frank, if you believe that then not only is Linux not for you, but neither are computers, and I really really am curious why you're on a website called "Hacker News".
Yeah, this is why everyone has such a low opinion of Linux nerds.
I switched to console gaming years ago. I can still play any major release while having whatever OS I want on my computers.
I did this and was happily Windows-less for quite a few years. I ended up building a PC with a big GPU and so I switched back to PC gaming with a Windows installation alongside Linux, but I still think the console route is a great option.
At this point, I think quite a few people are basically treating their Windows desktop as a console.
I'll have to remember that one, that's a good way to put it.
>It's unfortunate but at the same time if enough people switch to Linux then they'll be forced to change their ways.
Nope. Not Nadella. He'll kill windows in a heartbeat.
>Sometimes we need to make a stand. Everyone has a line, where's yours?
I just don't really play multiplayer to begin with. So I was never on the spectrum.
But tens of millions are. They won't even be aware of what's happening. That's why this remains.
But standing on principle is too hard!
> which the Linux wrappers (mostly the wine project) refuse to support for security reasons.
It's more that there's no sensible way they could do it even if they wanted to. Emulating the Windows kernel internals is well beyond the scope of what WINE is trying to do, and even if they did do it, there would be no way for the anticheat vendors to tell the difference between the AC module being sandboxed for compatibility versus sandboxed as a bypass technique. Trying to subvert the AC in any way is just begging to get banned, even if it's for beingn reasons.
As a competitive old school arena FPS guy, I have also had a very hard time getting the same smoothness and low latency (input, output, whatever it is) on Linux. The games I play are very fast and twitchy, and milliseconds matter.
There seems to be too many layers and variables to ever get to the bottom of it. Is it the distro itself? Is it a Wayland vs. X11 thing? Is it the driver? The Proton version? Some G-SYNC thing? Some specific tweak that games based on this game engine needs?
I've had better luck since the switch to Wayland. I don't play many FPS games but mouse input & overall smoothness for strategy games has been great. Check your mouse settings, you might need to set a higher USB sample rate. Piper is a frontend for adjusting them.
I know what you mean, though I have a device running SteamOS though and it runs extremely smoothly, the latency is no different than my windows PC (on titles where it can achieve the same framerate).
I'm sure that it must be possible to replicate whatever optimisations SteamOS has on other distros, but unfortunately I am not sure what those are exactly.
> Is it a Wayland vs. X11 thing?
Yes, most likely. Without a compositor I get lots of stuttering on x11, whereas KDE and GNOME's wayland sessions are both buttery smooth out of the box.
Might be my Nvidia GPU, but I've never gotten x11 to work flawlessly for gaming.
> Without a compositor I get lots of stuttering on x11... Might be my Nvidia GPU, but I've never gotten x11 to work flawlessly for gaming.
Weird. I don't use KDE's compositor, and -AFAIK- WindowMaker doesn't have one. When in either KDE or in WindowMaker I don't have stuttering with either fullscreen, borderless "fullscreen", or windowed games... everything is as smooth as it is in Windows. Having said that, I do know that -when using KDE- some fullscreen games get jittery as all shit if a notification pops up and remain that way until the notification disappears. I expect that that performance problem would go away if I was using the compositor... but I don't want to spend the VRAM on it.
I use AMD graphics cards, so it might be an Nvidia thing that you're seeing. It also might be a "Your Linux distro simply stopped shipping good xorg installs" thing. I'm running Gentoo Linux which continues to ship updated versions of xorg and supporting software. [0]
[0] I've heard people running Debian and Debian-derived distros report X11 behavior that absolutely does not match what I've been seeing for years... so some percentage of the "X11 can't do $THING" when it really, really can must be coming from distros that ship either dramatically out-of-date or severely crippled xorg installs.
X11 has basically no development anymore. That means regressions are entirely ignored.
I switched my Gentoo box from X11 to Wayland three years ago at this point.
It's shocking that people still install X11 as a default in 2026 except with very old hardware.
> X11 has basically no development anymore.
Odd. Every few months, I see a new xorg-server version in my distro's package manager.
> That means regressions are entirely ignored.
Should I ever actually have a problem, and it's something that I can't (or CBA to) fix, and my distro's maintainers don't want to try to fix (and then tell me that upstream will never fix), then I'll look more closely at XLibre. XLibre may or may not be a dumpster fire at that point, who knows? If it is a dumpster fire, then I'll look around for other alternatives.
> It's shocking that people still install X11 as a default in [TYOOL]
Nah. It works fine for what I'm doing. I don't do anything that depends on Wayland. The shocking thing would be if I were to waste a ton of time chasing the new shiny... especially when those responsible for the new shiny have been lying for the past 10+ years about how it's ready for everyone's general use. [0]
[0] Perhaps it's ready now, after nearly eighteen years in development. I can't rely on the statements of those responsible for the project to tell me, and I CBA to go searching for (and evaluating the trustworthiness of) information on the topic.
> Odd. Every few months, I see a new xorg-server version in my distro's package manager.
Yea these are security updates but the eco system requires a lot of desktop manager scaffolding in user space. That has basically stopped. It's baffling why you would run X11 today. The X11 emulation layer for Wayland works great too by the way.
Just as one example when you screen share from discord or zoom or Google meets there's now a pop-up that asks you to select the screen or window you wish to latch on to for streaming. This provides some security. With X11 anything can just take a screenshot at any time. Sure that's convenient but so many apps don't even support X11 anymore. As someone that made the switch three years ago I get how you might think the old system is better but in reality you haven't tried the new one so you don't really have a way to compare. I noticed so many quality of life fixes that I can't even imagine running X11 anymore.
> It's baffling why you would run X11 today.
As I mentioned:
> Sure that's convenient but so many apps don't even support X11 anymore.It works fine for what I'm doing. I don't do anything that depends on Wayland.Really? If true, I don't seem to run any of them. I've certainly not noticed anything I've been running over the past couple of decades suddenly stop working on X11. Given that QT, GTK, FTLK, and other cross-platform GUI toolkits support X11, these must be particularly special programs.
> Just as one example[, screensharing...]
Sure, it is a bit nicer to be able to control which windows which other programs can see. I've been watching the slow-moving, many-years-long shitstorm that has been "actually get screensharing that works the way ordinary people need it to". It's been quite a show.
Thing is, I do know that the X Access Control Extension was standardized in ~2006 and updated through 2009 with the aim to make additional fine-grained access control modules [0] easy. I don't know how long it would have taken to use what existed (or even write something new) and update the major Desktop Environments with tooling to manage it... but I suspect it would have taken far less than seventeen years.
> I noticed so many quality of life fixes...
I'm sure that were I 16, I'd believe that I cared very much about that. Now, -mumble decades later- the fanciest things I want are OpenGL and Vulkan support with performance at least on par with what you get from Windows, a window manager that lets me Alt+mouse-button to move or resize a window, functioning global hotkeys that I can command to run arbitrary programs (and that I can permit any arbitrary program to hook into... permanently), and functioning screen-sharing (that can I can permit any arbitrary program to hook into... permanently). And it's so, so silly for me to feel the need to mention anything other than Alt+mouse-button. You'd think that the rest would be "table stakes", but the Wayland development process has demonstrated that many folks disagree.
[0] Ones that could -for instance- prevent undesired keyloggers and screenshot tools
The problem is as soon as you run something new and it doesn't scale properly in X11 you're gonna be making a bug report instead of using what everyone else is using. Currently just with the screen sharing thing it's not even just graphical. There's also updates for Pipewire so you can select the audio output you wish to stream with your video feed. That dialog simply doesn't exist at all in X11. You probably don't even know it exists. It's been feature complete now for YEARS. There's a reason that Valve is using Wayland on SteamOS. It's cause it's feature complete now and they are working on stuff like HDR which won't work at all on X11. I'm guessing that X11 support will start to be dropped in the next few years by major code bases. It's hard for me to even explain some of the bugs I saw with X that disappeared overnight when I switched to Wayland. You talk about OpenGL and Vulkan support but hilariously that's what I'm trying to explain to you has *better* performance now than even Windows.
Just basic stuff wayland has that X11 will never have:
- No screen tearing by default - Proper vsync - Lower latency for input → display - Per-monitor refresh rates (144Hz + 60Hz works correctly) - Fractional scaling is actually correct (no blurry hacks)
Seriously, move on.
> The problem is as soon as you run something new and it doesn't scale properly in X11...
QT, GTK, FLTK, and friends handle scaling correctly. Perhaps in the future there will be a Wayland-only GUI library, but I'm not sure why anyone would bother when there exist Wayland backends for the major existing ones.
> Pipewire
I don't use it. I use JACK2 with a PulseAudio fallback for Steam games and other programs that don't know how to hook into JACK.
> - No screen tearing by default
If you're using an AMD graphics card, the TearFree option gives you this. If your distro hasn't enabled it by default, then it's two minutes work, and work that I did years ago.
> - Per-monitor refresh rates
The rest of your concluding list is just as poorly-informed.$ xrandr | grep -A2 DisplayPort DisplayPort-0 connected primary 3840x2160+0+0 (normal left inverted right x axis y axis) 698mm x 393mm 3840x2160 60.00 + 60.00 50.00 59.94 30.00* 2560x1600 59.94 -- DisplayPort-1 connected 1200x1920+3840+0 left (normal left inverted right x axis y axis) 546mm x 352mm 1920x1200 59.95*+ 1920x1080 60.00 50.00 59.94 59.99 -- DisplayPort-2 disconnected (normal left inverted right x axis y axis) HDMI-A-0 disconnected (normal left inverted right x axis y axis)
You should only ever be using Wayland from now on.
> The games I play are very fast and twitchy, and milliseconds matter.
Out of curiosity, what games are those? I wonder if I also play a subset of them.
> That's because almost all of the big players in that space
To the OP's point-- there are soooo many games nowadays, that if you and your friend group can skip some of those "big players," there are still hundreds of multiplayer games to play.
Even PVP is starting to “just work” via Proton. Arc Raiders runs just fine on Linux and is a strictly PvP game. Over time I think this will be less and less of a problem.
Arc Raiders is a PvPvE game, like most extraction shooters.
Still has an anti-cheat, they just bothered to allow Linux support.
Companies don't do this out of laziness/incompetence, but even some large anti-cheats work on Linux and some games simply choose to not enable it (cough, Tarkov, cough). Their problem, I'm no longer gonna play games that don't work on Linux.
Funnily enough the best FPS game ever (Counter-Strike) runs absolutely fine on Linux. Thanks Valve!
As far as I know, all the anti-cheat options for Linux are not kernel-level, which means that they are drastically less effective at their intended purpose. That's why so many competitive multiplayer games choose to not enable it.
Vote with your wallet, as the saying goes. If you quit paying money for the privilege of installing a rootkit, maybe they'll stop selling rootkits.
Lot of wallets are voting for AC, sadly. Sometimes the tyranny of the majority is a real thing.
BattlEye works on linux nowadays, so there definitely is progress in this direction!
> ...which the Linux wrappers (mostly the wine project) refuse to support for security reasons.
I mean, several of the major anticheats can be configured to work just fine on Linux. [0] It's up to the game dev whether or not it's permitted. So, yeah, unless the game is one where its dev makes huge blog posts about how "advanced" its anti-cheat is (like Valorant or the very latest CoD/Battlefield games) it's quite likely that multiplayer games will work just fine on Linux.
And if they don't, and the faulty game is a new purchase on Steam, then ask for a refund and tell them that the game doesn't work with your OS. Easy, peasy.
[0] I have 100% solid, personal knowledge that Easy Anti Cheat can work on Linux. On Linux, I play THE FINALS, Elden Ring, and a couple of other EAC-"protected" games without any troubles. I have perhaps-unreliable memories that at least one of the games I play uses Denuvo, which is only sometimes used as anti-cheat but does use many of the same techniques as kernel-mode anticheat.
> I have 100% solid, personal knowledge that Easy Anti Cheat can work on Linux.
That's no secret, but the catch is that the Linux version is much, much easier to bypass. That's why some developers choose not to enable it, or in the case of Apex Legends, enabled it but later backtracked and disabled it again.
> That's why some developers choose not to enable it
That's an excuse. It's mostly incompetence or more often than not the company doesn't think it's worth the effort. With more Linux users, the balance will eventually shift from "fuck them" to "we have to figure out a way".
Kernel-level anti-cheats are considerably more complicated to make for Linux for obvious reasons like lack of ABI stability in kernel space.
Well yeah, it always comes down to money. Even on an indie level Linux support is a commitment.
https://reddit.com/r/gamedev/comments/e2ww5s/mike_rose_linux...
Now if you do care about quality, having a committed, technical audience giving quality big reports is a godsend. But that's not where we are this decade rife with layoffs and rampant outsourcing in the industry.
You’re posting an argument from 6 years ago. Not including Steam OS, the Linux market share has almost quadrupled since then (to ~3.2%); including Steam OS, it’s up to ~24%. And continues to trend upwards.
You also don’t need to arbitrarily support Linux. It’s not difficult to say “this has only been tested on Fedora, Ubuntu, POP, and SteamOS; other distributions are unsupported officially”.
> You also don’t need to arbitrarily support Linux.
Right. The most one needs to do is to support Proton and let Valve sort the rest out.
> ...but the catch is that the Linux version is much, much easier to bypass.
Shrug. Rumor has it that the Windows version is already fairly trivial to bypass.
Oh, it absolutely is; if your product doesn't update its EAC bits regularly then it may as well not use EAC at all. Even still, there are known ways around it.
The greatest PvP game, DOTA, works on Linux, and once you get hooked on that you'll never want to play another PvP game.
> oh game XYZ doesn't run so it's not reasonable for gaming
People tend to generalize, but what they probably mean is "it's not reasonable for gaming for the games I play.
I haven't fully switched over yet because the games the combo of the hardware I have + the games I play regularly, still give me issues vs. Windows. Getting them to run isn't the problem, but I haven't been able to solve miscellaneous crashes, lag, lower frame rates, etc.
My next PC upgrade will probably be getting rid of my Nvidia 1660 super and getting something AMD for less headaches.
> People tend to generalize, but what they probably mean is "it's not reasonable for gaming for the games I play.
This. The corollary is also that people take the such quips way too literally.
I, personally, don't play that many games, and those that I do play tend to run faster on Linux (with an AMD GPU, which I bought specifically to avoid nvidia headaches).
But I still game on Windows. Why? Because I still have a Windows box, "because Linux is not reasonable for photo editing". I actually daily drive Linux, but I can't be assed to move from Lightroom and photoshop, so I still keep a windows pc under my desk. I just play games on it because it's much beefier than my 5 yo ryzen U laptop, and since I don't interact with that box all that much, I didn't feel like partitioning my smallish drive for no tangible benefit. My laptop is more than enough for all my other needs.
Ok, if you want to be stubborn about it then leave Windows on a partition and only start it when you want to play that one game. Problem solved.
In many ways, moving to Linux is like starting to live on your own. Your mommy might be a better cook than you, but is that a good enough reason to keep living in your parents' basement?
Win partition will make you want to cry.
Win insists on bootlocker/secure boot, meanwhile most of the Linux doesn’t boot with it or you have to go though hell and back to install unsigned drivers (nvidia, gentle-yall).
I’d all say that Linux is like living in a car with 0 euros and saving up for a house. Simple user can scrape by, but mowing dev work life to Linux is much harder than to Mac. VPNs, inconsistent distro support for weird work stuff and such will make you spend days to weeks of unpaid overtime to get comfortable
Linux can handle BitLocker & Secure Boot just fine. The problem with dual booting in that configuration is rather that every time Linux updates the boot loader, Windows will freak out and stop booting until you enter the recovery key for BitLocker. This can be prevented by first booting into Windows to disable BitLocker until the next reboot and then installing the Linux updates, but in practice I find that I forget about it all the time with my dual-boot laptop (which spends most of its time booted into Linux).
For quite a while I've found that the much easier answer is to have a physical drive per OS and make sure it's the only drive connected during install, or at least one for anything that doesn't play entirely nicely with multi-boot. Obviously there's downsides to that, buying another drive or you might be using something like a laptop which is less friendly to extra drives, dis/reconnecting M.2 drives isn't as trivial as SATA either.
This is a solvable problem and there's even pacman hooks around to do it for you
But also don't blame Linux. Even your comment says the problem is Microsoft. We need to be collectively mad at the right entity if we're going to get them to change. Otherwise they'll keep bullying people and they've found that they can bully people so much it gives them Stockholm Syndrome, where they feel they can't leave.
https://wiki.archlinux.org/title/Unified_Extensible_Firmware...
Bazzite supports secure boot just fine, its actually enabled by default. I'm sure others do too.
Secure boot mainly gets annoying if you have an Nvidia card, since the akmod needs to be self-signed. It's not insurmountable but you have to load your keys into the UEFI before it'll work.
Bazzite builds the Nvidia driver into its kernel, so you don't need to do anything special. Post installation it prompts you to do key enrollment, so all the user needs to do is select "Enroll MOK" and type "universalblue".
I’ll be honest I’m really struggling with this analogy.
Running two systems has cons of its own
Which are?
I've had Windows in one disk and Linux in another for maybe a decade and use the boot selection to pick what I want. Never had a single issue.
Although I haven't opened Windows in months, so I'll likely nuke it soon and give more space for my Linux.
Over the last year or so, nVidia support for the 3+ series of hardware has gotten pretty stable.
With that said, I'm probably going to grab and AMD or Intel card once my 3060 becomes too much of a pain to continue using. It's a little ridiculous that the 5060 gives very little reason for my to update my 5 year old video card.
I only update my rig ever 8-10 years. Saves money though I tend to then play the older games, which is OK for me. I've had a 3080 for 3 years and it still feels like a new card.
FWIW, I've been gaming with a 1660 on Nobara OS for the past 3 months w/o issue.
FWIW my 4070 Ti Super has had zero headaches in Linux. It’s only older Nvidia cards I’ve had issues with. Seems like there was a major driver change starting with the RTX 20xx series.
Up until last week I was running a 960 on mint and had absolutely no problems, nor did I even have to think about drivers. I also have a server running Tesla M10s and they're great too, little more fiddly getting the right driver, but that's moreso on the cards being weird.
Post last week I put in an Arc B580 and I had some issues at the start, but that's more to do with the fact that my workstation has a Haswell Xeon v3... Otherwise it was just turning CSM off.
Linux probably became first-class for them because a lot of ML workflows rely on NVidia in the cloud, and I don't think anyone really uses Windows for that.
It's telling how Nvidia released an ARM driver for Linux, but has not for Windows.
I've gamed since 1979 and have used nVidia on Linux since the early 2000's...without issue.
There have certainly been issues (I've been on Linux with mostly nvidia GPUs since 2004) but it's almost always been caused by the module being outside the kernel, and a kernel update breaking compatibility sometimes, understandably. This has always been fixed quickly on nvidias end though. And early Wayland issues and the current DX12 -> Vulkan translation performance issues in more recent times.
But overall I've also had a mostly stable experience during that time. New hardware is supported mostly at release. Not always supporting all the latest features straight away mind you, but still. Meanwhile I seem to hear about issues with support for Intel and AMD cards at release frequently in comparison.
Again, all these games are available on console (mostly) so the excuse to not support Linux is conscious. Those ARE Linux machines. Essentially. (Yeah yeah, they have their own tool chain and rendering) but if they are using Vulkan, DX12, DX11, and a window - it can run on Linux.
Of course, and it's mostly DRM and/or anti-cheat. The studios want full control over the device running their IP, and they can't achieve that with desktop Linux, but they also don't want to leave the PC gaming market behind entirely to launch exclusively on consoles. Hence why the Windows versions of these games install rootkits on your PC, they aren't cooperating with the PC ecosystem, they are forcibly turning your computer into a locked-down console.
I hope they won't start providing their own tailored heavily locked encrypted operating system versions as a requirement to run their games.
They did, it's called Xbox and PlayStation
Does xbox and playstation run on my custom PC?
and for a good reason, you want an infested cheater to be more a problem than currently bad problem that is happening????
giving a user freedom cause it to make multiplayer game to be more unbearable since its human nature to compete and come out of others ???? who would guess
Technically PS5 and I think Switch 2 is based on the BSD kernel probably because of the license. Xbox is not exactly Windows but it's using an NT kernel.
Playstation is FreeBSD, yeah, but the Switch runs a completely bespoke microkernel. Nintendo did borrow the BSD networking stack, which led some to infer from the license disclosure that it runs a BSD, but it's been extensively reverse engineered now and it doesn't even vaguely resemble Unix.
Interesting, I didn't know that! Thanks.
The fun thing about it being a true microkernel is that although there's zero official public information about it, it was small enough to fully reverse engineer and more or less reconstitute the original source code. You can see it here, it's tiny: https://github.com/Atmosphere-NX/Atmosphere/tree/master/meso...
I’ve been trying to train a model to do this kind of work. Take a black box and try to reverse engineer its functions back into something usable (not necessarily identical). Obviously on things that are out of copyright or copyleft.
I find this mostly applies to the competitive games due to most standard anti cheat apps not working outside win32.
> My next PC upgrade will probably be getting rid of my Nvidia 1660 super and getting something AMD for less headaches.
Then you'll have AMD headaches. NVidia is the only accelerated graphics card fully supported on Linux.
You only get acceleration in AMD if you use their binary-only drivers and they only support cards for about a year.
AMD drivers are now open and in the mainline kernel. They dropped their proprietary driver and now use the upstream MESA stack. Nvidia also still suffers from a 20-30% performance drop on DX12 games on Linux, while AMD does not.
It used to be the reverse as you stated, but that hasn't been true since about 2015.
NVIDIA is currently improving as well! Of course AMD is still the safer bet, but I think things look bright for NVIDIA in the future. The kernel driver was open sourced, and they are currently working on the DX12 performance issues.
Okay, but AMD isn't accelerated. It's godawful slow for anything to do with video, and really you just need an NVidia card if you're doing anything to do with video editing or motion graphics.
The built-in amdgpu drivers are awful, constantly crashy and with very poor hardware support of anything more than a couple of years old.
> AMD isn't accelerated
This is a bewildering assertion.
It doesn't have CUDA beacuse that's NVidia-only and it doesn't have OpenCL unless you use the binary-only drivers, which only work on a handful of very new cards.
What good is it?
You can't use it for editing video.
> CUDA .. nvidia-only
Well, duh.
> unless you use the binary-only drivers
Which you also have to do with nvidia cards.
> which only work on a handful of very new cards
So get a new card?
> You can't use it for editing video.
Yes I can.
----
I actually switched from a 7900 XTX to a 4090 BITD because I wanted CUDA, so I get that angle, but that doesn't mean I go around telling people "AMD isn't accelerated," because it's not true and it's a silly thing to try to claim.
I run both operating systems. But I have to say it either runs the game you want to play or it doesn't. This is especially true if you play games with friends.
> But I have to say it either runs the game you want to play or it doesn't
Can you elaborate on this?
For example, it was convoluted getting StarCraft 2 to run. Then it did eventually work, though it felt ever so slightly laggy.
Anno 1800 ran though it occasionally slowed way down, occasionally crashed, and multiplayer never worked.
Hogwart's Legacy ran but crashed, and ran massively slower / lower quality settings than on the same hardware but in Windows.
All of those were not binary "runs / doesn't".
That's not what I am saying, sorry if it was confusing. The parent was implying that if it doesn't run a game just pick a different game. But I was pointing out that isn't always an option, and some times you just want to play a specific game.
Gotcha - yeah I'm on the same page.
I used Linux Mint for 2 full months, 99% of my personal computing. Really like it. BUT... not all games my gaming group plays work on it, and social gaming is very important to me.
That doesn't mean I'm sour on Linux PC gaming. I think it's great, and will work for a lot of people, and it's so close for me. And I might switch, since my gaming tastes are shifting.
I do understand the premise but … people want to play the games they want to play.
For example I am a good customer for streaming services because I don’t care about specific titles - I will watch a series or a movie because it is available. I will most likely not go through a hassle to watch some specific show if it is not on streaming I already have.
Gaming doesn’t really work like that for me. I usually want to play specific titles - not just some game.
But I fully understand someone has the same approach to games as I have for movies/series.
I'd quote your own example from Anno -> "multiplayer never worked". Thats the "doesn't run" part. I always play Anno 1800 with friends. It has been my experience with linux gaming for a while - anything that involves multiplayer usually doesn't work, either because its just broken (less likely) or because its specifically stopped by the developer (anticheat, etc..). Reality is though, that most mainstream games (as in, biggest player counts and as such, the games most people are playing) do not support linux. If my Valorant or League of Legends or Counter Strike or Rust or ARC Raiders or Marvel Rivals don't allow me to play on linux then the state still is "linux can't really run games yet".
How do you fix this? I dont know - most of these are the developers refusing support because of anticheat or just support overload, but it's insane to suggest that linux works for gaming when the most played games in the world straight up do not work. I'd love if linux was more viable though, can't wait to ditch the slowness from windows.
It's like this. You eventually got Starcraft2 to work. That means Linux can run Starcraft2, it's in the "Runs" category. Games like League of Legends, which have kernel level anti cheat, are in the "Won't Run" category.
But you don't want to sacrifice comfort or other things. The game should work just right on Linux.
I have an Nvidia card and use mostly Ubuntu (mate), also for gaming. It's even a problem now, because I would benefit from a hard divide between the gaming and working\studying system (I have a gaming user in backlog). On Linux it's mostly KSP, Factorio, but sometimes DeepRockGalactic, Valheim, Euro Truck Sim or Warhammer: Total War1\2\3. These games work flawlessly or with <10%fps hit.
There are games that kind of work - Ancestors: Humankind Odyssey, Cyberpunk, Hunt: Showdown. But you lose comfort and I'd rather just play them on Windows, than suffer decreased functionality on Linux. I know that some of it (definitely Cyberpunk) is only because of NVIDIA.
When buying games I usually don't buy Windows only games unless there is a very good reason. And I quit League of Legends and WRC rally because of anti cheat scam. I feel scammed after putting lot of money in a game and suddenly losing the ability to play it.
This shifting of goalposts just to cater to linux just explains it all.
Comeon. If a customer bought a game that says it runs on linux, they should be able to play it on linux well, not just launch it and quit within 5 mins.
I get you have the ideology up in your head, but don't lie and embellish linux to this degree. The attitude just turns people off.
> If a customer bought a game that says it runs on linux, they should be able to play it on linux well
None of those games say they run on Linux.
The fact that you can play most games on Linux these days is due to the Wine developers, Valve, and CodeWeavers. But those efforts are completely unrelated to the developers of those three games. Buying Starcraft 2 is not, in any way, purchasing a Linux game or transferring money to anyone working on Linux support.- Starcraft 2 is available for windows/mac: https://starcraft2.blizzard.com/en-us/ - Anno 1800 is available for windows: https://store.steampowered.com/app/916440/Anno_1800/ - Hogwarts Legacy is available for windows: https://www.hogwartslegacy.com/en-us/pc-specsEvery game I've purchased that actually says it runs on Linux, has worked beautifully on Linux (stellaris and factorio come to mind). Most windows games work beautifully on Linux too, but Blizzard isn't lifting any fingers to make it that way.
Yeah I hope I'm clear in that I'm not "against Linux" or "against people choosing to use Linux." I think Linux is awesome.
And I choose to use Windows for most of my personal computing, due to my gaming preferences, some needs (concussions + poor eyesight means things like scaling and brightness controls and refresh rate matter a lot to me), and my preference for DxO PhotoLab (which isn't Linux compatible.)
"Linux" is really a family of operating systems, so people need to be more specific. It might run perfectly out of the box on consumer/gamer focused operating systems like Bazzite or SteamOS while perhaps requiring more work on something like Red Hat or NixOS. Those different operating systems all have wildly different approaches to how the OS actually works despite generally being able to run a largely overlapping set of programs.
It's like saying something works on "laptop" without specifying whether it's a Thinkpad or a Chromebook or a Macbook.
I can't comment generally but I use NixOS and have had no issues playing games on Steam. The setup was laughably simple, just `programs.steam.enable = true;` and Steam handles compatibility so well that I buy games without thinking "will this run".
Actually there was one thing I couldn't do but this isn't unique to NixOS. I tried to install a GTAV mod that allows you to ride your smart bike trainer in game: GTBikeV. The mod can be installed, but the Bluetooth doesn't work. This is a WINE limitation.
Fwiw I've been playing Hogwarts Legacy lately, though single player. Only problem I ever face is sometimes in a cave if I'm facing a certain direction I'll get blinding light as if I have ray tracing enabled and it's badly implemented. Though considering it's a AAA game and other things I've seen, I don't think that's exactly a Linux problem. Much like Starfield...
I ran Starcraft 2 through Lutrus and it was a piece of cake. No lag that I could discern. There was a little mini launcher and everything. The multiplayer also worked just fine, although the matchmaking system seemed to think I was an expert level player for some reason and kept matching me with dudes who were way better at the game than I was.
To me, this is the one thorn in Linux (and the Linux online community) that gives me pause.
For the people that it just works for, well it just works for.
For anyone else, apparently they are the problem? Not Linux?
Well sorry no. I did get StarCraft 2 working with Lutris... once. Then I couldn't get it to start again. Eventually I switched to running Battle.Net from Steam and for some reason that did work. But it wasn't a "just works" or "piece of cake." It was a puzzle.
Maybe the difference is that I am running Ubuntu? Personally I think it's a common mistake for new users to jump on some obscure distro because they read something online where someone says it's the best. Even if that's true there is value in being on a popular distro in that bugs tend to be discovered and fixed quicker and there's almost always someone who has had the same problem you did and often figured out the solution just a web search away.
I think Canonical and the Gnome foundation have made some really bone headed decisions over the years, but I stick with Ubuntu because the mass of users on it means I never get left high and dry. Or at least I'm not alone when I run into a problem.
Yeah, I was using Linux Mint at the time. Which is based on Ubuntu... So that's often where I'd look for help.
Though any kind of documentation is like Linux, scattered and inconsistent. And I'm "OK" with that, as in I think the way that Linux came to be and is maintained, and provides user choice is also the reason why it's not "user-friendly" in every scenario. You can choose your distribution, and a lot of other things. And then look in a wide variety of places for bug reports, user questions, etc. You'll get a variety of answers from "it just works for me" to "change your distribution that you chose" to "even though some guides say to use Lutris, it's easier to just put it in Steam's external program launcher and choose Proton version x.yz."
Even then, not everything will work because it wasn't written to work (for Linux). It was written to work for Windows, and then some smart people rolled up their sleeves and found ways to make a great many things work for Linux, and it's all amazing. And I find using Linux (mostly) quite pleasant. But when things don't work... there's going to be friction. It will take user effort to find a solution, or a solution might not be found.
And for me personally, being someone who really likes to poke and customize and do things my way, Linux is a blessing and a curse, because I can guarantee I'll hit "weird edge cases" like trying to use the online multiplayer part of a game instead of just single player, or try to use my laptop's brightness controls, but they don't work, or I'll want fractional scaling to work, but it won't. And maybe there's a fix out there, or maybe not. Fixes like "it works for me" or "change your distribution", though, are non-fixes. They just frustrate people. If changing my distribution fixes an issue, how many new issues does it create for me?
Not saying you didn’t experience this, but I’ve definitely run StarCraft 2 in the past, and I play Anno 1800 regularly fine (thanks to the mods I’ve been playing it’s even got 50% more sessions than the base game)
Did multiplayer LAN work in Anno 1800 for you out of the box, or did you make adjustments? I couldn't figure out how to get it to work.
StarCraft 2 worked, oddly enough, run from Steam as an external program. (Lots of search results tried to get me to use Lutris/bottles, but I couldn't get it to work consistently under Lutris.)
In Lutris it'll try to run on Wine 8 by default, I had to set it to use the latest Proton GE.
Was also able to get WoW, Diablo 4, WC3 and SC1 running well this way, since they're all in a single Wine Battle.net install.
I’ve done multiplayer internet play rather than LAN play, but that worked just fine without any changes from my part.
Ah yes that's what I meant. But yes unfortunately I could not figure out how to get multiplayer to connect. No idea why or how to troubleshoot and fix.
They don't mean all games thru all times, they mean "the latest $70 release" that still can have problem if it is multiplayer DRM/anticheat ridden one.
I haven't booted windows in months but there is definitely some caveats for gamers
On the other hand, those who find contemporary game development trends distasteful might find much to like about the fruits of the Debian Games Team's work on game-data-packager.
https://game-data-packager.debian.net/available.html
The games on that list have native ports that can be integrated into the Debian environment just by installing packages, and the game data packages can be automatically generated from each game's official install media.
It doesn't have to be "the latest $70 release", there are plenty of games that are many years old now that still don't work because of KAC.
This. I’d move to Linux in a heartbeat if certain anticheats for certain competetive games had supports for it. (i.e. faceit anticheat)
Play premier instead! I suck and have hugh trust so I never see any cheaters.
how do you actually accept having a rootkit installed on your system?
its easier for me to just have 2 different system for work and entertainment honestly
Even easier is to just give up on competitive games, at a certain point it becomes another job and you know... not fun?
I think Starcraft 2 broke me from this habit, once you're studying various metas rather than having fun you need to take a step back and reevaluate.
Studying metas can in fact be fun for people. It's mentally stimulating.
Many of us play with friends and don't dictate every game in the rotation. My time with my friends is more important to me than operating system purity.
Do the top sellers from the past year work on Linux?
I've been meaning to set up Bazzite on an older desktop.
Basically all games work, except some multiplayer games with kernel anticheat. You can look up the status of games here:
And specifically the state of multiplayer games with anticheat here (which is a much less favorable % of working games):
https://areweanticheatyet.com/
I personally wouldn't install any kernel anticheat on a computer that I intend to use for anything important, so I would personally refuse to install the incompatible games even if I was using windows.
Take ProtonDB with a grain of salt, Apex Legends still has a Silver rating ("Runs with minor issues") despite being 100% unplayable on Linux for over a year now.
"Just trust us, bro! Our security is better than the banks, governments, and major services and we would never let anyone exploit or abuse the gaping hole we're deliberately installing in your security profile! It's just our perfectly secure rootkit that won't ever be used for anything bad!"
It's so weird to me that people just allow this, or even defend it. Game companies should be legally obligated to scale human moderation and curation of multiplayer games, and if you're paying for service that gets moderated and curated, there should be some legal expectation of process - a requirement that the service provider lay out a specific "due process" framework, even if it ends up mediated, that gives a customer legal recourse. Instead, they try to automate everything, which has notoriously indiscriminate collateral damage with no recourse.
If you pour significant chunk of your private time and money into a game, you should be entitled to not arbitrarily lose an account or gameplay progress because some poorly configured naive Bayes classifier decided you did something wrong, without corresponding evidence or recourse to undo bad bans.
For some reason companies are entitled to infinitely expand their reach without concurrently expanding their responsibilities in providing service to individuals. Must be nice.
From Steam’s 2025 top X charts (https://store.steampowered.com/charts/bestofyear/2025?tab=3)
11/12 top selling new releases (the exception is battlefield 6, because the anticheat blocks Linux)
9/12 top selling (COD, BF6 and Apex block Linux)
11/12 most played (Apex blocks Linux)
So if you’re into competitive ranked games (especially fps), you might face problems due to anti cheat blocks, but practically everything else works
Well I used to game a lot when I was younger.
Initially I hated that Linux was so niche in 2005 or so.
Meanwhile now, I don't have time for games anyway. I still think gaming should be better on Linux, but I don't miss Windows anymore either (though I have it as secondary operating system on another computer; I just don't really care about it, it could die tomorrow and I would not miss it one iota).
The only pain point I've found is VR. I've bounced off trying to get it working multiple times with the best results getting about 10% functional (video working on one or two games, input broken on all).
That said, I haven't tried getting the same kit working on windows so I can't say if it's any better.
VR is rough at the moment, but one would hope that Valve is prepping an overhaul for SteamVR on Linux since they're launching a standalone VR headset which runs Linux soon.
I suspect that this might not ship given the recent dramatic change in memory prices.
I ran into the issue where I didn't know that you can tell Steam to always prefer NATIVE LINUX programs over everything over Proton. This was causing a ton of issues with VR, I havent gone back to try it yet though, havent found the time.
It was very broken for a long time. Since fairly recently you have WiVRn (specifically wivrn-dashboard on Arch) for Oculus (more supported though) and I would daresay it works better then SteamVR used to do for me on Windows
Hardware for flight sim games is also in a similar boat. It's hard to configure most of the newer hardware, but a lot of the old low quality joysticks work alright out of the box.
I have both a Reverb G2 and a Pimax both working great via Monado.
That's great to hear as a fellow Reverb G2 user. Starting with Windows 11 24h2 they dropped all Windows MR support. It looks like there's also a driver called "Oasis" now which restores functionality on Windows.
or DRM for old games that check stuff like the cd being present
I have owned the index for a few years, running it on ubuntu/mint. It is a pain. But VR is a pain generally. I go months without using the thing. Then when i do use it some bit of software has been updated and i inevitably have to spend an hour getting it to work correctly again. Honestly, VR on linux feels like using windows again.
VR is bad because nobody cares much about it. The hardware is clunky, the market tiny, and costs great. As the hardware improves it will get more attention from the FOSS community and so too will the overall experiance.
Does a game "run on Linux" when it has 100% feature parity? 90%? 80%? What are you willing to cut? Some performance? A few graphical effects? Multiplayer?
When you look at the details, Linux gaming is not as good as it might seem.
But I'm still gaming on Linux!
I can't remember the last time a game did anything other than run (this is with only trying games that have been documented to work). I think the worst I've had is audio not working in cut scenes in some game, but I don't remember what game it was.
What you sacrifice in feature parity, you gain in user freedom and principle. To me, that is a worthwhile tradeoff. Especially since it's really not that much different at this point. You're not sacrificing much in most cases now. It's really quite remarkable.
> More games run on GNU / Linux than any gaming console.
Not for long. The Steam Machine aka "GabeCube" is also a gaming console, and will run all these games.
You and most of the other people in this thread clearly do not understand what's going on here. I and everyone else you're griping about do not give a shit about Slop Spoogers 7 from 1998 running great on Linux, we care about the games that we play with our friends being playable.
https://www.protondb.com/explore?sort=playerCount
This is what matters, on Windows every single one of these is Native. Switching to Linux will be painful at best until every single one is at least Gold if not Platinum or Native.
Most of the games I play would work fine, but it’s the damn anti cheat and multiplayer games that forces Windows down my throat, and I’m not happy about it. I only use my gaming rig for gaming so I have no other requirements, which kind of makes it even worse.
I play multiplayer games with anti cheat all the time. The only ones that don't run are straight up malware.
I’m not disagreeing, it’s just how certain very popular games operate nowadays. I would never play them on a computer I used for anything but gaming.
There it is, the classic “just change what you enjoy then!!”. Linux will take off when the community stops trying to force new users to conform to the Linux way of life and instead respect that other people have other needs and wants that are valid, and not a moment before.
While I agree it's unreasonable, it's also kind of a chicken and an egg thing. These things won't change until Linux becomes big enough to ignore. I'm not sure what the solution is though, as I don't think it's realistic to make people give up what they enjoy to get there. That's not gonna happen. But Valve has at least made a dent with the Steamdeck and Proton in general, and maybe more with the upcoming Steam Machine. Devs actively target the Steamdeck nowadays for games where it makes sense, so it is taken into consideration at a whole new level compared to years past.
Not much else to do. You either convert people, convert the companies to support Linux, or convert the government into cracking down on whatever makes it difficult for Linux to be supported. The latter is highly unlikely, and the 2nd only cares if people shift their habits.
So there's only one channel left.
Games are more and more consolidating towards services, so it really only takes one game for the lions share of gamers. You can bet GTA V is a big draw away from Linux and that GTA VI will eventually be the same when it hits PC.
As for me, I'm still stuck for professional reasons. I do intend to develop natively on Linux when time comes to make my own game.
I've been using Fedora+KDE for over a decade, Windows 8 was last version of Windows I had installed at home, and we all know what a squarified mess that was.
Gnome is fine, but it's just not for me.
For everyone on here that complains about Windows requiring an 'online' account, MacOS does as well, but the perception is different. MacOS, just kind of quietly does it, with no ceremony, but Windows does a Ballmer-esque right-in-your-face demand. I couldn't possibly comment on Windows 11 as I've yet to use it, but Win10 felt a lot worse than Windows 7 which was probably the last high water mark for Windows after Windows 2000.
Plasma 6 is really polished and simple. I think anyone familiar with windows would be able to grab and run with it immediately.
No hate for anyone that likes other desktop environments, I as a long time windows user just really appreciate how familiar KDE feels.
The familiarity is great but the thing that really draws me to Plasma over Gnome is that the KDE developers seem to have an attitude of just implementing the features people want even if it's not perfect yet. Gnome is polished, but it's missing so many basic configuration options out of the box.
It's kind of funny because when I first got into linux it was practically the opposite story. Back in the day of KDE 2 or 3 and Gnome 2, KDE was the slow one to bring in features while Gnome felt like the wild wild west.
Now it seems like Gnome has gone down a practically walled garden path which I don't love. Last I tried it, I wanted to launch an app focused and in full screen on startup. The gnome response for that was basically "You're not allowed to do that".
To be fair, it has a thousand different settings and you have to manually click Apply to see them in action.
It's flexible and popular, but I don't know that I'd call it simple. It still feels 90s in a lot of ways.
Afaik, you can choose to not sign into icloud when creating an account on your mac. It's not a hard requirement like it is on Windows, though they do obviously strongly nudge you to login to icloud.
I didn't know that. Thanks. Setting up my mac once in 5 years means it isn't a screen I've seen very often!
At least on the latest Sequoia, there has been no hard requirement for an online account. They nudge you towards it, but you can decline and continue. As far as I can remember, macOS has never required an online account to set up a Mac.
You might need it for the App Store if anything, but even then... You don't need the app store for installing software. Mac is at its peak currently, though the new glass UI stuff is a little over the top for me. I miss the old simpler UI. I'm sure I'll get used to it eventually.
I only use KDE, though it has weird instability from time to time. They just changed which gcc version I'm on so I am not sure if I've noticed the same instability or not. Overall though KDE is the perfect DE for me.
The Mac 100% does not require you to sign in to an Apple Account to use it. You can go about with a local account only just fine and you can easily do it right from the OOBE setup UI — no tricks involved whatsoever.
> Windows requiring an 'online' account, MacOS does as well
This has never been my experience. Is that new in Tahoe?
It isn't. There's no such thing in macOS. Local and iCloud accounts are not necessarily linked, never been.
Yes, as pointed out, I was mistaken, but then in my defence, I've only ever set up one Mac, 5 years ago, so I've only seen 'that screen' once.
> if you don't sell your games on Steam or in a way I can run them on Linux I am not buying or playing them.
Agree 1000% and recently Steam Community Support pissed me off so I am now looking into GOG (I have my first GOG game now and playing it), Epic and Luna. In fact, the GOG game I got was free through Luna ironically. Even more ironic, the excellent Heroic game launcher lets you mark the game to show up in Steam, then when you start steam run it from there and it uses the config settings from Heroic but you can use screenshots, etc. in Steam.
The gaming landscape on Linux is great, except for those companies that refuse to support anti-cheat.
I run Kubuntu btw (and Ubuntu since 2006).
PS I keep Snap disabled.
While you _can_ use their launcher, you don't _have_ to. Once you buy the game, you can just download the installer and run it to install on your box. If you want you can save the install package somewhere if you think you'll enjoy the game for years to come and don't want to be dependent on GOG.
I also found out that they have quite a few fairly recent games. Maybe not the top-10 big budget (however they partner with RedProject so they do have Cyberpunk) but they have plenty of solid indy games from 2015 - 2020, and some more recent.
This was me in 2005. I cant believe people say that M$ started to suck in 2025. It always did.
I suspect this is less about when Windows declined and more about individual computing journeys. Early exposure (home, school, work) tends to set a baseline that’s hard to shake.
Microsoft had realyl good engineers and talent. Microsoft internally has gone to shit. They hire an army of H1B's and all the talent has left. Shell of a company on the Windows side that anyone working with them can see. It started a couple years ago, but it's really gone off the deepend and will just get worse. I say this as a windows expert and someone who thinks linux is crap.
>This was me in 2005.
Ha, same. Windows XP for me had a horrible habit of booting into a blue screen randomly after updating video card drivers (happened with both ATI and Nvidia). Trying to do a repair install wouldn't work. The only option was a full reinstall.
Installation from the disk took an hour. Then (if you were going about this the legal way) you'd have to call the microsoft number to register your install, but be on hold for another 30 minutes. Then it was multiple hours of install your favorite video player, reboot. Install video codecs, reboot. Install firefox, reboot. Apply all of your registry tweaks, reboot. Install all your games from CD-ROM, more rebooting. And multiple hours of that.
I moved to linux back in 2006 or so and never looked back. Documented part of the journey here https://net153.net/ubuntu_vs_debian.html
I started using Linux in like 2007 but the GPU was always an issue. Then it was running games. Linux changed for me around 2013+ when I would install it on my laptops and get a heck of a performance boost. Heck those laptops still turn on to this day. Windows just bloats all hardware.
Been on / off Linux for the desktop since about the same time. Recurring theme across my AMD and NVIDIA gpus. Support has always sucked!
Over the years it felt like a game of whack a mole finding the right combination of driver versions, open or closed source. R9 390 owners back in the day will understand... Fast forward to now, the same problems keep occurring albeit better off then they were.
It's been an unfortunate re-occurring issue for me as well. Recent hardware is much better about this, and I too have seen the performance bumps at the cost of software compatibility. I feel like if Adobe brought their CC suite to Linux I'd have no reason to ever use Windows outside the random game that _needs_ it.
> I started using Linux in like 2007 but the GPU was always an issue.
Were you running Nvidia hardware? I've been running Linux since like 2000-ish, have always run ATi/AMD hardware on my desktop machines, and (aside from overheat issues brought on by the undersized replacement fan attached with bread ties to that one board) haven't had troubles. On the other hand, I don't suspend my desktop or servers to RAM or disk, so maybe that has intermittently or always been broken... I'd never know.
I've only had Intel hardware in my laptops, and I can't remember ever having trouble suspending those to RAM or disk.
My first distro I booted from was Ubuntu 4.04.
This was me in 2006 as well. Long live Edgy Eft!
Yes, but it took some time before the suck became so bad too many people started to notice, and those people weren’t tech people.
Most people had never even heard of Linux. It has taken a lot of very bad things on Windows for it to get to this point. It’s classic frog in a slowly heating up pot territory.
>Most people had never even heard of Linux.
My experience is that people fear linux, rather than not knowing. I am the lonely Linux user since c. 2005, and people see half my screen is always a console, the other half a browser. So they fear linux is for console wizards, not for regular users. Nothing will convince them otherwise, even when they are 100% of the time using online webapps. I have some coworkers using browser + VS code + WSL2 all the time, but they don't switch because they fear the console-to-config-everything instead of Control Panel.
I don't know, man. In my experience, people make no difference between "windows" and "the pc". I think the vast majority of "regular people" have no idea there are alternatives to "windows", other than "macs".
So much of it is a problem of execution. If people could use Linux without ever having to know what a terminal is (much like the average Windows user doesn't know what PowerShell is), then it would actually be quite successful. It has gotten better over the past decade, but it still suffers from endless paper cuts and the odd issue that requires a shell session to fix. I will say that Valve's SteamOS has come the closest to avoiding this trap. You can use a deck without ever having to touch a CLI.
> Steam and Proton work perfectly
I am a hardcore DayZ player. DayZ does not work on Proton[0]. I cannot use Linux as my main gaming platform. Battlefield 6 does not work. Latest Call of Duty does not work. You can talk about voting with your wallet, but when millions of people are buying the game, your one non-vote means nothing.
So either you punish yourself and refuse to play with friends, or you punish yourself and install windows. It’s a damned situation regardless of your choice
[0] point me to as many compatibility databases as you want, the game will not start on my vanilla Ubuntu build
This is really just a subset of competitive shooters. Arc Raiders, The Finals, Hunt Showdown, Halo Infinite all play fine.
I have a Windows drive for Battlefield but I stopped booting into it after interest in the game waned.
Playing on console is also an option. Most games allow you to alternate between keyboard/mouse and controller. Discord works fine, and every game is cross-play.
Oh ok. I’ll just stop playing my favorite games
If it doesn't work for you, it doesn't work for you, but don't assume everyone values DayZ over control of their system.
Sounds like it might be an issue with your setup, considering that other people have no problems running it. Hard to tell what the problem is, but definitely a frustrating situation.
Isn't call of duty the game where Nikki Minaj shoots the cat from the Simpsons? I think I'll pass.
That’s Fortnite and while you can pass, I don’t want to pass.. I want to play it!
this is why a lot of people run arch and why valve based steamOS on arch instead of debian as the previous version was, you need a newer kernel and other packages to really play games on linux with the least friction possible
thats not a kernel issue its an anti cheat issue. No kernel except the windows kernel is going to allow him to play battlefield and cod.
Yeah, yay works until it doesn't anymore, because the pacman library dependency it uses was updated but yay was not... and then you need to recompile yay manually. I mean, I'll still use it (or rather paru, which works basically the same way), but it's very annoying, when it happens every few months.
You can download a precompiled yay/paru from their Github pages btw.
I don't understand, yay updates itself. I've never once had this problem.
That's assuming you do system upgrades through paru/yay. However, you may not want to upgrade the packages you've obtained from the AUR and so you upgrade using pacman. That may cause the updated libalpm to become incompatible with the installed yay/paru.
yay used to be in the official Arch Linux repository for some time, wonder why it was removed.
Iirc it was to force the extra step necessary for the user to acknowledge that the AUR can bootstrap malware if used blindly.
This seems to be a relatively consistent discussion surrounding AUR helper development; for example, adding UX to incentivise users to read PKGBUILDs, lest the AUR becomes an attractive vector for skids.
No one wants the AUR to become NPM, and the thing that will incentivise that is uneducated users. Having the small barrier of not having helpers in the main repos is an effective way of accomplishing that.
https://wiki.archlinux.org/title/AUR_helpers
AUR helpers like yay are not supported officially. The other commenter sheds some light as to why.
Assume they mean having to recompile the AUR package they were trying to install using yay.
If users mental model is mostly "yay is like pacman but can also install packages from AUR the same way" wihout thinking deeper about the difference then I think it using it is very risky and that you should just stick to pacman + git/makepkg. Only consider helpers once that's become second nature and routine. Telling people to "just yay install" is doing them a disservice. An upgrade breaking the system isn't even that bad compared to getting infected with malware due to an old package you were using being orphaned and hijacked to spread malware or getting a bad copycat version due to a typo.
I think EndeavourOS is doing users a disservice if they provide sth like yay preinstalled and ready to use out of the box. It isn't installing packages from a shared repo: It's downloading code from arbitrary locations and running it on your machine in order to produce a package. Being able to read and understand shell script (PKGBUILD) is kind of a prerequisite to using it safely.
> idk why Arch doesn't invest in whats standard in every other major distro
They largely have now, archinstall.
It's still text based/TUI but it's pretty simple and intuitive, anyone already familiar with installing a Linux distro (especially any sort of -server variant) will be comfortable with the archinstall script.
Came here to say this. Archinstall rocks.
Regarding why Arch doesn't "invest" in a graphical installer, it's worth mentioning that Arch's installation image has a different design philosophy than most installation media.
The image is a fully functional arch environment that copies the entirety of its contents to RAM on boot, giving you special installation opportunities such as the ability to install Arch to the same flash drive that booted the installer. Having no graphical dependencies lets this image remain small enough to pull this off, as well as allowing for fully remote installations over SSH out of the box, since archinstall is a TUI.
I don't believe there are any serious technical obstacles to providing a graphical installer in something like an initramfs environment. Many distros do provide graphical installation mechanisms using PXE, which loads the kernel and installer-initramfs over the network (and is similar in the sense that it won't touch local storage unless you tell it to)
I don't have a way to quickly around to check, but I thought the arch install media used squashfs? In which case I wouldn't have thought it was safe to blow away the backing store.
> anyone already familiar with installing a Linux distro (especially any sort of -server variant) will be comfortable with the archinstall script.
To be fair, thats not _generally_ the audience we tend to think about when we talk about the enshittification of Windows. We're usually talking regular consumers / computer users and "gamers" the latter of which is a wide range of people that can fend for themselves with instructions to people that cannot.
Fair enough, but I wouldn't generally direct that audience to vanilla arch linux as "gamers first distro" anyway.
I'd direct them to something like Bazzite (Immutable), or CachyOS for staying arch-based but providing a GUI installer and tools, Endeavor OS, even Fedora, etc.
Agreed. I know in some circles it's a meme, but if the Steam Gaming Console actually makes a debut any time soon, I think we'll see more of a jump from the "Gamer" crowd away from Windows. My (some say naive) hope is that it will make game devs try to design games that aren't only locked in on Windows and have more Proton support.
It's really a (good IMHO) sign of the times that us old hats have to remind ourselves that most new comers to Linux today aren't necessarily adept at installing another OS, let alone using the command line. The first time I installed Arch was maybe four years ago, but the very first dual boot setup I made was between Win 3.1 and OS/2 2.1 in 1993 when I was 10, and I've been playing with Linux since the mid-late 90s. When I first installed Arch the "hard way" I said to myself--"I don't understand why it has this reputation... this is all stuff I've done before countless times." Frankly, I'm still trying to figure out the distribution graph of Linux knowledge and how to engage with different skill levels.
I agree. I also think that not everyone (I couldn't say if this is generational, I see this among peers sometimes too) has the same appetite for problem solving. People hit a problem or a wall and say "So I tried X and now I see Y. I dont know what to do" and then they just sit there. The reason that LMGTFY and RTFM come off as "elitist" is because people are frustrated by others' willingness to just "stop trying" whenever they hit a road block.
Not that this is going to matter to you because you've left Windows behind, but I refuse to buy License Keys any more and I try to steer people away from buying "Gray Keys" to avoid the ridiculous costs. Using the MS Activation Scripts[0] is the much better go-to.
[0] - https://massgrave.dev/
Given the push to monetize user data it seems Microsoft is demphasizing their focus on key piracy. I bought a computer with a 55" touch screen. The company selling it said it was a Windows 11 computer. The computer was a 14 year old Intel CPU/Mobo that was never designed to run Windows 11. The company selling it had hacked Windows to run on this old computer. They didn't have a license key. I report it to Microsoft and crickets. The company ghosted me on the issue. In 2003, with XP in it's prime, they were cracking down hard on piracy... now it's part of the business model...
Absolutely. I would also think that the amount of money "lost" on license keys specifically on the "regular consumer" side pales in comparison to the data that they get once you're on their operating system. How many non-power users bother with disabling telemetry and other data that MS collects through their operating system? How many people bother configuring a Local Account? All of that is probably worth way more than a ~$200 license key.
On the business side, businesses make it a focus to be in compliance with licensing agreements so they still see whatever oodles of money from companies that have fleets of computers that run Windows.
Because its supposed to be stripped down. To serve as a base to create things like Endeavour, Manjaro, or Cachy.> idk why Arch doesn't invest in whats standard in every other major distroThere's still a lot of utility to doing things the hard way. I do suggest people that want to actually learn Linux install Arch and live in the terminal. You learn a lot very fast because you're forced to. But it's not for everyone and that's totally okay too. That's the beauty of Linux after all. That's the beauty of computing. You can't build a product for everyone but you can build an environment that can become what anyone needs.
But I'll second your point. I've been on Endeavour on my main machine for about 3-4 years now and only had one problem where I just got a mismatch in a new kernel and new Nvidia driver so I couldn't load the desktop. Easy rollback (from the cache) and a day or two later the issue was solved so I could upgrade without a problem. Took no more than 10 minutes to solve and that's the worst problem I've had the entire time. I will also give the advice that if you have an Nvidia card give your boot partition like 5GB instead of 1GB
I don't think it is because they can't do it or that they want to be a base for other distros. They simply let the user choose what the user wants. And if you don't know what you want then you learn it.
I switched to arch 15 years ago to learn Linux. And it is by far the best way to understand it.
Having used Arch I can easily maintain almost any distro out there, but it doesn't work the other way around.
I think this is an important thing to recognize. It's exactly why I tell people that want to learn Linux to do it (but not people who want to use Linux). The struggle is real, but the struggle is part of the learning process. The truth is that distros are not that different from one another. The main difference is in the package manager and the release schedule of their package databases.> Having used Arch I can easily maintain almost any distro out there, but it doesn't work the other way around.I'd also like to tell any Linux newbies, the Arch Wiki is your best friend. It doesn't matter if you're using Ubuntu, Mint, or whatever. The Arch Wiki is still usually the second place I go to for when I need help. The first is the man pages (while there's some bad documentation out there it is quite surprising how well most man pages are written. Linux really has shown me the power and importance of writing good documentation)
I switched my gaming laptop over to CachyOS (which is more or less "Arch with some good defaults for gaming and a curated runtime environment") because I literally couldn't play Stellaris on my $1800, year-old gaming laptop without regular hard crashes that locked up the entire system and required a hold-the-power-button-down hard reboot. This is apparently a rare but known issue on the Paradox forums, affecting many of their games, and it seems to be due to some problem with the 24h2 windows update on some machines, but there's no clear resolution. Eventually I got mad enough to just pave my entire gaming laptop and switch wholly over to Cachyos.
Since switching, I have not experienced a single problem with Stellaris, even running larger galaxies in longer games with more mods. I haven't had any compatibility issues or bugs or anything with my other games either. It was so painless that I switched my desktop over as well, and I no longer have a windows device. I've been really pleasantly surprised by how many games support Linux now.
I installed fedora yesterday. Instead of steam i am hoping that GOG with heroic games launcher will work nicely. Idk, I want to support drm free software so if it's on gog, I buy it there.
[obviously YMMV, take me with a grain of salt etc]
I actually tried Fedora first (thinking dev-first workflows) but ended up switching to Ubuntu w/x11 for gaming. A lot of that had to do with Fedora's release schedule (rather than Ubuntu's 2-year LTS) breaking working GOG/steam/wine-based apps on a rotating basis. Since switching to a defaults lifestyle / Ubuntu with x11 I deal with NVIDIA driver compatibility issues every 6 months or so instead of once/month. The 22 -> 24 upgrade was better than I expected and I didn't lose more than a couple of hours of life to appease the shell gods.
In any case Fedora and a once/month problem would still beat the Windows update nonsense, which I am still supporting since my spouse hasn't switched yet :/
I've used Ubuntu since 2006 and started using Kubuntu (I prefer KDE) about 2 years ago. Ubuntu (or Kubuntu) are very solid for gaming. It puzzles me how often I see highly customized distros like Bazzite and CachyOS touted for gaming after looking into some of the wild tweaks those distros do; it's amazing to me that they run at all.
PS I keep Snap disabled.
What wild customisations are you talking about?
As someone who used Linux (Ubuntu, Fedora, OpenSuse, Arch) exclusively from 2010 and recently moved to bazzite, I only see positives from the switch.
Most of my usecases work OOTB, and for everything else I use a container workflow. I like that there are fewer ways to mess up upgrades. I like that flatpaks are well integrated.
Fedora Silverblue user here. Lutris (from flatpak) can play GoG games fine (*).
(*): Apparently achievement support even on single player games requires the gamestore client (GoG client in my case) and Lutris doesn't support that yet. Am old enough to not care :p
> idk why Arch doesn't invest in whats standard in every other major distro
It could be a deliberate measure to set the bar high and filter out people who don’t want to troubleshoot themselves.
The arch advantage is that your system gets setup exactly how you want it and you have to consciously choose the software set you want to work with.
That minimalism is somewhat the point of the OS.
> (idk why Arch doesn't invest in whats standard in every other major distro)
Trust me it was far more involved of a process 10 years ago, and that's why people liked it.
The modern install process is paired down to something like 10 steps. Start the ISO, configure your partitions, mount your root and boot, and use the delightful arch-chroot tool to enter and install in those partitions. Set up your user, configure your boot manager, exit the chroot, reboot, remove the install media, and boot into your bare bones system.
The install ISO has all the networking drivers and other tools you may need to bootstrap your new install, you just need to remember to do it. It's obviously not for total newbies but it's no gentoo, lfs, or even old arch.
My first distro was Slackware, which I setup all myself. I just don't see any true value in what could be a simple GUI.
I'm currently on Pop, but have an install of Cachy ready once I have some time and a stable connection. My main gripe of Pop (other than the COSMIC issues) was mostly audio issues with how they set up PipeWire and regressions with some releases. Do you find Arch to be a bit less of a headache when dealing with drivers?
There actually is an installer for arch. I haven't tried it myself, but there should be an application included on the ISO called archinstall which helps with basically everything that's part of the install guide on the Arch wiki.
I use Arch but don't want to fiddle with stuff anymore.
Installing via the archinstall command was pretty easy. Not quite as easy as a Fedora or Ubuntu install, but for someone familiar with Linux, it's negligible.
Yeah, my last arch install was my last. It was fun to rice my system and setting up everything from scratch 4 or 5 times taught me a lot about operating systems and computers. Ultimately my setup is not significantly different than any other distro, it's just that I installed the packages and did configs myself. I'll be fine with a minimally riced system, if I ever even need to install an OS again.
Mu gaming needs are more than fulfilled with Steam/Proton and Xbox Live. Both of them work in Linux (Mint is my flavour of choice) and Mac.
> I know its a "meme" to talk about how great Arch is, but when you want the latest of something, Arch has it
I love my Arch installs to death, but I feel like I'm the oddball out about the mess that is AUR. The main repositories have a lot of things but I always end up getting pushed to AUR and then it just feels like I bolted on a hack rather than pacman/the arch base just supporting AUR more like a different package source normally.
My spare PC runs Win10. Was able to install it without internet and thus get an offline account.
Since they stopped full updates for it, it's a lot less annoying. Almost all the nags were at reboot time, usually triggered by the update giving it a new thing to nag about. Only thing now is it'll ask me once a month about either OneDrive or Win11, which is bad but tolerable.
What made you switch from Pop OS? I just installed it on a couple of old PCs I had lying around for my kids to play around with/learn from.
There was some 3D printer slicer software I needed that wouldnt run, when I finally figured out why it had to do with GLIBC being out of date. I have used Debian since like 2008, and Ubuntu since the mid 2010s so I am accustomed to doing PPA's and what not, but something in me broke and I wanted to finally try something more bleeding edge. I nearly went for Fedora but the version I wanted to try didn't even boot (I don't like to waste any time with command line incantations anymore) so I looked up EndeavourOS I don't remember how I found it, I think a friend said someone they knew used it (turns out they dont LOL) so I gave it a shot.
I had bad experiences with Arch before because of Manjaro, but in hindsight, I think the main issues I had were more to do with how Pacman can get insanely nuanced. When you update packages you have to know what you're doing, it will update all weird, its not like Debian or Ubuntu upgrades where it installs / uninstalls what you do and don't need unless you tell it to be that nuanced.
Long term stability is less important for gaming computers than having the most cutting edge (and theoretically highest performance) drivers. That's why the community leans so heavily towards arch.
Probably same reason most folks who are capable of running Linux don't stay on Ubuntu, etc.
I'm genuinely curious as to what the key differences are (especially those that would cause someone to switch), as someone who is pretty tech savvy but whose use of Linux as a daily driver is admittedly pretty weak.
You usually try a few distros, until you find the one that does whenever you needed, and then you stick with it for 15 years ;)
From my own experience: 15 years ago, when (except for academia), Linux was very nisje, it was hard to use it. Random rare errors would pop up. On Windows you would know someone who knew what to do, but with Linux? So I chose Ubuntu, because it had the most support. Solution to any error could be found on askubuntu (?) forums. But if you had a friend, you would choose his system and get help from him. I once had university admins very happy to help me with something and even give me some tips.
Nowadays it really doesn't matter that much, other than extra easy (with an LLM everything is already easy) installation of drivers (POP os?)/initial programs you used on Windows (on Mate it takes 10min due to a special GUI appstore).
BUT there are reasons to switch. Like Ubuntu's pushing of very annoying snaps, making it very hard to get Firefox without a snap. Snaps are annoying, because they don't have a cleaning mechanism and old versions just clog your hard drive. They take forever to launch and it's just not a good idea for a browser. Don't mind snaps for other things. There is also Desktop Environment support and support for hidpi monitors and such.
Other than that, there is a little of philosophy. Like super FOSS and idealistic like Debian (i guess? Pls correct me if I'm wrong). Or more business aligned, like Redhat/Fedora. Or elitist that like to waste their users time and make them read manuals for fdisk like Arch, where you have to format your hard drive without GParted or any other GUI.
I'm no pro, but that's a little that came to mind if you wanted to know what mattered in the past.
not OP, but for some it might be availability of latest versions packages (say, you've heard about new major version of Bash or Vim being released today, and wondering how soon it might be available in your distro packages), and, as someone else mentioned, less update stress due to lack of "major version bumps" - just remember to subscribe to https://archlinux.org/news/ and watch out for entries requiring "manual intervention".
I would say EndeavourOS is the "Ubuntu" to "Arch" if you will. The installer is easy, and it comes with "yay" out of the box which is a frontend to Pacman which holds your hand in just the right ways. If I want to update my OS I type "yay" into a terminal, hit enter and confirm the packages needing updating (or select which ones I want) and type my password, and that's it. In the past with Manjaro I did a system update with Pacman, and problems ensued.
Folks capable of running linux pick the best distro for the job at hand. They are tools, there is no progression like you're implying.
My homeserver is Ubuntu, my gaming PC is Arch.
If I am understanding correctly, you were using Windows without a licence? I think that's more the problem here, as Windows does provide a way to have offline accounts, you just didn't want to pay for it.
I would love to switch from Mac. But Mac hardware is so resilient & haven't seen that in PC world.
I just got a new work laptop: the ThinkPad X1 Carbon gen13. It's gorgeous: weighs a bit over 900 grams, has an amazing matte OLED screen, Intel Lunar Lake that sips power (1-2W idle) and is fast enough to compile Rust if needed, amazing keyboard, touchpad is great but I just use the trackpoint, everything works from the box on Linux (they even deliver it with either Fedora or Ubuntu, but I installed CachyOS).
Suspend: works always. Battery life: great, the whole day. Wifi: works always, connects fast, works fast.
The build quality is really nice, especially the carbon fiber body that doesn't feel so cold/hot to touch.
If you have older Mac (based on the Intel CPUs), then it may actually already work out of the box for you to run linux. I'm running Debian on Macbook Pro 2015, fully replaced the original system and I haven't looked back.
Dell, HP and Lenovo have been phenomenally resilient for us, going back more than 2 decades.
You can run Linux on Apple Silicon with Asahi Linux
There's a whole lot of asterisks that you're leaving out of that statement.
M1&2 yes with slight caveats, m3-5 not really (at least yet)
What do you mean by that? As a long term windows user I've never had any issues running my laptops and PCs for years and years.
> gave up with Windows 10 because you needed Windows Pro in order to make an "offline" account, I spent $2000+ for a gaming rig,
If you are spending 2000 for a gaming rig, a pro windows is like $200. Makes no sense.
Also, Apple is no better than Windows, so your post doesn't make sense.
The only game I regularly play refuses to pay their anti-cheat for Linux support. After Windows 10 support ends, my gaming days are probably over.
> idk why Arch doesn't invest in whats standard in every other major distro
Simplicity, among other reasons. Installers force the users hand and need maintenance. Having no installer but rather a detailed installation guide offers unlimited freedom to users. Installation isn't difficult either, you just pacstrap a root filesystem and configure the bootloader, mounts and locale.
ArchLinux does now have an installer called archinstall, but it's described more as a library than a tool. It allows you to automate the installation using profiles.
Just to paint an example, if I am installing Arch I like to have:
* A user configured through systemd-homed with luks encryption
* The limine bootloader
* snapperd from OpenSUSE with pacman hooks
* systemd-networkd and systemd-resolved
* sway with my custom ruby based bar
* A root filesystem in a btrfs subvolume, often shared across multiple disks in raid0
If you were to follow the installation guide it will tell you to consider these networking/bootloader/encryption options just fine. But trying to create an installer which supports all these bleeding edge features is futile.
Also if you want 'Arch with sensible defaults' CachyOS is basically that, people think of it as a 'gaming distro' but that's not an accurate characterisation. I use it as a daily driver on my personal machine mostly for non-gaming work and it's an excellent distro.
There is though the TUI installer, not like it used to be where the commands were typed in following the wiki. Not that there was anything wrong with the 'manual' mode, it gave you insight into the basic building blocks/configurations right from the start.
It's been a very long time since I moved to Arch, but I swear that something like 12 years ago it did have some form of menu-driven installer.
Nowadays, there are so many ways to partition the drive (lvm, luks, either one on top of the other; zfs with native encryption or through dm-crypt), having the efi boot directly a unified kernel image or fiddle with some bootloader (among a plethora of options)...
One of the principal reasons why I love Arch is being able to have a say in some of these base matters, and would hate to have to fight the installer to attain my goals. I remember when Ubuntu supported root on zfs but the installer didn't it was rather involved to get the install going. All it takes with Arch is to spend a few minutes reading the wiki and you're off to the races. The actual installation part is trivial.
But then again, if you have no idea what you want to do, staring at the freshly-booted install disk prompt can be daunting. Bonus points for it requiring internet for installation. I would have to look up the correct incantation to get the wifi connected on a newer PC with no wired ethernet, and I've been using the thing for a very long time.
> One of the principal reasons why I love Arch is being able to have a say in some of these base matters
Exactly, Arch allows you to do many bleeding edge things. An installer would never keep up are give you that freedom.
> I remember when Ubuntu supported root on zfs but the installer didn't it was rather involved to get the install going.
That's why many installers allow you to drop a shell when it's time to partition.
> I would have to look up the correct incantation to get the wifi connected on a newer PC
To be honest that would largely be helped if archiso would start using NetworkManager
>It's been a very long time since I moved to Arch, but I swear that something like 12 years ago it did have some form of menu-driven installer.
Yep, removed in 2012 as the last maintainer quit. Maintaining an installer seems like one of the least fun hobbies.
What kinda graphics card do you have in there? I’m considering building one soon.
With the absurd price of RAM and flash storage (and still-fairly-high price of video cards) now is quite a bad time to purchase a new computer.
Having said that, I'm not the OP, but I currently have a Radeon 9070 (non-XT), and previously had a Radeon 5700 XT. Both work great.
How do you figure out Arch but not OOBE?
> I gave up with Windows 10 because you needed Windows Pro in order to make an "offline" account
You quit without even trying.
Windows 10 and 11 can have all local accounts by using RUFUS to create the install media. And if your PC comes with Windows already installed, there are still commands - even with 25H2 - that allow you to bypass the Microsoft Account requirement during setup and configure a local account.
And this is with both Home and Pro.
Pacman -syyu
One command. That's why I won't use arch. This one command will fuck your system up, but only if you wait to long in-between doing it.
You spent $2000 on a new machine but wouldn’t shell out another $20-30 for a windows pro key? You’re willing to burn a bunch of time fiddling with getting a completely new operating system setup, but you’re not willing to spend a few minutes fiddling with setting up an offline windows account?
I get that maybe that was the final straw or something, but come on, “I switched to Linux because I didn’t want to take an hour to set up Windows” really sounds like you never really wanted Windows in the first place, you were just looking for an excuse.
The main difference, in my opinion, is that to set up Linux one doesn't need to work around the expected behaviours of the OS.
And why would anyone put so much effort into making Windows usable now, when there is not knowing what Microsoft will do next?
''By the way, I use Arch''
The meme was “I use Arch, BTW,” but I think it has mostly died as enough people have pointed out that Arch isn’t really hard-mode Linux or something. It is a barebones start but
1) very stable due to rolling-release producing small changes
2) the skill barrier to getting a full system is “basic literacy, to read the wiki”
Eventually I switched to Ubuntu for some reason, it has given me more headaches than Arch.
> 1) very stable due to rolling-release producing small changes
Having very frequent updates to bleeding edge software versions, often requiring manual intervention is not "stable". An arch upgrade may, without warning, replace your config files and update software to versions incompatible with the previous.
That's fine if you're continuously maintaining the system, maybe even fun. But it's not stable. Other distributions are perfectly capable of updating themselves without ever requiring human intervention.
> 2) the skill barrier to getting a full system is “basic literacy, to read the wiki”
As well as requiring you to be comfortable with the the linux command line as well as have plenty of time. My mom has basic literacy, she can't install ArchLinux.
ArchLinux is great but it's not a beginner-friendly operating system in the same way that Fedora/LinuxMint/OpenSUSE/Pop!_OS/Ubuntu/ElementOS are.
> Having very frequent updates to bleeding edge software versions, often requiring manual intervention is not "stable". An arch upgrade may, without warning, replace your config files and update software to versions incompatible with the previous.
12 in the last year if you used all the software (I don’t many people are running dovecot and zabbix), so probably actually like 3 for most users: https://archlinux.org/
That’s not too dissimilar from what you’d get running stable releases of Ubuntu or Windows. And of course plenty of windows software will auto upgrade itself in potentially undesired ways, windows users just don’t blame the OS for that
I don't just mean the types of manual intervention mentioned in the news. ArchLinux ships bleeding edge software to users with very little downstream changes. ArchLinux also replaces config files when upgrading. This is inherently different behavior from stable release distributions like Ubuntu.
ArchLinux is not an operating system where you can do an unattended upgrade and forget about it. That's not "bad" or "good", that's just a design choice.
https://wiki.archlinux.org/title/Frequently_asked_questions#...?
Arch replaces _unmodified_ config files when changing. It’s not an uncommon behaviour in software to update defaults to the new defaults.
If you have a modified config file, it puts the new default one in a .pacnew file for you to compare, which seems strictly better to just deleting the new default one.
Huh you're right, I must've confused myself by removing/installing instead of upgrading recently.
Anyway I think the discussion boils down to semantics. ArchLinux is not "unstable" in the sense that it is prone to breaking. But it also delivers none of the stability promises that stable release distros or rolling release distros with snapshotting and testing like OpenSUSE Tumbleweed deliver. To call ArchLinux stable would make every distribution stable, and the word would lose all meaning.
Most distributions promise that an upgrade always results in a working system. Instead moving the manual maintenance to major release upgrades.
> Having very frequent updates to bleeding edge software versions, often requiring manual intervention is not "stable".
I dunno. I have an arch installation that is maybe 4 years old, I might update every few weeks, and have only had one issue.
Any issues are usually on the front page of archlinux.org what the issue is, and how to fix it.
> without warning, replace your config files and update software to versions incompatible with the previous.
This is just nonsense, pacman doesn't do this. If you'd modified a config file, it will create a .pacnew version instead of replacing it. Otherwise you'll get the default config synced with the version of the software you've installed, which is desirable.
It's pretty rare to modify any config files outside of ~/.config these days anyway. What few modifications I have at the system level are for things like mkinitcpio, locale, etc and they never change.
> very stable due to rolling-release producing small changes
Can you elaborate on the chain of thought here? The small changes at high frequency means that something is nearly constantly in a <CHANGED> state, quite opposite from stable. Rolling release typically means that updates are not really snapshotted, therefore unless one does pull updates constantly they risk pulling a set of incompatible updates. Again, quite different from stable.
It's the same train of thought as the modern cloud software notion that deploying small changes more often is safer than bundling "releases"; if you upgrade 3 packages 3x a week (or deploy 50 lines of code 3x a week), you catch small issues quickly and resolve them immediately, rather than upgrading 400 packages 1x a year (or deploying 50,000 lines of code 2x a year), where when things break you have a rather tall order just to triage what failed.
I think there are advantages to both, but I will say that I've found modern Arch to be quite good. The other huge benefit of Arch is the general skill level present in the user base and openness of the forums; when something breaks it's usually easy to google "arch + package name broken" and immediately find a forum thread with a real fix.
I don't think I'd use Arch for a corporate production server for change management reasons alone, but for a home desktop and my home server, it's actually the distribution that's required me to do the _least_ "Linux crap" to keep it going.
It’s stable in the way that a person taking small predictable steps at a time is stable compared to somebody who making large random lurching steps. Sure, the system is often changed, but if only a few packages have changed, should there be a problem it is easy to identify the culprit.
Although it is hard to say. Ubuntu also has, I guess, intentional behavior that is hard to distinguish from a bug, like packages switching from apt to snap. So it might just be that my subjective experience feels more buggy.
I think op meant the subjective feeling of having a system that runs in a stable manner. I don't quite follow their reasoning either (maybe the smaller changesets expose compatibility bugs before affecting general ux?), but I agree that arch was a joy for me to use and felt "stable".
>the skill barrier to getting a full system is “basic literacy, to read the wiki”
if GenZ knew how to read they would be very disappointed right now
in the age of tablets and tiktok, basic literacy is quite a big ask
It really is nothing new. People quickly close windows with errors, they go out of they way to avoid reading actual message.
That's what they said about GenX, Millennials, and probably every other generation before them. Something something, "OK boomer."
they absolutely did not say that tiktok and tablets are destroying basic literacy about GenX or Millenials
if anything, they said the kids were good with technology
Yeah cause that's what he was talking about
I know. I was emphasising that this time is not like before. That there are major differences, and things look similar only on a very superficial level.
If ubuntu had stuck with APT for software installs instead of snap and whatever else, it would be a lock less headachey
I've started my Linux journey a decent year ago. It's been fun but I'm happy that they're such a great community to troubleshoot along with me. Never tried Arch but I do love a barebones no fuzz system.
> idk why Arch doesn't invest in whats standard in every other major distro
Because Arch maintainers are a bunch of elitist gatekeepers that don't accept any level of knowledge that is lower than theirs. You can see that through every forum interaction generally and any discussion about the installation process specifically.
Arch is great btw. It could be greater, if all maintainers would quit.
As a long-time Linux user who fairly recently dropped the Windows partition entirely, I do think the remaining chafing points are these:
* UI framework balkanization has always been, and remains a hideous mess. And now you don't just have different versions of GTK vs QT to keep track off, but also X vs Wayland, and their various compatibility layers.
* Support for non-standard DPI monitors sucks, mostly because of the previous point. Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.
* Anything to do with configuring webcams feels like you're suddenly in thrown back 20 years into the past. It'll probably work fine out of the box, but if it doesn't. Hoo boy.
* Audio filtering is a pain to set up.
> UI framework balkanization has always been, and remains a hideous mess
I thought you were talking about Windows there. There are 4 (5?) different UI paradigms within Windows, and doing one thing sometimes requires you to interact with each of them.
At least on Linux, with GTK/KDE, you can pick a camp and have a somewhat consistent experience, with a few outliers. Plus many apps now just use CSD and fully integrate their designs to the window, so it's hopeless to have every window styling be consistent.
I never had to mind X vs Wayland when starting user applications tho.
If we're talking about mass adoption of Linux then there really has to be no concept of even "picking a camp". The vast majority of users - even techy people - will not understand what a window manager is, never mind be capable of choosing one.
Yes, there are many UI implementations in Windows but they are almost totally transparent to the user (no pun intended), and they can all run on the same system at once.
Hard disagree. You can run the same programs on any DE or Window Manager or even without one (on pure X11 for example). That's not a hurdle it's a feature.
Users who don't know about the feature can just use a pre-configured system like Mint Cinnamon and never know about any of these things.
Yeah I wanted to say, for people who don't care, there's Linux Mint. (I used to spend all my time tinkering with the DE, now I prefer to spend zero!)
Except even with Linux Mint you have to choose which one ;)
Nope.
Linux user for decades, but headless since the early aughts. Decided to dip my toes back into the desktop space with Mint Cinnamon.
I can mirror or run lots of phone apps on Windows or macOS, but ironically, not Linux. I decide to run an Android emulator so I can use some phone-only apps.
I read up on reviews, then download and install Waydroid as the top contender.
Does Waydroid work? No. It fails silently launching from the shortcut after the install. Run it from the command line, and, nope, it's a window manager issue. Mint Cinnamon uses X11, not Wayland, and Waydroid apparently needs... Wayland support.
OK, I log out, log into Mint with Wayland support, then re-launch Waydroid. My screen goes into a fugue state where it randomly alternates between black and the desktop. Try a variety of things, and I guess this is just how it is. Google and try any number of fixes, end up giving up.
Yes, that's my old pal Linux on the Desktop. Older, faster and wiser, but still flaky in precisely the same ways.
You can't run X11 programs on Wayland without Xwayland.
Likewise you cannot run Wayland programs on X11 without a wayland compositor like Cage (a wayland kiosk) or Weston. Both run as a window on X11 inside of which Waydroid works just fine.
It's an odd complaint that incompatible software is incompatible.
You read the parent I was responding to, no? You're reinforcing my point.
"Users who don't know about the feature can just use a pre-configured system like Mint Cinnamon and never know about any of these things."
I did. I agree it's not obvious. But you cannot run OpenGL, Vulkan, Glide or DirectX on Windows either without having the proper hardware and software installed. So yeah. Waydroid needs wayland. Anbox runs on X11.
I think that type of user wouldn't go out looking for an Android compatibility layer.
That's cause you're using a distro like mint which is using older builds of stuff.
Get yourself a most recent plasma 6 Wayland setup with pipewire for audio. It even has rdp server now.
What's most likely happening is your user space app wants the newer API but you're running old builds from two years ago.
It will continue to degrade for you unless you fully switch to a Wayland DM.
Anything built on X11 is basically deprecated now and no one is building on it anymore.
> That's cause you're using a distro like mint which is using older builds of stuff.
The context here is that I was commenting on the parent's assertion that one "can just use a pre-configured system like Mint Cinnamon and never know about any of these things." Nope!
> It will continue to degrade for you unless you fully switch to a Wayland DM. Anything built on X11 is basically deprecated now and no one is building on it anymore.
That's my impression as well, and again, with the 2nd most popular Linux distro using X11 by default and with "experimental" Wayland support, that only reinforces my rebuttal of parent's claim.
I don't recommend Mint for this reason. SteamOS or Nobara for the white glove premium experience.
Headless daily driver? Hardcore. What do you use for a browser?
I've tried it as a challenge for a couple of days (lynx, mutt, some other TUI stuff) and it made some things like Vim stick (although that may have as much to do with that challenge as Tridactyl did). But I couldn't last longer than a week. It does free you from the burden of system requirements. CPU: Optional.
w3m can even display images in a linux console if you have the proper drivers or use KMSCON. It unwieldy but surprisingly usable. And my laptop battery runs for 8 hours which is quite amazing for a Zen1.
> And my laptop battery runs for 8 hours
I imagine your display is almost entirely black for the majority of the time, with your (most probably) LCD backlight blasting away, trying its hardest to get a few thousandths of its light output through the few pixels on the screen that it can escape! XD
The same is true of Linux - GTK3 apps runs just fine on Plasma, and so do GTK4, and Qt 5, and Qt 6, and X11 apps, and on.
Sure they all look slightly different, but it's definitely worse on Windows in that regard.
No, it's not about users picking a camp, it's about developers.
It's been a long, long time since I've seen an application utterly fail to load because it's a GTK/QT/etc framework running under a totally different DE.
Gnome apps look ugly as hell under KDE[0], but they still work. As a user, you don't need to know or care in any way. It'll run on your machine.
[0]I don't know if they're ugly because of incompatibility or if that's just How Gnome Is. I suspect the latter
> Yes, there are many UI implementations in Windows but they are almost totally transparent to the user (no pun intended), and they can all run on the same system at once.
I mean this is a solved problem on linux using modern distributions like NixOS or even 'normal' distros with flatpak, appimage, etc. I haven't had to deal with anything like this in years.
The windows UIs are way more different than linux was. There was a time in the 90s where UIs were expected to follow platform specifics. These days, most UIs don't and they're almost kind of like the branding. Thus, this is not as big a deal as you're making it out to be. If anything, things like the gnome apps and gtk4 are more consistent than any windows app.
>many apps now just use CSD
If there's something I hate about Linux, it's CSD (Client-Side Decorations, in case people don't know what it is).
If I wanted all my apps to look different from each other, I'd use macOS. I want a clean desktop environment, with predictable window frames that are customizable and they all look the same. CSD destroys that.
Having no CSD at all is unacceptable on small screens IMHO, far too much real estate is taken up by a title bar, you can be competitive with SSD by making them really thin, but then they are harder to click on and impossible with touch input. At the moment I have firefox setup with CSD and vertical tabs, only 7% of my vertical real estate is taken up by bars (inc. Gnome), which is pretty good for something that supports this many niceties.
Linux doesn't mean GNOME.
KDE favors server-side decorations.
Conversely, I don't want all of my apps to look identical to each other. I want to be able to tell with a submoment of a glance what app I am working on or looking for without having to cognitively engage to locate it, breaking my state of flow in the process.
That's what the title bar is for?
> UI framework balkanization has always been, and remains a hideous mess.
At least things look more or less the same over time. With commercial offerings one day you open your laptop and suddenly everything looks different and all the functions are in a different submenu because some designer thought it was cool or some manager needed a raise.
> It'll probably work fine out of the box, but if it doesn't. Hoo boy.
LLMs are actually very useful for Linux configuration problems. They might even be the reason so many users made the switch recently.
Pair-programming Nix with Gemini has taught me a lot about the assistive power of LLMs.
They're still slow and annoying at languages I'm good at. But it's really handy to be able to take one I'm not (like Nix or even C++) and say "write me a patch that does …" Applying a lot of the same thinking/structuring skills, but not tripping on the syntax.
They're pretty good for most things, yes... but man was it rough figuring out getting my IP allocation routing right on my Proxmox server. The system is issued a primary IP, and need to route my subnet through that to my VMs... wasn't too bad once I got it working... I'd also wanted a dnat for "internal" services, and that's where it got tricky.
I need to refresh myself as I'm wanting to move from a /29 to a /28 ... mostly been lazy about not getting it done, but actually mqking progress oo some hobby stuff with Claude Code... definitely a force multiplier, but I'm not quite at a "vibe code" level of trust, so it's still a bit of a slog.
You could just let the VMs be normal IPs on the network....
Where would those IPs route to/from if it didn't have a configured default gateway exactly?
The machine got a single IP, I had to route the CIDR block using that IP as the gateway in the host OS. The VMs wouldn't just get assigned additional real IPs.
KDE & Gnome are both guilty of the same.
> Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.
I think Windows is the only other one which really does this properly, macOS also does the hack where they simulate fractional scales by rendering with an integer scale at a non-native resolution then scaling it down.
> I think Windows is the only other one which really does this properly
Windows is the only one that does this properly.
Windows handles high pixel density on a per-application, per-display basis. This is the most fine-grained. It's pretty easy to opt in on reasonably modern frameworks, too; just add in the necessary key in the resource manifest; done. [1]
Linux + Xorg has a global pixel density scale factor. KDE/Qt handles this OK; GNOME/GTK break when the scaling factor is not an integer multiple of 96 and cause raster scaling.
Linux + Wayland has per-display scaling factors, but Chromium, GNOME, and GTK break the same way as the Xorg setup. KDE/Qt are a bit better, but I'm quite certain the taskbar icons are sharper on Xorg than they are on Wayland. I think this boils down to subpixel rendering not being enabled.
And of course, every application on Linux in theory can handle high pixel density, but there is a zoo of environment variables and command-line arguments that need to be passed for the ideal result.
On macOS, if the pixel density of the target display is at least some Apple-blessed number that they consider 'Retina', then the 'Retina' resolutions are enabled. At resolutions that are not integer multiples of the physical resolution, the framebuffer is four times the resolution of the displayed values (twice in each dimension), and then the final result is raster-scaled with some sinc/Lanczos algorithm back down to the physical resolution. This shows up as ringing artifacts, which are very obvious with high-contrast, thin regions like text.
On non-retina resolutions, there is zero concept of 'scaling factor' whatsoever; you can choose another resolution, but it will be raster-scaled (usually up) with some bi/trilinear filtering, and the entire screen is blurry. The last time Windows had such brute-force rendering was in Windows XP, 25 years ago.
[1]: https://learn.microsoft.com/en-gb/windows/win32/hidpi/settin...
ChromeOS also does fractional scaling properly because Chrome does it properly. The scaling factor is propagated through the rendering stack so that content is rastered at the correct scale from the beginning instead of using an integer scaling factor and then downscaling later. And it takes subpixel rendering into account too, which affects things like what elements can be squashed into layers backed by GPU textures.
I think Android does it properly too because they have to handle an entire zoo of screen sizes and resolutions there. Although they don't have the issue of dealing with subpixel rendering.
> Windows is the only one that does this properly. Windows handles high pixel density on a per-application, per-display basis.
This is not our [0] experience. macOS handles things on a per-section-of-window, per-application, per-display basis. You can split a window across two monitors at two different DPIs, and it will display perfectly. This does not happen on Windows, or we have not found the right way to make it work thus far.
[0] ardour.org
> macOS handles things on a per-section-of-window, per-application, per-display basis.
No, it does not. If you have two displays with different physical pixel densities, and especially if they are sufficiently different that Apple will consider one 'Retina' and 'not Retina' (this is usually the case if, for instance, you have your MacBook's display—which probably is 'Retina'—beneath a 2560 × 1440, 336 × 597 mm monitor, which is 'not Retina'), then the part of the window on the non-Retina display will be raster-scaled to account for the difference. This is how KDE Plasma on Wayland handles it, too.
In my opinion, any raster-scaling of vector/text UI is a deal-breaker.
I think the only case where raster scaling is not a deal breaker is a window spanning high and low DPI displays. That is unless the app delegates compositing to the OS compositor which could then raster the contents at the different scales correctly. Not all content can be delegated to the OS - video games for example.
I think there’s one group of people who consider preserving the physical dimensions important that like the macOS approach. For me, if a window is across multiple displays then it’s already broken up and I’m not too bothered about that. What I care about is getting application UI to a reasonable size without blurring. MacOS doesn’t do that.
Actually, the default in MacOS is that the window is only on one monitor, and its the monitor where the cursor was when you last moved the window, so you might have a window appearing invisible because you dragged it near the corner and some sliver ended on another monitor.
Look at this complicated tinkering MacOS makes you do for something as simple as spanning windows across monitors! https://www.arzopa.com/blogs/guide/how-to-make-a-window-span... (OK this last part is slightly facetious but Linux gets dinged for having to go into menus because the writer wants something to work the way it does in on other operating systems the whole time)
> then the final result is raster-scaled with some sinc/Lanczos algorithm back down to the physical resolution. This shows up as ringing artifacts, which are very obvious with high-contrast, thin regions like text.
I don't think this is true. I use non-integer scaling on my Mac since I like the UX to be just a little bit bigger, and have never observed any kind of ringing or any specific artifacts at all around text, nor have I ever heard this as a complaint before. I assume it's just bilinear or bicubic unless you have evidence otherwise? The only complaint people tend to make is ever-so-slight additional blurriness, which barely matters at Retina resolution.
Indeed, these artifacts sound like they're coming from Display Stream Compression [1] rather than scaling. I've had Macs occasionally use DSC when it wasn't necessary; power-cycling the display and/or changing the port it's plugged into usually fixed it. If it's consistently happening, though, it's probably because the display, the cable, the port, and/or the GPU can't handle the resolution and refresh rate at full bandwidth.
[1]: https://en.wikipedia.org/wiki/Display_Stream_Compression
> ringing or any specific artifacts at all around text
There are a few Reddit threads that crop up when one searches for 'macOS ringing artifacts scaling'. For instance, these ones:
https://www.reddit.com/r/macbookpro/comments/1252ml8/strange...
https://www.reddit.com/r/MacOS/comments/1ki58zk/fractional_s...
https://www.reddit.com/r/MacOS/comments/l8oadr/macos_fringin...
All are ringing artifacts, typical of downscaling. I no longer have a Mac (chose one for work to try it out, saw this issue, returned it immediately), but I assure you this is what happens.
> The only complaint people tend to make is ever-so-slight additional blurriness
At no scale factor should there be any blurriness unless a framebuffer resolution is explicitly set. The 'scale factor' should be entirely independent of the physical resolution, which macOS simply does not do.
Apple's understanding and implementation of 'Retina' comes from a singular source: the straightforward doubling in each dimension of the display resolution of the iPhone 4 compared to the iPhone 3GS. It has not changed since, and has applied this algorithm throughout its OS stack.
All of these involve external monitors as far as I can tell, so it seems more likely it's the Display Stream Compression mentioned by the sibling to your comment that is the culprit.
Like I said, absolutely nothing like that happens on my display. I see the ringing in the first link. That doesn't happen to me. Not even a hint of it.
I get you don't like the scaling, but like I said, the very slight blurriness just isn't really noticeable in practice, especially given how Macs antialias text to begin with. Of all my complaints about Macs, this particular one is close to the bottom.
I gotta say, as the guy who brought up DSC, that last Reddit post especially had me doubting. That is not what DSC artifacts look like. DSC subsamples the chroma, which causes distinct color bleeding issues. That is luma bloom, which doesn't happen with DSC.
So I took my Mac Mini, hooked up to a 4K monitor, verified there were no DSC artifacts at native resolution, set it to "2560x1440" and sure enough the same artifacts appeared for me too, but still no telltale signs of DSC. So yeah, I gotta say, this is on Apple. Between this and dropping subpixel antialiasing support for text, it's pretty clear that their only properly supported configuration is 2x scaling on high-DPI displays.
Huh, very interesting.
OK, I just grabbed my loupe to make sure I'm not missing anything, and pulled up an app in dark mode (so ringing should be more visible) on my MBA M4. I'm using its built-in display. I've cycled through all 4 available resolution settings in Display, and absolutely zero artifacts or ringing. Then tried connecting to my LG UltraFine 4K which connects over Thunderbolt, that gives 5 resolution settings instead of 4, and zero artifacts/ringing on any of those either.
So I have no idea what's going on. I don't doubt that you're seeing it, and it's there in that Reddit photo. But maybe it's something specific to external monitors over a certain connection type or something? Seems very strange that Apple would use a different downsampling algorithm under different circumstances though.
I'd normally assume the most likely culprit would be some kind of sharpening setting on a monitor, as that can absolutely cause the type of ringing seen in that Reddit photo. But on the other hand, if you're testing it right now and not seeing it at native 2x, then that would seem to be ruled out, at least in your case. Maybe it's some kind of resolution negotiation mismatch where it's actually the monitor applying a second scaling that has ringing, since monitors can accept signals that don't match their native hardware resolution?
I can get a mild form of it on my M4 MBP's built-in display at "1800x1125"* but it's not nearly as noticeable as it was on the 4K external display at "2560x1440" and honestly I needed my cell phone camera zoomed in to definitively identify it, so that was more of a fishing expedition than a real problem. However, I have tried 2 different Macs, 2 different 4K monitors (both LG UltraFine also, though they differ in firmware version and color reproduction because of course they do), and 2 different interfaces (HDMI, Thunderbolt), and I can reliably replicate it under all of those combinations. I think that exact scaling factor probably has a bad interaction with the scaling algorithm. I do agree that a lot of other scalings do not produce the ringing/halo/bloom effect.
* = You have to go click "Advanced...", enable "Show resolutions as list", then when back on the main Displays page, enable "Show all resolutions", to get this and many other options -- but this is only necessary on the internal display, the external display offers "2560x1440" as a non-advanced choice
> All of these involve external monitors as far as I can tell
This happens on the native displays of MacBooks and iMacs, too. Try any of the 'looks larger'/'looks smaller' settings and it'll show up.
I've repeatedly explained that's not the case. Not on any Mac I've ever owned, and I already explained I thoroughly investigated my current M4 MBA. It's not showing.
That just means you can't perceive it. Sorry, but raster scaling is how Apple's algorithm works, and just because you can't see it doesn't mean it is not the case.
For the record, this isn't just visual. Rasterising to a framebuffer that is considerably larger than the physical resolution and then scaling produces a tangible effect on battery life. Not that it matters much with the impressive efficiency of the M-series SoCs, but it is there nonetheless.
I work a lot with Photoshop. I've studied digital signal processing. Believe me, I will perceive ringing when it's there.
You seem to be fundamentally misunderstanding. Yes, raster scaling is how it works. I haven't disputed that anywhere.
I'm saying the ringing artifact specifically that you're complaining about is not happening on my setup, nor does it seem to be widespread. You were complaining about the specific Lanczos algorithm due to its noticeable ringing, I'm saying that therefore doesn't seem to be the algorithm being used on mine, nor is there any documentation that's the algorithm Apple uses. Your criticism seems to be based on partially wrong information, even if something like it seems to happen on certain external displays -- whatever it is, it's not a universal problem.
If you somehow missed my other comment, please read it:
> The last time Windows had such brute-force rendering was in Windows XP, 25 years ago.
To be fair, UXGA was a thing 20 years ago. I don't think it makes sense for Apple to care all that much about low DPI monitors. They don't sell any, and they wouldn't be acceptable to most Apple people, who have had crisp displays available for > 10 years now. I wouldn't be surprised if the number of Apple users on low dpi is single digit percentage.
This is a surprising opinion to encounter, given my experience with scaling on Windows, where simple things like taking my laptop off its dock (going from desktop monitors to laptop screen) causes applications to become blurry, and they stay blurry even when I've returned the laptop to the dock. Or how scaling causes some maximized window edges to show up on the adjacent screen. Or all manner of subtle positioning and size bugs crop up.
Is this more of an aspirational thing, like Windows supports "doing it right", and with time and effort by the right people, more and more applications may be able to be drawn correctly?
[edit] I guess so, I see your comment about setting registry keys to make stuff work in Microsoft's own programs. That aligns more closely with my experience.
Not sure about the underlying reason, but I use Windows for work and the only program I've encountered in the past two years with this behavior is the Eclipse IDE. Everything else deals very well with rescaling and docking / undocking to 4k displays.
> Windows is the only one that does this properly.
How can you say this when applications render either minuscule or gigantic, either way with contents totally out of proportion, seemingly at random?
I don’t have to pull out a magnifying glass to notice those issues.
These were probably written against the old-school Win32. It's pretty easy to fix.
Done.Right-click on the `.exe` Properties Compatibility tab Change settings for all users Change high DPI settings Under 'High DPI scaling override' section, tick box for 'Override high DPI scaling behaviour. Scaling performed by' In the drop-down box below, select 'Application'For MMC snap-ins like `diskmgmt.msc`, `services.msc`, or `devmgr.msc`, there's a Registry key you can set. See this ServerFault question: https://serverfault.com/q/570785/535358
The 'doing it right' part is from how it should be done, but it still needs application support.
The thing is X11/Xorg can also theoretically do the same thing (and most likely Wayland too) but it needs, you guessed it, application (and window manager / compositor) support.
That's roughly what I did for my ANSI console/viewer... I started with EGA resolution, and each ega pixel renders 3x4 in its' buffer then a minor blur, then scaled to fit the render area. The effect is really good down to about 960px wide, which is a bit bigger in terms of real pixels than the original... at 640px wide, it's a little hard to make out the actual pixels... but it's the best way I could think of to handle the non-square pixels of original EGA or VGA... I went with EGA because the ratio is slightly cleaner IMO. It's also what OG RIPterm used.
I have precisely one Windows thing I use regularly, and it has a giant window that needs lots of pixels, and I use it over Remote Desktop. The results are erratic and frequently awful.
> * UI framework balkanization has always been, and remains a hideous me
I'd take balkanization over the "we force-migrate everyone to the hot new thing where nothing works".
> It'll probably work fine out of the box, but if it doesn't.
Drivers are a pain point and will probably stay so until the market share is too large for the hardware vendors to ignore. Which probably aren't happening any time soon, sadly.
This is not a driver issue I'm talking about. It's a "best way to adjust the white balance is with this GTK+-2.0 app that hasn't seen maintenance since the Bush administration" issue.
Yes, this one is quite a problem as well.
> I'd take balkanization over the "we force-migrate everyone to the hot new thing where nothing works".
The UI framework for macOS has not changed in any substantial design-update-requiring ways since OS X was first released. They did add stuff (animations as a core concept, most notably).
The UI framework for Windows has changed even less, though it's more of a mess because there are several different ones, with an unclear relationship to each other. win32 won't hurt though, and it hasn't changed in any significant ways since dinosaurs roamed the silicon savannahs.
The UI framework for Linux ... oh wait, there isn't one.
I'm a lifelong Mac user, but a gaming handheld has gotten me into some of these topics. I dual-boot SteamOS and Windows.
On SteamOS, my 5.1 stereo just works.
On Windows, apparently there was some software package called DTS Live (and/or Dolby Live) needed to wrap the audio stream in a container that the stereo understands. There was a time when there was a patent pool on the AC-3 codec (or something like that - I'm handwaving because I don't know all the details). So Microsoft stopped licensing the patent, and now you just can't use AC-3 on Windows. I spent an evening installing something called Virtual CABLE and trying to use it to juryrig my own Dolby Live encoder with ffmpeg… Never got it to work.
It's easy to fall deep into the tinkerhole on Linux, which has kept me away for a long time, but as mainstream platforms get more locked down, or stop supporting things they decide should be obsolete, it's nice to have a refuge where you're still in control, and things still work.
(Insert meme about the Windows API in Proton being a more stable target than actual Windows.)
I had to dump a perfectly fine c.2012 workstation recently because of video driver limitations. Could no longer stay current on my flavor of Linux (OpenSUSE) and have better than hideous display resolution limited to just one monitor. NVIDIA’s proprietary drivers are great, but the limited support lifecycle plus poor open source coverage is actually making Linux turn fine systems into trash just the way Windows used to do.
>poor open source coverage is actually making Linux turn fine systems into trash just the way Windows used to do.
I'd blame Linux as a very small percentage of the problem here. This is on NVIDIA ensuring their hardware doesn't last to long and forcing you to throw it away eventually. Open source can make the monitor 'work' but really aren't efficient, and really can never be efficient because NVIDIA doesn't release the needed information and directly competes with their proprietary driver.
Couldn't you swap out for a now lower level AMD GPU? An RX 6600 should be under $200 and likely at least as good as what you were running... unless you were doing specific CUDA workloads. Even on PCIe 2/3, it should be fine.
I'm using KDE with Wayland and 2 non-standard DPI monitors (one at 100% the other at 150% scale). No workarounds needed, nothing is blurry. I think your experience comes from GNOME which lacks behind in this regard.
FWIW, I can do the same with KDE on Xorg with Gentoo Linux.
Since the introduction of the XSETTINGS protocol in like 2003 or 2005 or so to provide a common cross-toolkit mechanism to communicate system settings, the absence of "non-integer" scaling support has always been the fault of the GUI toolkits.
> I think your experience comes from GNOME which lacks behind in this regard.
When doesn't GNOME lag behind? Honestly, most of Wayland's problems have been because a project that expects protocol implementers and extenders to cooperate in order to make the project work set those expectations while knowing that GNOME was going to be one of those parties whose cooperation was required.
Mint/cinnamon here at 150%, X11, not blurry. It’s FUD.
The issue with X11 is that it's not dynamic. Think using a laptop, which you sometimes connect to a screen on which you require a different scale. X11 won't handle different scales, and it also won't switch from one to the other without restarting it.
> The issue with X11 is that it's not dynamic.
No, it is. Maybe you're using an ancient (or misconfigured) Xorg? Or maybe you've never used a GTK program? One prereq is that you have a daemon running that speaks the ~20 year old XSETTINGS protocol (such as 'xsettingsd'). Another prereq is that you have a DE and GUI toolkit new enough to know how to react to scaling changes. [0]
Also, for some damn reason, QT and FLTK programs need to be restarted in order to render with the new screen scaling ratio, but GTK programs pick up the changes immediately. Based on my investigation, this is a deficiency in how QT and FLTK react to the information they're being provided with.
At least on my system, the KDE settings dialog that lets you adjust screen scaling only exposes a single slider that applies to the entire screen. However, I've bothered to look at (and play with) what's actually going on under the hood, and the underlying systems totally expose per-display scaling factors... but for some reason the KDE control widget doesn't bother to let you use them. Go figure.
[0] I don't know where the cutoff point is, but I know folks have reported to me that their Debian-delivered Xorg installs totally failed to do "non-integer" scaling (dynamic or otherwise), but I've been able to do this on my Gentoo Linux machines for quite some time.
I use whatever package is shipped by arch, so I think I'm fairly up to date.
I did look a bit into this at one point, but I've found that it's mostly QT apps which work fine with different scaling (telegram comes to mind). GTK apps never did, but I admit I never went too deep in the rabbit hole. Didn't know there was supposed to be some kind of daemon handling this. I do run xsettingsd, but for unrelated reasons. I'll have a look if it can update things.
In any case, except for work, I always used everything at 100% and just scaled the text as needed, which worked well enough.
> I've bothered to look at (and play with) what's actually going on under the hood, and the underlying systems totally expose per-display scaling factors...
Would you care to go into some details? What systems are those and how do you notify them there's been a change?
On UI frameworks... mostly agree, I say this as a COSMIC user even... so many apps still don't show up right in the tray, but it's getting a bit better, I always found KDE to be noisy, and don't like how overtly political the Gnome guys are. So far Wayland hasn't been bad, X apps pretty much just work, even if they don't scale right.
I'm on a very large OLED 3440x1440 display and haven't had too many issues... some apps seem to just drop out, I'm not sure if they are on a different workspace or something as I tend to just stick to single screen, single display. I need to take the time to tweak my hotkeys for window pinning. I'll usually have my browser to half the screen and my editor and terminal on the other half... sometimes stretching the editor to 2/3 covering part of the browser. I'm usually zoomed in 25-30% in my editor and browser... I'd scale the UI 25% directly, like on windows or mac, but you're right it's worse.
For webcams, I don't use anything too advanced, but the Nexigo cams I've been using lately have been working very well... they're the least painful part of my setup, and even though I tend to use a BT headset, I use the webcam mic as switching in and out of stereo/mono mode for the headset mic doesn't always work right in Linux.
On audio filtering, I can only imagine... though would assume it's finally starting to get better with whatever the current standard is (pipewire?), which from what I understand is closer to what mac's interfaces are. I know a few audio guys and they hate Windows and mostly are shy to even consider Linux.
- Yes. I think big players in Linux should start supporting core functionalities in GNOME and KDE, and make it polished for laptops and desktops and that would be very cool. For a long time, KDE had a problem of having too many things under its umbrella. Now, with separation of Plasma Desktop and Applications, focusing on Plasma Desktop and KDE PIM should be a good step.
- Kind of ties to the old point: KDE on Wayland does this extremely well.
- You're back to 20 years because problems are exactly from 20 years ago. Vendors refusing to support linux with d rivers.
- Audio filtering? Interesting. I know people who use Pipewire + Jack quite reasonably. But may be you have usecase I am now aware of? Would be happy to hear some.
It would help if Gnome wasn't so hostile towards proper cross-DE interop. A famous quote by a Gnome dev goes, "I guess you have to decide if you are a GNOME app, an Ubuntu app, or an Xfce app unfortunately"
They seem to genuinely believe that their way is the right way and everyone else is "holding it wrong" so there's no need for things that would make cross-DE apps easier (or even possible).
the scaling and UI framework issues are by far my biggest pain point. I will inevitably end up with an app with tiny and/or blurry UI elements every few weeks and have to spend a ton of time figuring out the correct incantation to make it better.
This is on a pretty clean/fresh install of current ubuntu desktop
Hardware support for esoteric things such as the new generation of Wacom EMR is still awkward --- I was able to get the previous gen working on a ThinkPad X61T using Lubuntu --- wish that there was such an easy way to try out Linux on my Samsung Galaxy Book 3 Pro 360....
> Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry
Not blurry for me on KDE and I wouldn't tolerate blurry, I'd prefer the imperfect solution of using bigger fonts.
KDE Plasma 6 might be the only desktop that does this right on Linux.
I've also been running fractional scaling on Sway for many years now and native wayland applications are not blurry. X11 apps run through XWayland will be blurry, but I don't have any legacy X11 apps remaining on my system.
> Audio filtering is a pain to set up.
Like noise filtering for your microphone? It was pretty trivial to set up: https://github.com/werman/noise-suppression-for-voice
> * Support for non-standard DPI monitors sucks, mostly because of the previous point. Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.
This sounds like you're using some old software. GNOME and sway have clean fractional scaling without blurring, though that hasn't always been the case (it used to be terrible).
Fractional scaling on Wayland is only blurry for X apps, and even then, most apps have Wayland support at this point, so for the remaining apps, just turn off Xwayland scaling, and using native scaling through env vars and flags, and no more blurriness.
I use Linux as my daily driver, with a Mac laptop. I only use Windows when I absolutely have to (i.e., testing), and usually through a VM.
Some other rough edges in Linux I've encountered:
- a/v support in various apps. We use Slack for everything (I can't just use something else) and a/v support is pretty bad to where my video frame rate is now ~1Hz and screen share shows a black rectangle. I think that's mostly Slack's fault as Google Hangouts works fine, but it's probably low on their priority list.
- sleep / hibernation is still sometimes flakey. occasionally it won't wake up after hibernating overnight, and I have to hard reboot (losing any open files though that's not an issue)
- power management on laptops (and therefore battery life) is still worse than Windows, and way worse than Mac. I tried Framework + Linux for a while and really wanted to love it, but switched to a Mac and am not going back (still run Linux on desktop). There is nothing out there that compares to the M-series MacBooks.
- occasional X/Wayland issues, as mentioned
> UI framework balkanization has always been, and remains a hideous mess.
Amen.
But, which OS doesn't have this problem? I'm currently running windows on a work laptop and even freaking first-party apps have a different look and behave differently from one another. Teams can't even be assed to use standard windows notifications! And don't get me started on electron apps, of which most apps are nowadays, each coming with their own look and feel.
Also, have you tried switching from light to dark mode, say at night? The task manager changes only partially. The explorer copy info window doesn't even have a dark mode! On outlook the window controls don't change colour, so you end up with black on black or white on white. You can't possibly hold up windows as a model of uniform UI.
So while I agree that this situation is terrible, I wouldn't pin it on the linux ecosystem(s).
> Every other major OS can deal with [high dpi].
Don't know about mac os, but on Windows it's a shitshow. We use some very high DPI displays at work which I have to run at 200%, every other screen I use is 100%. Even the freaking start menu is blurry! It only works well if I boot the machine with the high-dpi display attached. If I plug it in after a while (think going to work with the laptop asleep), the thing's blurry! Some taskbar icons don't adapt, so I sometimes have tiny icons, or huge cropped ones if I unplug the external monitor. Plasma doesn't do this.
IME KDE/Plasma 6 works perfectly with mixed DPI (but I admit I haven't tried "fractional" scales). The only app which doesn't play ball 100% is IntelliJ (scaling works, it's sharp, but the mouse cursor is the wrong size).
> Audio filtering is a pain to set up.
What do you mean? I've been using easyeffects for more than five years now to apply either a parametric EQ to my speakers or a convolver to my headphones. Works perfectly for all the apps, or I can choose which apps it should alter. The PEQ adds a bit of a latency, but applications seem to be aware of it, so if I play videos (even youtube on firefox with gpu decoding!) it stays in sync. It detects the output and loads presets accordingly. I also don't have to reboot when I connect some new audio device, like BT headphones (well, technically, on Windows I don't anymore, either, since for some reason it can't connect to either of my headphones at all). I would love to have something similar on windows, but the best I found isn't as polished. It also doesn't support dark mode, so it burns my eyes at night.
macOS and Windows have a much smaller set of variants, and tend to ship a single UI with everything included with OS. Even the best single desktop Linux distros will ship divergent KDE and Gnome apps.
If you want essentially perfect high-DPI support out of the box and can afford higher end displays, use macOS. It just works. I see the comments above about scaling, and to that, I say: most people will never notice. However, a Win32 app being the wrong scale? They'll notice that.
But the real display weak point of Linux right now vs Windows is HDR for gaming. That's a real shitshow and it tends to just work on Windows.
My story is simpler. Microsoft dropped the support for Windows 10 and gave me no upgrade path to Windows 11 because my CPU was 5 years too old apparently.
So I installed Fedora on that machine, I learned the process, I went through the hurdles. It wasn’t seamless. But, Fedora never said “I can’t”. When it was over, it was fine.
Only if Microsoft had just let me install Windows 11 and suffer whatever the perf problem my CPU would bring. Then I could consider a hardware upgrade then, maybe.
But, “you can’t install unless you upgrade your CPU” forced me to adopt Linux. More importantly, it gave me a story to tell.
There is a marketing lesson there somewhere, like Torvalds’ famous “you don’t break userspace”, something along the lines of “you don’t break the upgrade path”.
I'm in the exact same boat. I was a little unhappy with the ads etc in Windows, but perfectly willing to give Windows 11 a try. But Microsoft decreed that my admittedly a bit old but perfectly workable CPU was incompatible, due to not having a feature I wasn't interested in. I'd need to replace most of my existing hardware to switch. So why not try Linux? It certainly seems reasonable when Windows apparently needs more command-line hackery to maybe work for a while than Linux.
So to Fedora I went! So far, I've been pleasantly surprised. All of the software I want to use installed easily and works, via Flatpak. All of my hardware works fine, and there are actually fewer weird hardware quirks than under Windows. I also appreciate that there are options to turn off behaviors I found annoying in Windows.
It's a bit sad to have to switch due to Microsoft trashing their own OS rather than Linux becoming superlatively awesome, but what can you do.
Linux is superlatively awesome
and I say that as a FreeBSD user.
And I say it as a Linux user. My story is breathing life into old hardware back in the 2010s without having to spend any money, and just enjoying the No Man's Sky style freedom of exploring a whole new world of how a desktop operating system could look in feel and work, unmoored from the background sickness of thinking everything I do is channeled into Microsoft telemetry.
If you're the type of person who's capable of falling in love with software and software ecosystems, there's nothing like a first jump into Linux and understanding it as a world ready and waiting for you.
FreeBSD user here too <3 Mainly because I think Linux is way too aligned (and developed by) big tech these days.
We have Netflix and Sun influence but the former is not really putting its stamp on it and the latter no longer exists (and evil Oracle has zero interest of course)
I prefer the OS aligned with users like me not the big cloud boys.
I would love to try FreeBSD but I'm too addicted to NixOS... If there was a way to have a NixOS-like declarative BSD system I'd give it a serious try.
FreeBSD is kinda declarative. A lot of it is (or can be) configured in a text file called rc.conf
https://man.freebsd.org/cgi/man.cgi?rc.conf
It's not as completely declarative as Nix but it was never intended to be.
More like 3 files.
- /boot/loader.conf for kernel settings to be set only at boot
- /etc/sysctl.conf for kernel settings to be set anytime
- /etc/rc.conf for rest of configuration
I'm in a similar but more ridiculous reason. My reasonably modern hardware should support windows 11, but I get "disk not supported" because apparently I once picked the "wrong" bootloader?
I can't be arsed, if I'm going to have to fiddle around getting that working I might as well move to linux.
Iike that you both started with Fedora. Same for me but almost 20 yrs ago
Haven't touched it in a long time ever since debian8 was the point in time where it was fine to run on desktop and laptop for me, not only on server. Ever since then I have it on all my 20something machines
I know you've already switched but bid you try using FlyOOBE to bypass it?
Prior to a lot of apps transitioning to be web apps, this would be more important, but there’s less value now that almost everything is non-native. Even MS Office is online now
You can bypass the warning really easily, I googled it the moment I saw it and it was very easy. A keyboard shortcut to open the command window during the install and one cheeky command. I agree though that it’s silly they don’t offer it officially.
But I get the feeling you were on the edge of transitioning anyway, which is fine! Sounds more like the straw that broke the camels back.
If you bypass the installer minimum hardware checks then you're making a gamble that the official statement from Microsoft won't affect you:
> If Windows 11 is installed on ineligible hardware, your device won't receive support from Microsoft, and you should be comfortable assuming the risk of running into compatibility issues.
> Devices that don't meet these system requirements might malfunction due to compatibility or other issues. Additionally, these devices aren't guaranteed to receive updates, including but not limited to security updates.
Aren't you guys actually talking about a TPM 2.0 device being present on the machine and not a CPU specifically? Cause the whole Windows 11 thing was (I thought) full disk encryption with TPM 2.0 attestation booted from a secure boot BIOS. That basically just means you can't take the disk and boot it on another machine. There would be no way to decrypt.
Windows 11 officially requires TPM 2.0, secure-boot enabled, and an AMD Zen+ (Ryzen 2xxx) or later or an Intel Core Gen 8 or later.
https://arstechnica.com/gadgets/2021/10/windows-11-the-ars-t...
> ... the best rationale for the processor requirement is that these chips (mostly) support something called “mode-based execution control,” or MBEC. MBEC provides hardware acceleration for an optional memory integrity feature in Windows (also known as hypervisor-protected code integrity, or HVCI) that can be enabled on any Windows 10 or Windows 11 PC but can come with hefty performance penalties for older processors without MBEC support.
> Another theory: older processors are more likely to be running in old systems that haven’t had their firmware updated to mitigate major hardware-level vulnerabilities that have been discovered in the last few years, like Spectre and Meltdown
I have a few machines which lack a supported CPU. There's CPU's only 6 years old which aren't supported. There may be some newer ones even (I didn't bother to look).
If it was 2000 - it'd be like, "OK boss, you gotta upgrade that old dog of a CPU", but software bloat really hasn't kept up with CPU performance. I've got an i3 which is serviceable enough from 2014. Is it going to be able to keep up with modern SQL Server and Teams and VSCode and all that? Probably not all at once. But totally fine for basic computing.
You can use a TPM for disk encryption with Linux if you want. You also get to use your own secureboot keys if you want. Your choice.
I can't be bothered. My 80386 worked fine without any of the above and I still don't need any of it on a Zen%d (except Linux)
Yea I was looking at this for work. We require full disk encryption for all operating systems but linux is the one where it's a passphrase or a yubikey. In my personal life it would just make managing my PC more annoying. Imagine a motherboard failure and boom there goes my entire disk.
You can have automatic unlock with tpm2, with or without a pin, in addition to passphrase, file, fido2, pkcs#11 cert, or whatever else is supported by luks.
I've been using this for a few years now, and never had an issue.
https://wiki.archlinux.org/title/Systemd-cryptenroll
> Imagine a motherboard failure and boom there goes my entire disk.
You can also set a long-ass key in addition to the other methods, and back it up somewhere safe. It works the same as bitlocker: you have key which can decrypt the drive without external help from a TPM in case something goes wrong.
Yubikeys are very useful. I was pointed to them by a colleague and was a bit skeptical in the beginning but since then I am more than happy to use them, absolutely flawless execution. The only thing that I am a bit concerned about is that it isn't the key that I place on the device that governs all this so you can't be 100% sure that there isn't some kind of supply chain trick that would allow the manufacturer or one or more of their employees to create duplicate keys.
With Linux I think you do have the option of encrypting with your own cert using the PCKS#11 module on the Yubikey.
That's interesting, thank you, I will definitely look into this.
> Imagine a motherboard failure
Hold up, I'm no expert on Secure Boot, but LUKS allows you to have multiple entry keys to the same drive.
This means you can have one key of random gobbledegook which is kept and auto-used by the magic motherboard, and also a passphrase that you can memorize or write down, and either one is totally sufficient on its own.
You don't even need to set them up at the same time, you can start with one and then add the other as an option later.
Secureboot is something else. It verifies the boot loader at the BIOS. This can be broken by the system itself (like if it's hacked). So it's protecting you against modifications to the boot loader. This is where kernel modules can be injected.
TPM 2.0 is something else. It's typically soldered onto the motherboard as a physical device and the key can be generated and then used to encrypt the disk. The private key can not be extracted. Only the signature and you can ask the TPM to sign a binary blob with the private key while providing you the public key to verify. This protects you against physical access to your device. No one can take your disk and decrypt it.
> the key can be generated and then used to encrypt the disk
Right, you can't recover or copy that specific key, but you also don't have to for accessing your data, if you set up some redundancy before disaster struck.
AFAIK: 99% of your storage is encrypted by a giant fixed unchanging master-key, and that is itself encrypted again with a non-master key/LS or passphrase, which is stored in the remaining "LUKS header". There's room to store multiple copies of the same master-key encrypted with different non-master options.
In that model, the TPM is simply providing (in a convoluted way) its own passphrase for one of those co-equal slots, so having one or more alternates prepared is sufficient to protect your drive from motherboard failure.
For some reason that risk never seemed larger than the one that Microsoft would force me into subscribing to more services because they hold my data hostage or that they would be more than happy to pass the keys to my machine to the USG.
But if your next move is to go to Linux where all that applies as well, why would that stop you?
You are correct that a Linux installation is ineligible for support from Microsoft. Not that that means anything for private usage.
Also, Linux has a great track record for not dropping support for older hardware. I think that is a lot more informative than whatever statement Microsoft's legal team has managed to come up with.
There's a ton of outdated guides out there because Microsoft has been patching out workaround after workaround. It's likely that the simple solution you used doesn't work anymore.
There's a little bit of considering it already yeah. Plus what the sibling comments say of it being clearly against what Microsoft wants, so no guarantee they won't disable it or make it even harder in the future. And also, the factor of, doing any of these check-disabling hacks also seems to require a full OS reinstall instead of an in-place update. If I need to do a full reinstall anyways, why not do it with an OS I don't need to hack up to get it to install on a system the OS maintainer doesn't want it to be installed on.
Apparently, fundamentally, Microsoft does not want me as a user. Hacking around their checks won't change that. I'd rather comply with their wishes and use an OS that actually wants me as a user.
I have two laptops that - even being 8 years and 4 years old fit the specs MS decided to set.
I still kicked itin the can. Am a happy Arch User & Ubuntu (will probably migrate that one to an Arch derivate as well, though) nowadays. I still use WIN11 in my day job. And it is an okay OS. I had worse. I had better.
What I find interesting is, that I gained on average 30 - 50% more battery time from the laptops I switched to Linux. It is quite unexpected and to me quite frankly amazing. I am writing this on my day job quite expensive Surface machine. I pulled it from the power connection to sit on the sofa about 20 minutes ago. My battery? At 73%. And I am running Firefox and PowerPoint at the moment (plus whatever corp crapware is installed underneath).
Except for exactly one set of tools (older Affinity progs) I have no need for WIN anymore. And as my day job provides a WIN machine...
My daily driver is a second hand W540 that was made in 2014 or so. It's got the maximum RAM that it will take (32G), and a larger HD, other than that it's just the same old box. It's indestructible and rock solid, drives three monitors and I couldn't be happier with it.
I can confirm this.
Honestly, I am really surprised this is a top comment here. This was an extremely easy work around. We are all mostly curious nerds here.
All this work because one couldn't google a easy work around?
Last time I tried Linux it sucked for gaming and I've spent hours trying to install a printer.
Not to excuse Microsoft in this situation, Linux is obviously more open.
I can't speak to gaming but i was warned about printer issues as well. However after a hasty switch from win10 to xubuntu to save my phd work i was able to get the office printer working on ubuntu that i could never print to on windows. Sure, i installed a driver but the dialogue literally directed me to do so. My jaw hit the floor when the test page came out flawless.
Yah, I feel like Linux was way worse with printers in the past.. now the story is more like: you'll have a different set of printer issues across the major OSes but no OS is clearly better or worse.
When was the last time you tried any of that on linux? Printers have been plug and play(which is impressive considering the hoops I had to jump through on windows) and with advent of proton, there's been no game I've played that's had any issues
It has been a few years to be fair. However, back when I ran into the issue people said the same thing.
I might try it on one of my older laptops which are in the closet.
Linux has been good for gaming for years now. I think I switched 4 or 5 years ago, and in all that time I've almost never had problems running games.
Which still leaves you in a state that at any time your OS stops updating because they decide to close the "loophole" or remove the "feature"
Steam now supports 1 click install of its entire library windows and Linux native and the majority work. The majority of printers either work or do not. It's not a reasonable expectation that all hardware will work but you won't need hours of work either.
MS is free to deprecate your work around any given Tuesday when you have work to do leaving you in the same spot with less time available to do anything about it.
You are wrongly assessing the value of the alternatives to boot if you think they were just too stupid to google. Based on the article they already viewed Windows negatively prior to this and thus already had a motivation to switch.
Apple does this all the time, though, and seems to get a free pass here. I have four Macs in my home, and they are cut off at Ventura (for the 2017 iMac), Monterey (for the 2014 Mac Mini and the 2015 MacBook Air), and El Capitan (for the 2014 iMac). They are all stuck at 3, 4, and 5 major OS versions back. Nobody really seems to complain about this, though.
I don't think it's the same. On older Apple hardware, it just keeps on running on the older OS version. You don't get some new features or styling of the new OS, but nothing else changes. On Windows, it periodically brings up full-screen notifications that your hardware is obsolete and you need to upgrade, with the only options being to upgrade or "remind me again later".
They also provide security updates for those old OSes for quite a while, AFAIK.
macOS receives 1 year of full support and 2 additional years for security updates for each version with 6-8 years of upgrade eligibility. Windows 10 received 10 years of support (on top of a free upgrade from Windows 7/8.1 for most users).
I'm not sure why you're counting the years of support for a version of the OS and not the years of support for a computer. The interesting thing is: if you bought a computer at year X, does it still receive updates at X+Y?
There's loads of relatively young computers which can't upgrade to Windows 11 and therefore aren't supported anymore. That's the problem, not how long Windows 10 was supported.
That's great, but it's no silver bullet. We have a 4th Gen iPad that was used mostly for consumption. Only one of the streaming apps works with its ios version.
The same issues plague old Android tablets. Lots of unecessary ewaste out there so OEM's can sell new devices.
There are a lot of Android devices that look temping until one discovers how out-of-date the firmware is.
With no option to install your own, of course. Boot loaders should be exclusively for running the manufacturer's lone security update from 5 years ago.
2013 MacBook Air on Linux Mint is fantastic
Software in much more tied to the OS though. For example, Chrome is still compatible with Windows 10 which is more than 10 years old, while on macOS you cannot install it past Monterey (2021). Not to mention that also system applications are updated with the OS, so forget about using Safari
I just installed Opencore and run the newer OSs anyway. It will eventually not be an option when they come up with an ARM-only OS, but at the moment it seems to work ok.
Because Apple been continuously doing deprecating hardware regularly since the mid 90s. And they’ll processor architecture every 10-15 years.
Microsoft was the backward compatibility king.
Though it's kinda funny that Wine can run Windows applications that Windows can't.
Windows doesn't support 16bit apps anymore, but Wine (at least Wine <9.0) still does.
They don't get a free pass, I think people have spoken with their wallets and it shows with the user base counts: Windows 66–73%, macOS 14–16%, Linux 3–4%.
Apple seems to support their previous generation OS on older macs for ~8-9 years or so from what I've seen. You just don't get the latest generation features, they cut it off and move on similar to how Microsoft did.
Apple only gets a free pass from folks who are invested in that particular kind of ... relationship.
I think Apple gets a pass because they're a luxury product. For the record, even though Apple has some really impressive hardware, this is one of the reasons I'm not very big on Apple. People praise their phone's longevity all the time, but I think this is crazy. I could be running a 13 year old computer right now and it would work fine if I had Linux. Smartphones don't really have options for this due to the market capture. Apple's PC could be supported longer, but Apple isn't interested in doing it. (and apparently they change architectures every 15 years anyhow.)
> I think Apple gets a pass because they're a luxury product.
No they aren't. They've just convinced everyone that they are.
I've seen people meme about Android being for people who couldn't afford an iPhone when the fact is that a flagship Android costs just as much as an iPhone.
I tried a "flagship Android" phone once (the top of the line Samsung), it was bugging the second I opened the pack. I returned it and got a cheap Pixel budget line phone. Then a few years later I jumped ship to iPhone, and largely am very happy now. Nothing is perfect, but for me, iPhone is the best I've tried.
That's the contemporary luxury market for nearly all goods; signifiers that tell folks "this item is more valuable because it has the magic sigil" or whatever.
That is the reality of pretty much every "luxury product/brand"...
It is convincing people to pay a premium for what is still at the end of the day a stitched leather bag, watch, computer or smartphone made in factories like everything else.
People pay for names, to project their luxury lifestyle.
It is very rare that the actual quality/performance of a "luxury item" is dramatically above a high-quality equivalent. Does a Rolex tell time and look better than a Breitling? Or a Tag Heur? Or a Seiko? Each of those represents a different price/style point - and ultimately it is subjective to a consumer - who wants to project a certain style/look.
Or several times more, in some cases
> No they aren't. They've just convinced everyone that they are.
What’s the difference?
Either way, Apple consistently makes decisions that I think put them on the side of “luxury item” — even if I often disagree with those decisions.
I ran iPhone 6 and 8 well beyond their years. I only replaced because the batteries were already replaced once. But otherwise the phone was fine. I have had same issues with laptops
I used to install linux on older mac hardware and donate it. I don't think that works anymore with with the M chips (at least in the same easy way)
You can install Asahi Linux on some of the M chips. But you are right, it's not as easy as it used to be.
because desktop Apple users have been domesticated for decades now and just accept whatever shows up in the feeding trough.
I would say its part of the promise/agreement of buying into the ecosystem, and a known caveat. Might be overly optimistic viewpoint.
They didn’t get a pass from me. My MacPro has been running Linux longer than it ran MacOS. Apple stopped supporting it officially at Mojave but I jumped ship earlier when I was forced to do a clean install rather than an upgrade because I had a RAID.
idk what other people give passes to, but I had been a Linux desktop user since the mid 90s and Mac laptop user since ~G3 iBook years and I finally gave up on their laptops a few years ago; so it's mainly linux-linux now
i think the last straw was the added telemetry that required so much effort to get rid of, but also for years they have made clear moves to make their laptops iOS-like progressively, which I cannot stand on so many levels
Apple will provide software and hardware support for any given product for at least 5 years. After those 5 years, you sometimes will still get security fixes.
The reason for this is that newer software will start using hardware features and capabilities that only exist on newer hardware, not because Tim Cook is evilly cackling in his office "hahhahha! Let's force people to buy new Macs!!!"
If only there was a way to write software that uses the new hardware features if they're available but falls back to a legacy path, if the hardware features were not available.
Erm, isn't the last bit a key part of Tim Cooks job (getting people to buy new apple stuff even if they don't really need it)?
Apple's hardware products generally sell themselves.
It's a really bad time for Microsoft to force consumers to upgrade - even computer parts from 5 years ago are price hiking.
Which Microsoft really should have been able to see coming, since it’s largely their money that’s being used to soak up all the supply of computing hardware.
Someone inside Microsoft probably did see it coming. That comic of their org chart being individual bubbles, all pointing guns at each other really does explain that company's behavior.
Microsoft doesn't seem to be one unified entity, but a bunch of smaller competing companies under the same umbrella, each trying to destroy the other.
Allegedly Sony saw the writing on the wall and secured a good amount of GDDR supply for their consoles. Microsoft did not, and has had to raise the price of the Xbox Series X as a result.
I don't think that Microsoft knows what Microsoft is doing.
Yep. Similar thing forced me off of Apple. They stopped making 17 inch laptops. They started soldering parts into place. Made it so you couldn’t open your own laptop to replace the HD.
Switched to Linux 8 years ago and haven't looked back.
I have a similar issue with Windows. The machine already dual-boots Linux, but it is simultaneously demanding Windows 11 and telling me that it doesn't support it. It's a three year old Ryzen, it plays every game I've thrown at it flawlessly - which admittedly is only just so many things, but if it could manage Oblivion Remastered at launch it should manage a bloody operating system surely.
I hear it might be some TPM thing. If so, it still seems like a bad decision to require this thing, and it's telling that I'm working on speculation here - it doesn't _tell_ me that's what it is.
I had an issue with TPM too, enabling PTT (Intel's on-chip TPM) on BIOS fixed that one.
Something is disabled in your BIOS. Ask an LLM to guide you.
I can solve this problem if I have to. Right now, I don't.
But I think you miss my core point. If "Ask an LLM" is the answer to this, Microsoft are doing an unforgivably shit job of maintaining Windows. Why on earth should that be necessary? Surely they can provide the minimum of information about why they think the upgrade isn't possible.
Microsoft asking to upgrade hardware reminds me of that old joke (from memory so excuse the bad story telling)
User: hello, my PC smokes and I would like to purchase an anti smoke software
Computer service: sorry it's not possible, you have to replace the hardware
User: no I really want an anti smoke software
(Later)
User: hello I would like to purchase a new computer
Service: see, I told you that an anti smoke software is not possible
User: wrong! I have purchased one from Microsoft. But apparently it's not compatible with my current hardware
.\setup.exe /product server /auto upgrade /EULA accept /migratedrivers all /ShowOOBE none /Compat IgnoreWarning /Telemetry DisableYeah, until microsoft says "Sup there lil buddy? Running an unsupported system? Oof. The next update is gonna really turn it inside-out"
This is exactly my experience - I have a Lenovo W530 from 2013, it has an i7, 32gb RAM and SSDs (RAID0 for performance, backups are off-device) - and it is STILL lightning fast.
However - EVERY single trick I have tried... the above command, LTSC, Enterprise edition, etc, results in a situation where after installation a few days (or hours) and some updates get installed, and... blue-screen-of-death on every boot.
Gave up, installed Linux - still working through some issues (GPU driver compatibility), but overall it is a much better experience...
I think at a certain point you need to just call it quits with that sort of bullshit. I have my dignity. I'm a fucking grown adult. I'm not going to spend my spare time haplessly looking online to unfuck the new current set of fuckery. Just take the fucking bullet. Learn linux. Congrats you're playing whack-a-mole with a trillion dollar corporation and prolonging your misery. This is stupid.
Yeah, microsoft will never change otherwise. People and companies continue to willingly allow themselves to get abused, and then wonder why Microsoft never changes and continues to abuse them.
So long as said abuse never results in a loss of marketshare and revenue, it will continue. Why would they stop if there's no negative repercussions?
Take backups and disable the updates with group policy. OP just wanted to install Windows 11.
Just stay at windows 10 at this point. The whole point of upgrading to 11 is to not stay on an unsupported OS
Seriously, if people are willing to learn all this, they can easily learn Linux and simply tell the corporate overlords to fuck right off.
Well that's never happened before (with Windows anyway), so it's not likely to happen now.
It's happened at least three times:
Win8.1 x64 required double-width compare and exchange instruction support, so people who bought Win8 for a CPU or motherboard that didn't support it had to downgrade to the 32-bit version or lose support in 2016.
Win7 updates from 2018 onwards required SSE2 with no warning.
Win11 24H2 and later won't install on x86 processors that don't support the x86-64-v2 baseline.
Has happened:
Core2Duo, Opteron64 and Athlon64 can run W11 RTM
They will bluescreen booting after an update to 24H2 because they are missing the POPCNT instruction.
https://arstechnica.com/gadgets/2024/02/windows-11-24h2-goes...
Athlon 64 is a 20 yo CPU. At some point...
Hey, my X200 has a Core2Duo and still does everything I need.
(No, I don't need gaming or LLMs.)
From my experience it seems to happen all the time. Settings reset, uninstalled apps reinstalled, firewall settings erased. I went looking for the Windows 10 patch that deleted the Documents folder if you had remapped it to another drive, and it was hard to find an article due to all the other times their updates have also deleted people's Documents folder. This was the first time I recall it happening: https://www.engadget.com/2018-10-09-windows-10-october-updat...
Where can one read the source code of setup.exe
That's, e.g., how I would determine what these commands do
I have had HN replies in the past that argued Windows is open source and thus comparable to UNIX-like OS projects where _the public_ can read the source code and make modifications, _for free_
Absent the source code, we can read Microsoft's documentation
https://learn.microsoft.com/en-us/windows-hardware/manufactu...
It seems like WinPE is the most useful version of Windows, e.g., it allows more options to setup.exe
How does one quickly and easily download and install a copy of WinPE, preferably on removable media
The windows assessment and deployment kit is what you need, with the windows pe add-on: https://learn.microsoft.com/en-us/windows-hardware/manufactu...
You should be aware there's a 3 day limit to uptime, then PE reboots. You can work around that: https://lsoft.zendesk.com/hc/en-us/articles/360011128377-I-n...
The other thing to tell you is that this is not a live version of windows with all the features of the full desktop. It is the windows that runs the windows installer application, so enough windows to do that and no more.
I would personally recommend linux instead.
Who would argue that Windows is open source? That's hilarious.
What's this?
Whenever I see an unexplained command I don't understand from a random internet forum, I hop onto the production server and run it, just in case it might boost performance. Wouldn't want to miss out on that.
Been doing it since I was 12. It taught me all about the ins and outs of `rm`.
Reminds me of that story from an IRC channel:
A: I have a program that will format your hard drive. I just need your IP.
B: Ok, it's 127.0.0.1
A: Ahahaha, it 56% now! Lol.
A left the chat. Connection reset by peer.
Sounds like me back in the early 80s when I used to war dial, and people used to share "active" prefixes. I learned all about the 911 prefix when I set my dialer and went to sleep. About 20 minutes later the cops were banging on my front door. True story; I was in 6th grade, got arrested for it.
wow did you get a record? this is some Hackers(1995) vibe stuff
I got taken to juvenile hall, put in a holding area with kids that had stolen cars and stabbed other kids in fights. The funny thing is all these "bad kids" were really cool; we talked about video games (Donkey Kong!). I remember one kid got into a fight with his football coach and broke both the coach's legs. He was a big kid, looked like a grown man. He was pretty much in charge of the holding area. But he was cool as hell, cracked jokes with me. I actually kinda enjoyed the holding area.
Anyway, the officials thought I had just called 911 over and over, like to play a prank. They wouldn't hear anything about my computer or whatever (it was the early 80s). They were pissed. I was kept in the holding area for a few hours, then they let me go home. I was ordered to a bunch of community service, cleaning the parking lots of local parks, stuff like that.
A work-around to install on unsupported hardware which both works, but is unsupported and could break during a feature Windows Update.
At this point I'd say it's more of a "would" than a "could"
A clever way to maximize the chances that your computer gets bricked on a future Patch Tuesday.
It’s really not
Some of the checks are around CPU features that they don’t currently use but may use in the future. And CPUs don’t typically respond super gracefully to being asked to execute instructions they don’t understand.
LoL with the insane backslash crap
I've been using Linux for 20+ years, but I was fairly happy with Windows 11. At its core it did exactly what I needed it to do, and it allowed me to run some commercial software that is harder to install and run on Linux (Davinci Resolve).
But my Dell hardware drivers were flaky in Windows. My bluetooth had extremely variable availability. And then Windows rebooted itself, against my wishes, 3x in one week. And then there was the promise of Recall.
That's when I wiped Windows and installed Ubuntu. All my hardware issues went away (yes, I had to fiddle the sound driver a little so it didn't crack when it woke up from sleep, and I had to make one small change so suspend worked properly.. but both were easily solvable). My bluetooth has been flawless since and I was able to use my Logitech wireless mouse again.
I'm never going back.
I do a bit of napkin math on Apple Silicon single-threaded performance, GPU performance, and battery management against non-Macbook Air/Pro specs for same price. I follow DHH (who I otherwise object to) on his adventures with the Asus G14 machines.. but I'm not sure its GPU performance still matches the similarly priced Apple offering.
Less integrated OS, worse battery management, and weaker performance for more money? I'm not sure. But I'll probably still go that way.
The Intel/AMD laptop manufacturers need to get out from under Nvidia's hardware GPU thumb.
I get the intent, but moving to linux for better bluetooth support is... an interesting take
How so? Bluetooth has been working out of the box (no tinkering) for me under Linux for the past ten years now across multiple devices. Including stuff like APT-X and LDAC. All with proper OS integration (I use Gnome). What's the story on Windows?
Same here. The story for windows, IME, is that my work Logitech BT keyboard works fine, but neither my sony nor shure headphones work at all. Windows says connected, but then disconnects right away. On the same PC which dual-boots linux, they both work fine, with LDAC for the sony and apt-x hd for the shure.
At work, we have BT Jabra headsets. I specifically asked for a corded version, I hate the latency for calls. My windows-using colleagues, for some reason, love wearing a wireless headset and talking through the laptop microphone.
Well I won't be buying Dell again.
Say what you will about Macs, I ain't no fanboy, but from this side of the fence, I had forgotten that drivers were a thing.
Agreed. Given my expectation of travel, it's probably the sanest choice. Esp with the Stores for service.
Why did you want to install windows 11 anyways? I also have a PC stuck on Windows 10 and it makes me happy that it's now stable and not part of the forced rolling releases in Win11. Im going to run it on Win10 as long as I can.
Lack of security updates is a problem for a computer that's connected to internet.
I am in the same boat. Running Windows 10 on a Ryzen 5 until the cows come home. I run Rocky Linux in my laptop but I am a gamer so I'll hold to Windows 10. Some Linux Distros are bringing AI. Not ready for that.
> “you can’t install unless you upgrade your CPU”
To be fair, I recently had to switch distros for my little Atom-based server because of a similar deal:
https://en.opensuse.org/openSUSE:X86-64-Architecture-Levels#...
Granted, I only had to convert to Tumbleweed (not trivial, but easier than reinstalling), and the open source nature means there will always be lots of other alternatives, too.
How old is the Atom? My CPU was about 10 years old at the time.
I'm sure there's a million reasons not to, but they could even just open-source Windows 10. Leave you alone with the hardware that you rightfully purchased, and let the community police the security gaps that arise. It's beyond me how planned obsolescence especially on perfectly sufficient hardware is even legal.
That’s not what open source means / how it works.
It's likely a large percentage of that code is also used in win 10 - it’s not like 11 was a complete rewrite.
This was very useful for me. Force install Windows 11: https://news.ycombinator.com/item?id=45853012
You can bypass the warning really easily, I googled it the moment I saw it and it was very easy. A keyboard shortcut to open the command window during the install and one cheeky command.
A future update may restrict access to your windows or break its ability to get further updates. It’s a matter of time. They just haven’t gotten around to it.
And if you’re thinking you’ll just Google another solution then you might as well spend that time googling efforts for Linux. Windows shouldn’t require constant hacking or tinkering to function.
Microsoft is removing these bypasses over time though.
> because my CPU was 5 years too old apparently.
And yet Nadella writes this:
"For AI to have societal permission it must have real world eval impact. The choices we make about where we apply our scarce energy, compute, and talent resources will matter."
Apparently resources are only "scarce" when Microsoft needs them. When it comes to your consumer outcomes you have to throw away working equipment and buy new.
is there a good windows rdp client for Linux? I went looking for one a few months back but didn't find anything definitely
I use remmina, which uses xfreerdp under the hood. It works well, but I haven't managed to get smartcard authentication working, if that matters for your environment.
> There is a marketing lesson there somewhere. Microsoft once had the IT world at its feet, because not only was its Windows 9x OSes ubiquitous everywhere, but it had many millions of programmers who had become experts at Visual Basic and Visual C++, so almost all corporates used programs written in these easy to learn, not too difficult to master, fun to program in (yay for Intellisense and Drag-n-drop ActiveX controls), and versatile despite some limitations. This was also the era where many corporates had complicated databases set up in MS Access or MS SQL Server, because they were easily accessible and usable from front-end applications written in VC or VC++.
Microsoft even evolved it all to adapt to and compete with new ideas from rivals, such as COM+ as alternative to CORBA, ASP.Net as alternative to JSP, etc.
Then Microsoft did the unthinkable. It inexplicably threw away all these IT dependencies away, that it had spent decades to build across the worlr.
Microsoft unleashed .Net on an unsuspecting IT world.
And M$ arrogantly expected the world to also throw all their years of efforts of building applications and databases revolving around VB/VC++.
To save their careers, millions of VB/VC++ programmers tried hard to scramble and learn these new technologies, but Microsoft just kept updating and upgrading the .Net landscape with increasing frequency and leading to more chaos and confusions. And as the learning curve steeper and the .Net scope became too hard for sane people to master in a short time, it became apparent that to the entire IT world (except Microsoft) that it had become too difficult and cumbersome to build applications for corporations using Microsoft's new-age tools. Thus, the interest and ambitions of the programmers and corporations quickly waned towards Microsoft tools, especially when they realised that .Net was a mess for installations, and it called expensive licenses to build and ship.
So programmers and SOHO/medium-scale companies, pivoted to alternatives to Microsoft imposed nightmares. Python, PHP, MySQL, Linux, Perl, Ruby, JavaScript, JSP, etc. took centre stage, even as the IT world moved away from .Net.
The worldwide chaos caused by Windows Vista and Windows 8, did nothing to improve upon IT people's disdain for all things Microsoft.
And Microsoft's rivals pounced at such golden opportunities, and they slowly ate away at Microsoft's dominance in corporate world.
Yes, there is indeed some lessons for Micro$oft to be learnt from these debacles.
"Hubris calls for nemesis, and in one form or another it's going to get it, not as a punishment from outside but as the completion of a pattern already started." ~ Mary Midgley
"And on the pedestal these words appear: 'My name is Ozymandias, king of kings: Look on my works, ye Mighty, and despair!' Nothing beside remains. Round the decay Of that colossal wreck, boundless and bare The lone and level sands stretch far away." ~ Percy Bysshe Shelley, Ozymandias
It's been wild watching Ms shoot itself in the foot every year for the past 20 years.
>no upgrade path to Windows 11 because my CPU was 5 years too old apparently.
Let's be real. It's because new systems support DRM and Microsoft has been captured by the media company lobby.
We had that announcement of a new "Verifiable" Linux project from Pottering, other kernel devs, and a bunch of ex-microsoft employees yesterday. Gives me the heebie jeebies.
Yeah I caught a lot of downvotes for coming out early against it.
How dare you think for yourself in 2026!
Remote Attestation of Immutable Operating Systems built on systemd
Its the "remote" thing that has no place in personal computing, or rather, computing that is to extend one's own autonomy, or agency. Its no one's damn business whether my system is attested or not! I mean, sure theres certainly benefits for me knowing if its attested, but the other road is one of ruin, and will basically be the chains of the future.
If you're trying to remotely attest immutable OSs you are definitely not a home user, or if you are, you're definitely very keen at least and likely a raging self-masochist.
If you're NOT trying to remotely attest anything, you're fine. Just use your chosen OS, dawg.
Remote attestation is just generating a random blob on the remote side and then making the tpm 2.0 module on a computer sign the blob with a private key. You then provide the signature and the public key to the remote for verification. That enrolls that device. After that you can "verify" with a new binary blob and validate a new signature came back with the same key. That full loop is remote attestation. The idea is your disk didn't get moved to another computer. It's a security thing that Linux does need and is capable of being fully open source.
It has nothing to do with drm.
It has everything to do with DRM. It’s not “dual use” technology. It has one use, and this is it.
We're gonna be using this to validate someone didn't move your login to another device. Which will protect you from session hijacking. Your work stuff will start requiring it. Your media accounts will too. Or else linux will simply be locked out from major services. DRM is already in your browser. And literally has no connection to identity attestation.
Who is “we”? So we can know who to avoid.
It’s way worse than that. It’s for verified identity and attestation.
If they didn't, people might start capturing copyrighted streaming content and sharing torrents of it. We cannot allow that to happen.
Of course it is, there's no real requirement to have a TPM, plenty of people made a version with that requirement patched out and the system works fine.
I saw a vid where they installed a recent version of linux on a Pentium 1.
I came to rely pretty heavily on Docker and WSL(2) in Windows. I was an insiders user for a bit over a decade, and worked with .Net and C# since it was "ASP+" ...
I had setup a dual boot when I swapped my old GTX 1080 for an RX 5700XT, figuring the "open source" drivers would give me a good Linux experience... it didn't. Every other update was a blank/black screen and me without a good remote config to try to recover it. After about 6 months it was actually stable, but I'd since gone ahead and paid too much for an RTX 3080, and gone back to my windows drive...
I still used WSL almost all day, relying mostly on VS Code and a Browser open, terminal commands through WSL remoting in Code and results etc. on the browser.
Then, one day, I clicked the trusty super/win menu and started typing in the name of he installed program I wanted to run... a freaking ad. In the start menu search results. I mean, it was a beta channel of windows, but the fact that anyone thought this was a good idea and it got implemented, I was out.
I rebooted my personal desktop back to Linux... ran all the updates and it's run smoothly since. My current RX 9070XT better still, couldn't be happier. And it does everything I want it to do, and there's enough games in Steam through Proton that I can play what I want, when I want. Even the last half year on Pop Coxmic pre-release versions was overall less painful than a lot of my Windows experiences the past few years. Still not perfect, but at least it's fast and doesn't fail in ways that Windows now seems to regularly.
Whoever is steering Windows development at Microsoft is clearly drunk at the wheel over something that should be the most "done" and polished product on the planet and it just keeps getting worse.
I want to chime in here. It's advertisements on my desktop that repels me. There is something deeply personal about ads in my desktop that feels like being violated. This is a computer that I paid for, with software that I pay for, that includes all my most personal files and data. Seeing ads on the OS completely eroded my trust.
Of course, I still use Windows for various things, but I have too much "ick" for it to be the system where I check my email, manage my business, keep my important files, etc.
Windows is really great for lots of things, but I don't trust it.
Yeah. The ads in he start menu are a sign that you are no longer the customer, you are the product. Windows has other similar “features”.
I do not have ads in my start menu, and no, I didn't "debloat" my PC. This is a base install where I flipped a couple of settings in the start menu options.
It was a test they ran on Insiders channel to see how people reacted to them. It never mated it into GA, or for that matter the entire insiders channels... They'll feature gate things to some insiders users and A/B test them to see how the user response looks. There was a bit of an uproar at the time for those that saw them, including myself... I ditched windows altogether (except my assigned work laptop).
I think some form of ads made it into the release channel. I recently did a clean install of Windows 11 25H2 and I could not figure out how to get App Store ads out of the search results in the start menu. That and a game working better on Linux than Windows was the straw that broke the camel’s back for me and I installed Ubuntu.
How generous of them to allow their paying user to disable the ads. It's only a matter of time until this either becomes some sort of premium feature.
You're missing the point entirely.
The problem isn't that ads can be disabled. The problem is that a paid operating system ships with ads in the first place. Full stop. There's no universe where that's acceptable product design, and the fact that you can disable them (for now, at least) doesn't make it less offensive.
I don't understand why you're going to bat for a trillion-dollar corporation here. Your settings work now. Great. They won't after the next feature update, this is a well-documented pattern. Windows updates routinely re-enable telemetry, Bing integration, and promotional content that users explicitly disabled. You're not configuring your OS, you're fighting it.
The TPM2 requirement is pure planned obsolescence. Millions of perfectly good machines binned because Microsoft decided hardware from 2016 is suddenly "insecure"... whilst the actual benefit is DRM enforcement and remote attestation.
It's a corporate compliance tool, not a security feature.
The Insiders build being referenced had actual web advertisements in search results. That's where this is headed. If you're comfortable defending that trajectory, carry on flipping those settings.
>whilst the actual benefit is DRM enforcement and remote attestation.
This is not highlighted nearly enough. It's very bad.
You paid for windows 11? They basically give it away to end users.
Yes, I paid for Windows 11. It came bundled with the £1,900 laptop I bought. The fact that the licence cost is hidden in the hardware price doesn't make it free.
And even if it were free, which it isn't, that still wouldn't justify ads. Android is free. Linux is free. Neither ships with gambling app promotions in the system UI.
Microsoft made $20 billion in Windows revenue last year. They're not a scrappy startup looking for alternative monetisation. The ads exist because they can get away with it, not because they need to.
Many variants of Android ship with ads, particularly on cheaper phones. Xiaomi phones for instance.
Let me know when Android base OS has ads in the UI.
If the largest advertising company on the planet isn’t making it part of the base operating system — then that should tell you something, don’t you think?
OEM volume pricing is $20 USD or so for system builders like Dell, HP, etc last time I saw it, but that was a long time ago. So technically yes it was purchased if you bought a system, it was just built into the price.
Not really. Only upgrading an existing Windows 10 installation is free.
> The problem is that a paid operating system ships with ads in the first place.
You never buy a laptop or pre-build? They are often full of ads that are not Microsoft Windows build in but add-on by the OEM.
Now i agree that Ads in your OS that you paid for, is a big nono. I never understood why Microsoft threats Home and Pro as almost the exact same. Sell Home for cheaper and with Ads, but keep the more expensive Pro clean. Microsoft can do that easily because Windows Server is just that ...
But on the Linux front, i have never been happy with the desktop experience. Often a lot of small details are missing, if the DE itself not outright crashes (KDE, master in Plasma/Widget crashes!). And so many other desktop feel like they have been made in the 90s (probably are) and never gotten updated.
And i do not run W11, still on old and very stable W10. There is no reason to upgrade that i see. Did the same with W7, for years after support ended (and by that time W10 was well polished and less buggy).
The problem is, what does Linux Desktop offer me more, then a few annoyances that i can remove after a fresh install? Often a lot more trouble with the need to use the terminal for things, that are ancient in Windows. That is the problem ... With Apple, you can get insane good M-CPU hardware (yes, mem/storage is insane), for the os/desktop switch.
I noticed that often the people who switch to Linux, are more likely to send more time into finetuning their OS, tinkering around, etc... aka people with more time on their hands. But when you get a bit older, you simply want something that works and gives you no trouble. I can literally upgrade my PC here from a NVidia to AMD or visa versa, and it will simply work with the correct full performance drivers. Its that convenience that is the draw to keep using (even ifs a older) Windows.
For now 25 years every few years, i look at upgrading to Linux permanently, install a few distro's and go back. Linus Desktop does not feel like you gain a massive benefit, if that makes sense? Especially not if your like me, who simply rides out Microsoft their bad OS releases. What is the killer features that you say, hey, Linux Desktop is insane good, it has X, Y, Z that Microsoft does not have, its ... That is the issue in my book. Yes, it has no adds but that is like 5 min work on a fresh install, a 2 min job of copy/past a cleanup script to remove the spyware and other crap and your good for year. So again, killer features?
Often a lot of programs that are less developed or stripped down compared to Windows, let alone way too often 90 style feels programs. You can tell its made by developers often, with no GUI / Graphical developers involved lol
I said it a 1000 times but Linux Desktop suffers from a lot of distro redoing the same time over and over again. Resulting in this lag ...
That is my yearly Linux rant hahaha. And yes, i know, W11 is a disaster but i simply wait it out on W10, and see what the future brings when the whole AI hype dies down and Microsoft loses too much customers. I am betting that somebody is going to get scared at MS and we then get a better W12 again.
I've been pretty happy with Pop in general, I did upgrade to COSMIC pre-release about 6 months ago, and although there have been rough edges, less than some of my Win11 experiences. I don't really fiddle that much in practice, I did spend a year with Budgie, but only the first week fiddling. Pop's out of the box is about 90% of what I want, which is better than most.
I do use a Macbook M1 Air for my personal laptop and have used them for work off and on over the years... I'm currently using a very locked down windows laptop assigned from work. Not having WSL and Docker have held me back a lot though.
In the end, I do most of my work in Linux anyway... it's where what I work on tends to get deployed and I don't really do much that doesn't work on Linux without issue at this point. Windows, specifically since Win11 has continued to piss me off and I jumped when I saw something that was just too much for me to consider dealing with. I ran insiders for years to get the latest WSL integrations and features. This bit me a few times, but was largely worth it, until it wasn't anymore.
C# work is paying the bills... would I rather work on Rust or TS, sure... but I am where I am. I'm similar to you in that I looked at Linux every few years, kicked the tires, ran it for a month or a couple weeks and always went back. This time a couple years ago... it stuck. Ironically, my grandmother used Linux much longer than I ever did on her computer that I maintained for her. For her, it just worked, and she didn't need much beyond the browser.
> You never buy a laptop or pre-build? They are often full of ads that are not Microsoft Windows build in but add-on by the OEM.
This was never acceptable, but we tolerated it because it subsidised the cost of the laptop, OEMs decided the trade-off and you could vote with your wallet for cleaner experiences (often with the same manufacturer).
Show me the ThinkPad T or X series (or EliteBook, or Precision/Latitude) that shipped with ads and I'll take it as a valid point. Otherwise, it's not valid.
> if the DE itself not outright crashes (KDE, master in Plasma/Widget crashes!). And so many other desktop feel like they have been made in the 90s (probably are) and never gotten updated.
All modern Linux desktops feel more advanced than the corresponding windows version, IMO. I just installed standard Raspbian on a bunch of Raspi5s, and it feels snappier and more advanced than Windows already.
For those that want to remove items, You can quickly disable these options by going into Settings > Personalization > Start and turn off "Show recommendations for tips, shortcuts, new apps, and more".
It's like a 10 second fix and basically everything is gone.
That's not what I'm referring to... it was a beta test that included actual internet ads in the start menu search results... It was literally a product I was looking at on the previous day.
Again, let's clarify here.
Microsoft implemented, in its Beta Windows Insider Channel in 2024, ads in the "Recommended" section of the Start Menu. The very section I just described pretty plainly how to turn off.
I mean I don't understand why everyone is so puffed up about this. You read some internet headline and start screeching about it on social media as if it doesn't take 2 seconds to literally turn off.
AGAIN this is NOT what I was referring to... I'm referring to when you start typing in the name of a program you have installed, and you get a short list of matches, with maybe additional results... not a PRODUCT ADVERTISEMENT (not software) from the internet at the top, which is what I got.
It's not a feature that should EVER exist at an OS level... I didn't even mind the adjacent product ads or the Recommended section you mention that much... but it's emphatically not what I'm fucking talking about.
The fact that this was even something that was implemented and tested means that I'm not someone who will buy or choose Microsoft Windows at all from here forward. I have over 3 decades of development experience on/for/with software that runs on Windows.
An even then... It doesn't matter if I can shut it off, it shouldn't have existed in the first place.
It might be different if they didn't push updates every other month that changed settings like default browser back to their products.
You're right that there are simple fixes but the point is that Microsoft is no longer on your side. You're now stuck defensivly scrambling for value in a product where you are no longer the customer.
The Ass-Fucker 3000 fucks you in the arse when you use your car ignition.
But don't get bent out of shape - you can disable it in settings. Takes 10 seconds. Assuming you know it exists and the option doesn't disappear in a future update.
And if it re-enables itself after the next patch? Well, at least the option to disable it still exists! Probably.
Why would you buy a different car? It's so easy to turn off. What, you want to use a BMW? Be a BMW-user? A sheep? All your tools already integrate with our car anyway. There's no real choice, is there - unless you want to be a try-hard. And maybe it doesn't even work properly. You don't want that hassle, do you? Just accept the Ass-Fucker 3000. Next week they'll add the Wife-Beater 2000, but don't worry -that'll have a toggle too.
Cope harder. I wish I had apologists like you for my software.
This sounds like Boeing MCAS cant.
I mean you can also copy my dotfiles onto your linux machine and have a more advanced system than anything windows would provide, and it'll take less than ten seconds, but this is 'fiddling' or somethin.
It's really hard to maintain a product team where the mandate is just "don't break anything and keep the quality high". Especially something with as big of an installed base as Windows.
The team will look for excuses to build new and exciting stuff and new opportunities to increase revenue. Even if the product is pretty much "done".
I disagree, I think companies mostly just don't want to spend development money on existing "finished" products. That's the smell I'm getting from microsoft.
There are plenty of easily identifiable issues with performance in windows 11. There should be people in the windows team dedicated to eliminating "jank". MS product owners, on the other hand, are much more interested in getting copilot integrations into every menu. That's an "easy" task which looks good on a scorecard when you complete it.
No, it really shouldn't be... You can reduce headcount a lot, which they did, and concentrate on bugs (including security reports), while working with hardware vendors for if/when new features need to be integrated for better usability.
If/when you decide to do a redesign, it should be limited to a specific area, or done in such a way that all functionality gets moved to its' new UI/UX in a specified timeframe and released when done. Not, oh, here's a new right click menu that you now have an extra click 1/3 of the time for the old menu that has what you are actually looking for because the old extension interface was broken.
Want a real exercise in fun ... just for fun, because I know it's not as useful on a laptop, but was fun on desktops... get a screensaver working in windows that runs for an hour or so before going to sleep... just try it... that's a fun exercise in frustration... oh, it's still in there, but every third update will disable it all again. I get it... but you know what, I want my matrix screensaver to run when I'm only away for a few minutes or over lunch.
The mandate seams to be "squeeze everything for a subscription fee and keep the quality... actually just the first thing".
Every month more and more people switch to Linux and I just love it. I'm tired of one company controlling the core operating system of 85% of desktop computers and users being at their whim.
You want proprietary programs? Alright, fine, one can argue for that. But the central, core operating system of general purpose computers should be free and fully controllable by the users that own them!
It's a form of Stockholm Syndrome for most people. They'll have some bit of software they imagine is irreplaceable because they have some special use case that means that they just have to tolerate the relentless abuse. Or some other excuse. Whatever. It all boils down to people being afraid of change.
Most of that fear is not all that rational. It's not unlike kidnap victims falling in love with their captors. Your mind just tries to make the most of what fundamentally is a really messed up situation. You'll tell yourself it isn't that bad or that the next update will fix it or that you can get some magic software thingy that makes it go faster. Whatever.
Once you realize you are being abused, you can make some choices and do something about it. Most tools can be replaced if you look around a bit and do a bit of research. And virtual machines on Linux can run Windows just fine if you have one or two things that just really need it (been there done that). There's also wine and proton which aren't half bad these days. And they work for lots of things other than games. You can dual boot. Etc. Try it and find out. The absolute worst case is that you have to go back to being a lame abuse victim here. You'll feel extra bad because now you know. The best case gets you out of that abusive relation ship for the rest of your life. Life is too short to get subjected to this kind of abuse.
This might have some merit for some people.
But the talk of abuse is also heavy-handed.
I've spent months testing and trying out RAW photo editors, and months trying out Linux gaming.
Linux is incredible, but my experience with Windows is still better. As many that still use it can attest, you can disable almost any annoyance. It's extremely stable. Things just work including brightness controls, fractional scaling, high refresh rates and high FPS gaming, and my favorite RAW photo editing. I could switch to a less enjoyable experience with Linux but I choose not to after extensive evaluation. I don't spend any money on Microsoft services, no Office or OneDrive subscription.
But my decision isn't permanent. My hobbies, software use, gaming selection, etc. can change over time, and Linux is getting better while Windows is getting worse. If it's ever "abuse" and I can have a better experience with Linux, I won't hesitate to change. But it's also a lot of effort to try out alternatives, and dual booting is slow and annoying. Plus when I dual boot to Linux Mint the kernel fails to boot every other time and I have to select an older one, reboot, select a newer one, reboot. It's a huge waste of time. A bad experience and I have chosen to avoid it and try again in another year or two.
Did you try Rawtherapee?
Yes, and Dark Table.
In the end I love, love, love DxO PhotoLab and would really like to stick with it over the alternatives.
I don't think it's Stockholm Syndrome, rather it's a classic case of sunken cost fallacy. For me at least, that's what it was. I had invested so much time in Ableton (~14 years) and didn't feel like starting from scratch with another DAW. And let's be real, no one likes that kind of friction.
It had to get worse to finally break the inertia and also make me realize that it's only going downhill.
I've dual booted for over a decade at this point and I still need to switch back and forth for different things.
> Every month more and more people switch to Linux
We've been hearing this for decades and yet the home Linux userbase is microscopic and somehow even smaller than ever. Unless we're going to count Google's Android and Chrome OS. Those are the only Linux-based distributions that have ever gained market share over desktop Windows.
Somehow I think the stars might be aligning this time though. People are genuinely fed up with Windows and governments around the world are loudly thinking about how to reduce dependence on US tech. And then there is Proton which makes it much easier for Gamers to jump ship. To me it feels like there is more momentum than ever for this.
On the other hand I am also a realist and I don't think that Linux will take over the Desktop, but it will certainly have its biggest growth year ever in 2026.
> On the other hand I am also a realist and I don't think that Linux will take over the Desktop, but it will certainly have its biggest growth year ever in 2026.
I _love_ Linux, but I agree with this as well. I don't think Linux will ever be easy enough that I could recommend it to an elderly neighbor. I hope to be proven wrong, though.
What frustrates me about this particular moment is that at the same time Windows is getting worse, I feel that OS X is _also_ getting worse. This _is_ an opportunity for Apple to put a big dent in Windows market share.
> Somehow I think the stars might be aligning this time though
> governments around the world are loudly thinking about how to reduce dependence on US tech
I am definitely sympathetic, after all, I worked for a major Linux company for quite a few years, started using Linux RH) in 1994, and even wrote some network related kernel modules.
However, this switch to Linux is not going to happen (apart from where it is already used heavily, from servers to many non-PC systems).
I have been in projects for large companies but also government on and off. Now, I manage the IT of a small (<50 employees) non-IT business with people in several countries.
People who actually comment in these discussions seem to be entirely focused on the OS itself. But that is what matters the least in business. Office is another, and even there most people who don't deal with it at scale are way too focused on some use case where individuals write documents and do some spreadsheeting. It's almost always about a very small setup, or even just a single PC.
However, the Microsoft stack is sooooo much more. ID management. Device management. Uncountable number of little helpers in form of software and scripts that you cannot port to a Linux based stack without significant effort. Entire mail domains are managed by Office 265 - you own the domain and the DNS records, you get licenses for Office365 from MS, you point the DNS records to Microsoft, you are done.
Sure, MS tools and the various admin websites are a mess, duplicating many things, making others hard to find. But nobody in the world would be able to provide soooo much stuff while doing a better job. The truth is, they keep continuously innovating and I can see it, little things just conveniently showing up, like that I now have a Teams button to create an AI script of my conversations, or that if more than one person opens an Office document that is stored in OneDrive we can see each other inside the document, cursor positions, and who has it open.
Nobody in their right mind will switch their entir4e org to Linux unless they have some really good reasons, a lot of resources to spare, and a lot of experience. Most businesses, for whom IT is not the be-all-end-all but just a tool will not switch.
But something can be done.
The EU could, for example, start requiring other stacks for new special cases. They cannot tell the whole economy to switch, not even a fraction of it, but they could start with new government software. Maybe - depends on how it has to fit into the existing mostly Microsoft infrastructure.
They could also require more apps to be web-only. I once wrote some code for some government agency to manage business registrations, and it was web software.
The focus would have to be to start creating strong niches for local business to start making money using other stacks, and to take the long road, slowly replace US based stacks over the next two or three decades. At the same time, enact policies that let local business grow using alternative stacks, providing a safe cache-flow that does not have to compete with US based ones.
The EU also needs some better scaling. The nice thing about the MS stack is that I can use it everywhere, in almost all countries. The alternative cannot be that a business would have to use a different local company in each country.
I read a month ago that EU travel to the US is down - by only ~3%. Just like with any calls for boycott of this and that, the truth is that those commenting are a very tiny fraction. The vast majority of people and businesses are not commenting in these threads (or at all), and their focus is on their own business and domain problems first of all. Switching their IT stack will only done by force, if the US were to do something really drastic that crashes some targeted countries Microsoft- and Cloud-IT.
> However, the Microsoft stack is sooooo much more. ID management. Device management. Uncountable number of little helpers in form of software and scripts that you cannot port to a Linux based stack without significant effort. Entire mail domains are managed by Office 265 - you own the domain and the DNS records, you get licenses for Office365 from MS, you point the DNS records to Microsoft, you are done.
Is there any bit of this that is not web based or does not support Linux nowadays? Office 365 runs on a browser, and even Intune supports some enterprise oriented distributions, like RH, so device management shouldn't be a problem. But even if none of that was true, there is certainly competition in the IT management space. Defaulting to Microsoft just because of a Windows based fleet doesn't sound like a great idea.
> The truth is, they keep continuously innovating and I can see it, little things just conveniently showing up, like that I now have a Teams button to create an AI script of my conversations, or that if more than one person opens an Office document that is stored in OneDrive we can see each other inside the document, cursor positions, and who has it open.
This is stuff other vendors have been offering for ages now.
The browser versions of the Office apps aren't comparable to the native apps, and also don't support whatever native integrations (like VBA add-ins) companies use.
They may not be, but I can almost guarantee that Microsoft will get rid of them sooner than later.
Trading dependency on a company in Redmond, WA, USA, for one in mountain view, CA, USA does nothing for moving away from USA in the dependency chain, but it proves that it's possible. And I know it's possible as there are several billion-dollar companies in Google Workspace I know of personally. And if it's possible for them, it means it's possible for the EU to get there. The only question is will they ever? Let's form a committee to schedule a meeting to look into that question.
"Possible" is everything that does not violate any laws of the universe, that is not a useful criterion!
Oh and thanks for ignoring everything I wrote I guess. Not that I expected anything different, it is always the same in these threads after all. Why bother with arguments, especially those of the person you respond to?
But you see, this "laziness" actually supports my point. Not even you want to do the hard thing and bother with what somebody else thinks when there is a much easier path. But you expect others to care about the things that you care about, without spending much effort even merely understanding their position.
Go and download the archives of Reddit, there are plenty of torrents out there. Filter to a sub like r/gaming. Relative frequency graph of Linux mentions. You'll see a magnitude increase over the last 12 months compared to years before. It's real.
Must admit, not sure if the data torrents are uptodate now that Reddit anti-scrapes so hard to raise their premium on the exclusive contract to the highest bidder, OpenAI.
Calling 4-5% marketshare microscopic is not fair. I get it if it was still stuck at 1%, but it's growing, and the rate of growth has been increasing too.
Is the desktop/laptop linux market share really over 4%? What is that based on?
At least according to Statcounter, Linux is currently at 3.86% worldwide: https://gs.statcounter.com/os-market-share/desktop/worldwide.
It's slightly larger in the US at 5.28%: https://gs.statcounter.com/os-market-share/desktop/united-st...
In India, where I live, it's surprisingly at 6.51%: https://gs.statcounter.com/os-market-share/desktop/india
Take this with a grain of salt, because numbers from Statcounter are not fully accurate. However, none of those numbers are small. 3.86% of the entire PC market is not something to scoff at.
There's also the people like me that couldn't historically run certain games well directly on Linux, so we have Windows virtual machines with GPU passthrough. Which would read as me being a Windows user in the Steam stats, but a Linux user in other stats.
The state of gaming has improved drastically since I started doing it that way, though, and I'm considering ditching the VM entirely. Multiplayer games seem to be getting the hint about anticheat exclusion on Linux. ARC Raiders, for example, is a competitive game and runs flawlessly directly on Linux.
The high amount of "Unknown" is interesting. Especially as it doubled in the last 6-8 months.
"Unknown" is always mostly some version of Windows that they couldn't classify for one reason or another.
Last time I looked on stat counter it showed 4 and something percent. That's where I pulled the number from. But it seems they updated it to 3.86 now. It's so over for the Linux community.
The Steam survey has it at 3.6%, although that's obviously skewed towards gamers, and counts Steam Decks in addition to desktops.
According to Statcounter, Linux's share is 3.86% and rising; but I'd imagine that quite a bit of the almost 16% 'unknown' is also Linux.
https://gs.statcounter.com/os-market-share/desktop/worldwide
Not insignificant at all.
Maybe more interesting is that if you switch to looking at just America Linux jumps to 5.25% (unknown 7.4%) (Similar numbers for all of NA), and for Europe it is 4.32% (9.75% unknown).
Again, not huge numbers but also not insignificant. But they are quickly growing and taking share from Microsoft. If we look back at (Dec) 2021 the numbers are 1.8% and 2.2% respectively. Those gains are meaningful.
You can see that while Windows 10 numbers are going down over the past few months, the Windows 11 numbers aren't making up for it. About 2/3 of that gap are going to Linux with the other third going to Mac. So Mac is getting more market at the expense of Windows as well. There are a significant number of disgruntled Windows users leaving over the past year.
Its not ... The problem is that people do not realize that devices like Steam Deck are also considered Linux desktop devices in those numbers. Chrome tends to also inflate those numbers. Yes, they are Linux desktops but not in the way people are comparing Windows to Linux.
The real number is closer to 2.5% somewhere. What is still growth but nowhere the "year of the Linux desktop".
You tend to see a rather vocal minority that makes you feel like there is some major switch but looking here in the comments, people that switched 8 years, 12 year, 20 years ago are people that are part of the old statistics. There are some new converts but not what you expect to see despite Linux now also being more gaming compatible.
It still has minor issues (beyond anti-cheat), that involve people fixing things, less then the past. But its still not the often click and play, works under every resolution, has no graphic issue etc etc. That is the part people often do not tell you, because a lot of people are more thinkers, so a issue pops up, they fix it and forget about it.
Ironically, MacOS just dominates as the real alternative to Windows in so many aspects. If Apple actually got their act together about gaming, it can trigger a actual strong contender to Windows.
Steam Deck is a Linux desktop device. It is literally a thin laptop with a build-in screen and joystick running linux. Does my linux system stop being that when I turn on big picture mode in steam? You can run the steam deck as your daily driver hooked up to a keyboard and a monitor.
The Steam Deck is not a Desktop ... That is like saying that every Android smartphone is a desktop. Sure, you can use it as a desktop but 99.99% of the people are using it as a handheld console.
And nice downvotes... Typical in Linux Desktop topics.
A growth of 4% over 20 years is not an increasing rate. And yes, 4% marketshare is microscopic. macOS has a bigger share but you wouldn't say macOS is massive. Posts like this are cheerleading OS's because everything needs to be a zero sum competition.
But it's also not not an increasing rate, there's not enough information to know if the rate is increasing or not.
As phones replace desktop computers for non-technical users, leaving a concentration of "skilled" users, my suspicion is that the pattern will resemble the quote "Slowly, then all at once."
Have a look at the Steam Hardware [and software] Survey [0] results. Linux has been trending upwards whist Windows has been trending down for a wee while. And the population this looks at is primarily interested in gaming, which means that this is despite a compatibility layer being needed for a large amount of the software used. I imagine in other communities (software, old people) it's trending much faster.
E.g. I recently installed Linux Mint for my grandma so she could use email and an up-to-date web browser on her old laptop that can't run (secure) Windows anymore. The UI differences are marginal for her, and she can do everything she needs to much better than she could before (which was not at all).
I mean, this is literally false? Desktop Linux userbase is growing, it's bigger than it has ever been even without including ChromeOS, and more OEMs are shipping devices with desktop linux than ever before (Valve's suite of devices, multiple laptop vendors including major ones like Lenovo, a few SteamDeck competitors)
More and more desktop apps are just becoming websites. More and more desktop apps are using Electron rather than some native app. Windows is slowly becoming a dumpster fire in terms of usability and issues. Most games these days Just Work on Linux without any tinkering.
While I hardly think that this year will be "the year of the Linux desktop" or whatever, but if these trends keep going, I really foresee Linux market share growing, slowly, each year, until it's not so microscopic anymore.
I mean - steam deck was a pretty significant inflection point quite recently. Making gaming viable on linux via a popular consumer product is a huge deal and starts to kill one of desktop linux's single biggest barriers to adoption.
According to the Steam Hardware Survey (https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...) only ~3.6% Steam users use Linux and these statistics include the Steam Deck users. SteamOS accounts for ~26% of Linux users, which in turn brings down the count to ~2.6%. For comparision, MacOS is ~2.1% of the market share at the moment. Wake me up when Linux gets to 10%.
> You had unsaved work? Too bad, it's gone, get bent.
This has happened to me a couple of times. I put the PC to sleep and the next morning I discover it has decided to close everything to install an update.
Not using Windows ever again to do any work. Say what you will about Apple but at least they don't do crap like this.
I aspire to have your level of confidence in anything that amounts to leaving unsaved work in any sort of shape or form.
The point is, the user shouldn't have to work around the OS. The OS should work around the user. If there are unsaved files, the OS should not be installing an update and removing unsaved work.
I installed Windows Update Blocker (AKA "WUB") and i've stopped the nonsense shutdowns late at night.
That helped stopping the aggravation, but lets see how long I last. I do feel my next computer will be a Linux OS ... but i'm not a programmer and I wince at having to do all the wine installs fresh...
The fact that you leave unsaved work overnight is the actual crazy part.
Why though? On Mac, I have tons of unsaved work: many TextEdit windows which keep their state for many months, even through reboots. And it has been working like for at least 10 years. It's such a simple, little quality-of-life thing. And Microsoft just doesn't care.
This is what a computer should be doing: helping the user to get their work done, without the user having to worry about insignificant details about saving files. E.g. does Google Docs ever ask where to save a file before closing the browser or shutting down the computer? No you just get an untitled document that is automatically saved. If I want to rename it or save it in a different location, I am free to do so. But as long as I don't, it doesn't get in the way and just persists stuff automatically.
I don't disagree, but you have to know which applications reliably keep their state across restarts. You can't blindly rely on it on any desktop system. The Microsoft Office applications actually do auto-save documents since a couple of years ago, even though the recovery UX can be a bit awkward.
What Microsoft doesn't care about is that you may have applications running that don't do that, when Windows reboots for updates.
On macOS the feature is baked into the OS's APIs, the app developer just opts into using them. If they don't, quitting with unsaved work will prompt the user modally, and block the restart to the point where the OS will timeout the reboot process and give up. The only way to purposefully lose unsaved work in almsot every app I've ever used on macOS is to yank the power cable or hold the power button down.
Window locations and app state are written to plist files, again, using OS libraries and APIs for app resume. I can reboot my Mac and not even realize it happened sometimes it all comes back the way it was.
The blocking happens on Windows as well, except that the timeout logic is the reverse: it force-quits the applications then, because presumably the potential security update is more important.
Yep. On Mac (and Linux, actually) I know of some applications that do that. I also know that on Windows most applications don't do that. I would also never leave un-saved work open on Windows.
I was replying to: "The fact that you leave unsaved work overnight is the actual crazy part". As long as you know which apps auto-save and know you can somewhat rely on them, it's not so crazy.
Ok, I wouldn't do that because I don't know what random apps are doing.
But if you're happy with your workflow, don't mind me.
Of course, everyone has their own workflow. I won't tell anyone to adjust their workflow. But the exact point I was trying to make is that it's not random apps. It's specific apps that one knows about and how they behave. And once you know those apps (like TextEdit, Google Docs, etc) you can pretty much rely on it to survive reboots and power outages.
Personally it's rare that I leave something unsaved. That said it has never been an issue on macOS in 20 years.
There's plenty of tasks that can take hours that don't save their progress. E.g. running a simulation, training an AI model, rendering video. Or, these days, leaving agentic AI models running in a loop implementing tasks.
Even if the state is recoverable, it doesn't mean that it's simple to recover.
I would be infuriated if my OS decided to shut itself down without permission.
Huh?
I use a mac and a linux box. I'd never cross my mind that I cant leave some unsaved changes overnight. I leave unsaved changes for weeks across the many things I am working on.
the flip side of this is I can't count the number of times that cropping a screenshot in paint and leaving it in the background has partially stopped my PC from rebooting, and I've discovered the following morning that "you have unsaved work" on paint interrupted the shutdown and i need to do the shutdown _again_.
Happens to me way too often. And it is frustrating if backup auto save is not included in the system. I have disabled auto update because of this.
Happened to me just a few days ago. Woke up, turned on PC, all my open programs were gone due to a Windows Update...
Not just a couple of times. It happened to me countless times.
Correct. Windows is not a serious operating system. It really never has been. I've been on desktop linux for decades now. Linux is a serious operating system. Nothing happens without you asking it too. My linux computers are never turned off, since they day I turn them on, except for the occasional kernel upgrade. Otherwise, all upgrades are live. Even most kernel upgrades can be avoided if you use one of the modern patch frameworks
I literally cannot count the number of times I put my Linux computer to sleep and it just doesn't wake up, and I have to hard reset the power to get it to do anything. I would never leave anything unsaved open for an extended period of time on a graphical Linux system.
Happens 90% of the time on my standard Elitebook laptop when I run windows. It just crashes and has the fan going crazy. On Linux it's been fine since day one, some 5 years ago.
But this is a bug, and it's very different from the OS voluntarily rebooting without your consent.
Meanwhile on macOS, modern apps will not lose data if the power is janked out at any point.
Its was a good read until at the end ...
> For the remainder of 2026, Microsoft is cooking up a big one: replacing more and more native apps with React Native. But don't let the name fool you, there's nothing "native" about it. These are projects designed to be easily ported across any machine and architecture, because underneath it all, it's just JavaScript. And each one spawns its own Chromium process, gobbling up your RAM so you can enjoy the privilege of opening the Settings app.
I'm a little tired of people junking on react native when they have no clue what they talked about (And I'm not even react native dev but iOS dev). React Native doesn't spawn any chromium process. This is not electron. React Native doesn't even use v8 engine. All UI views and widgets are native. Platform SDK is native, Yoga Layout is native C++ and even faster than UIKit layout. Majority of RN code is Native - go have a look at github at languages section. JS is only 19% of codebase, everything else is C++, Obj-C, Obj-C++, Kotlin, Java.
The problem AFAIK with startup being laggy was making http requests to downloads those ads.
> React Native doesn't even use v8 engine
Are you saying you would use React Native with a language other than JS?
Engines other than v8 exist. React Native uses Hermes or JavaScriptCore (Apple/Safari). [0]
Other engines include SpiderMonkey (Mozilla/Firefox) [1] and QuickJS [2]
[0]: https://reactnative.dev/docs/javascript-environment
I'm a little tired of "hey I installed Linux!" posts. Ok, you installed Linux. Great. Wow! Now what, wanna show a screenshot of your desktop with an anime girl as the baackground and neofetch in a terminal window?
I've been running Ubuntu Linux for a long time now (over a decade, started with 8.04). Linux still has it's fair share of bugs but I'll take having to deal with those over running Windows or MacOS any day.
For me the biggest thing is control, with Windows there are some things like updates that you have zero control over. It's the same issue with MacOS, you have more control than Windows but you're still at the whims of Apple's design choices every year when they decide to release a new OS update.
Linux, for all it's issues, give you absolute control over your system and as a developer I've found this one feature outweighs pretty much all the issues and negatives about the OS. Updates don't run unless I tell them to run, OS doesn't upgrade unless I tell it to. Even when it comes to bugs at least you have the power to fix them instead of waiting on an update hoping it will resolve that issue. Granted in reality I wait for updates to fix various small issues but for bigger ones that impact my workflow I will go through the trouble of fixing it.
I don't see regular users adopting Linux anytime soon but I'm quickly seeing adoption pickup among the more technical community. Previously only a subset of technical folks actually ran Linux because Windows/MacOS just worked but I see more and more of them jumping ship with how awful Windows and MacOS have become.
I remember when Ubuntu decided to reroute apt installations into SNAP installs. So you install a package via apt and there was logic to see if they should disregard your command and install a SNAP instead. Do they still do that?
It annoyed me so much that I switched to mint.
> Do they still do that?
Yes. I know its more than firefox, but I don't have the full list. On 24.04:
me@comp:~$ apt info firefox | head -n 5 WARNING: apt does not have a stable CLI interface. Use with caution in scripts. Package: firefox Version: 1:1snap1-0ubuntu7 Priority: optional Section: web Origin: Ubuntu me@comp:~$I agree with the sentiment but I keep Snap disabled because I like Kubuntu (Ubuntu with KDE) for its rock solid stability.
The control is both a blessing and a curse. It’s really easy to accidentally screw things up when e.g. trying to polish some of the rough edges or otherwise make the system function as desired. It also may not be of any help if the issue you’re facing is too esoteric for anybody else to have posted about it online (or for LLMs to be of any assistance).
It would help a lot if there were a distro that was polished and complete enough that most people – even those of us who are more technical and are more demanding – rarely if ever have any need to dive under the hood. Then the control becomes purely an asset.
This is literally Linux Mint, Zorin, and several other distros. I haven't had to "go under the hood" on my daily driver machines that run either of these distros for over 7 years.
I think at this point people are just (reasonably) making excuses not to change.
Those and other big distros are better in that regard, but they're still not perfect. Depending on one's machine and needs, there can still be pain.
One recent example I experienced is jumping through hoops to get virtualization enabled in Fedora… it takes several steps that are not obvious at all. I understand not having it enabled by default since many won't need it, but there's no reason that can't just be a single CLI command that does it all.
What exactly did you need to do? All I've ever had to do to get QEMU working properly has been to make sure KVM is enabled in the BIOS (which you have to do on all OSs).
Just run a KVM based Windows VM (via GNOME Boxes, virt-manager, etc. On my Fedora install I had to install the @virtualization meta-package and enable dameons among other things, and the only reason I knew to do that is because I looked it up. Without that Boxes, etc just throws an unhelpful error that doesn’t suggest that more packages or config changes are needed.
I had to enable virtualization features in BIOS too, but that’s entirely separate and not the fault of any Linux distro.
Things like that can be unbelievably annoying and confusing on Windows or Macs, too. Even worse, they can just turn out to be impossible: the company can actively be preventing you from doing the thing that you want to do, refuses to give you enough access to your own system to do the thing you want to do, and/or sells permission to do what you want to do as an upgrade that you have to renew yearly.
These are things that don't happen in Linux. Doing what you want to do might be difficult (depending on how unusual it is), but there's no one actively trying to stop you from doing it for their own purposes (except systemd.)
Also, as an aside, a reason that Windows and Macs might have easy virtualization (I have no idea if they do) is because of how often they're running Linux VMs.
One needs to go a fair ways off the beaten path before they'll start running into trouble like that under macOS and Windows.
For macOS in particular, most trouble that more tinker-y users might encounter disappears if guardrails (immutable system image, etc) are disabled. Virtualization generally "just works" by way of the stock Virtualization.framework and Hypervisor.framework, which virtualization apps like QEMU can then use, but bespoke virtualization like that QEMU also ships with or that built into VirtualBox and VMWare works fine too. No toggles or terminal commands necessary. Linux does get virtualized a lot, but people frequently virtualize Windows and macOS as well.
> It’s really easy to accidentally screw things up when e.g. trying to polish some of the rough edges or otherwise make the system function as desired.
'Similar to Windows' System Restore and macOS's Time Machine', the Linux 'Timeshift' tool can be used to do make periodic saves of your OS files & settings. (They can be saved elsewhere.) Restoration is a cinch.
Mint program 'Backup Tool' allows users to save and restore files within their home directory (incl. config folder and separately installed apps).
There's several distros that are fully usable without ever touching a terminal. The control is a gradient, some distros give you all the control and others (eg. SteamOS) lock down your root filesystem and sandbox everything from the internet.
> I've been running Ubuntu Linux for a long time now...Linux still has it's fair share of bugs...
> I don't see regular users adopting Linux anytime soon...
I can see why you think the second statement is true based on the first statements. When Ubuntu switched their desktop to Gnome, they gave up on being the best Linux desktop distro. I'd recommend you to try Linux Mint.
> switched their desktop to Gnome, they gave up on being the best Linux desktop distro.
Mint is also based on GNOME.
I'm curious, can you elaborate on why you believe that changing to Gnome meant they were giving up on being the best desktop distro?
Well, to start they tried putting Amazon ads in Unity's Dock which was also doing data collection, but removed them after the backlash.
Then they switched to Gnome, meaning they gave up on their own desktop, Unity, so they were no longer dictating what their desktop was like, so how much did they care?
Since then they have replaced a number of apps with SNAPs which are only available from Canonical so many people see it as an attempt to corner the Linux market. Many see AppImages and Flatpacks as better than SNAPs.
They are a company. They exist to make money. Of course they are going to decide to do things that make more money and annoy their users.
Let me recommend Fedora to you Timbit.
Debian family is outdated and builds with bugs upon release.
I too was corrupted by Ubuntu's marketing strategy of being popular and using the misleading word 'Stable'.
I tried Fedora once. On a fresh install, all it did was clog up all the hard drive space with error logs within 3 days.
I'm not interested in any distro that is controlled by a corporation. IBM is a corporation and they already screwed up CentOS and is eventually going to screw up Fedora someday because that's what corporations do, and I'm not interesting in going through that.
You have your fun running Fedora for now but know you're going to get burned someday.
What exactly is "outdated" about Debian?
Meh, I don't care much about control, I care more about getting my work done with the least amount of friction. Macs do that for me. Linux and Windows have too many barriers to make them a daily GUI driver.
>Linux still has it's fair share of bugs
>Linux, for all it's issues
You are confusing debian-family with Linux. Debian family is designed to be outdated upon release. When they say "Stable" it doesn't mean 'Stable like a table'. It means version fixed. You get outdated software that has bugs baked into it.
Fedora is modern and those bugs are fixed already.
Reminder Fedora is not Arch. Don't confuse the two.
I'm still surviving on Windows but only because over the last four years, as each new annoyance and regression arose, I made the mistake of very gradually, in tiny increments, sinking the time into invoking the arcane incantations necessary to tame each one.
15 minutes to deactivate an entire branch of notification pathways, 20 minutes to (mostly) restore the Right-Ctrl key they hijacked into a CoPilot key. 10 minutes to restore Win10 functionality to the Win11 taskbar with the wonderful ExplorerPatcher. $5 spent on Start11 to sidestep the whole start menu train wreck. And little 3 to 5 minute fixes with WindHawk (an amazing store-like platform to discover, install and manage open source Windows GUI patches).
I'm the stupid frog who didn't leap from the gradually heating pot. I acclimated to the boiling. And it's... okay. At least for now. But I know someday soon, the thousand faceless product managers at MSFT will break something unfixable. Somehow exceed the considerable abilities of the large community finding clever hacks and patches to keep the harsh Win wasteland livable for hardy souls.
While I greatly appreciate Linux philosophically and deeply respect it architecturally, I still really liked what Windows got so close to being - right before MSFT shifted biz models, simultaneously de-investing and turning it into a promotional platform for their other business. When that day comes, it'll suck to leave behind the wonderful third-party tools like Everything search, Ditto clipboard and AHK automation that streamline my day.
The thing I don't understand is why MSFT refuses to just make a version of Windows that's a Product again. I'd gladly pay them $100/yr for an upgraded "Windows Ti Super+" that just wants to be a good operating system for advanced users, instead of a strategic moat or monetization flywheel.
I never realized Windows tuning was a thing (WindHawk, Start11). Thank you for mentioning those.
I recently had to install Windows 11 to play a video game that runs janky on Linux, and I am encountering all the annoying problems people are describing in this thread (forced updates, full-screen ads, etc). It does not bother me too much, since I effectively use Windows only to play that one game. But maybe I can tune Windows 11 into something less obtrusive with the custom hacks. Thanks.
You're welcome. One of the best things I did once I realized I was reluctantly on this journey, was start keeping notes in a cloud backed-up file of every registry setting I tweaked or little utility I installed. Just a quick cut and paste to remind me.
Of course some of the notes become stale or irrelevant as things evolve but it's still invaluable insurance if this Windows install eventually gets crufty and needs to be hosed out for a fresh start (which eventually happens even with diligent hygiene). Also, don't forget ExplorerPatcher which restores some essential Win10 taskbar and explorer functionality MSFT removed from Win11 (promising to eventually replace with new code but now, years later, obviously never will). It's clear at this point MSFT isn't devoting any serious effort toward re-implementing previous functionality or creating new features for power users.
AFAICT, the only things getting meaningful funding now are fixing critical 0days / bugs, reworking the interface to create "more exposure vectors to support corporate initiatives" and a couple under-resourced teams clearing only the most critical technical debt threatening the whole edifice. Meanwhile, every PM or UX designer who suggests "Hey, it'd be easier to just decide that new feature or unrestored functionality will just confuse 'our average user'" gets promoted. I just feel sorry for the engineers still there who joined the Windows team wanting to build toward a state-of-the-art operating system supporting a powerful, flexible, extensible, customizable user interface. Now it's "If it ain't broken, there's no funding to fix it" and "If it just broke, see if we can just remove it instead of fixing it", "if it's there but incomplete, remove what's there, patch over the hole and we'll pretend it was never there." And finally, "You seem ambitious and diligent, we just wish you'd align your interests with current strategic priorities (ie sticking AI where it doesn't belong or pushing Office, cloud, etc subs)."
Shhh! Don’t tell anyone.
Years ago MS depended on Windows. It was the profit center. Everything MS did was a moat to sell more seats. Even MS-Exchange was just a ploy to force enterprises to stop deploying any other operating system.
That all changed with Azure.
MS realized they could make billions in Windows or trillions with Azure.
They changed the org structure. Now Azure is at the top and everything else is a moat or a way to draw people to Azure. They changed the sales commission (your multiplier doesn’t kick in unless you’ve sold enough cloud services).
Windows is no longer a profit center. It’s a cost center.
Anything that scares people away from using Windows is a benefit.
Let those other suckers spend money developing operating systems. As long as it runs on a VM in Azure, Microsoft will profit.
Windows being worse and worse isn’t a bug. It’s a feature.
Apple forced me to switch to Linux!
Linux should consider paying Microsoft and Apple for new customers. Perhaps the customer acquisition funnel is quite long, at least it took 20 years of using Apple in my case before switching to Debian (Xfce), but it was worth it!
As a regular linux user for the last 20 years, who had used windows for games for about 25 of the last 30 years. When I had gotten a macbook pro for work in a company that was all apple there were three things that stood out: The M processors are amazing, the apple hardware is really good, and mac os is absolutely awful. I have no idea how people use mac.
> mac os is absolutely awful. I have no idea how people use mac.
I hear this from a lot of people when they get their first Mac. When they get specific about what their issues are, it tends to be that macOS doesn't do a thing how they are used to doing it, which is more of a learning curve issue, or rigid thinking. Apple software can be quite opinionated, those who fight against those opinions tend to have a hard time. This is true of any opinionated software.
> Apple software can be quite opinionated, those who fight against those opinions tend to have a hard time. This is true of any opinionated software.
And this is why many like me prefer Linux. We have our own opinions, and Linux enables us to enforce our opinions.
I've been a Linux guy for 25 years, and used Windows at work for the last 15. I now have to use MacOS at work.
I miss Windows. It wasn't totally better, but I managed to overcome most Windows headaches with workarounds. I haven't found the alternatives yet to MacOS.
From my perspective, both Windows and MacOS suck - but in different ways. I think the problem many Linux folks have with MacOS is that it is the "uncanny valley" of Linux. You get happy that you can use your usual UNIX flows, and then you find out that you can't.
I really want a good tiling window manager. I have yet to find one on MacOS that has the features AwesomeWM have.
It really sucks not being able to rebind keys to use Ctrl instead of Cmd in many apps. For basic tasks (opening/closing browser tabs), I have to use one set of keys in the daytime (at work), and another at night (at home). Why won't MacOS let me change them?
I got used to the Mac keyboard layout and I think it makes more sense - I now remap all Linux (using keyd) to actually use the Mac layout. The main thing that I like is that it's more ergonomic for me to press command + something with my thumb, than it is to press control + something with my little finger. So command+c, command+v, command+Tab, command+`... are all easily reachable when my fingers are still in the writing letters position, just slightly moved to the left.
MacOS lets you rebind Caps Lock, Ctrl, Option, Command, and the Globe/fn key in Settings > Keyboard > Keyboard shortcuts... > Modifier Keys. Does that not work for you?
I don't recall what problem I had with it, but it was a case of solving one problem and introducing another.
I don't want to globally swap Ctrl and Cmd. For some apps, the keys are identical to that on Windows. For others, it isn't. I need to be able to do it on a per app basis.
I assume the problem would be in the Terminal (assuming you use it), where Control + C is an often used shortcut, and flipping the modifiers globally would make this Command + C.
macOS uses Command instead of Control for a lot of things, but they didn't change how the shell works.
Most of the stuff isn't really personal preference, more like being temporarily used to a different way.
Btw search "modifier keys" in Mac sysprefs if you want to rebind command to control. I'm also sick of using separate shortcuts at work, but the other way around, gonna rebind Ubuntu.
I use Karabinier to remap keys. Mac OS makes you work hard to enable it the first time, though.
I can give you a few examples:
Packages are not done well compared to linux. Brew is a poor replacement. It feels like the terminal and everything involved is constantly out of date.
The OS just has a lot of weird things, like the ribbon at the bottom taking up so much space. When I made is smaller and hidden except on mouse over it was incredibly rough.
Window management is decades behind windows or linux. It doesn't like maximizing windows and doesn't make partitioning screen space easy. I had to download a third party app to make it better, which was still worse than windows even in windows 7, and miles worse than linux with i3.
Mac has a lot of rough spots. I have two external monitors and occasionally after updates one monitor would be fuzzy or different resolutions, and it wouldn't go back until the next update.
I found myself really frustrated trying to use MacOS at work, because I'm a heavy user of virtual desktops. Turns out, I couldn't find a way to disable animations to switch between virtual desktops on MacOS. If there is a way, I'd be surprised.
Shortening the animation to minimum was not sufficient for my preference.
When did you use a Mac last, 2010?
You can run nix on macOS now. You can also drag windows into corners or edges to tile them, it is almost exactly like Windows 7 or 10. You can even have tiling window managers on macOS that emulate i3.
Your complaints with the dock seem like a personal choice... I like the dock behavior but if you don't, probably not a lot that will fix that, it will always suck.
Sad to report that external monitor support is still terrible.
I switched to Mac as my primary two years ago and I'm still finding myself frustrated at the software a lot.
It's not just that it's opinionated - that's fine. It's that those opinions are often just poor UX.
There's something to it.
On that note, is there any GUI tool that allows me to browse my zip archives without unpacking them, and is also free?
Home/End don't work correctly (external keyboard).
Cmd-Tab switching between applications instead of windows is utterly stupid. (Yes I know there is some magic keystroke that will do it, but who even wants the standard behavior? Like why even do that?)
If there is a window under another window, and you click on something in it, the OS will ignore the click, it will just activate the window.
So now you have to click twice, except what if it's actually active? So now you have to always check if a window is active - which is harder than necessary because of how Macs have the toolbar on top, not near the actual window. (This is especially bad when you have two monitors.)
The toolbar is far from the window, leading to extra mouse movements.
There is no maximize button, instead it's a full screen button.
If you manage to get a window off-screen, there's almost no way to get it back (you have to pick tile windows or something like that to make the mac move it). If you do show all windows, and click on it, nothing obvious happens.
I'm trying to add the screenshot app to the launch bar - I can't, I click on Launchpad and find it, but you can't right click on any of the icons in there to do anything with them.
The finder is an utter disaster - I can not for the life of me figure out how to go up one level in a directory. It's like finder is trying very hard to pretend there's no such thing as directories.
If you have two monitors you can't have an app halfway across both of them, it's always on one of the order.
If I move an app to the bottom right corner the OS will "helpfully" move it back up, even though I moved it down. (This is especially funny when you realize it frequently manages to place windows off screen - why can't it be helpful then?)
When you drag a window sometimes you get this white outline that will resize the window for your screen - I have yet to figure out when this activates and when it doesn't.
When you drag a window from a larger monitor to a small one, it will resize it - sometime. But despite that it manages to place the window offset - so it's the right size, but like 40 pixels to the left.
Every single time I reboot, if I have to unplug my external monitor, and keyboard, login, then plug them back in. Otherwise it refuses to talk to them.
I hate mac os window management as much as the next guy, but I do find that it's much easier to tell which window is actually active than on newer versions of windows where all windows look the same. Hell, I've typed my password in the wrong window more times than I can count, because even though the window which just appeared was on top, had a blinking cursor and everything, it wasn't active. This even happened with the UAC prompt, but I think it's been fixed now.
I also like the first click in a window to not be passed through. I don't want to have to make sure I'm not clicking on some active part which will immediately have an unwanted effect. I've actually configured my Linux WM to behave that way. It still passes through the scroll wheel, though.
> The finder is an utter disaster - I can not for the life of me figure out how to go up one level in a directory. It's like finder is trying very hard to pretend there's no such thing as directories.
You can enable a clickable bread-crumb panel somewhere. Also, cmd+up. cmd+down goes down one level, instead of enter. This was very frustrating to me at first.
> I'm trying to add the screenshot app to the launch bar - I can't, I click on Launchpad and find it, but you can't right click on any of the icons in there to do anything with them.
Never tried to do that, but I loved that there were system-wide shortcuts to access it, with an easy switch between modes (cmd+shift+3/4 for screen / area if memory serves).
> Every single time I reboot, if I have to unplug my external monitor, and keyboard, login, then plug them back in. Otherwise it refuses to talk to them.
On Windows, I have the opposite problem, kind of. It only detects my 5k external screen as such if it's plugged in when booting up. Unplug while it's running, or sleep/wake the laptop and it's gone. Linux, again, works fine.
> Every single time I reboot, if I have to unplug my external monitor, and keyboard, login, then plug them back in. Otherwise it refuses to talk to them.
HOLY SHIT, my work Mac does this all the time and my personal Mac does not, I cannot for the life of me figure out why, nobody I have talked to understands it, it drives me absolutely insane.
Everything else in your post is either a personal preference and/or not a problem for my workflow
> Home/End don't work correctly (external keyboard).
macOS tends to use the arrow keys for this, with various modifiers. Command + the arrow moves to the start or end of things (documents or lines), Option will be at the word or line level. Adding Shift to either of those will highlight those regions.
> Cmd-Tab switching between applications instead of windows is utterly stupid.
I've never been a cmd-tab user, so I don't notice thins. Once Exposé (now Mission Control) came out, I just stuck with that. I bind it to an extra button on my mouse.
> The toolbar is far from the window, leading to extra mouse movements.
The reason for this dates back to the original design of the Lisa. Bill Atkinson explains it in this video. It's a trade off between having issues with menus when windows are small, and having to move more. I believe this is why they added mouse acceleration, so no matter where you were, you could get up to the menus fairly quickly.
https://youtu.be/Qg0mHFcB510?si=yc0uCunQiMufGc75&t=416
> There is no maximize button, instead it's a full screen button.
They're starting to get better on this. The full screen button has a menu to do many things, and one of them is to maximize (they call it Fill). You can also just drag the window to the top edge to maximize it, like Aerosnap on Windows.
> If you manage to get a window off-screen, there's almost no way to get it back
Windows > Center, will bring the active window to the center of the screen.
> I click on Launchpad and find it, but you can't right click on any of the icons in there to do anything with them.
You can drag icons from LaunchPad to the Dock to add them. They'll still be where they were in LaunchPad, but now also in the Dock for quick access. LaunchPad is gone in macOS 26 though, so you can either right-click it in the Dock while it's there to tell it to keep in there (or just drag it over to the left and it will remember it)... or find it in Finder /Applications/Utilities/Screenshot
> I can not for the life of me figure out how to go up one level in a directory.
I usually show the Path Bar whenever I get a new Mac. In Finder, View > Show Path Bar. This shows your path at the bottom of the Finder window. You can click on any parent directory to go to it.
If you don't want to do that, or want another way, right-click the folder name at the top of the Finder window. This will show you a dropdown menu of all the parent directories, pick however far you want to go up the tree.
> If you have two monitors you can't have an app halfway across both of them, it's always on one of the order.
This one annoys me a bit too, and can lead to that window off-screen thing you mentioned earlier. It's one of the reason I went with a large primary monitor instead of having 4 external displays, like I had before.
> If I move an app to the bottom right corner the OS will "helpfully" move it back up
I think this has to do with the horizontal area the Dock is on being "protected" for lack of a better word, so nothing gets trapped behind the Dock. I agree, that having it do this for off-screen windows would be nice.
> When you drag a window sometimes you get this white outline that will resize the window for your screen
This is what I mentioned earlier to maximize. It works pretty much like on Windows. It activates not when the window hits an edge, but when your mouse cursor that is holding a window hits an edge.
Top edge: Maximize Side edge: Half the screen Corner: 1/4 of the screen
By default there will be gaps between these tiled windows, which some people don't like. You can remove the gaps in the Settings.
> When you drag a window from a larger monitor to a small one, it will resize it
I think this has to do with scaling of the monitors, or just that one monitor is dramatically smaller. My main setup is a laptop + a large monitor. The windows on my main display are bigger than the entire laptop screen, so it makes them smaller so they fit.
> Every single time I reboot, if I have to unplug my external monitor, and keyboard, login, then plug them back in. Otherwise it refuses to talk to them.
On my work setup I use an CalDigit dock and I occasionally have this happen after a big upgrade. I don't have to disconnect everything, I just have to login using my laptop, then trust the dock.
On my home setup, I use my monitor as the dock for my mouse and keyboard. With this, every time I reboot I need to login with the laptop and then approve the monitor as the dock for the other things to work. I don't have to unplug/replug anything (thankfully).
I tried looking into this once or twice. People online talked about various trust settings, but nothing seemed to stick. I really only reboot when there is an update, so it's pretty infrequent. If I was rebooting daily I'm sure it would drive me insane, to the point where I'd stop using the monitor as a dock.
MacOs is extraordinarily opinionated about how everything should work and frequently attempt to predict your workflow.
Linux/Windows (historically) were straightforawrd, each tool did exactly what it said it would do, and it was up to you to learn how to use the tools available.
On linux/windows, if a button was "capture image", it would just capture the image on the screen. On a mac a "capture image" button could do anything from displaying the image on the screen, to saving it in a photos folder, to saving and syncing it to an iCloud account. Whatever the apple PM decided the most common use case was, and god help you if you want to do something different.
If you've been in the mac ecosystem for a while, you've grown used to this and don't notice any longer. You may even occasionally express happiness when a function does something unexpected and helpful!
If you're coming from anywhere else, its unbelievably painful.
I’d frame it slightly differently.
With Linux/Windows you’re supplied with a toolbox and from that toolbox you’re expected to cobble together a workflow that works for you and maintain it.
I spent a significant amount of time trying to learn Tasks inside of Outlook and come up with a system that would make it remotely useful. I failed repeatedly. They eventually bought Wunderlist and replaced it with that, which still has some rough edges (last I tried) due to the legacy Outlook Tasks integration.
Apple, more often than not, is looking to identify a problem and give an opinionated solution on how to handle it. If you’re ok with their solution, great, problem solved. If you’re not, you end up either fighting with the Apple tools or finding a 3rd party toolbox style app that lets you cobble together a workflow. I found just going with the opinionated solution removes a lot of needless stress from my life. There are some places I do go 3rd party, but I reevaluate often to ask if I really need these things and if they’re worth the trouble.
It ends up being a question of what my goals are with the computer. Am I looking to work on the operating system and apps to tune them to exactly what I want, or am I just looking for the system to fade into the background so I can do other things. When I was younger, I found tweaking and playing with everything to be a bit of a hobby. These days, I just want to do what I need to get done and move on with my life.
Not just this. I'm linux/macos user since early 2000's and still sometimes hate macos because they have very annoying bugs that are never fixed, and annoying corpo decisions.
E.g. it keeps opening Music app whenever I connect bluetooth earbuds. I can't delete Music app, it just keeps popping up with imbecile message about "user is not logged in" or something. I run a script that monitors that Music.app is running and kills-9 it.
Or blinking desktop background issue, that's been there for years, accumulated many support threads, and still not fixed.
Random services like coreaudiod that suddenly start consuming 100% CPU for no apparent reason.
Macbook throttling (thanks God, gone with M cpu's)
I can keep going but my point is macos has legit problems that can't be simply shrugged off with "they just hold it the wrong way".
Like any other mass product tbh, except rare ideal products like Factorio game or sqlite.
I haven't had that Bluetooth issue (but I haven't tried connecting my non-airpods to my mac).
Have you tried this? I saw it as a fix over on Reddit.
Privacy & Security > Bluetooth > Click the + > Add Music from Applications > Toggle to disabled
(This is insane to have to do, but better than running a script to monitor for it and kill it)
Interesting. My experience over a decade was that (expensive) Apple hardware was unreliable or poorly designed ... from the IIsi to the iMac. One exception: the murdered Power clone was great. The iMac vertical screen-stripe fiasco (affected hundreds of users within the warranty period, before they shut down the forum, then took years to respond to) was capped with a hard-drive fail after a year. My 'never again' still in effect 15 years later.
My home-made AMD tower is in its 6th year (running Linux) with no, zero, fails.
> and mac os is absolutely awful. I have no idea how people use Mac.
Not sure about other people, but in my case I spend 99% of the time using software made by 3rd parties so my exposure to the OS is very limited.
Latest OS is making life miserable though, compared to all the previous releases.
Anything in particular? I get that it takes some tweaking but so does Linux. The biggest thing that you'll probably never get the way you want is window tiling - it's my personal bugaboo with MacOS. Maybe there's a way to get what I want ...
For me, the biggest pain point is the way it decides which window to bring to the front. If I minimize a window, and then click on the application in the bar, it won't show the window just minimized, instead it always seems to show the older window. Really annoying when using an app with many windows
right click on the app and select the window you want...
right, but if you cmd-tab, it brings up ALL the windows: say you had multiple browser windows open, and only want to go back to the one you just used before (think reading some docs while coding).
There's a couple but nothing I've found at the level of i3 or whatever the hyprland equivalent is.
“I get that it takes some tweaking (MacOS)” How times have changed, it used to be as intuitive as drinking water.
Window wiling is a big one for me. I have tried the third party options, and nothing compares to i3.
There are an absolute ton of very capable tiling window managers for macOS, posted here frequently. From yabi to aerospace to fully programmable ones like hammerspoon. A quick search will turn up plenty more. I would be shocked if none of them meet your needs.
Shouldn't need to install third party stuff for such a basic feature. One more thing that will possibly break with updates or not play nice with something.
Fucking Finder. What a colossal dumpster fire. It drags that entire OS down.
Better than Windows Explorer
No it's not. It's one of very few things that Windows does better
Windows explorer got significantly better in 11, except for fucking context menus. Also it's incredibly slow and unstable now and frequently crashes, taking the taskbar down with it (???).
But, at least it has tabs. Jesus Christ, took long enough.
Man, we didn't have this all along.
Six years ago everything was stable and solid, but Apple's board of directors seems to have decided that new Mac users can't handle a computer interface anymore and started merging it with mobile OS interfaces. And the result is absolutely terrible.
They also decided that they have to capture React devs and everything should use a declarative UI, which has brought us the wonderful new Systems Settings.
Windows is such garbage, I can't understand how you think MacOS is worse lol. It's just Unix. Linux is definitely better than both though
They didn't say they think macOS is worse, though.
Windows is absolute garbage, I agree. But the application windows behave normally, maximize when I want them, will take half a screen, quarter screen, etc. with just a quick hotkey. Mac doesn't have that extremely basic functionality without a 3rd party extension, which is absurd. But I don't use windows other than if my work gives me one, I am purely linux otherwise.
Hover on the green button. Also, Mac has a really robust keyboard shortcut system. Including shortcuts for tiling.
https://support.apple.com/guide/mac-help/tile-app-windows-mc...
https://support.apple.com/guide/mac-help/create-keyboard-sho...
> I have no idea how people use mac.
Meh, it has a terminal. Good enough for me. It's worth putting up with MacOS for the hardware.
It's only fair that Linux should pay 10% of the license fee for their software to Microsoft in exchange
For a long time, I had a MBP (this is in Intel days), with a Linux VM. It was like a reverse mullet, party in front (multimedia), work in back (dev).
And then:
So I switched to System76/Linux (Pop OS) and that has been wonderful, not to mention, much cheaper.- Butterfly keyboard - Touchbar - M-series CPUs, which, while technically awesome, did not allow for Linux VMs.- No esc
See I'm a ends justify the means guy:
The more people forced into the beautiful world of capslock is escape the better!
Your website has stained my screen. lol
background-image: radial-gradient(circle at 12% 24%, var(--text-primary) 1.5px, transparent 1.5px),
radial-gradient(circle at 73% 67%, var(--text-primary) 1px, transparent 1px),
radial-gradient(circle at 41% 92%, var(--text-primary) 1.2px, transparent 1.2px),
radial-gradient(circle at 89% 15%, var(--text-primary) 1px, transparent 1px);
> Linux should consider paying Microsoft and Apple
Who or what is the "Linux" entity in this context?
Joking aside, I often hear people say "they should" when talking about GNU/Linux (for example: "they should just standardize on one audio stack"), as if there were a central authority making those decisions. What many don't realize is that with FOSS comes freedom of choice... and inevitably, an abundance of choice. That diversity isn't a flaw, it's a consequence of how the ecosystem works.
There's free choice for those OSes to use different kernels, but they don't, they all use the same Linux (rather than say BSD). There's a lot of advantage in getting aligned on things, even though anyone can choose not to.
It is true that Linux-based distributions have this thing in common: the Linux kernel. There have been some GNU/Hurd variants though...
I guess Linus Torvalds and co? First they'd need to standardize a Linux desktop OS.
Also who is paying "Linux" and for what?
Maybe the answer ends up being Valve.
Well at least Microsoft is a platinum member of the Linux Foundation for many years...
As much as I love the idea of moving to Linux - Mac hardware is like two years ahead of PC currently in pretty much any regard aside from gaming. I keep looking for an iteration where it makes sense to switch but currently the intel core 3 stuff is at best comparable to M5 base. Strix Halo is much more power hungry and also not that impressive other than having a bunch of cores. Nothing comes close to the pro/max chips in M4 series. And with RAM/storage pricing Apple upgrades are looking reasonably priced (TBD when M5 Pro devices launch).
So I can either get a top tier tool when I upgrade this year or I can buy a subpar device, and the power management is going to likely be even worse on Linux.
> Mac hardware is like two years ahead of PC currently in pretty much any regard aside from gaming
and any contemporary ergonomics. Seriously, macbooks are an environmental hazard at this point: ultra glossy screen, hand twisting keyboard, wrist cutting sharp edges, lack of modern surge protections etc. etc. I genuinely don't understand this sentiment that macbook hardware is good.
I think this mostly only holds if you use local compute in a portable form factor.
Most of my personal development these days is done on my home server - 9995wx, 768GB, rtx 6000 pro blackwell GPU in headless mode. My work development happens in a cloud workstation with 64 cores and 128GB of ram but builds are distributed and I can dial up the box size on demand for heavier development.
I use laptops practically entirely as network client devices. Browser, terminal window, perhaps a VS Code based IDE with a remote connection to my code. Tailscale on my personal laptop to work anywhere.
I'm not limited by local compute, my devices are lightweight, cheap(ish) and replaceable, not an investment.
I'd like to use this kind of setup but unfortunately every time I try there's just soo many annoying edge cases that are wasting my time. Especially when I need to do FE/Mobile - but even BE has gotchas. I guess it depends on your environment - I'll try making this setup work sometimes in the next few months again.
There is Asahi Linux project for Apple Silicon Macs.
It's not really viable
So whatever resources you have, Apple will use them mostly to render 3D glass effects. With Debian (Xfce), I can't speak for other desktop environments, you need roughly three times fewer resources to run the OS itself.
Or you just don't run Tahoe?
Actually, you don't have this choice anymore.
Apple is disabling downgrading across all of iOS, and starting to do the same with MacOS. So you need to keep old hardware to run older MacOS versions, and it's only a matter of a few years before Tahoe is the latest OS you can run on your Mac.
> Actually, you don't have this choice anymore.
I must have taken some shrooms before I downgraded from Tahoe to Sequoia a few hours ago then.
Oh, I must be clear here: I'm not considering M1 Macs or later, since Apple closed the ecosystem with Apple Silicon.
What you did is a downgrade in what's called the supported OS.
However, if you decide to downgrade to Catalina on an M1 Mac, it's not possible — Big Sur is the earliest version that runs on Apple Silicon.
Anyway, you cannot downgrade to a macOS version older than what your Mac originally came with. So if you buy a Mac now, Tahoe will be the minimum option.
Old Macs can certainly be downgraded. iOS doesn’t allow it though and they pulled the latest security update which fucking sucks. And if you buy a M5, Tahoe is the only OS that’s available.
I have nothing against old Macs and MacOS, but I certainly won't be buying anything since the Apple Silicon switch, because now only Apple controls which OS you can run.
>If you buy a machine that isn't even released yet
Uhh, I guess.
AFAIK iOS has been very locked down wrt rolling back upgrades since forever and isn't super relevant to this thread. Happy to be corrected.
M5 MacBook Pros have been shipping for over three months now.
The M5 Pro/Max variants aren't; but an M5 Mac is a thing you could have bought for a good while now.
That's a very temporary solution to be fair. KDE and even, shudder, Gnome put mac os and windows to shame when it comes to responsiveness, performance, and resource usage.
I mean, KDE does 3x the stuff for 1/3 the cost. That's more memory and CPU for your IDE or, more likely, chrome tabs.
If Linux had a revenue stream and model, this would make sense. But the style of open-source is to make good software, and let others gravitate to you as a result.
If there was an easy and supported way to put linux on a macbook I'd be back on linux but I can't give up the hardware.
I am good laptop hardware away from making the move.
I can also recommend HP Zbook Ultra G1a. It's probably the closest thing to Macbooks at the moment. It has lower battery life and latest M chips are still faster but it's fast enough for me. The hardware is solid and sw support is great.
HP Zbook Ultra G1a, 128GB RAM. Add SSD to taste. HP supported (Canonical OEM) Ubuntu with KDE. Works great as a daily driver with a UGreen GAN charger.
Interesting, I had never even heard of this laptop! Thanks for the tip!
I'm on frame.work with AMD, 96GB RAM. Using it with fedora+KDE Absolutely love it
Do they still use a paddle trackpad? Framework seems like its nearly perfect for me, even if I would miss Apple's displays on the MBP.
Frameworks are built like crap. Sorry for the language. Watch the laptop olympics: https://www.youtube.com/watch?v=M96O0Sn1LXA
thinkpad
The whole upgrade to Windows 11 was forced on me by Microsoft, basically using trickery. I opened windows update and it said there were some updates… Did not make it clear at all that it would be taking my system from Windows 10 to 11. I thought it was just installing security updates. I would’ve had to change it anyway when 10 support ended, but I still had about four months and wanted to wait.
Still reading the article, but early on it says:
"Also, is it weird that I still remember the specs of my first computer, 22 years later?"
My first computer was a TRS-80 Model 1, 1.78 Mz Z80 with 16 KB RAM.
That was 48 years ago. Is it weird that I remember that?
They stick with you. I remember our first family computer well (an Acer 486 with 40MB drive and 32MB RAM.)
Same for my first computer I built myself out of a TigerDirect order. Made a few mistakes there (K6 generation.)
Having these computers was such a change in our lives that they should be somewhat indelible memories.
>an Acer 486 with 40MB drive and 32MB RAM.
32MB ram <-- no way. 4 and 8MB were the standard (8MB being grand), you could find 16MB on some Pentiums. So 40MB drive and 32MB RAM is an exceptionally unlikely combo.
32MB become norm around Pentium MMX and K6(-2).