Oops, thanks.
Oops, thanks.
Apparently, this is hardly hyperbole. For example: https://bugs.kde.org/show_bug.cgi?id=377162
Talk about arrogance. In the window paradigm, only a few desktops ever REQUIRED a similar look and feel for all windows. Apple was the worst offender for that. I suggest that if Edmundson wants a similar look and feel, he should go get himself a Mac and stop mucking up KDE.
From a quick look at the proposed patch - and obviously without having the full picture - it’s true that it would add some complexity. But it’s code for the sake of people’s convenience, not the other way around, right? IMHO, as long as:
- shading is off by default,
- users get a clear message about limitations and SSD/CSD complications before enabling it,
- the implementation doesn’t introduce impossible-to-maintain logic and limits some weird edge cases like resizing a shaded window, then it’s worth doing.
TBH this is one reason I got off Ubuntu/KDE Neon.
It kept trying to roll Nvidia+KDE fixes forward (including one I dealt with in their bug tracker), which I had to manually figure out and maintain, which I kept breaking, so I finally decided “why don’t I just use a distro where everything Nvidia/KDE is up to date?”
It’s a very Linux thing.
People get very particular about their setups.
The KDE bug tracker now:

I don’t even know what window shading is…. What is it?
It looks like it’s still being discussed:
Yeah. I would massively emphasize this too.
Don’t mess around.
Especially don’t mess around with AUR. Discrete packages are fine, but AUR tweaks that mess with the system are asking for trouble, as they have no guarantee of staying in sync with base Arch packages.
Ease of use.
I’ve run the same CachyOS partition for 2 (3?) years, and I don’t do a freaking thing to it anymore. No fixes, no tweaking. It just works.
…Because the tweaks and rapid updates are constantly coming down the pipe for me. I pay attention to them and any errors, but it’s all just done for me! Whenever I run into an issue, a system update fixes it 90% of the time, and if it doesn’t it’s either coming or my own stupid mistake.
On Ubuntu and some other “slow” distros I was constantly:
Fighting bugs in old packages
Fighting and maintaining all the manual fixes for them
Fighting the system which does not like me rolling packages forward.
And breaking all that for a major system update, instead of incremental ones where breakage is (as it turns out) more manageable.
I’d often be consulting the Arch wiki, but it wasn’t really applicable to my system.
I could go on and on, but it was miserable and high maintenance.
I avoided Fedora because of the 3rd party Nvidia support, given how much trouble I already had with Nvidia.
…It seems like a misconception that it’s always “a la carte” too. The big distros like Endeavor and Cachy and such pick the subsystems for you. And there are big application groups like KDE that install a bunch of stuff at once.


Hence, Zuckerberg has just recently fired most of the LLAMA staff, the lab’s leader is rumored to be leaving for their own startup, and the new lab where all the funding’s going is a bunch of tech bro egos that are pro-closed models.
…And I suspect PyTorch is too “utilitarian” for Facebook’s leadership to draw enshittification attention.
Llama was an anomaly, and it seems they’re done with that. Which is quite sad. But on the plus side, it could be a death knell for Meta (as all that ego in the new lab will be a catastrophe).


Baloo can be disabled, it’s just the background search indexer. So can kaccess (the accessibility service) and kde wallet (the secrets manager).
A lot of the widgets, effects and stuff take up RAM too.


Look up RAM usage in btop, sort processes by memory usage. A lot it is random services you can disable in the system setting or uninstall with a package manager.
And yeah… it even matters on a higher RAM setup. Sometimes I have most of mine filled with a background thing, and 1GB vs 2 or 3 can make a big difference.


And KDE’s RAM usage is very reasonable these days, especially if you opt out of some of the bells and whistles.


I use KDE with Krohnkite.
E.g. I have my cake and eat it, as windows can get dragged around if I want. Anything weird is just windowed like normal KDE.
Works with mice, and works good OOTB!


Hard disagree.
Installing Debian on Nvidia means you are maintaining Nvidia yourself, and you are just holding your hands together hoping the 3rd party repo’s don’t fall out out of sync and you don’t have to troubleshoot some Nvidia conflict yourself. This is the whole reason I left that ecosystem behind, it was a huge waste of my time…
…Maybe you got lucky and just didn’t run into any Nvidia bugs? But that was not my experience.
(And to be clear this is different if you’re using it headless or something).
Well Mint is technically fine, right? Their Nvidia support is 1st party, so it should work out of the box.
Pretty sure Ubuntu does too.
Debian, specifically, does not though. And I’m not sure how ‘behind’ Mint and Ubuntu are on their DE and Nvidia driver packages these days, which could be an issue sometimes. But I think many remember Ubuntu/Mint from older days when they were worse in this regard.
Nobarra, Bazzite, or CachyOS.
I’d say Nobara or Bazzite are better for ‘I install it and it just works.’
Cachy is better for the learning aspect. It’s not hard, but there are more choices to make, and you’re closer to the Arch wiki and all its excellent resources/tutorials.
I am biased, as I run CachyOS and I love it. I also love how much stuff is in its repos, including everything you need to game optimally, and how easy CUDA is (which is part of what you need for CAD).
Whatever you choose, do not, I repeat DO NOT install Fedora, Debian, or anything that doesn’t explicitly support Nvidia laptops by default, out of the box, or you are in for a world of pain. If any guide starts with ‘install these 3rd party repos’ or so, you have entered a danger zone, and you will hate linux.
VSCodium, or some similar VSCode build/derivative.
I know, I know, but the critical mass is just so useful. As a random example, there are specific extensions to support game modding in Paradox scripting language, or Rimworld XML. Nothing else has so many random niches filled.
It’s fast with big files (faster than anything I’ve tried other than ‘specialized’ log readers and such), it’s a fast search, it’s got good git support, it’s got support for sudo file editing…


This is interesting because theres not a ton of direct Windows vs. linux game benchmarking, and now there’s about to be. GN churns though a lot of hardware and testing.
And excellent, because being linux, drawing attention to issues increases the chances of them getting fixed, whereas that is hardly the case for Windows.
Arch (with KDE I presume?) + Bazzite is not bad either. There’s a lot of handwaving over they should have chosen this or that distro, but they’re both very popular in the gaming space, so I feel that’s fairly representative of many distros.


To be clear, VMs absolutely have overhead but Docker/Podman is the question. It might be negligible.
And this is a particularly weird scenario (since prompt processing literally has to shuffle ~112GB over the PCIe bus for each batch). Most GPGPU apps aren’t so sensitive to transfer speed/latency.
I suppose not. Not yet.
I know people are particular about WMs, but having to minimize a window vs keeping the window decoration in place seems like a… very minor distinction.
Is the use case rearranging a ton of windows? Something like that?