

Tiny Core Linux? I’ve never used it but I’ve seen videos of it running really well on super old machines. Like this one.
Tiny Core Linux? I’ve never used it but I’ve seen videos of it running really well on super old machines. Like this one.
Might be to prevent the SD card from running out of write cycles?
Maybe I’m just a young whipper snapper but I don’t get why people would want cartridges when freely copying the files to the main drive is an option since this would only work with DRM-free games. Cartridges were historically used instead of floppy disks or optical disks for DRM as you can make them basically impossible to duplicate. Even now the only reason Nintendo still sells cartridges is to allow the same game to be played in different devices with different logged in accounts while ensuring there is only one copy available between them.
And so with that in mind the basic idea is that you grab DRM free games (from stores like GOG), and pop them onto SD Cards to turn them into cartridges that Kazeta will detect.
So now instead of storing games on the computer itself, you have to go out of your way to put them on individual SD cards?
Also, is it strictly one game per SD card? That would be pretty wasteful of the available space for smaller games.
I noticed a lot of non-technical people using ChimeraOS/SteamOS getting lost in Steam’s complex menu structure and struggling with basic things like launching and closing games
I feel like someone who’s so nontechnical they can’t even figure out Steam’s UI, which is developed by a massive company with dedicated UX engineers and comprehensive QA for all their software, would probably also not be able to figure out installing a Linux OS, especially one that doesn’t boot into a normal GUI by default. It also assumes they will have a dedicated computer just for console style gaming, which nontechnical users probably wouldn’t bother with. Unless they plan on selling devices with their OS preinstalled as dedicated game consoles?
Also, you still have to interact with GOG to get the games. And also be able to find the app data direcrory GOG downloads games to in order to put them onto an SD card.
This also directly contradicts a quote later in the article: “Kazeta is definitely not for everyone. It requires a bit of work to get started”
I became disenchanted with digital storefronts and have come back around to appreciating physical media: game cartridges, CDs, DVDs
I have gotten more and more into collecting old physical games and systems and found them to be a much more pleasant experience than what modern gaming offers
Fair enough if you just want physical media in general, but I feel like people collecting physical media would specifically want ones branded by the company and not generic SD cards.
I have become more and more concerned with preserving my digital game collection for play in the future.
Bur there’s things in between digital storefronts and physical read-only media. Why not just have a special directory on the desktop that autodetects games copied into it? I assume that’s basically what happens when you insert an SD card with a game on it.
If you want to keep games atomic and prevent corruption of the directory structure, why not just support game directories in the form of tar or zip files and automatically mount them as a virtual filesystem?
keeping your games untouched and preserved forever
Don’t flash based storage put your data at risk of corruption if you leave it unpowered for too long? Having the games on the SSD you have powered every day sounds like it would be safer.
Though at least the flash’s write cycle limit wouldn’t matter with read only cards.
Why? I don’t drive and don’t have a car but I can’t imagine the car itself not already having the exact same features since modern cars already have what is essentially a tablet built in.
Also, why not just have one of those phone holders on your dashboard like people have been doing before car integration was a thing?
Are those actually the only things you find lacking? If so that’s really good, practically the same as using LineageOS without any Google services.
I don’t use any of the stuff you mentioned and might have to consider Linux mobile as a daily driver if it’s that good. Especially if Google kills custom ROMs, it sounds like the people already running them would feel right at home switching to Linux mobile.
More importantly, how’s the app situation? Can people generally expect most of the desktop GTK or Qt apps they’re familiar with to be usable on a phone form factor? Is there a reliable way to run Android APKs on regular Linux now? At the very least F-droid apps?
More options is more good. The beauty of the open source community is different offerings of the same product catgory directly benefit each other instead of competing. Looking forward to running Cosmic system apps on KDE.
Containerized Cat for the Docker version?
Dock Dog would work too.
All of the long time Linux users have what you perceive as flawless experiences because they already did all the stumbling you did and more. Every operating system has steep learning curves and you will struggle with how it does things when first starting out. I recently had to start using Windows again after exclusively using Linux for years (and Windows 11 no less which I never used before) and there are plenty of times I’ve failed to do simple things I could do on Linux without even thinking.
What magical company do you work in that gives you UEFI access on your work computer? Mine’s so locked down.
Seconding Fedora.
I wish there were more M.2 cards beyond just SSDs and wireless NICs. The idea of a small form factor PICe interface is underutilized and things like hardware codec accelerators can keep laptops with older processors usable with new standards for longer. It’s sad how PCMCIA had an entire ecosystem of expansion cards yet we somehow decided that the much higher bandwidth M.2 is only for storage and networking. Hell, do what sound cards in the 90s/00s did and have M.2 SSDs specifically designed for upgrading older laptops that also have built in accelerators for the latest media standards. Hardware acceleration is energy efficient and can probably just be bundled into the flash controller like they’re bundled into the processor, and unless you have a top of the line SSD you’re probably not saturating the M.2 interface anyway.
Pre-installed distro needs to support one-click installation (like .app or .exe).
This defeats a lot of what makes Linux secure. The main reason you don’t get malware is because you never run untrusted binaries from the internet and you install everything from trusted sources like your package manager. A non tech savvy person doing this will inevitably hit one of the super rare Linux malware in the wild. Clueless person downloads the wrong installer is the model malware entry case. I also don’t see a benefit of just having an app store, you can even show proprietary software by default as long as they can be turned off (I suspect the main reason for one click installation is for downloading proprietary software).
AFAIK it’s a bad idea to use dd or another wiping tool that just overwrites the logical partitions on flash based media, and is also not that effective for security. SSDs have wear leveling and what the computer sees does not map 1 to 1 to what’s actually on the flash chips. They also have extra overprovisioned space inaccessible to your computer specifically for shuffling data around when wear leveling. So not only are you wasting write cycles, it’s not guaranteed to actually overwrite all your data on the flash chips themselves.
If you want to wipe an SSD, use secure erase from a tool like nvme-cli which will directly tell the controller to erase all the data. How well the controller implements that is anyone’s guess though.
I’d say if you’re going to the effort of fully encrypting your new install, doing a secure erase will be in that spirit and won’t hurt. There won’t be any performance benefit but it will (probably) ensure that none of your previous unencrypted data is still there, though even if you don’t do this, just writing to the drive in normal use will eventually fill up the free space and make it less and less likely that sensitive information is recoverable, but how long this happens depends on how you use the computer.
Did the Ventoy binary blob thing ever get resolved?
so the web browser will play nicer out-of-the-box with Wayland
Anything really polarizing can end up with a cult following. Just look at Rust.
People would be less mad if you straight up used a stock image with a watermark so I don’t understand why people go out of their way to use AI when they know people will comment on it and it will detract from the point of the article.
Also, using AI in the thumbnail makes people automatically assume you’re using AI in the text as well. And if you’re not doing that, why would you lessen the perceived value of your writing by making it seem like you are?
It just seems pointless and actively harms your actual goals because people will get hung up on the fact that you used AI and ignore your actual valid points. Especially when you’re writing about open source projects when most people interested in open source are vehemently anti-AI, it really just shows you don’t know your target audience.
TLDR: While Linux is less susceptible to malware in some ways, it mostly boils down to Linux having a more technically minded userbase whereas Windows is a “mainstream” operating system.
Most Windows malware nowadays come from social engineering scams (complete this “captcha” by pressing Windows+R and pasting in this powershell script we conveniently put in your clipboard) or untrusted third party installers because Windows doesn’t natively have a package manager. Like others have said, the old school self-propagating worms and drive by downloads that activate just by clicking on a link aren’t really possible anymore (outside of state actors with unlimited budgets to buy zero days) unless your system or browser is horrifically outdated.
In terms of social engineering, Linux is not necessarily better at preventing it than Windows. In fact, sudo in Linux will unquestioningly delete the kernel and system software or make unlimited changes to them. Windows, for better or for worse (tbh more worse than better), uses TrustedInstaller to limit access to system files. Windows 11 won’t easily let you delete or modify System32 for example, even if you’re an admin. So it’s in theory easier to do more damage to your system on Linux if you don’t know what you’re doing. But if someone is using Linux full time, they’re most likely technical enough to not be fooled into running random untrusted bash commands.
The biggest thing is to be careful with those Linux terminal tutorial sites that have a “add to clipboard” button, they can put literally anything into your clipboard, including an enter key to run the script as soon as you put it in your terminal (though this may or may not be possible depending on your terminal app). Actually, they don’t even need you to use their copy button. They can just set an event listener for control-C anywhere on their site and automatically replace the clipboard content. Just double check everything you copy before running it, especially since there’s a lot of times where Linux users have to rely on obsecue tutorials hosted on untrusted websites.
You also don’t really need to run untrusted installers on Linux because almost everything you need is in a properly moderated software repository, be it your native package manager, Flatpak, or Snap. Everything is signed by the authors and has a ton of eyes from the open source community on it. The only things to look out for is compiling something from GitHub, random AppImages, Elf binaries, scripts, and last but not least third party repositories that can be added as an installation source to your package manager/Flatpak/Snap. Basically, Linux gets most of its “doesn’t get malware” reputation from the same place Mac does: you rarely have to manually download and run an executable from a random website, which is the norm on Windows. Add to the fact that even when that’s needed, the Linux userbase is more technical and is more able to discern which sources are reputable and which are suspicious.
Another major source of malware is pirated versions of Windows or untrusted “license activators” from the internet. This just isn’t a problem on Linux because there’s no license to activate and it’s free to begin with so there’s nothing to pirate. And again, if someone is running Linux, they’re probably technical enough to know not to run random pirated versions of paid software to begin with, helped by the fact that the vast majority of paid software is Windows only.
Worse, it preserves “special” files like the ones in /dev or /var which aren’t removable by anyone other than root. Love extracting a system file backup in my file server as a regular user in order to get just a few files out of it, and promptly not being able to fully delete it afterward without SSHing into the server and using sudo.
I don’t get how a regular user can even create files like that. Sounds like a security vulnerability.