Yep. The reason Windows and macOS are way more accepted than Linux is because they’re essentially idiot proof. Linux is not and that’s not necessarily a good thing if you want the year of the Linux desktop to actually happen one day.
Yep. The reason Windows and macOS are way more accepted than Linux is because they’re essentially idiot proof. Linux is not and that’s not necessarily a good thing if you want the year of the Linux desktop to actually happen one day.
Ok, so arch doesn’t break because it’s unstable, it just breaks anyways. And it doesn’t break more in general, it just breaks worse more often. Got it.
I’ll still stay away from the bleeding edge.
That’s still exactly what I meant? Sure, arch may never break even though it’s unstable but it being unstable heightens the risk of it (or some program) breaking due to changing library versions breaking dependencies.
Dependency issues happen much more rarely on stable systems. That’s why it’s called stable. And I very much prefer a system that isn’t likely to create dependency issues and thus break something when I update anything.
I‘d rather have a system that is stable and a few months out of date than a system that is so up to date that it breaks. Because then I cannot, in a good conscience, use that system on a device that I need to just work every time I start it.
Second this. Am not a huge fan of ubuntu itself and I have had issues with other debian based distros (OMV for example) but mint has always been rock solid and stable on any of my machines. The ultimate beginners distro imo.
Larger downstream distros like manjaro (and steamOS for that matter) can be stable. I wouldn’t call manjaro a beginners distro though, like mint would be (No Linus, there’s no apt in manjaro) but it’s very daily-driveable.
Although, if you’re most people, just stay away from rolling release distros. There’s so little benefit unless you’re running bleeding edge hardware…
If it‘s your first time trying linux, go with mint. It’s stable and almost every tutorial will work for you. If you know your way around a terminal already, the choice is all yours. I personally like Fedora.
That’s why I recommend mint. You have all the benefits of ubuntu but without the corporate stuff. And flatpak instead of snap.
Wasn’t that one of the main critiques of snap/ubuntu/canonical a few years ago already?
Among my personal dislike for its shade of purple, that has been my primary reason to not recommend ubuntu for a while, at least.
No, HDR can’t make your monitor brighter than it is. But it can take full advantage of the brightness and contrast of modern displays in a way SDR cannot. In almost every case HDR looks better than SDR but brighter and/or more contrasty displays take the most advantage.
In a more technical sense, SDR content is mastered with a peak brightness of 100 nits in mind. HDR is mastered for a peak brightness of 1000 nits, sometimes 2000 nits and the resulting improved contrast.
If you don’t watch movies in HDR on a modern TV, you’re not taking full advantage of its capabilities.
That’s incorrect. While it can be assumed that HDR content supports at least 10bit colour, it is not necessary for monitor or content. The main difference is contrast and brightness. SDR is mastered for a brightness of 100 nits and a fairly low contrast. HDR is mastered for brighnesses of usually 1000 or even 2000 nits since modern displays are brighter and capable of higher contrast and thus can produce a more lifelike picture through the additional information within HDR.
Of course you need a sufficiently bright and/or contrasty monitor for it to make a difference. An OLED screen or displays with a lot of dimming zones would produce the best results there. But even a 350nit cheap TV can look a bit better in HDR.
Well, my internet connection would have to be a lot faster, and they would all need devices that support UHD h.265 and HDR10 playback. But if you have have gigabit upload and they all have shields or similar with just as fast connections, you’re good to go without transcoding (if no one wants to access it through mobile)
I regularly watch on my server when I’m not home and a few friends of mine also have access to it, so I need the content to be available in SDR and lower bit rates. When I stream from home, I‘d like to have access to the full quality and HDR though, so either I need multiple versions of each film or hardware encoding/tonemapping and a used gtx 1050ti was a lot cheaper than the required storage would be to have 4 or 5 versions of every film.
But yes, if you’re only streaming within the same network, hardware transcoding isn’t necessary in the slightest. But then a SMB fileshare might also suffice…
As I need hardware transcoding, that makes emby immediately non viable for me. I also usually watch via various apps and on tv, which, if you don’t have emby premiere are also not free to use.
It’s free and open source. That alone is a big plus. And it works fairly well. What does emby do better, that warrants paying $120 for it?
Yea. I like my MacBook and I like macOS (yes, I know, shame on me). But in a few years, when Apple eventually stops supporting it, I can just put Linux on it and keep using it (or give it to a relative who just needs a working computer). It’s good hardware and in true Apple fashion, it will probably outlast its software. I also have an old Core 2 Duo unibody macbook laying around and while it is possible to put the latest macOS on (thanks hackintosh community), Linux is a much better experience and the MacBook is sturdier and has a better trackpad and keyboard than most new laptops, even many that are much more expensive.
I wouldn’t recommend Intel CPUs (at least the last two gens) either but if all that matters to you in a GPU is hardware encoding (quality or codec support), like for a Jellyfin server, Intel ARC is unbeatable.
That’s fairly standard for serif fonts like times new roman, baskerville, etc. Although it is uncommon in modern sans serif fonts and/or fonts designed to be viewed on a screen.
Sure, give a somewhat intelligent person between 20 and 40 a PC with Linux on it and they’ll figure it out. However, that doesn’t mean they have the patience of finding out how to install Linux in the first place. And also, they‘ll figure out how to install apps, sure. Until they try to download the installer.exe for Microsoft Office because why would they know that it won’t work.
The problem isn’t, that they couldn’t figure it out, the problem is most people just want a working computer and not relearn what they already know or learn what an operating system is at all.
(And also, I remember reading some study, that a lot of late Gen Z and younger (the ones that didn’t grow up with Windows XP or earlier anymore) are actually less tech savvy than older generations because they’re used to not really having to troubleshoot tech)
A few things (disclaimer, I‘m both a Linux and mac user. Linux on my gaming machine, mac on my work machines):
• Privacy is a big factor. Microsofts track record is bad, even among non FOSS companies.
• Bloatware and Ads. Microsofts insistence on pushing OneDrive, Edge, 365 and bing are annoying to say the least. Why do they think I’m going to change my mind about that after a minor update?
• The UX is less than stellar. Why does the OS have 4 different UI styles for different programs that sometimes even do almost the same thing but not entirely, so you’ll have to use both versions?
• It’s almost impossible for me to keep my desktop tidy short of not using it. I’m dependent on macOS stack feature. On Linux I never had enough random files for it to be a problem.
In short, Windows just annoys me. While Linux and macOS go out of my way and let me just do my stuff, Windows just constantly pulling my attention away from what I advertised want to do and that was even when I was using my PC solely as a gaming machine.
Edit: formatting
Not having to pay for hardware transcoding/tonemapping is the biggest „selling“point for Jellyfin. I used to have plex before. It worked well but I didn’t want to pay 100€ for transcoding. Never tried emby for the very same reason.