What e-waste? Your computer still works, you can install other browsers and apps 🤷♂️


podman unshare is your friend. And placing :Z at the end of bl volume share to make SELinux happy


No problem is a bit of an exaggeration, though. For starters, if your are running with a non-root user, it bites you when trying to share a host volume, specially in SELinux. I can imagine other situations when one has to fiddle with it. But yes, go with podman whenever possible.
I was kinda guessing that images, video and such are your use case - yep, those might really benefit from a faster memory and plenty of cores. Not sure if it affect ML much since it’d be calculated on GPU. Thanks for info on gaming.
How time flies, eh? I went searching for more info - https://www.techpowerup.com/339178/ddr6-memory-arrives-in-2027-with-8-800-17-600-mt-s-speeds It also mentions an architectural change to increase the speed further.
Not sure, but I assume a couple of years at least - it might be also affected by stupid external factors, like insane tariffs and such. Also it will take some time for DDR6 perf to go up - for both mobos and memory. So even at the start you might be still better off DDR5 and if you go with DDR6 you might need to replace mobo and memory to get better perf. That’s my impression on hardware situation, I might be wrong, though.
What is your use case for threadripper, I’m curious? AFAIK it’s not a good match for gaming at least. Or is it?
Why DDR6 though? It will be expensive at start and won’t bring much to your workstation unless you are really memory perf bound.
He’s talking about non Android phones. We all know that Android is Linux based.


While measuring power efficiency is not an easy task, I doubt that AMD is better. Here is an older article https://www.tomshardware.com/news/geforce-rtx-4070-vs-radeon-rx-6950-xt-which-gpu-is-better But you are right, things seem changing https://gamersnexus.net/gpus/incredibly-efficient-amd-rx-9070-gpu-review-benchmarks-vs-9070-xt-rtx-5070 Hopefully AMD improves on efficiency in future.


From just hardware perspective, Nvidia cards are more energy efficient.
Edit: I stand corrected, series 9070 is much more energy efficient.
My bad, used the wrong word there. I meant that Windows is very compatible with older versions and different flavors.
I think I’m cautious enough to not have the experience, luckily. But why does that matter? I’m still waiting from you for rationale why is Linux experiencing less infections. And you keep asking unimportant questions…
Servers are a different story. I’m both Windows and Linux user, meaning more towards the later recently. I’m still wondering why do you think Linux is more resistant to malware - besides the incompatibility (mentioned in other reply here). Your experience doesn’t tell much about why and I wrote my theory.
Ha, yes, incompatibility is the secret defense of linux 🫣. But even without root access, malware can create a lot of damage.
You sure though? Windows has more viruses because it’s more popular (desktop) and monolithic, not because Linux is much better in that regard. IOW Linux is not magically virus resistant. If you run an infected file, it will infect both without much trouble. Also removing infection would be similar. At least that’s my understanding.


NFS lacks security unless it’s NFSv4 paired with Kerberos AFAIK.


Yeah, that won’t but you could still try, just in case. Other than that, are both client and server on same IP segment? About router - I’d really suggest using your own router (also firewall) behind the provided one. Otherwise you are exposing your internal network to network provider which you might not want to do and at the same time you don’t have a control over the core device in your network.
Restic for backups works as well.