

Oh! Thanks! I like that link. Definitely researching that more.
Oh! Thanks! I like that link. Definitely researching that more.
Personally, I think IPv6 is not a good choice for any service you don’t want associated with a specific device. As I understand it, the prefix delegation comes from the ISP, but often the interface ID is derived from the machine’s MAC address which is a link to specific machine hardware, can reveal information about the host, and possibly deanonymoized across networks.
I’d stick with IPv4 because NAT gives a tad more anonymity. Just my $0.02 though.
That’s good news! It would be great if relays made it difficult to be targeted. I last tinkered with TOR almost… Jeez!.. 20 years ago haha!
I ran a relay too way, way back in the day and I remember almost a third of the sites I used blacklisted my IP address within days. It wasn’t cool.
I ended up shutting it down, resetting my cable modem, and spoofing a new MAC address on my router to get a new IP address to get everything working again.
Using a VPN is smarter. I wouldn’t run that on IPv6 whatsoever.
About 10 years ago, I just moved and my new neighbor had an open network. Problem was they were 2 houses away and across the street. I set up a tiny repeater in my car with a battery pack and parked half way between us.
It worked surprising well for about 6 months.
I just did something sort of like what you are doing and after a few hiccups, it’s working great. My Synology just couldn’t handle transcoding with docker containers running in the background.
Couple differences from your plan: I chose a N100 over the N150 because it used less power and I wasn’t loading up CPU dependent tasks on the thing. The N150 is about 30% faster if memory serves, but draws more power. Second, do you really need a second m.2 SSD BTRFS volume? Your Synology is perfectly capable of being the file storage. I’d personally spend the money you’d save buying a smaller N150 device on a tasty drive to expand the existing capacity then start a second pool from scratch.
Finally, I wouldn’t worry about converting media unless you are seriously pinched for space. Every time you do, you lose quality.
Ditto to your comment except power usage. I moved my Plex/Jellyfin (and hopefully Immich soon) docker containers to an N100 for the hardware acceleration. TDP is 6 watts on some of these devices and CPU use sits around 2% unless Plex is doing DB optimizations (about 60% for a bit). I haven’t measured consumption or my older server, but I feel moving some CPU intensive services to hardware GPU is saving a few watts.
There is M.2 on the mobo so I’d probably go with NVMe over SSD.
I second the RAM recommendation. I have 32GB in my Synology and it needed it for all those docker containers and VM’s.
As for the mobo, not thrilling, but could work. If you have to add a PCIe card for more drives, 10G network, or more NVMe, you’ll max out pretty quick with a GPU in there too.
I tried to update my lemmy instance and it all went so horribly wrong. DB never came up, errors everywhere, searching implied I updated to a dev branch sometime in the past (not a dev, don’t think I did) and it’ll be console and DB queries for a fix.
Ran out of time and overwhelmed, I restored backups and buried my head in the sand. Nope, not now. Future, yes, but oh not now.
People don’t use VPN to bypass CGNAT, they use it to protect their IP address. The 'ol saying “Don’t shit where you eat” applies. Probably worth looking into depending on location.
Same here. I don’t like some of the recent decisions, but I remember the time I looked at the value and thought “yeah, this is working, valuable, and I can get behind it”, and bought the lifetime pass.
And I used the hell out of it! I don’t regret supporting the developers at all.
But features like plugins disappear, rolled to in-house teams. They work better, but cost more to maintain.
It’s ambitious, and gives developers plenty of work, but I feel the new redesign bit more than they can chew and overran budgets. They may be trying to balance budgets.
Jellyfin certainly took off. Great for them. It just wasn’t polished or an option when I set things up way back then.
Same boat here. I chose Plex because the apps were everywhere. Smart TV’s, phones, web…
I can switch, no problem. I don’t want to have to teach my parents a new app. OMFG!
I can understand new features being behind a fee, but this is putting old, old capabilities behind a paywall. Hmmm…
This with a recent decision to remove watch together sort of eliminates the whole reason I would have tried Plex so many years ago.
I’m a fan of Plex (it’s worked for me) and understand the Jellyfin crowd too. I’m worried about who is calling the shots at the moment. They aren’t aligning with their users.
Heck yeah! Old desktops or laptops are how most of us got started.
Things to consider:
I’m sort of looking to upgrade and N100 or N150’s are looking good. Jellyfin can do transcoding so that takes a little grunt. This box would work well for me. It’s not a storage solution, but can run docker and a handful of services.
Since you can’t use GPU for rendering or recording, a faster CPU couldn’t hurt and is probably your best bet, but it isn’t really thaaaat old. (There being no other hardware bottlenecks of course)
There are, cough cough I think, ways to decrypt and record HDCP protected streams but I’d imagine you’d have to have a render unit and recording unit in your setup. Sort of complicated.
Ah bummer. That’s most of my software tricks. I’d break out my hardware equipment next step, but that’s pricy.
That PC should be able to play a stream just fine. I’d check if there background apps hogging resources and make sure hardware acceleration is active. Scale the resolution down (no point of upscaling 1080p content) and record to a lower resource intensive format, perhaps h.264 vs h 265/HEVC.
I use smokeping just for fun. It doesn’t measure throughput, just latency, but they are loosely linked.
Bitwarden/vaultwarden is a popular option for selfhosters.