• 0 Posts
  • 36 Comments
Joined 4 months ago
cake
Cake day: November 5th, 2024

help-circle



  • Yeah let’s instead install a massive bloated shit project that the original developers left years ago and the maintainers don’t know heads from tails of the codebase because it’s too massive to maintain, with enough dependencies to make even a small child think he’s independent by comparison.

    All so that we can, uh, synchronize a markdown text file across 3 computers.

    These projects exist so that we don’t all have to re-invent the wheel every single time we need something simple. They have a purpose, even if they’re not pushing the envelope. I’ve developed a bunch of software to do extremely simple things for myself because all the existing options are massive and bloated and do a million more things than I need.

    I’m sure your projects look impressive on your resumé, though.



  • Even if that was possible, I don’t want to crash innocents peoples browsers. My tar pits are deployed on live environments that normal users could find themselves navigating to and it’s overkill when if you simply respond to 404 Not Found with 200 OK and serve 15MB on the “error” page then bots will stop going to your site because you’re not important enough to deal with. It’s a low bar, but your data isn’t worth someone looking at your tactics and even thinking about circumventing it. They just stop attacking you.







  • Sending is someone else’s problem.

    It becomes my problem when I’m the one who wants the files and no free service is going to accept an 80gb file.

    It is exactly my point that I should not have to deal with third parties or something as massive and monolithic as Nextcloud just to do the internet equivalent of smoke signals. It is insane. It’s like someone tells you they don’t want to bike to the grocer 5 minutes away because it’s currently raining and you recommend them a monster truck.


  • Why is it so hard to send large files?

    Obviously I can just dump it on my server and people can download it from a browser but how are they gonna send me anything? I’m not gonna put an upload on my site, that’s a security nightmare waiting to happen. HTTP uploads have always been wonky, for me, anyway.

    Torrents are very finnicky with 2-peer swarms.

    instant.io (torrents…) has never worked right.

    I can’t ask everyone to install a dedicated piece of software just to very occasionally send me large files




  • The misunderstanding seems to be between software and hardware. It is good to reboot Windows and some other operating systems because they accumulate errors and quirks. It is not good to powercycle your hardware, though. It increases wear.

    I’m not on an OS that needs to be rebooted, I count my uptime in months.

    I don’t want you to pick up a new anxiety about rebooting your PC, though. Components are built to last, generally speaking. Even if you powercycled your PC 5 times daily you’d most likely upgrade your hardware long before it wears out.



  • To me, the appeal is that my workflow depends less on my computer and more on my ability to connect to a server that handles everything for me. Workstation, laptop or phone? Doesn’t matter, just connect to the right IPs and get working. Linux is, of course, the holy grail of interoperability, and I’m all Linux. With a little bit of set up, I can make a lot of things talk to each other seamlessly. SMB on Windows is a nightmare but on Linux if I set up SSH keys then I can just open a file manager and type sftp://<hostname> and now I’m browsing that machine as if it was a local folder. I can do a lot of work from my genuinely-trash laptop because it’s the server that’s doing the heavy lifting

    TL;DR -

    My workflow becomes “client agnostic” and I value that a lot