

Fuck yeah Randall is based af.
Formerly /u/Zagorath on the alien site.
Fuck yeah Randall is based af.
Honestly I can’t even figure out how to get that alpine-chrome image to work. I edited my Dockerfile to say
FROM zenika/alpine-chrome:with-puppeteer
instead of
FROM node:22
I tried changing USER node
to USER chrome
. I removed all the apt-get dependencies that were needed to get Puppeteer working in Docker on my PC in the first instance, and added --chown=chrome
to my COPY package.json
line, all as described in the with-puppeteer example. I also added the ENV
lines from that. (I also tried various combinations of some of the aforementioned changes but not others.) Now I get an error with the npm install
step.
15.44 npm ERR! code 1
15.44 npm ERR! path /usr/src/app/node_modules/canvas
15.44 npm ERR! command failed
15.44 npm ERR! command sh -c prebuild-install -r napi || node-gyp rebuild
15.45 npm ERR! prebuild-install warn install No prebuilt binaries found (target=7 runtime=napi arch=x64 libc=musl platform=linux)
15.45 npm ERR! gyp info it worked if it ends with ok
15.45 npm ERR! gyp info using node-gyp@8.4.1
15.45 npm ERR! gyp info using node@20.15.1 | linux | x64
15.45 npm ERR! gyp info find Python using Python version 3.11.10 found at "/usr/bin/python3"
15.45 npm ERR! gyp http GET https://nodejs.org/download/release/v20.15.1/node-v20.15.1-headers.tar.gz
15.45 npm ERR! gyp http 200 https://nodejs.org/download/release/v20.15.1/node-v20.15.1-headers.tar.gz
15.45 npm ERR! gyp http GET https://nodejs.org/download/release/v20.15.1/SHASUMS256.txt
15.45 npm ERR! gyp http 200 https://nodejs.org/download/release/v20.15.1/SHASUMS256.txt
15.45 npm ERR! gyp info spawn /usr/bin/python3
15.45 npm ERR! gyp info spawn args [
15.45 npm ERR! gyp info spawn args '/usr/src/app/node_modules/node-gyp/gyp/gyp_main.py',
15.45 npm ERR! gyp info spawn args 'binding.gyp',
15.45 npm ERR! gyp info spawn args '-f',
15.45 npm ERR! gyp info spawn args 'make',
15.45 npm ERR! gyp info spawn args '-I',
15.45 npm ERR! gyp info spawn args '/usr/src/app/node_modules/canvas/build/config.gypi',
15.45 npm ERR! gyp info spawn args '-I',
15.45 npm ERR! gyp info spawn args '/usr/src/app/node_modules/node-gyp/addon.gypi',
15.45 npm ERR! gyp info spawn args '-I',
15.45 npm ERR! gyp info spawn args '/home/chrome/.cache/node-gyp/20.15.1/include/node/common.gypi',
15.45 npm ERR! gyp info spawn args '-Dlibrary=shared_library',
15.45 npm ERR! gyp info spawn args '-Dvisibility=default',
15.45 npm ERR! gyp info spawn args '-Dnode_root_dir=/home/chrome/.cache/node-gyp/20.15.1',
15.45 npm ERR! gyp info spawn args '-Dnode_gyp_dir=/usr/src/app/node_modules/node-gyp',
15.45 npm ERR! gyp info spawn args '-Dnode_lib_file=/home/chrome/.cache/node-gyp/20.15.1/<(target_arch)/node.lib',
15.45 npm ERR! gyp info spawn args '-Dmodule_root_dir=/usr/src/app/node_modules/canvas',
15.45 npm ERR! gyp info spawn args '-Dnode_engine=v8',
15.45 npm ERR! gyp info spawn args '--depth=.',
15.45 npm ERR! gyp info spawn args '--no-parallel',
15.45 npm ERR! gyp info spawn args '--generator-output',
15.45 npm ERR! gyp info spawn args 'build',
15.45 npm ERR! gyp info spawn args '-Goutput_dir=.'
15.45 npm ERR! gyp info spawn args ]
15.45 npm ERR! Package pixman-1 was not found in the pkg-config search path.
15.45 npm ERR! Perhaps you should add the directory containing `pixman-1.pc'
15.45 npm ERR! to the PKG_CONFIG_PATH environment variable
15.45 npm ERR! Package 'pixman-1', required by 'virtual:world', not found
15.45 npm ERR! gyp: Call to 'pkg-config pixman-1 --libs' returned exit status 1 while in binding.gyp. while trying to load binding.gyp
15.45 npm ERR! gyp ERR! configure error
15.45 npm ERR! gyp ERR! stack Error: `gyp` failed with exit code: 1
15.45 npm ERR! gyp ERR! stack at ChildProcess.onCpExit (/usr/src/app/node_modules/node-gyp/lib/configure.js:259:16)
15.45 npm ERR! gyp ERR! stack at ChildProcess.emit (node:events:519:28)
15.45 npm ERR! gyp ERR! stack at ChildProcess._handle.onexit (node:internal/child_process:294:12)
15.45 npm ERR! gyp ERR! System Linux 6.10.14-linuxkit
15.45 npm ERR! gyp ERR! command "/usr/bin/node" "/usr/src/app/node_modules/.bin/node-gyp" "rebuild"
15.45 npm ERR! gyp ERR! cwd /usr/src/app/node_modules/canvas
15.45 npm ERR! gyp ERR! node -v v20.15.1
15.45 npm ERR! gyp ERR! node-gyp -v v8.4.1
15.45 npm ERR! gyp ERR! not ok
15.45
[+] Running 0/1A complete log of this run can be found in: /home/chrome/.npm/_logs/2025-02-18T01_04_35_846Z-debug-0.log
- Service node Building 18.9s
failed to solve: process "/bin/sh -c npm install" did not complete successfully: exit code: 1
Which I just now (after posting) noticed was already mentioned in a different comment. Sorry!
I’m guessing the user who made that other comment is on lemmy.world? I can’t see any comment other than yours, and LW has known issues with federation (issues that would be fixed if the instance weren’t 5 version behind…) that mean I probably won’t be able to see it for about 2 days right now. So thanks!
I haven’t looked into the suggestion in great detail yet, but I will say I’m already running as a non-root user (USER node
is a line in my Dockerfile). I’m not sure what a seccomp profile is, but in case it wasn’t clear from the original post, I just want to emphasise that the current configuration works in Docker on my Windows PC. It’s only on the Synology NAS that it fails.
Personally I’m not enormously worried about SSH, because I’m behind NAT anyway, but yeah it’s definitely still something I’d rather keep off if not in use.
Is there a way to get a terminal on the Synology itself, or is SSH from my PC the only way?
I would love to containerise it. I worked with Docker in a previous job, but honestly I’ve forgotten most of how to work with it. Would be a nice refresher to try and relearn how to create Dockerfiles and docker-compose.yamls.
Unfortunately I currently have two problems. First: I seem to be completely unable to test this on my desktop. When I open Docker on my PC, it complains that I need to run wsl --shutdown
, but despite doing that many times, it still complains, before immediately closing.
So I was going to try doing it entirely on the Synology. And then I ran into the issue that…I have no idea how to even start with that. When I search for Docker in the Package Manager the only thing that comes up is Synology’s own container manager, and I have no idea how to work with that.
Yeah I’m pretty sure my Synology should be able to run containers. It’s a DS923+. But unfortunately when I search for Docker in the Package Manager the only thing that comes up is Synology’s own container manager, and I have no idea how to work with that.
How do you run a docker container on Synology? I have a DS923+ which AFAIK should be able to run it, but when I search for Docker in the Package Manager the only thing that comes up is Synology’s own container manager, and I have no idea how to work with that.
How do you run a docker container on Synology? I have a DS923+ which AFAIK should be able to run it, but when I search for Docker in the Package Manager the only thing that comes up is Synology’s own container manager, and I have no idea how to work with that.
How do you run a docker container on Synology? I have a DS923+ which AFAIK should be able to run it, but when I search for Docker in the Package Manager the only thing that comes up is Synology’s own container manager, and I have no idea how to work with that.
I can, but that would require manually starting it up every time I restart my computer—which is daily, for the most part. And there are times when I don’t even turn on my computer for the day, or don’t do so before the 2pm time the bot needs to run. It would be better to have it running on a system that’s always online.
I shared my parents’ account, but rather than locking me and my sister out of Netflix when they started bringing in restrictions against account sharing, Netflix just…deleted our account entirely. Maybe something to do with my parents being in a different country from me and my sister.
So anyway we haven’t had Netflix for like 2 years now thanks to that. Thankfully we’ve got a really nice…alternative streaming site with all the stuff that Netflix has, and any other stuff available from around the web, with the niceties of being streaming based rather than requiring downloading ahead of time, and keeping your watch progress etc.
I don’t even think that’s remotely true.
I’ve seen two cases that actually directly impacted my ability to use Firefox. I can only presume there are many more. Those being supporting the column-span CSS property (available since 2010 in other browsers with vendor prefix, and early 2016 without, while being late 2019 for FF) and supporting iPad OS’s multi-window functionality (introduced mid 2019, Firefox has had it for just a handful of months now). I have first hand experience telling me very directly that this is true.
There’s also been a lot of talk about Firefox’s lack of support for PWAs. I’ve not experienced that myself to be able to comment more than to say I’ve noted others have complaints.
The point is that with open source you can effectively leech off of Google for now, while still retaining the flexibility to nope out and do your own thing at any point you decide.
Considering just how severely behind they are already (as I mentioned in my other comment, they’re often 3–5 years behind other browsers in implementing new web standards or operating system features), I see anything they can do to reduce how much they need to maintain independently as a good thing. In an ideal world where they had all the funding and development power they could want I might say sticking with the completely independent Firefox would be great. But that just isn’t where they’re at today.
They wouldn’t be at the mercy of anything. That’s…how open source works. If it changes in a way that breaks things for you, don’t pull that change. At that point, if the change is drastic enough to require it, you can turn that soft fork into a hard fork and hope that Edge, Brave, Vivaldi, Opera, etc. join you; something that would significantly hamper Google’s ability to maintain their dominance of the browser engine market. That’s a choice that they simply don’t have today when being based on Firefox and Gecko means using an inferior browser platform.
Honestly I’ve been saying for some time that Mozilla’s resources would be much better spent making Firefox a soft fork of Chromium. Primarily: use the Blink browser engine and V8 JS engine, with only the changes to those that they deem absolutely necessary, and maintain a privacy-forward Chromium-based browser. Maybe try and enlist the help of Brave, Vivaldi, and other browsers that are currently Chromium but which prefer more privacy than Google offers.
It’s not zero effort, and especially as Google continues to develop Chromium with assumptions like the removal of Manifest V2 it might take some effort to maintain, but it cannot possibly be as much effort as maintaining an entire browser.
I couldn’t tell you for sure, because I don’t use it or its commercial competition very much. That said, personally when I have needed to use it, I’ve always found the gap between Audacity and its pro equivalents in terms of basic usability to be much lower than in other creative fields. GIMP, in particular, is nigh unusable compared to Photoshop.
If you’re interested in seeing more, here’s a video where the new lead announced that he was taking it over. And the official Audacity YouTube channel has been posting overviews of its updates since then. I think it likely that the first two updates (3.1 and 3.2) contain some of the most critical functionality.
Foss should encourage privacy and freedom. Cloud storage doesn’t normally do that.
Then don’t use it? It’s that simple. If it makes money for them and some users like it, there’s nothing wrong with that.
that’s not that right way to get paid
I don’t know a whole lot about what Audacity is up to these days, but the same company owns MuseScore, and it sounds like they’re doing kinda similar things in terms of monetisation. The core software itself is still free, but there are optional cloud services on top of that which you can pay for.
I don’t see what’s wrong with this. Cloud services provide a convenience. Some people like that convenience and are willing to pay for it. Others might be perfectly ok doing it themselves and won’t pay.
It helps that the new head of design for both of these products is a guy who really knows his shit. He’s already taken MuseScore from an application that nobody in their right mind would use if they could afford the commercial competitors, to a legitimately great music engraving application, and he’s been on Audacity too since 2021.
Hardly the first time Spanish courts have completely fucking failed to understand technology (I will never get over how monumentally stupid the Mario Costeja case was), but this manages to take it to an incredible new level. The idiotic lack of forethought asside, you’d think that once the consequences were made clear and people were actively being prevented from accessing all sorts of legitimate websites, including GitHub, they’d immediately reverse course. How the fuck does one private sporting company gain the right to force a huge swathe of the Internet to be blocked, anyway? Utter fucking nonsense.