

How? If you got hit by this you are looking at restoring the system from a safe previous version.
And the compromised versions get pulled, not superseeded by a new release, so once you rebuild you would go back to a safe version…
I like sysadmin, scripting, manga and football.


How? If you got hit by this you are looking at restoring the system from a safe previous version.
And the compromised versions get pulled, not superseeded by a new release, so once you rebuild you would go back to a safe version…


As long as the bot is not allowed to automatically merge minor version bumps in libraries…


You can mitigate similar attacks by editing your .npmrc
min-release-age=7 # days
ignore-scripts=true


That part of the speech is perfectly standalone and applies to many regimes outside the US


Everytime somebody posts something complacent like this telling us not to worry I remember a piece of speech by Arnim Zola in the Winter Soldier movie.
"Hydra was founded with on belief that humanity could not be trusted with it’s own freedom.
What we did not realize is that if you try to take that freedom, they resist.
The war taught us much, humanity needed to surrender it’s freedom willingly.
[…]
For 70 years, Hydra have been secretly feeding crisis, reaping war. And when history did not cooperate, history was changed.
Hydra created a world so chaotic that humanity is finally ready to sacrifice it’s freedom, to gain it’s security"


Doesnt have a dashboard per-se for centralized administration. It has a web ui to manually create create/upload collections. I personally use it a very simplistic way and just reupload an updated .vcf file with all my contacts from time to time.
About user management, I dont know how you installed radicale but they have this docs https://radicale.org/v3.html#authentication


They just need to push for thin clients that rely on cloud computing as the prices for consumer hardware make it not affordable to own your stronger hardware.


Yes I do. I cooked a small python script that runs at the end of every daily backup
import subprocess
import json
import os
# Output directory
OUTPUT_DIR = "/data/dockerimages"
try:
os.mkdir(OUTPUT_DIR)
except:
pass
# Grab all the docker images. Each line a json string defining the image
imagenes = subprocess.Popen(["docker", "images", "--format", "json"], stdout = subprocess.PIPE, stderr = subprocess.DEVNULL).communicate()[0].decode().split("\n")
for imagen in imagenes[:-1]:
datos = json.loads(imagen)
# ID of the image to save
imageid = datos["ID"]
# Compose the output name like this
# ghcr.io-immich-app-immich-machine-learning:release:2026-01-28:3c42f025fb7c.tar
outputname = f"{datos["Repository"]}:{datos["Tag"]}:{datos["CreatedAt"].split(" ")[0]}:{imageid}.tar".replace("/", "-")
# If the file already exists just skip it
if not os.path.isfile(f"{OUTPUT_DIR}/{outputname}"):
print(f"Saving {outputname}...")
subprocess.run(["docker", "save", imageid, "-o", f"{OUTPUT_DIR}/{outputname}"])
else:
print(f"Already exists {outputname}")


26 tho this include multi container services like immich or paperless who have 4 each.


echo 'dXIgbW9tCmhhaGEgZ290dGVtCg==' | base64 -d


I run changedetection and monitor the samples .yml files projects usually host directly at their git repos


Bring back my computer as well


people say go back in time to pick the correct lotto number
I say go back in time and sell my 8TB disk for 80 billion


For the price of the car I would expect the SSD drive to grow wheels and be able to actually drive it


gpt-oss:20b is only 13GB


This includes everything for a total of 261G



I have 4 Arch machines so I actually hold a local mirror of the entire arch repo in my homeserver and just sync from there at full speed


ollama works fine on my 9070 XT.
I tried gpt-oss:20b and it gives around 17tokens per second which seems as fast a reply as you can read.
Idk how to compares to the nvidia equivalent tho
I tried it in the past and it felt too heavy for my use case. Also for some reason the sidebar menu doesn’t show all the items at all times but instead keeps only showing the ones related to the branch you just went into.
Also it seems pretty dead updates wise
Mdbook is really nice if you mind the lack of dyanimic editing in a web browser
Yes, it’s called
export HISTSIZE=10000000