Just a PSA.
Sorry to link to Reddit, but not only is the dev sloppily using using Claude to do something like 20k line PRs, but they are completely crashing out, banning people from the Discord (actually I think they wiped everything from Discord now), and accusing people forking their code of theft.
It’s a bummer because the app was pretty good… thankfully Calibre-web and Kavita still exist.


And every time the use of LLMs for open source development comes up we get the same tired spiel from people about how it’s just a tool and implications that anyone who doesn’t embrace it with jpy in their heart is just a Luddite.
It seems to me that it’s less a tool and more like intentionally infecting your project with cancer. Sure it shows all the signs of rapid growth, but metastasization isn’t sustainable or desirable. Plus I am yet to encounter a strong advocate for LLMs who isn’t a cunt.
I’ll argue that it is a tool, and object to automatic zealous hostility towards anyone using it, but that doesn’t mean criticisms of how that tool is being used aren’t valid. It seems like that is what people are focusing on here, and they definitely aren’t Luddites for doing so.
I think I can provide you a great equivalent. Firearms, they have utility, but there are people who make them a lifestyle choice, and there are people who make them their whole personality. There are also a lot of people just desperate for an excuse to use one. I grew up with a couple of farmers in the extended family, I would never argue guns should be entirely banned, but I am so glad I live somewhere with sane laws around gun ownership. It would be so nice if we had similar consideration around regulating LLMs.
The danger to open source as I see it is that LLMs degrade the quality and ability of developers while increasing their throughput, and I have never once heard someone complain that open source lacks quantity, but I hear a lot of people complaining about the quality.
I think that the problem, in both cases, is culture.
It’s not that either of those are bad, or bad for people; it’s bad for people of this culture or people of this society. It’s how the two intersect that is the problem.
It could be a tool that lifts up the worker or creative, but instead it’s a tool to devalue the creative and extract power and wealth.
It highlights that people with power get a different set of rules and laws than the rest of us, and they’re using that to further entrench and enrich themselves.
I will complain about quantity, many areas where open source projects are competing with closed source commercial products they have not achieved feature parity or a comparable level of polish, quantity matters. So does, as someone else touched on, quality of life improvements to the process of writing code like ease of acquiring and synthesizing information. That doesn’t mean it’s necessarily a worthwhile tradeoff, but how much is really being sacrificed depends on what exactly is being done with a LLM. To me one part of what’s described here that’s clearly going too far is using it to automate communication with other people contributing to the project, there’s no way that is worth it.
As for the gun thing, I will support entirely banning LLM powered weapons intended to kill people, that’s an easy choice.
I still don’t think quantity is lacking, and when quality is there it’s amazing how often Open Source becomes a defacto standard. How many video tools are just a shim over FFMPEG for example?
Yet again the problem I see is that LLMs are a seductive form of software cancer, it starts as a little help and before you know it we have booklore like projects. If open source can’t be better it will be subsumed in slop.
Not disagreeing about LLMs as a weapon. In a functional society the person who pulls the trigger on any weapon is responsible for the consequences of that action. I wonder how eager the CEOs of these “AI” companies would be to weaponise their creations if they were held personally accountable for every injury caused by their product. By a jury. Preferably with explicit laws stating they could not indemnify or gain immunity.
I think it kinda depends on the context. If someone is just making a tool for themselves and they slap on MIT or GPL3 just because who cares someone else can have it, then sure. Who cares if it’s trash if the stakes are so low that they’re scraping the ground and the user base is expected to be single digits.
But when you care about the reputation of your project, or if your project requires people trust it, then yeah for sure it’s not appropriate to vibe/slop it.
I have ethical concerns about the realities of how this tech is used, mainly in what it’s doing to the economic and power dynamics in society. But I don’t have a problem with the tech itself. That said, I have to admit that it may not be realistic to separate the tech from its inevitable impact. Now I have become death, the destroyer of worlds, and all that.
I find an LLM is a great way to shortcut the googling itd take for me to parse random error message #506 when I’m learning a new language but that’s about it. I’m also in no way writing software meant for mass consumption.
Ergo its a tool, a search engine replacement, that we wouldn’t need if search hadn’t gone to shit due to neglect and active internal sabotage.
Oh 100%.