One of the best pieces of self-hosted software ever to exist.

Edit: This is Immich! for the folks who don’t know.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 day ago

      No, that’s not fascist. Facial recognition software can be used for a variety of reasons, like unlocking a phone or laptop, gaining access to secure areas, or home automation stuff.

      It’s only fascist if used by a government to oppress minorities. The software itself cannot be fascist, but it can be used by fascists.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          24 hours ago

          The fault lies with the makers and users of the softeware. Software doesn’t have political opinions, it’s software.

          It’s like saying Panzer tanks were fascist because they were built by the Nazis. Tanks cannot be fascist, they’re tanks. So despite being made and used by fascists, they’re not fascist, they’re tanks.

          That’s the same exact thing here. Facial recognition software can be used by fascists, but that doesn’t make the software itself fascist.

          • PeriodicallyPedantic@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            15 hours ago

            The other person deleted their comment so I can’t really know what the argument was, but I would like to make a distinction:

            While tools cannot be political themselves, tools can lend themselves to specific political purposes.

            A tank cannot itself be fascist, but it can make fascism more viable. Surveillance software cannot be political, but it is easily abused by fascists to destroy political opposition.

            What matters is the harm and benefits. Is the harm caused by the tool justified by it’s benefits? Or are the primary use cases for the tool to prop up fascism?
            (I suspect that “authoritarianism” would be a better term to use here, but I’m continuing the theme of the thread)

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 hours ago

              Their argument was that software can, in itself, be fascist, and that’s what we went around and around on. The example given was facial recognition software that can determine race (and later, country of origin).

              Essentially, I said exactly what you’re saying, while they argued the opposite. I wish I quoted them, but I did only directly address their claims, if you’ll take my word for it.

              I don’t want the government to have and use facial recognition software (their example) and extensive security camera systems (my example, such as Flock), not because those solutions are fascist in and of themselves, but that they can be used by fascists to accomplish their goals. Even if the current regime uses them purely for good (i.e. completely opt in facial recognition, cameras inaccessible to police until there’s a warrant with no passive collection) the next regime may not.

              • PeriodicallyPedantic@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 hours ago

                The extension of the argument I’m making (and maybe them kinda?) is that it’s functionally the same as if the software were political.

                You can make software that nearly exclusively benefits a particular political belief for family of beliefs.

                So even if it’s not actually technically political, it can be functionally political, at which point the argument is splitting hairs.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  4 hours ago

                  I think those are important hairs to split.

                  Let’s say there’s a camera system built due to a direct public vote and rolled out by a political party all agree defends democracy. The stated goal is catching red light violations and speeders, and it’s a popular system. As part of the functionality it reads license plates, and that is verified by a human every time, and no footage is stored if there’s no violation.

                  Is that system fascist? Most would say no, and it exists in many states, like California and Washington.

                  Then the next election, a fascist is elected, and one of the first moves is to repurpose that system to track undesirables, and now it stores a ton of footage.

                  Is that system now fascist? It’s the same exact system as in the previous example, it’s just being used for fascist ends, such as tracking vehicles with certain plates (e.g. Illegal immigrants, minorities, etc) Nothing has changed in the capabilities or programming of the system, the only change was when to capture footage, what people use it for, and how long to store it.

                  Yes, it’s theoretically possible to design a fascist system, such as an LLM that only gives fascist answers, but that’s an incredibly narrow definition.

                  • PeriodicallyPedantic@lemmy.ca
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    23 minutes ago

                    Just because a product has a plausibly deniable use case doesn’t really mean that it’s not functionally political.

                    If someone creates a super invasive surveillance system and initially uses it for a seemingly benign purpose, that doesn’t mean the intention all along wasn’t more nefarious, especially if the system was practically irresistible for power structures and it’s use directly lead to authoritarianism. Like giving someone their first hit for free.

                    In a case like that, I would discount the benign use as a red herring, and say that the software is functionally political.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              edit-2
              22 hours ago

              Again, it’s not the software itself that’s fascist, it’s what it’s being used for that’s fascist. Facial recognition for determining citizenship could absolutely be used for non-fascist purposes, like simplifying border crossings to not require documentation (i.e. completely opt-in). Likewise, surveillance systems can also not be used until there’s an actual warrant (i.e. no passive recording), which can help in catching dangerous criminals.

              The technology itself isn’t fascist, it’s how it’s applied that’s fascist. The mass data collection is fascist, the tools used to collect that data isn’t fascist in the same way that guns and tanks aren’t fascist, but they can certainly be used by fascists.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  19 hours ago

                  If anyone is refusing to engage, it’s you. You provided no argument for your position, whereas I’ve explained as best I can in detail, with examples of similar things. Me not agreeing with you isn’t “refusing to engage,” it’s a good faith debate.

                  If there’s some point you’ve made that I’ve failed to address, I apologize, I tried to be thorough to not waste any time going back and forth.