No, that’s not fascist. Facial recognition software can be used for a variety of reasons, like unlocking a phone or laptop, gaining access to secure areas, or home automation stuff.
It’s only fascist if used by a government to oppress minorities. The software itself cannot be fascist, but it can be used by fascists.
The fault lies with the makers and users of the softeware. Software doesn’t have political opinions, it’s software.
It’s like saying Panzer tanks were fascist because they were built by the Nazis. Tanks cannot be fascist, they’re tanks. So despite being made and used by fascists, they’re not fascist, they’re tanks.
That’s the same exact thing here. Facial recognition software can be used by fascists, but that doesn’t make the software itself fascist.
The other person deleted their comment so I can’t really know what the argument was, but I would like to make a distinction:
While tools cannot be political themselves, tools can lend themselves to specific political purposes.
A tank cannot itself be fascist, but it can make fascism more viable. Surveillance software cannot be political, but it is easily abused by fascists to destroy political opposition.
What matters is the harm and benefits. Is the harm caused by the tool justified by it’s benefits? Or are the primary use cases for the tool to prop up fascism?
(I suspect that “authoritarianism” would be a better term to use here, but I’m continuing the theme of the thread)
Their argument was that software can, in itself, be fascist, and that’s what we went around and around on. The example given was facial recognition software that can determine race (and later, country of origin).
Essentially, I said exactly what you’re saying, while they argued the opposite. I wish I quoted them, but I did only directly address their claims, if you’ll take my word for it.
I don’t want the government to have and use facial recognition software (their example) and extensive security camera systems (my example, such as Flock), not because those solutions are fascist in and of themselves, but that they can be used by fascists to accomplish their goals. Even if the current regime uses them purely for good (i.e. completely opt in facial recognition, cameras inaccessible to police until there’s a warrant with no passive collection) the next regime may not.
Again, it’s not the software itself that’s fascist, it’s what it’s being used for that’s fascist. Facial recognition for determining citizenship could absolutely be used for non-fascist purposes, like simplifying border crossings to not require documentation (i.e. completely opt-in). Likewise, surveillance systems can also not be used until there’s an actual warrant (i.e. no passive recording), which can help in catching dangerous criminals.
The technology itself isn’t fascist, it’s how it’s applied that’s fascist. The mass data collection is fascist, the tools used to collect that data isn’t fascist in the same way that guns and tanks aren’t fascist, but they can certainly be used by fascists.
deleted by creator
Software can’t be fascist, it’s just software. The makers or users can be fascist though. If that statement was true, Lemmy would be tankie.
deleted by creator
No, that’s not fascist. Facial recognition software can be used for a variety of reasons, like unlocking a phone or laptop, gaining access to secure areas, or home automation stuff.
It’s only fascist if used by a government to oppress minorities. The software itself cannot be fascist, but it can be used by fascists.
deleted by creator
The fault lies with the makers and users of the softeware. Software doesn’t have political opinions, it’s software.
It’s like saying Panzer tanks were fascist because they were built by the Nazis. Tanks cannot be fascist, they’re tanks. So despite being made and used by fascists, they’re not fascist, they’re tanks.
That’s the same exact thing here. Facial recognition software can be used by fascists, but that doesn’t make the software itself fascist.
The other person deleted their comment so I can’t really know what the argument was, but I would like to make a distinction:
While tools cannot be political themselves, tools can lend themselves to specific political purposes.
A tank cannot itself be fascist, but it can make fascism more viable. Surveillance software cannot be political, but it is easily abused by fascists to destroy political opposition.
What matters is the harm and benefits. Is the harm caused by the tool justified by it’s benefits? Or are the primary use cases for the tool to prop up fascism?
(I suspect that “authoritarianism” would be a better term to use here, but I’m continuing the theme of the thread)
Their argument was that software can, in itself, be fascist, and that’s what we went around and around on. The example given was facial recognition software that can determine race (and later, country of origin).
Essentially, I said exactly what you’re saying, while they argued the opposite. I wish I quoted them, but I did only directly address their claims, if you’ll take my word for it.
I don’t want the government to have and use facial recognition software (their example) and extensive security camera systems (my example, such as Flock), not because those solutions are fascist in and of themselves, but that they can be used by fascists to accomplish their goals. Even if the current regime uses them purely for good (i.e. completely opt in facial recognition, cameras inaccessible to police until there’s a warrant with no passive collection) the next regime may not.
The extension of the argument I’m making (and maybe them kinda?) is that it’s functionally the same as if the software were political.
You can make software that nearly exclusively benefits a particular political belief for family of beliefs.
So even if it’s not actually technically political, it can be functionally political, at which point the argument is splitting hairs.
deleted by creator
Again, it’s not the software itself that’s fascist, it’s what it’s being used for that’s fascist. Facial recognition for determining citizenship could absolutely be used for non-fascist purposes, like simplifying border crossings to not require documentation (i.e. completely opt-in). Likewise, surveillance systems can also not be used until there’s an actual warrant (i.e. no passive recording), which can help in catching dangerous criminals.
The technology itself isn’t fascist, it’s how it’s applied that’s fascist. The mass data collection is fascist, the tools used to collect that data isn’t fascist in the same way that guns and tanks aren’t fascist, but they can certainly be used by fascists.
deleted by creator
deleted by creator
Hahah you mean like Lemmy itself?
deleted by creator