Fawkes, a software developed by the Univ. of Chicago, can be used to modify your images at a pixel level so that they look they same to humans, but can no longer be used for automated image (i.e. person) recognition.
Would people be interested to have this integrated into Mastodon so it can be used on images they upload?
@FrankSonntag yes, yes I would.
@FrankSonntag unfortunately they're looking into patenting it :(
@thufie Thank you. I hadn't picked up on that. Not sure how much that matters given that they already published under a BSD license. But I will contact them and see what's what.
I would be
I can't help but being sceptical towards efforts like these. Not the intentions, but the efficacy. I'm also worried that this would give people a false sense of security and that that would make them sloppy with regards to pictures of themselves and, worse, pictures of others.
If this was to be on by default, perhaps that wouldn't give rise to the false sense of security. But would it be ethical to manipulate peoples uploads like this without their knowledge?
Just some thoughts.
If just a few people do this to protect themselves, that would not help society & that would also be a clearly defined group for surveillance capitalism & the surveillance state.
When we not only have Amazon Ring doorbells, but also autonomous cars with 20 cameras in all directions, not having identifiable pictures will be necessary to maintain basic human rights.
@FrankSonntag `fawkes` might work for circumventing todays face recognition, but in the end it's just an arms race. (like any circumvention technique). There's also a natural limit to this arms race: At some point face recognition tech. may be as good or even surpass human vision. So I actually don't think it's a good to deploy this at scale. I'd only use it selectively in high-risk scenarios and even then it's likely to only temporarily protect against face recognition software.
Fediscience is the social network for scientists.