If social media required ID, you could maintain the freedom of being able to use these tools for anything legal, while swiftly detecting and punishing illegal usage. IMHO, you can't have your cake and eat it too: either you want privacy and freedom but you accept people will use these things unlawfully and never get caught, or you accept being identified and having perpetrators swiftly dealt with
Same is true outside of the Internet. With cameras and face recognition everywhere, criminals can be swiftly dealt with. At least that's what people tend to believe.
Is it any more effective than (say) messing with its recognition so that any attempt to deepfake just ends up as garbled nonsense?
Can't help wondering if the censor models get tweaked more frequently and aggressively (also presumedly easier to low-pass on a detector than a generator, since lossiness doesn't impact final image)
This might prevent the image from being used in edits, but the downside is that it runs the risk of being flagged as nfsw when the unmodified image is used in a benign way. This could lead to obvious consequences.
and who decides if I want to use a knife to cut mushrooms instead? see where I am going, there are (or could exist) legit cases when you need to use it in a non-standard way, one that the model authors didn't anticipate.
In this case, image generation and editing AI is a tool which we managed just fine with until three years ago, and where the economic value of that tool remains extremely questionable despite it being a remarkable improvement in the state of the art.
As a propaganda tool it seems quite effective, but for that it's gone from "woo free-speech" to "oh no epistemic collapse".
If social media required ID, you could maintain the freedom of being able to use these tools for anything legal, while swiftly detecting and punishing illegal usage. IMHO, you can't have your cake and eat it too: either you want privacy and freedom but you accept people will use these things unlawfully and never get caught, or you accept being identified and having perpetrators swiftly dealt with
Same is true outside of the Internet. With cameras and face recognition everywhere, criminals can be swiftly dealt with. At least that's what people tend to believe.
Obligatory Benn Jordan link (YouTube - ~11mins)
This Flock Camera Leak is like Netflix for Stalkers
https://youtube.com/watch?v=vU1-uiUlHTo
This is a really cool idea, nice work!
Is it any more effective than (say) messing with its recognition so that any attempt to deepfake just ends up as garbled nonsense?
Can't help wondering if the censor models get tweaked more frequently and aggressively (also presumedly easier to low-pass on a detector than a generator, since lossiness doesn't impact final image)
This might prevent the image from being used in edits, but the downside is that it runs the risk of being flagged as nfsw when the unmodified image is used in a benign way. This could lead to obvious consequences.
deepfake edits are a feature, not a bug
its the same as banning knives because they can be used to hurt people. we shouldn't ban tools.
with that analogy, OP's solution is akin to banning the use of knives to harm people, as opposed to banning the knife itself
If I undestood correctly he's unsharpening knives.
Or making knives that turn into overcooked noodles if you try to use them on anything except vegetables and acceptable meats
and who decides if I want to use a knife to cut mushrooms instead? see where I am going, there are (or could exist) legit cases when you need to use it in a non-standard way, one that the model authors didn't anticipate.
But we do ban tools sometimes: you can't bring a knife to a concert, for good reason.
> we shouldn't ban tools
When I see the old BuT FrEe SpEeCH argument repurposed to impinge civil rights I start warming to the idea of banning tools.
Alternately "Chemical weapons don't kill people, people with chemical weapons kill people"
Not really, its like banning chemistry sets cause they may be used to create chemical weapons.
Not sure the comparison works when it does all the work for you
I've had very little success mumbling "you are an expert chemist..." to test tubes and raw materials.
In this case, image generation and editing AI is a tool which we managed just fine with until three years ago, and where the economic value of that tool remains extremely questionable despite it being a remarkable improvement in the state of the art.
As a propaganda tool it seems quite effective, but for that it's gone from "woo free-speech" to "oh no epistemic collapse".
[dead]