I feel queasy everytime someone uses and acronym as a subject. More so with this thing.
Regardless of the origin of manipulated pictures, I have noticed that people still have not renounced to their confirmation bias: when they see a picture of someone they wish to be true/false they automatically believe/disbelieve the picture. It's amazing how fast the rationalizations kick in.
I'm far from having been cured of this problem. But I am astonished every day that so many people seem to operate under the delusional thought that they are not manipulable but other people are.
There is really a quick and easy fix for this - if the people call for it. Everything that is Ai generated (a small "i" for intelligence) is to have a watermark on it so we can tell the differenece. I am imagining that not having it is going to cause all sorts of problems with our interpretations of pictures, we're just not going to be able to tell whether some of them are real or not. Sometimes I can tell straight away but on the pictures showed, those details are so subtle that they would still not help us determine that they are Ai generated. Why not have a watermark on them? Corps do that to stop copyright "infringement" (copying is a right) so it can be done on Ai generated work as well. That is if and only if, the people demand it.
I feel queasy everytime someone uses and acronym as a subject. More so with this thing.
Regardless of the origin of manipulated pictures, I have noticed that people still have not renounced to their confirmation bias: when they see a picture of someone they wish to be true/false they automatically believe/disbelieve the picture. It's amazing how fast the rationalizations kick in.
I'm far from having been cured of this problem. But I am astonished every day that so many people seem to operate under the delusional thought that they are not manipulable but other people are.
Understanding that confirmation bias exists is 1/2 the battle!
There is really a quick and easy fix for this - if the people call for it. Everything that is Ai generated (a small "i" for intelligence) is to have a watermark on it so we can tell the differenece. I am imagining that not having it is going to cause all sorts of problems with our interpretations of pictures, we're just not going to be able to tell whether some of them are real or not. Sometimes I can tell straight away but on the pictures showed, those details are so subtle that they would still not help us determine that they are Ai generated. Why not have a watermark on them? Corps do that to stop copyright "infringement" (copying is a right) so it can be done on Ai generated work as well. That is if and only if, the people demand it.