theoldgreybeard

2 days ago
The interesting tidbit here is SynthID. While a good first step, it doesn't solve the problem of AI generated content NOT having any kind of watermark. So we can prove that something WITH the ID is AI generated but we can't prove that something without one ISN'T AI generated.

Like it would be nice if all photo and video generated by the big players would have some kind of standardized identifier on them - but now you're left with the bajillion other "grey market" models that won't give a damn about that.

akersten

2 days ago
Some days it feels like I'm the only hacker left who doesn't want government mandated watermarking in creative tools. Were politicians 20 years ago as overreative they'd have demanded Photoshop leave a trace on anything it edited. The amount of moral panic is off the charts. It's still a computer, and we still shouldn't trust everything we see. The fundamentals haven't changed.

darkwater

2 days ago
> It's still a computer, and we still shouldn't trust everything we see. The fundamentals haven't changed.

I think that by now it should be crystal clear to everyone that it matters a lot the sheer scale a new technology permits for $nefarious_intent.

Knives (under a certain size) are not regulated. Guns are regulated in most countries. Atomic bombs are definitely regulated. They can all kill people if used badly, though.

When a photo was faked/composed with old tech, it was relatively easy to spot. With photoshop, it became more complicated to spot it but at the same time it wasn't easy to mass-produce altered images. Large models are changing the rules here as well.

Politicians absolutely were doing this 20-30 years ago. Plenty of folks here are old enough to remember debates on Slashdot around the Communications Decency Act, Child Online Protection Act, Children's Online Privacy Protection Act, Children's Internet Protection Act, et al.

https://en.wikipedia.org/wiki/Communications_Decency_Act

BeetleB

2 days ago
Easy to say until it impacts you in a bad way:

https://www.nbcnews.com/tech/tech-news/ai-generated-evidence...

> “My wife and I have been together for over 30 years, and she has my voice everywhere,” Schlegel said. “She could easily clone my voice on free or inexpensive software to create a threatening message that sounds like it’s from me and walk into any courthouse around the country with that recording.”

> “The judge will sign that restraining order. They will sign every single time,” said Schlegel, referring to the hypothetical recording. “So you lose your cat, dog, guns, house, you lose everything.”

At the moment, the only alternative is courts simply never accept photo/video/audio as evidence. I know if I were a juror I wouldn't.

At the same time, yeah, watermarks won't work. Sure, Google can add a watermark/fingerprint that is impossible to remove, but there will be tools that won't put such watermarks/fingerprints.

losvedir

2 days ago
I'm sure Apple will roll something out in the coming years. Now that just anyone can easily AI themselves into a picture in front of the Eiffel tower, they'll want a feature that will let their users prove that they _really_ took that photo in front of the Eiffel tower (since to a lot of people sharing that you're on a Paris vacation is the point, more than the particular photo).

I bet it will be called "Real Photos" or something like that, and the pictures will be signed by the camera hardware. Then iMessage will put a special border around it or something, so that when people share the photos with other Apple users they can prove that it was a real photo taken with their phone's camera.

swatcoder

2 days ago
The incentive for commercial providers to apply watermarks is so that they can safely route and classify generated content when it gets piped back in as training or reference data from the wild. That it's something that some users want is mostly secondary, although it is something they can earn some social credit for by advertising.

You're right that there will existed generated content without these watermarks, but you can bet that all the commercial providers burning $$$$ on state of the art models will gradually coalesce around some means of widespread by-default/non-optional watermarking for content they let the public generate so that they can all avoid drowning in their own filth.

slashdev

2 days ago
If there was a standardized identifier, there would be software dedicated to just removing it.

I don't see how it would defeat the cat and mouse game.