Google's DeepMind Trials Invisible Watermark

Publish date: 2024-06-29

Google’s artificial intelligence arm, DeepMind, is conducting a trial of a digital watermark named SynthID to detect images produced by AI, aiming to combat disinformation.

This innovation modifies individual pixels in images to embed changes that are invisible to the human eye yet detectable by computers. While it’s not entirely “foolproof against extreme image manipulation,” it offers a step toward distinguishing between real images and those created by AI.

The rising complexity of distinguishing AI-generated images from real ones has been underscored by the popularity of AI image generators like Midjourney, which has over 14.5 million users.

Also Read: Google’s AI-Enabled Tools to Assist Journalists Report Spark Industry Debates

As such tools become mainstream, concerns about copyright and ownership arise.

Google’s proprietary image generator, Imagen, will be subject to the watermarking process. This watermarking technique differs from traditional ones, which are often logos or text that can be easily removed or edited out.

The system modifies images subtly enough that human observers can’t detect the changes, making it challenging for them to be altered or edited out. The experimental launch of this system is part of Google’s effort to understand its robustness and effectiveness.

In July, Google and six other major AI companies voluntarily committed to ensuring the secure development and use of AI, which included implementing watermarks to enable the identification of AI-generated content.

While various companies, including Microsoft and Amazon, have pledged to watermark AI-generated content, coordination and standardization across the industry could help address complexities and challenges related to disinformation and content attribution.

Email your news TIPS to Editor@kahawatungu.com or WhatsApp +254707482874

ncG1vNJzZmijkZ2uuK3TrqWgrV6YvK57xqimoKSVqHqlscSppKKmlGLBs7XApapmoZ6rtrS1waWcZq%2BRqbKzucCromasn2KxpsDEnKtmmZlitKa6xKuYrZ2UYraurcaeqmg%3D