Amid rising cases of deepfakes surfacing on the internet, OpenAI, the company that made ChatGPT, has released a new tool to help curb the spread of fake photos and videos. On Tuesday, the company announced its image detection tool, which helps determine whether an image has been generated using the DALL-E 3 text-to-image tool.
Also Read: Best Tablets Under 25000
OpenAI’s Image Classifier: What It Is And How It Works?
“Starting today, we are opening applications for access to OpenAI’s image detection classifier to our first group of testers – including research labs and research-oriented journalism nonprofits,” the company writes in a blog post published on May 7, 2024. The tool predicts the likelihood of an image being generated by OpenAI’s DALL-E 3.
During the testing phase, OpenAI aims to enable independent research that assesses the classifier’s effectiveness, analyzes its real-world application, surfaces relevant considerations for such use, and explores the characteristics of AI-generated content. Even though the initial results have shown high accuracy in differentiating between non-AI generated images and those created by DALL-E 3, OpenAI clarifies that the image classifier may underperform in some instances.
Also Read: Wireless Headphones Under 2,000
How Accurate Is The Deepfake Detector Tool?
“It [the image classifier tool] correctly identified ~98% of DALL·E 3 images and less than ~0.5% of non-AI generated images were incorrectly tagged as being from DALL·E 3,” mentions the company. The classifier can also handle common modifications, like cropping, compressing, and changing the image’s saturation. However, other modifications might reduce its performance. For instance, changing a photo’s hue or adding moderate amounts of Gaussian Noise reduces its effectiveness.
Apart from announcing the new tool, OpenAI also confirmed adding metadata to the photos and videos created with DALL-E 3 and Sora. Although removing this data is not entirely impossible, it’s difficult for most users. Yes, the image classifier tool will help people distinguish between AI-generated and original images. Still, for now, it can only detect images generated via DALL-E 3, or at least that is what it looks like.
OpenAI also announced that it is joining the Steering Committee of C2PA, the Coalition for Content Provenance and Authenticity. It is a widely used standard for digital content certification, developed and adopted by several software companies, camera manufacturers, and online platforms. “We look forward to contributing to the development of the standard, and we regard it as an important aspect of our approach,” says the company in its blog post.
Also Read: Red Cards Price List in India
You can follow Smartprix on Twitter, Facebook, Instagram, and Google News. Visit smartprix.com for the latest tech and auto news, reviews, and guides.