According to VentureBeat, Google has made the most recent version of Imagen 3, its AI text-to-image generator, available to users in the United States. In comparison to Google’s earlier models, the tool—which anyone can access on Google’s AI Test Kitchen—is meant to produce photographs with “better detail, richer lighting, and fewer distracting artifacts.”
Although Google first revealed the improved Imagen 3 tool during I/O in May, it doesn’t seem like the feature has been widely accessible through the Vertex AI platform for a few days now. Google released a research paper on Imagen 3 on Tuesday, and some Reddit users began experimenting with the tool last week.
Also Read: Google Gemini Live vs GPT-4o AI Assistant: Which is Better?
Google claimed, “Our best text-to-image model is Imagen 3. Compared to our previous models, it generates an incredible level of detail, producing photorealistic, lifelike images with far fewer distracting visual artifacts.”
Based on your request, Imagen 3 can produce detailed graphics just as other AI image generators. Additionally, you can alter the image by highlighting a certain area and stating the changes you would like to make.
There appear to be some restrictions in place because the technology refuses to produce photos of weapons or well-known people like Taylor Swift. Furthermore, even if it won’t generate named copyrighted characters, you may still circumvent this by describing the character you wish to create.
Also Read: How to use and access Gemini Live? Check the Features of Google’s New Voice AI Assistant
Even with these somewhat accommodating boundaries, Imagen 3 is still very different from Grok, the artificial intelligence image generator that resides on Elon Musk’s X platform. Grok has been used to create a wide range of bizarre content, such as pictures of drugs, violence, and dubious behavior by well-known people.
However, there have also been significant problems with Google’s AI products. After users discovered that Google’s Gemini AI chatbot was producing historically incorrect photographs, the company decided to cease allowing users to make images with it earlier this year.