So there has been a lot of talk about Instagram’s new ‘Made with AI’ tag which is already being rebranded as ‘AI Info’. I actually don’t think in principle it is a bad idea…. Viewers deserve some transparency especially if the person posting the image isn’t offering full disclosure. But if you are going to implement a tool, it needs to work…
I did a little test with two images both completely AI generated using Photoshop’s Generative Fill feature on a blank canvas. The only thing I did differently was to export the second image (unicorn) and strip any of the meta data. My expectation was the first image (raindrops) would be labelled AI and the second image not be labelled (despite it clearly being AI generated). I mean, how many unicorns do you see running on the beach at sunset?
What I actually got was neither image was labeled AI. Now, Instagram’s own help centre states that not all AI content will be labeled AI, but upon exporting within Photoshop it told me ‘A Content Credential will be automatically applied for Adobe Firefly generative AI use transparency.’ So given I did nothing to try and strip any meta data I was expecting the image to be labeled ‘AI Info,’ but it wasn’t. I can’t find anything to suggest the labeled is not applied immediately, unless there is some delay in scanning of images…
So in the end does it matter? People (myself included) have manipulated images for years with editing software and these haven’t required any ‘label’ so should AI images or art be any different?