DALL-E 2 is now an Open API

Go forth and iterate

Here's your daily briefing:

  • Bookmark this thread of AI tools:

  • "Developers can now use DALL·E directly in their own apps or products through their newly-available API:

  • New AI-powered photo editor just dropped.

  • We'll be following along with the conversation at EmTech 2022. The ethics of responsible AI use is our jam and the more people yapping about it, the better!

We just read this essay by Andy Baio about a viral Reddit post that spurred a contentious debate about one of the biggest ethical issues raised by generative AI tools: the unsanctioned use of an artists work to train a generative model. We thought we'd share our take.

Hollie Mengert is a Los Angeles-based illustrator whose work has been commissioned by the likes of Disney and Penguin Random House. Her style is unique and distinctive and, she thought, just that: her style.

Until Redditor MysteryInc152 shared this viral Reddit post: 

In the post, MysteryInc152 shares the DreamBooth model he trained on Mengert's work. 

Reactions to the post were generally positive, as the post was shared in the r/StableDiffusion subreddit and the crowd is understandably congenial to SD outputs.

But then someone asked a question, which for a time happened to be the most up-voted comment in the thread:

“Whether it’s legal or not, how do you think this artist feels now that thousands of people can now copy her style of works almost exactly?”

Baio thought this was a great question, so he reached out to Mengert and asked her.

“My initial reaction was that it felt invasive that my name was on this tool, I didn’t know anything about it and wasn’t asked about it,” she said. “If I had been asked if they could do this, I wouldn’t have said yes.”

Hollie Mengert

So right off the bat you (meaning both MysteryInc152 and all of us who would use these models) are confronted with this (possibly, depending on your opinion) inconvenient truth:

You're using someone's creative work in a way that they do not approve of and wouldn't allow you to if they had any say in the matter.

“For me, personally, it feels like someone’s taking work that I’ve done, you know, things that I’ve learned — I’ve been a working artist since I graduated art school in 2011 — and is using it to create art that that I didn’t consent to and didn’t give permission for... I think the biggest thing for me is just that my name is attached to it. Because it’s one thing to be like, this is a stylized image creator. Then if people make something weird with it, something that doesn’t look like me, then I have some distance from it. But to have my name on it is ultimately very uncomfortable and invasive for me.”

Hollie Mengert

But that's the rub, isn't it? If they had any say in the matter.

Like Napster in the early aughts, generative AI is forcing all of us to make explicit our feelings about artists, their work, and their works value. Many of us will pay lip service to respecting artists and their wishes when it comes to their work, but is that number the same as the number of those of us who will respect artists wishes?

Because now that these tools are available and the choice is there...how many will resist the temptation to spin up variations of artwork in the style of their favorite artist, even if that artist is opposed?

From his interview with the creator, Baio comments on the fatalism often found in discussions about AI and art and ethics. It's often not so much about what is "right" or "wrong" but what is happening:

 “The technology is here, like we’ve seen countless times throughout history.”

Ogbogu Kalu (aka MysteryInc152)

But worth considering, as we move forward into this increasingly generative world, is that age-old heuristic: the golden rule.

DreamBooth, like most generative AI, has incredible creative potential, as well as incredible potential for harm. Missing in most of these conversations is any discussion of consent: are you treating people the way you would want to be treated?

Andy Baio

Above all, and like with so many things in our society these days, we believe that the extremes for or against obscure the truly interesting questions and moral nuance these tools force us to confront.

To that end, we thought we'd leave you with this, from the author himself: