Interviews and Conversations

Writers and Artists Need a Way to Label AI Use: Here’s What That Could Look Like

Photo by cottonbro studio

Today’s post is by poet and filmmaker Dave Malone, AI-assisted by Claude AI (editing and drafting assistance) and edited by PGB (a human editor).


I was curious to see if Claude AI could write poetry as good as my own. In September, I ran an experiment on TikTok. I shared two poems titled “Autumn Witness”—one written by me, one by Claude AI after I shared my work and asked him to write in my voice and style. I asked my followers to guess which poem was mine.

Seventy-six percent got it wrong. They chose Claude’s poem.

The comments revealed something that could be unsettling to some. “I resonated with #1 more, more positive, more spiritual,” wrote one follower about Claude’s poem.

Another declared with total confidence: “#1 is you and better, far better.” Some of my followers proclaimed the opposite assurance: “#1 is Claude using common phrases. #2 matching helicopter leaves and dervishes is a human mind connecting disparate similarities.” And others revealed the difficult truth: “I am annoyed that this is as hard as it is.”

A week later, I did a similar experiment at Missouri State-West Plains’ Ozarks Symposium. Fifty-nine percent thought Claude wrote my poem.

These experiments were hatched out of simple curiosity but led me to consider more complex issues. If AI can create art as compelling as humans, then surely audiences deserve transparency. As well, artists need to be transparent about their AI use.

I’m a poet and screenwriter from the Missouri Ozarks, and I’ve been AI-curious since the beginning. For me, Claude started out as a research assistant, and I called him “an amped-up Google machine.” Things have changed. Using Claude almost daily in my writing practice, he has become a valued sounding board as well as an editor for my poetry, screenplays, film projects, book submissions, and this article you are reading now. I have moved from AI-curious to AI-positive, and I know transparency matters.

The creative industry doesn’t have agreed-upon standards for transparency when it comes to AI use. Artists don’t know how to disclose usage, and subsequently audiences don’t know if they’re getting AI or human work. Some creators are upfront about their AI use while others hide it. Many are somewhere in the murky middle.

Right now, without clear standards, artists don’t know what’s expected of them, and audiences don’t know what they’re getting. Given this problem, I created a solution.

Two-category solution

The AACC (AI Attribution and Creative Content) is an open-source transparency framework with two simple categories.

AI-Assisted means the creator originated and drove the work, with AI contributing along the way.

AI-Generated means AI created the primary content from the creator’s prompts and direction, and the creator’s role was conceptual and editorial. 

That’s it. Two categories that cover the spectrum of how artists work with AI.

Graphic created by the author titled AI ATTRIBUTION AND CREATIVE CONTENT (AACC): A transparency framework for creators. (This framework is for educational and discussion purposes, not legal purposes.) Box 1 contains the following: AI-ASSISTED • The creator originates and drives the work • Al contributes through generation, modification, or enhancement at any stage • The creator makes all major creative decisions and is responsible for the final work Attribution: By [Creator Name], Al-assisted by [Al Name] Examples Writing: "Mercury." By Sarah Chen, Al-assisted by Apertus (word choice and line break suggestions) generanus Dir. by Chris Taylor, Al-assisted by Runway ML (visual effects and scene Music: "Mars." By Luna Rivers, Al-assisted by Wondera (harmonic progressions and arrangement) Visual Art: Jupiter. By Jamila Banks, Al-assisted by Midjourney (background textures and composition elements) Box 2 contains the following: AI-GENERATED - Al creates the primary content from the creators prompts and direction - The creator's role is conceptual, curatorial, and editorial Attribution: Al-generated by [Al Name], concept by [Creator Name] Examples Writing: "Saturn." Al-generated by Apertus, concept by Jordan Lee (Lee provided plot outline and characters; Al wrote 5,000-word short story draft; Lee edited for voice and pacing) Film: Uranus. Al-generated by Runway ML, concept by Sam Rivera (Rivera provided scene descriptions; Al generated video sequences; Rivera edited and color graded) Music: "Neptune." Al-generated by Wondera, concept by Sofia Razo (Razo specified genre and mood; Al composed and produced track; Razo added transitions and mastered) Visual Art: Pluto Demoted. Al-generated by Midjourney, concept by Kai Thompson (Thompson crafted prompts; Al generated imagery; Thompson selected from 50+ variations and refined details) Note: In the US, Al-generated content cannot be copyrighted. Credit: AACC 1.1 by Dave Malone. Creative Commons (CC BY) [github.com/dzmalone/aacc](http://github.com/dzmalone/aacc)(http://davemalone.net/aacc)
Graphic created using Canva / graphic content AI-edited by Claude and Apertus

AI-Assisted

  • The creator originates and drives the work
  • AI contributes through generation, modification, or enhancement at any stage
  • The creator makes all major creative decisions and is responsible for the final work

Authors are already working this way, even without formal labels. In a recent BookBub survey of authors (69% self-published; 6% traditionally; 25% both self-pubbed and traditionally), some writers described using AI as an editor for “research, grammar edits and sometimes rephrasing,” or as a developmental editor to “talk about plot, toy with character profiles, work through structural templates.”

One explained using AI for “drafting passages and scene descriptions; rewriting passages.” The author emphasized their use wasn’t “blind” generation but material that was “reviewed/accepted/rejected to stay within my voice and vision.”

AI-Generated

  • AI creates the primary content from the creator’s prompts and direction
  • The creator’s role is conceptual, curatorial, and editorial

Another surveyed author explained: “I see it like an assistant or even a ghostwriter at times … I come up with the idea and play with AI to make it better. Then AI purges a first draft and I take over as the author from there.”

These two AACC categories cover the full spectrum, from using AI to polish a sentence to having it generate an entire first draft. When this type of consistent labeling is missing, problems emerge.

Why this framework is needed: examples from mainstream artists

AI-assisted novel: Tokyo-to Dojo-to by Rie Kudan

In 2024, author Rie Kudan admitted (after the fact) that 5% of her novel was “lifted verbatim from ChatGPT.” Would transparent labeling from the start have changed the conversation around her award-winning work?

AI-assisted film: The Irishman

In this Martin Scorsese 2019 film, editors used AI deepfake software FaceSwap to make the movie’s leads, Robert De Niro, Al Pacino and Joe Pesci look younger. Film audiences largely accept AI-assisted visual effects, but I argue that transparent disclosure should still be the standard.

AI-assisted or AI-generated music: Telisha “Nikki” Jones

Recently, Telisha “Nikki” Jones as Xania Monet snagged a three-million-dollar record deal for her music created with AI. She writes the lyrics, but Suno’s AI does the vocals. Is her music AI-assisted or AI-generated?

AI-generated music: The Velvet Sundown 

The Velvet Sundown streamed music on Spotify without at first acknowledging how they created their music. They later updated their Spotify profile to remark that their music project was “composed, voiced, and visualized with the support of artificial intelligence.” Those keywords point to AI-generated work.

AI-generated novel: Death of an Author by Stephen Marche

In an interview with Joanna Penn, Stephen Marche was transparent about his use of AI accounting for 95% of the novelDeath of an Author. Though Marche described his role as “curatorial” and himself as the work’s “creator,” the substantial AI use classifies this as AI-generated.

Without a consistent framework, every artist handles AI disclosure differently, if they disclose at all. Audiences often can’t tell what role AI played in the work. Artists don’t know what’s expected of them. That’s the problem this framework solves.

Moving forward

Admittedly, the AACC framework won’t solve every problem with AI in creative work. Like many artists, I have ethical concerns about how most AI models were trained. Major companies (with exceptions like Apertus) used copyrighted works without artist consent or compensation. Their data centers carry environmental costs. These are real issues.

But the hard truth: just as illegal file-sharing through Napster eventually evolved into licensed platforms like Spotify and Pandora, the current wave of lawsuits against AI companies will likely establish proper licensing and payment systems. That’s the long game. In the meantime, artists have an immediate choice: use AI transparently or leave audiences guessing.

For young writers and artists, I echo musician Nick Cave’s warning about leaning too heavily on AI-generation in your art. Don’t sidestep “the inconvenience of the artistic struggle” by going straight to the easy commodity. That struggle is where you develop your voice and your art. There are no shortcuts. AI is a tool, and like any tool, it reveals the skill of the person using it. An inexperienced curator produces inexperienced work.

For those of us using AI as a legitimate tool, transparency shouldn’t be complicated. Label your work. Help audiences understand your process. Use the framework or create your own, but be honest about how you’re working.

The AACC framework is open-source and available at GitHub and at my website. Two simple categories with clear attribution. Whether you’re AI-curious, AI-positive, or still figuring it out, I encourage you to be transparent. It’s that simple.




Source link

Related Articles

Back to top button