One of the most heated debates today revolves around the use of artificial intelligence in the arts. AI has been a major sticking point in the negotiations of striking members of the writers and screen actors guilds in the United States. Lawsuits have been filed around the globe, questions have been raised about copyright law, countries are trying to tighten up protection of Indigenous culture and artists are increasingly concerned about their livelihood and protection of their copyrighted work.
Generative software advancements have also sparked controversy over the ethics of AI-generated art and whether this represents a technologically advanced form of plagiarism.
Lawsuits multiply as artists search for legal guardrails
In February, Getty Images filed a lawsuit in the High Court of Justice in London against Stability AI, the British startup behind the Stable Diffusion app, a text-to-image app that sources an online database of billions of images created by artists to learn patterns and create art based on those artistic styles. Getty claims that Stability AI infringed its intellectual property rights by illegally copying and processing millions of copyrighted images and their associated metadata.
In Australia, artists have accused the Lensa app, which also uses Stable Diffusion, of stealing their content without permission or compensation.
A group of San Francisco artists, represented by attorney Matthew Butterick and the Joseph Saveri Law Firm, have filed a class-action lawsuit against DreamUp, Midjourney and Stable Diffusion to reclaim copyright and consent rights. They are demanding that AI creators be required to obtain artists’ permission for the use of their works and should provide compensation.
At this point in time, there are no legal guardrails to AI-generated art other than copyright laws. Hopefully, lawsuit by lawsuit, new guardrails are being developed through case law that will establish ethical and legal boundaries.
How do we define art?
Artists have unique styles and infuse their art with their own personality. Can a software program replicate emotions, personality, or vision? That may be one of the larger issues the world is grappling with. People relate to art at a very human level. They may have different interpretations, but they relate to art emotionally. Many believe that AI-generated art sucks the humanity out of art.
Mexican filmmaker Guillermo del Toro has called animation created by machines is “an insult to life itself.” “I consume and love art made by humans,” the “Shape of Water” director told Euronews, “And I am not interested in illustrations made by machine and the extrapolation of information.”
AI-generated Indigenous art
The appropriation of Indigenous art presents another level of complexity – and legal problems. Indigenous art is typically protected by individual countries. In Australia, appropriation of Aboriginal and Torres Strait Islander art has created a firestorm due to its sacred nature and the premise that only Indigenous Australian peoples can create their art. Per the Museum of Contemporary Art Australia, “fake” Aboriginal and Torres Strait Islander art undermines the role of Indigenous communities in sharing cultural knowledge, denies them economic opportunities, deceives buyers and disadvantages businesses who do take the trouble to sell ethically produced Indigenous Art.
In 2018, for example, British artist Damien Hirst was accused of plagiarizing paintings by Aboriginal artists from the community of Utopia. Australia is looking at ways to respect and preserve the rich tapestry of their Indigenous culture in the ethical pursuit of technological advancements.
In recent years, communities worldwide have begun to advocate for formal agreements to protect their cultural knowledge, heritage and beliefs on the basis of Indigenous Data Sovereignty. First defined in 2015, the term refers to the “right of Indigenous Peoples to own, control, access and possess data that derive from them, and which pertain to their members, knowledge systems, customs or territories.”
Mexico has also had problems in protecting Indigenous peoples from cultural appropriation. In 2021, the Mexican government passed a law that prohibits and criminalizes the unauthorized use of Indigenous and Afro-Mexican cultural expressions. Unfortunately, the law is messy and requires clarification.
The Federal Law for the Protection of the Cultural Heritage of Indigenous and Afro-Mexican People and Communities empowers groups to sue if someone without permission replicates symbols, designs, or other elements of their cultural heritage. The law says that the “community” must give permission. But who in the community? The entire community? The spiritual leader? The political leader?
And how do you define cultural heritage? Establishing the origin of a cultural expression is complicated as it is passed down from generation to generation and Indigenous culture overlaps at times. There are nearly 17 million Indigenous people in Mexico; at least 68 Indigenous languages; and over 350 variations of those.
To answer that question the Mexican government says it has created a legal framework with a registry to identify the different cultural expressions subject to protection, the owners of such rights and detailing a process to obtain and document authorization properly.
There are a lot of definitional questions, however, that demonstrate how difficult it can be to legislate against the exploitation of Indigenous cultures. Legal experts have been critical of the law’s vague provisions on ownership and the fact that it doesn’t specify how the compensation for cultural theft will be distributed.
Most Indigenous art reflects the history, culture, traditions, and spiritual beliefs of the Indigenous community. Can AI generative software replicate that? There is also the problem of Indigenous iconography that is so old that it falls under public domain – a legal back door for any company looking to appropriate Indigenous symbolism.
Indigenous artists look for solutions to AI-generated Indigenous art
Some Indigenous artists believe that Indigenous people being involved in the creation and decision-making process of AI will minimize the risk of appropriation and cultural bias ensuring that Indigenous art is respected and properly attributed to the artists.
Michael Running Wolf, a Northern Cheyenne man from the United States and former Amazon software engineer, believes part of the solution is to train Indigenous youth in Mexico, the United States and Canada in artificial intelligence and data science.
Running Wolf believes another part of the solution is developing policy frameworks that protect and remunerate art, telling Tech Policy Press that the underlying problem is the exploitation of Indigenous data. “A great deal of energy and effort goes into the creation of art. Stable Diffusion could not [generate art] if they didn’t have the ability to scan the intellectual property of the internet. And that is worth something.”
Can AI-generated art be copyrighted as intellectual property?
A U.S. federal court in Washington, D.C. ruled in August of this year that art created by artificial intelligence without any human input cannot be copyrighted under U.S. law. The ruling stated that human authorship is a “bedrock requirement of copyright” based on “centuries of settled understanding.”
As U.S. District Judge Beryl Howell stated in his ruling, “We are approaching new frontiers in copyright as artists put AI in their toolbox,” which will raise “challenging questions” for copyright law.
If AI-generated art cannot be copyrighted because it lacks human authorship, can it be defined as art?
Sheryl Losser is a former public relations executive, researcher, writer, and editor. She has been writing professionally for 35 years. She moved to Mazatlán in 2021 and works part-time doing freelance research and writing. She can be reached at [email protected]