Source: news.google.com
By now, you’ve probably seen the term AI, either on these streets of the internet or in one of my previous articles. Either way, it’s the new trendy tech term that everyone is talking about. Especially with the growing popularity of the artificial intelligence company ChatGPT. This software provides consumers with answers to specific prompts for a variety of topics, from coding to blogging tips. Even Microsoft recently invested billions of dollars in the artificial intelligence company OpenAI to help fuel the latest trend. But, one thing people don’t talk about is the potential downside of AI technology. Since this technology is one that learns and evolves based on human interaction and information that is already published, it is drawn from archives of articles, books, art, etc. that humans have created.
This might not initially seem like a bad thing. Not at all, who doesn’t want a single piece of software that saves tons of research time by extracting those resources for you? But, one thing that AI software fails to take into account is providing proper credit to the sources and people from whom it pulls the information. Since this software is an object and not a human, can we really blame it? Or the people who create it? However, thousands of people are having pieces of their work stolen and they are not going to rest easy.
Midjourney and Stability AI are two AI companies that have created state-of-the-art AI generators. Last year, the latter raised $101 million in funding and released version 2.1 of its Stable Diffusion tool in December. This software selects artwork based on the message or word entered by the consumer. For this to work, the company has to train her in many art styles and techniques that have already been published. This may sound like a breakthrough in the tech space, but the art they used was composed by millions of artists who didn’t give permission or get compensated for the use. Because of this, a group of creators came together and filed a class action lawsuit against these two companies.
The lawsuit alleges that these AI companies violated the intellectual property rights of millions of artists by training their AI software on their artwork. Stability’s response to this lawsuit stated: “Please know that we take these matters seriously. Anyone who thinks this isn’t fair use doesn’t understand the technology and misunderstands the law.”
But they are not the only ones going after these companies. Getty is another victim of the art maker using her images without her consent or compensation. The photo company mentioned that they wanted to talk to the AI companies before proceeding with the lawsuit. The response from tech companies? A Stability AI spokesperson said: “The Stability AI team has not received any information regarding this lawsuit, so we cannot comment.” Currently, all AI art generator sites remain active and in use for consumers.
Lawyers at the law firm Joseph Saveri, LLP, which handles the artists’ class actions, have been looking into the issue by comparing these cases to those of music streaming platform Napster. For those who don’t remember, this was one of the first music streaming companies that was making artist music without paying them a penny. They were sued, the site was shut down, and artists are now protected in the music streaming space. Legal experts at that firm believe that if music can be protected from streaming companies making money without your consent, then digital art should have the same protection, as mentioned in their recent press release. Laws are made through cases like these, so I hope regulations are put in place so that artists are protected from these. Emerging technologies and they can be duly rewarded for their work.
Artists are banned together to make sure their creative works aren’t ripped off or used without their green light. They have even gone so far as to create a site called haveibeentrained.com to upload their art to see if they helped train the AI software. Although it is a useful tool to keep up with illegal use of your work, this is something that the artist should not have to do. There has to be a way for the AI to continue to thrive while giving artists proper credit and payment. Can these two worlds really exist?
Shutterstock may have found the solution. Another stock photo platform similar to Getty Images, announced late last year that they were expanding into the world of AI by partnering with OpenAI. They discussed creating enhanced AI-generated content while unilaterally selecting a background for the artists who are involved to be paid. This approach is something all AI companies can learn from when it comes to including and paying artists for their contributions to the art world. This content or background has not been released yet, but I hope this can set a precedent.
Some people may be thinking: your art is being used, this is a good exhibition for you. Or are you an artist? so just creating more art. But they don’t consider the long hours and creative exhaustion artists go through to make these masterpieces. They should be credited and rewarded for the use of their work just like any other creator. The whole purpose of Web3 is to provide ownership to creators and allow them to monetize their work. Technology is supposed to drive us and solve problems, not cause more problems. my hope is that AI companies You can partner with artists and license their art. And if they don’t, we’ll start to see more of these lawsuits pop up over the course of the year, resulting in many of these sites shutting down.
Read More at news.google.com