The senior executive at Stability AI has left due to disagreement with the position taken by the company that it is permissible to train its products with copyrighted material without authorization.
Ed Newton-Rex was the leader of audio operations at the company with offices in both the United Kingdom and the United States.
He informed the BBC that he found it to be "exploitative" for any AI creator to use imaginative work without authorization.
Nevertheless, many big AI companies, such as Stability AI, maintain that using copyright material is a form of "fair use".
The "fair use" exemption to copyright regulations indicates that obtaining consent from the proprietors of the initial material is not necessary.
The US copyright office is presently engaged in an analysis concerning generative AI and related matters of policy.
Mr Newton-Rex highlighted that the majority of AI firms share the opinion he had stated previously, and this is something he wanted to emphasize.
In a post on X (Twitter), Stability AI founder Emad Mostaque responded to a former employee stating that the company believes fair use is beneficial to creative development.
AI programs are trained by leveraging tremendous amounts of information, often acquired without authorization through a process known as "scraping" from the web.
Generative AI products are utilized for creating content that is similar to images, audio, video and music. Furthermore, they are able to imitate the style of a single artist upon request.
Mr Newton-Rex, a choral composer, stated that he would not be eager to provide his own music to AI developers without charge.
He expressed that it was unlikely he would agree to supply his writings to such a system, stating, "No, I wouldn't consent to that."
He asserted that a lot of folks produce content "usually with no financial gain, in the expectation that their copyright may be of some worth down the line".
However, he pointed out that their efforts were being utilized without their authorization to create rivals and, perhaps, even substitute them altogether.
He developed an AI audio creator, Stability Audio, for his ex-employer, and opted to license the data it was trained on, giving rights holders a share of the profits. Nevertheless, he admitted that this system is not a universal solution.
He stated that he did not believe that there was a single, perfect solution.
I've been informed that those on the rightsholder side are enthusiastic concerning the prospect of the agenda being discussed today and are interested in collaborating on it, however, they desire for the conditions to be ideal.
He voiced his optimism for the potential of AI and made clear he had no intention of exiting the sector.
He stated that from an ethical, moral, and global perspective, it is essential to get permission from the creators before using their work, or else it would not be acceptable.
The controversy surrounding the utilization of copyrighted material for the purpose of training AI tools is well-known.
US comedian Sarah Silverman and Game of Thrones writer George RR Martin are two of many creatives that have taken legal action against AI companies, claiming their work was used without permission to teach products to recreate material in their own style.
An AI-generated track featuring the voices of Drake and The Weeknd was taken down from Spotify earlier this year after it was discovered that they hadn't given approval for it to be released. instead opting for a more complex approach.
The head of Spotify, however, later declared that he would not completely prohibit AI from the platform, and instead would pursue a more elaborate strategy.
Earlier this year, Stability AI was taken to court by the Getty image archive, alleging that they had scraped 12 million of its pictures and used them in the training of its AI image generator, Stable Diffusion.
Certain news organisations, such as the BBC and The Guardian, have prohibited Artificial Intelligence companies from using their published content from the internet.
top of page
bottom of page
Comments