AI chief quits over 'exploitative' copyright row

11 months ago 21
ARTICLE AD BOX

Ed Newton Rex

Image caption,

Ed Newton Rex has worked in AI and music for 13 years

By Zoe Kleinman

Technology editor

A senior executive at the tech firm Stability AI has resigned over the company's view that it is acceptable to use copyrighted work without permission to train its products.

Ed Newton-Rex was head of audio at the firm, which is based in the UK and US.

He told the BBC he thought it was "exploitative" for any AI developer to use creative work without consent.

But many of large AI firms, including Stability AI, argue that taking copyright content is "fair use".

The "fair use" exemption to copyright rules means the permission of the owners of the original content is not required.

The US copyright office is currently conducting a study about generative AI and policy issues.

Mr Newton-Rex stressed that he was talking about all AI firms which share this view - and the majority of them do.

Replying to his former member of staff in a post on X (Twitter), Stability AI founder Emad Mostaque said the firm believed fair use "supports creative development".

AI tools are trained using vast amounts of data, much of which is often taken, or "scraped", from the internet without consent.

Generative AI - products which are used to create content like images, audio, video and music - can then produce similar material or even directly replicate the style of an individual artist if requested.

Mr Newton-Rex, who is also a choral composer, said that he "wouldn't jump" at the chance to offer his own music to AI developers for free.

"I wouldn't think 'yes, I'll definitely give my compositions to a system like this'. I don't think I'd consent," he said.

He added that plenty of people create content "often for literally no money, in the hope that one day that copyright will be worth something".

But, ultimately, without consent their work was instead being used to create their own competitors and even potentially replace them entirely, he said.

He built an AI audio creator for his former employer called Stability Audio but said he had chosen to licence the data it was used to train on and share revenue from it with rights holders. He acknowledged that this model would not work for everybody.

"I don't think there's a silver bullet," he said.

"I know many people on the rightsholder side who are who are excited about the potential agenda today and want to work with it, but they want to do it under the right circumstances."

He said he remained optimistic about the benefits of AI and was not planning to leave the industry.

"I think that ethically, morally, globally, I hope we'll all adopt this approach of saying, 'you need to get permission to do this from the people who wrote it, otherwise, that's not okay'," he said.

The use of copyright material to train AI tools is controversial.

Some creatives, including the US comedian Sarah Silverman and Game of Thrones writer George RR Martin, have initiated legal action against AI firms, arguing that they have taken their work without permission and then used it to train products which can recreate content in their style.

A track featuring AI-generated voices of music artists Drake and The Weeknd was removed from Spotify earlier this year after it was discovered that it had been created without their consent.

Earlier this year, Stability AI faced legal action from the Getty image archive, which claimed it had scraped 12 million of its pictures and used them in the training of its AI image generator, Stable Diffusion.

Some news organisations, including the BBC and The Guardian, have blocked AI firms from lifting their material from the internet.

Read Entire Article