As policy makers in the UK weigh how to regulate the AI industry, Nick Clegg, former UK deputy prime minister and former Meta executive, claimed a push for artist consent would “basically kill” the AI industry.
Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models. But he claimed it wasn’t feasible to ask for consent before ingesting their work first.
“I think the creative community wants to go a step further,” Clegg said according to The Times. “Quite a lot of voices say, ‘You can only train on my content, [if you] first ask’. And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data.”
“I just don’t know how you go around, asking everyone first. I just don’t see how that would work,” Clegg said. “And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight.”
The comments follow a back-and-forth in Parliament over new legislation that aims to give creative industries more insight into how their work is used by AI companies. An amendment to the Data (Use and Access) Bill would require technology companies to disclose what copyrighted works were used to train AI models. Paul McCartney, Dua Lipa, Elton John, and Andrew Lloyd Webber are among the hundreds of musicians, writers, designers, and journalists who signed an open letter in support of the amendment earlier in May.
The amendment — introduced by Beeban Kidron, who is also a film producer and director — has bounced around gaining support. But on Thursday members of parliament rejected the proposal, with technology secretary Peter Kyle saying the “Britain’s economy needs both [AI and creative] sectors to succeed and to prosper.” Kidron and others have said a transparency requirement would allow copyright law to be enforced, and that AI companies would be less likely to “steal” work in the first place if they are required to disclose what content they used to train models.
In an op-ed in the Guardian Kidron promised that “the fight isn’t over yet,” as the Data (Use and Access) Bill returns to the House of Lords in early June.
From The Verge via this RSS feed
“Following the law would ruin our massively destructive, never-to-be-profitable industry!”
well uhhh…
“Thief complains if theft illegal, he will go out of business”
How many times this thing gonna be reposted
Perhaps one should think twice before founding an entire “industry” on blatant theft?
Yes, and?
Look, I steal cars and sell them for money. If I were forced to ask the car owner’s for permission before stealing them, it would ruin my industry.
If you see this, please read very carefully.
I hate you.
Training is transformative use.
From-scratch models with public-domain data would still function.
All commercial works before 1995 should be public domain anyway.
Corporations forcing this tech on everyone is not a problem with the tech.
Any measured take on this whole stupid industry starts to feel like “enlightened centrism.” None of it is so morally simple or disconnected that responding with YAY or BOO makes any damn sense. The cost of whiz-bang science-fiction technology is that every science fiction story is about the consequences of that technology. Worrying possibilities are the genre.
And yet: whiz bang. We have programs that can emit video just by describing it - that’s fucking awesome. That’s how computers work when authors don’t know how computers work. We have programs which, just by guessing the next word, are debatably adequate writers, editors, coding partners, translators, et very cetera. They’re not perfect at anything you tell them to… but they’ll do anything you tell them to. Vast swaths of “[blank] requires true intelligence” went right out the window.
Legitimate concerns abound, but the loudest complaints include ‘the robot took books from the library.’ That’s what libraries are for. Be mad that these assholes let it spy on you, not that it also read bestselling novels.
Very “if we pay our workers a living wage, we’ll go out of business!!!” vibes.
Good.
If it can’t afford to then it’s simply admittance of not a sustainable way to do things, of course that’d come from a former Meta executive
Good. do it. please. AI cannot exist without exploitation and theft, fuck 'em