17
Share
Share on Mastodon
Share on Twitter
Share on Facebook
Share on Linkedin
Darius Kazemi

Opposing arguments are like this one, also quoted from the Washington Post article above:

"generative AI typically produces content only in response to prompts or queries from a user; these responses could be seen as simply remixing content from the third-party websites, whose data it was trained on"

If you are taking that broad a definition of "it's just a remix" then buddy, I have news for you about literally all creativity, language, and culture. By that logic, no one is liable for anything

5
2y
ali alkhatib

@darius the apparent crisis this seems to be causing people is a little baffling. i'm honestly a little curious what outcome people thought we would be heading toward that would've been at all coherent.

1
2y
Kevin Karhan

@darius +9001%

This is the same in #Germany where one cannot deny or disavow #copyright and that one is always responsible for the outputs of one's product unless they can evidence otherwise [i.e. sabotage by the end user]...

0
2y
Éibhear 🔭
@darius Does that not also mean that the creator should benefit from its outputs, too? Such as #copyright?
1
2y
Jesse Baer

@darius Seems to me the bigger issue with this approach will be determining who created a given bot.

0
2y

@darius It's the magic roundabout of wingnuts.

0
2y

@darius oooooooh

0
2y

@darius What exactly is the alternative? If creators are not responsible for the outputs of algorithms, then ... nobody is?
The people who get all worked up about the existential threat of "superintelligent AI" seem awfully blasé about not-so-intelligent systems spouting libel, giving inaccurate medical advice, or running people over with no accountability.

0
2y
Chip Stewart

@darius I wrote about this & summarized a number of approaches a few years ago in my sci-fi & future media law book, with examples from folks like @marklemley & @gunkel. In short - maybe assign liability to programmers or hosts, but maybe also terminate the bots themselves?

{"p":"","h":{"iv":"ROXSYW+cfvEbFHu5","at":"ocxplSQjdRC3tXEtB/9/wg=="}} {"p":"","h":{"iv":"ROXSYW+cfvEbFHu5","at":"ocxplSQjdRC3tXEtB/9/wg=="}} {"p":"","h":{"iv":"ROXSYW+cfvEbFHu5","at":"ocxplSQjdRC3tXEtB/9/wg=="}}
0
2y
Cadence Larissa Beth Beresford

@darius Who is responsible for what an AI creates? The AI's creator, the AI's user or the AI itself?

I say the answer is yes, all the above. What it is trained with, is important, what it is asked to do, is important and it's own capacity for reason, is important.

0
2y
Frankc1450

@darius it's an interesting point. I hope you're not a pompous arrogant bastard like he is.

1
2y
Anthony DiPierro

@darius Seems the question for 230 protection is whether the information was provided by another information content provider.

Which depends on the information. Was it a hallucination or was it just repeating someone else?

0
2y
m. libby

@darius What about the direction the copyright office is taking where the output of AI is not eligible for copyright? msn.com/en-us/news/politics/us

Calling these outputs the speech of the person/company who coded the AI would seem to imply that SOMEONE should be able to copyright these outputs.

Personally I prefer the idea that all these outputs are automatically public domain, given that they didn't get permission for most/any of the code/images/text in the training corpus.

But if strict liability hamstrings these LLMs, that works too.

0
2y
Shreejith

@darius @anildash okay now it’s going to get interesting

0
1y
jack the nonabrasive

@darius @irwin Yet the output of a generative ML model isn’t copyrightable. It seems like, if this becomes precedent, holding a bot creator who uses generative tech to create the bot output is a wedge that could be used to pressure copyright for generative tech.

Which might not be desirable. Unless it’s a derivative work and the author owes royalties to holders of copyright on the training corpus?

0
1y
Avi Rappoport (avirr)

@darius Those who make the profit must also take responsibility — @j2bryson has been saying this for years

0
1y
Jules 🍺

@darius I was going to say that it would mean we are responsible for what our kids say, but then I remembered that these ai bots are not independent entities unlike our progeny so yes the corporations are liable.

0
1y
Replies