Forums > General Industry > AI erotica immune from Section 230 liability?

Photographer

Roaring 20s

Posts: 137

Los Angeles, California, US

Many websites, such as this one, have gotten skittish about their user submitted revealing and erotic photos because of SOSTA/FESTA, a 2018 law that makes websites and their owners liable if the person depicted was sex trafficked.

The chilling effect has been clear internet wide and this has primarily affected independent women that are discriminated against on platforms, like this one, and other social media, by algorithms and overzealous company policies policing any sexual or revealing photos.

But here's the kicker, and it won't help those women that have sexy photos, SOSTA/FESTA's liability can only apply to humans depicted, as models produced by generative AI cannot be sex trafficked, due to being entirely ephemeral (edit: or not existing physically, pick the word for that).

Looks like we have an unforeseen state sanctioned monopoly on generative AI erotica, as it may crowd out humans by merely being exempt from liability, despite being indiscernible if crafted well. So far, consumers do not care, if the operator gives it a personality alongside its visual extremes of femininity or masculinity.

Of course platforms may try to limit generative AI, but any enterprising platform simply will attract those accounts with following themselves. Sex sells, and some platforms will sell sex, or sexuality, better in a friendlier regulator environment.

Will AI crowd out humans as the operator can appeal flagging easier?

Will "its AI" be an excuse that everyone uses to avoid Section 230 liability?

What direction will this go?

Thoughts?

Oct 18 23 06:19 pm Link

Photographer

Weldphoto

Posts: 845

Charleston, South Carolina, US

I'm not a lawyer so I don't know the wording of SOSTA/FESTA, Does it define what it means by sex trafficking, if in fact it uses that term? I find it not to be a helpful term as it really doesn't define what it means. It's too broad and is tossed around far too loosely. I also don't see that nude or porn is at all curtailed. Twitter or X has more explicit porn than I ever could have imagined being allowed (I'm 77) to be published.

We, as a nation are somewhat fad driven and our laws sometimes seem to stem from the latest outrage by whomever is the most vocal. The moral pendulum swings back and forth driven by thbanner of e loudest voices of the era. Hard to keep up with at times!

As for AI - who knows where that will take us. It's quite amazing and has the potential for exciting art as well as realistic depictions of the deepest perversity of the human mind which will hide behind the banner of the 1st amendment and the "victimless crime".

Oct 18 23 06:57 pm Link

Photographer

Roaring 20s

Posts: 137

Los Angeles, California, US

Weldphoto wrote:
I'm not a lawyer so I don't know the wording of SOSTA/FESTA, Does it define what it means by sex trafficking, if in fact it uses that term? I find it not to be a helpful term as it really doesn't define what it means. It's too broad and is tossed around far too loosely. I also don't see that nude or porn is at all curtailed. Twitter or X has more explicit porn than I ever could have imagined being allowed (I'm 77) to be published.

We, as a nation are somewhat fad driven and our laws sometimes seem to stem from the latest outrage by whomever is the most vocal. The moral pendulum swings back and forth driven by the loudest voices of the era. Hard to keep up with at times!

It makes it a federal crime to assist in sex trafficking, or facilitate, or support. and usually enforcement of this was blocked by "Section 230", that gives immunity to website operators so they don't have to police what their users post, but now website operators are liable under state and federal laws only in the enforcement of sex trafficking laws

so because website operators can't tell who is being sex trafficked, and SWERFs already have impossible standards that suggest even independent women are incapable of consent for broadcasting their sexuality in any way, the website operators often take skittish views.

its just random interpretations by whoever the corporation happens to have as a lawyer, and happens to have as an executive. Twitter/X takes one view because they told the SWERFs in tech to pound sand and had a lot of capital, other platforms already had SWERFs in decision making positions or were more beholden to advertisers or otherwise didn't find it worth it to deal with the liability.

but neither direction really represents public opinion or the state of cultural norms or the law

Oct 18 23 07:13 pm Link

Photographer

JSouthworth

Posts: 1830

Kingston upon Hull, England, United Kingdom

Roaring 20s wrote:
But here's the kicker, and it won't help those women that have sexy photos, SOSTA/FESTA's liability can only apply to humans depicted, as models produced by generative AI cannot be sex trafficked, due to being entirely ephemeral.

Ephemeral means lasting for a short time, as with an ephemeral plant. AI generated imagery will stay around I think.

Oct 19 23 05:24 am Link

Photographer

Roaring 20s

Posts: 137

Los Angeles, California, US

JSouthworth wrote:

Ephemeral means lasting for a short time, as with an ephemeral plant. AI generated imagery will stay around I think.

whoops, picked better description

Oct 19 23 09:34 am Link

Photographer

JSouthworth

Posts: 1830

Kingston upon Hull, England, United Kingdom

Post hidden on Oct 21, 2023 02:43 pm
Reason: not helpful

Oct 20 23 04:12 am Link

Photographer

Roaring 20s

Posts: 137

Los Angeles, California, US

JSouthworth wrote:

Don't you mean, unrepresentative? non-representational?

not really my edit was to convey that the purpose of language is to convey a shared concept, which you've demonstrated that I've done successfully.

Oct 21 23 10:25 am Link

Photographer

Patrick Walberg

Posts: 45205

San Juan Bautista, California, US

This is a good question to ask an attorney who is a specialist in Intellectual Property and Internet laws.

Oct 21 23 02:43 pm Link

Photographer

JSouthworth

Posts: 1830

Kingston upon Hull, England, United Kingdom

I think the point is that AI generated imagery is not subject to the section 230 law if it does not depict a model who is the victim of sex trafficking. But the potential for image manipulation is demonstrated by the number of faked pictures of celebrities on the web. Ultimately it may not be possible to tell what is real, or derived from a real photograph and what is AI generated, except on a circumstantial basis. If you know that person A does not pose for adult pictures, then you know that an adult image of person A must be a fake.

Conversely, if images resembling a person who is regarded as a possible victim of sex trafficking appear on the web, a court may consider that as evidence of sex trafficking.

Oct 22 23 04:33 am Link

Photographer

Roaring 20s

Posts: 137

Los Angeles, California, US

Patrick Walberg wrote:
This is a good question to ask an attorney who is a specialist in Intellectual Property and Internet laws.

I think it will be a recurring theme and is a topic to breathe life into this forum. Right now it might be too obtuse for the remaining crowd here, but I think its something worth watching

if the AI producers have a monopoly on content that human want to see, because human content creators are barred from paranoid platforms, then its something people should be aware of and react to, shift how they spend their time, shift what they learn, or simply go do something else.

Oct 24 23 03:08 pm Link

Photographer

Focuspuller

Posts: 2767

Los Angeles, California, US

never mind

Oct 24 23 06:29 pm Link

Artist/Painter

Hunter GWPB

Posts: 8199

King of Prussia, Pennsylvania, US

““Your child was never assaulted, your child was never exploited, but their likeness is being used as if they were,” he said. “We have a concern that our laws may not address the virtual nature of that, though, because your child wasn’t actually exploited — although they’re being defamed and certainly their image is being exploited.”

The argument [against restrictions] would be, ‘well I’m not harming anyone — in fact, it’s not even a real person,’ but you’re creating demand for the industry that exploits children,” Wilson said.


https://apnews.com/article/ai-child-por … ba9748f38a



"In a first-of-its-kind case in South Korea, a man was sentenced in September to 2 1/2 years in prison for using artificial intelligence to create 360 virtual child abuse images, according to the Busan District Court in the country’s southeast.

In some cases, kids are using these tools on each other. At a school in southwestern Spain, police have been investigating teens’ alleged use of a phone app to make their fully dressed schoolmates appear nude in photos.

Sexton said IWF analysts discovered faces of famous children online as well as a “massive demand for the creation of more images of children who’ve already been abused, possibly years ago.”


https://apnews.com/article/ai-artificia … b6177138d2

Oct 28 23 09:20 am Link

Photographer

Focuspuller

Posts: 2767

Los Angeles, California, US

JSouthworth wrote:
Conversely, if images resembling a person who is regarded as a possible victim of sex trafficking appear on the web, a court may consider that as evidence of sex trafficking.

Highly unlikely scenario in the real world. Maybe UK courts are different, but in the US courts of law, depictions of questionable authenticity "resembling" a "possible" victim of sex trafficking would not likely be admissible as "evidence" of sex trafficking.

Oct 29 23 09:16 am Link

Photographer

Roaring 20s

Posts: 137

Los Angeles, California, US

Hunter  GWPB wrote:
““Your child was never assaulted, your child was never exploited, but their likeness is being used as if they were,” he said. “We have a concern that our laws may not address the virtual nature of that, though, because your child wasn’t actually exploited — although they’re being defamed and certainly their image is being exploited.”

The argument [against restrictions] would be, ‘well I’m not harming anyone — in fact, it’s not even a real person,’ but you’re creating demand for the industry that exploits children,” Wilson said.


https://apnews.com/article/ai-child-por … ba9748f38a



"In a first-of-its-kind case in South Korea, a man was sentenced in September to 2 1/2 years in prison for using artificial intelligence to create 360 virtual child abuse images, according to the Busan District Court in the country’s southeast.

In some cases, kids are using these tools on each other. At a school in southwestern Spain, police have been investigating teens’ alleged use of a phone app to make their fully dressed schoolmates appear nude in photos.

Sexton said IWF analysts discovered faces of famous children online as well as a “massive demand for the creation of more images of children who’ve already been abused, possibly years ago.”


https://apnews.com/article/ai-artificia … b6177138d2

the article mentions the possibility about people that don't even exist, and also imagine consensual circumstances, with the likeness of an adult

even what those 50 states and korean municipality are aiming for, it would only enshrine that adult likenesses are granted a state-sanctioned monopoly that humans are not afforded

Oct 29 23 12:07 pm Link