Forums >
General Industry >
AI erotica immune from Section 230 liability?
Many websites, such as this one, have gotten skittish about their user submitted revealing and erotic photos because of SOSTA/FESTA, a 2018 law that makes websites and their owners liable if the person depicted was sex trafficked. The chilling effect has been clear internet wide and this has primarily affected independent women that are discriminated against on platforms, like this one, and other social media, by algorithms and overzealous company policies policing any sexual or revealing photos. But here's the kicker, and it won't help those women that have sexy photos, SOSTA/FESTA's liability can only apply to humans depicted, as models produced by generative AI cannot be sex trafficked, due to being entirely ephemeral (edit: or not existing physically, pick the word for that). Looks like we have an unforeseen state sanctioned monopoly on generative AI erotica, as it may crowd out humans by merely being exempt from liability, despite being indiscernible if crafted well. So far, consumers do not care, if the operator gives it a personality alongside its visual extremes of femininity or masculinity. Of course platforms may try to limit generative AI, but any enterprising platform simply will attract those accounts with following themselves. Sex sells, and some platforms will sell sex, or sexuality, better in a friendlier regulator environment. Will AI crowd out humans as the operator can appeal flagging easier? Will "its AI" be an excuse that everyone uses to avoid Section 230 liability? What direction will this go? Thoughts? Oct 18 23 06:19 pm Link I'm not a lawyer so I don't know the wording of SOSTA/FESTA, Does it define what it means by sex trafficking, if in fact it uses that term? I find it not to be a helpful term as it really doesn't define what it means. It's too broad and is tossed around far too loosely. I also don't see that nude or porn is at all curtailed. Twitter or X has more explicit porn than I ever could have imagined being allowed (I'm 77) to be published. We, as a nation are somewhat fad driven and our laws sometimes seem to stem from the latest outrage by whomever is the most vocal. The moral pendulum swings back and forth driven by thbanner of e loudest voices of the era. Hard to keep up with at times! As for AI - who knows where that will take us. It's quite amazing and has the potential for exciting art as well as realistic depictions of the deepest perversity of the human mind which will hide behind the banner of the 1st amendment and the "victimless crime". Oct 18 23 06:57 pm Link Weldphoto wrote: It makes it a federal crime to assist in sex trafficking, or facilitate, or support. and usually enforcement of this was blocked by "Section 230", that gives immunity to website operators so they don't have to police what their users post, but now website operators are liable under state and federal laws only in the enforcement of sex trafficking laws Oct 18 23 07:13 pm Link Roaring 20s wrote: Ephemeral means lasting for a short time, as with an ephemeral plant. AI generated imagery will stay around I think. Oct 19 23 05:24 am Link JSouthworth wrote: whoops, picked better description Oct 19 23 09:34 am Link
Post hidden on Oct 21, 2023 02:43 pm
Reason: not helpful Oct 20 23 04:12 am Link JSouthworth wrote: not really my edit was to convey that the purpose of language is to convey a shared concept, which you've demonstrated that I've done successfully. Oct 21 23 10:25 am Link This is a good question to ask an attorney who is a specialist in Intellectual Property and Internet laws. Oct 21 23 02:43 pm Link I think the point is that AI generated imagery is not subject to the section 230 law if it does not depict a model who is the victim of sex trafficking. But the potential for image manipulation is demonstrated by the number of faked pictures of celebrities on the web. Ultimately it may not be possible to tell what is real, or derived from a real photograph and what is AI generated, except on a circumstantial basis. If you know that person A does not pose for adult pictures, then you know that an adult image of person A must be a fake. Conversely, if images resembling a person who is regarded as a possible victim of sex trafficking appear on the web, a court may consider that as evidence of sex trafficking. Oct 22 23 04:33 am Link Patrick Walberg wrote: I think it will be a recurring theme and is a topic to breathe life into this forum. Right now it might be too obtuse for the remaining crowd here, but I think its something worth watching Oct 24 23 03:08 pm Link never mind Oct 24 23 06:29 pm Link ““Your child was never assaulted, your child was never exploited, but their likeness is being used as if they were,” he said. “We have a concern that our laws may not address the virtual nature of that, though, because your child wasn’t actually exploited — although they’re being defamed and certainly their image is being exploited.” The argument [against restrictions] would be, ‘well I’m not harming anyone — in fact, it’s not even a real person,’ but you’re creating demand for the industry that exploits children,” Wilson said. https://apnews.com/article/ai-child-por … ba9748f38a "In a first-of-its-kind case in South Korea, a man was sentenced in September to 2 1/2 years in prison for using artificial intelligence to create 360 virtual child abuse images, according to the Busan District Court in the country’s southeast. In some cases, kids are using these tools on each other. At a school in southwestern Spain, police have been investigating teens’ alleged use of a phone app to make their fully dressed schoolmates appear nude in photos. Sexton said IWF analysts discovered faces of famous children online as well as a “massive demand for the creation of more images of children who’ve already been abused, possibly years ago.” https://apnews.com/article/ai-artificia … b6177138d2 Oct 28 23 09:20 am Link JSouthworth wrote: Highly unlikely scenario in the real world. Maybe UK courts are different, but in the US courts of law, depictions of questionable authenticity "resembling" a "possible" victim of sex trafficking would not likely be admissible as "evidence" of sex trafficking. Oct 29 23 09:16 am Link Hunter GWPB wrote: the article mentions the possibility about people that don't even exist, and also imagine consensual circumstances, with the likeness of an adult Oct 29 23 12:07 pm Link |