Home ยป Forum ยป Author Hangout

Forum: Author Hangout

Another AI case

Switch Blayde ๐Ÿšซ

From the article at: https://www.yahoo.com/news/beverly-hills-school-district-expels-020534338.html

Five Beverly Hills eighth-graders have been expelled for their involvement in the creation and sharing of fake nude pictures of their classmates. They created and shared images that superimposed pictures of real students' faces onto simulated nude bodies generated by artificial intelligence.

Dozens of A.I.-powered apps are available online to "undress" someone in a photo, simulating what a person would look like if they'd been nude when the shot was taken. Other A.I.-based tools allow you to "face swap" a targeted person's face onto another person's nude body.

Versions of these programs have been available for years, but the earlier ones were expensive, harder to use and less realistic. Today, AI tools can clone lifelike images and quickly create fakes; even using a smartphone, it can be accomplished in a matter of seconds.

The Beverly Hills Police Department and the Los Angeles County district attorney's office are still investigating the incident, but no arrests have been made or charges brought. California's laws against possessing child pornography and sharing nonconsensual nude pictures do not specifically apply to AI-generated images, which legal experts say would pose a problem for prosecutors.

DBActive ๐Ÿšซ

@Switch Blayde

18USC2252a:
(7)knowingly produces with intent to distribute, or distributes, by any means, including a computer, in or affecting interstate or foreign commerce, child pornography that is an adapted or modified depiction of an identifiable minor.[1]
shall be punished as provided in subsection (b).

Replies:   KimLittle
KimLittle ๐Ÿšซ

@DBActive

I want to start by saying that it should be treated like revenge porn because the intent behind it was to affect the victims in the same way that otherwise would have happened if they had sexted the wrong person.

But I really value the encyclopaedic knowledge of statues, precedent, and case law across numerous jurisdictions that is readily available from folks on this site.

Replies:   Dominions Son
Dominions Son ๐Ÿšซ

@KimLittle

I want to start by saying that it should be treated like revenge porn

Revenge porn is a civil matter, 18USC2252a, would be a federal criminal charge.

Replies:   caliphornia
caliphornia ๐Ÿšซ

@Dominions Son

Revenge porn is a crime in 48 states.

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ

@caliphornia

The definition of 'revenge porn' in many of those states does not equate to these cases, though. Yes, I'm aware you didn't say it does :)

For instance, under Texas law:

(2) the visual material was obtained by the person or created under circumstances in which the depicted person had a reasonable expectation that the visual material would remain private;

is an element of the crime. In the case of 'hybrid' works, the depicted person cannot have any reasonable expectations since they weren't even there.

We're getting there, but we're not there yet.

DBActive ๐Ÿšซ

@Switch Blayde

I would think this would be a relevant statute in the CA penal code:
311.3.
(a) A person is guilty of sexual exploitation of a child if he or she knowingly develops, duplicates, prints, or exchanges any representation of information, data, or image, including, but not limited to, any film, filmstrip, photograph, negative, slide, photocopy, videotape, video laser disc, computer hardware, computer software, computer floppy disc, data storage media, CD-ROM, or computer-generated equipment or any other computer-generated image that contains or incorporates in any manner, any film or filmstrip that depicts a person under the age of 18 years engaged in an act of sexual conduct.

(b) As used in this section, "sexual conduct" means any of the following:

Sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex or between humans and animals.
Penetration of the vagina or rectum by any object.
Masturbation for the purpose of sexual stimulation of the viewer.
Sadomasochistic abuse for the purpose of sexual stimulation of the viewer.
Exhibition of the genitals or the pubic or rectal area of any person for the purpose of sexual stimulation of the viewer.
Defecation or urination for the purpose of sexual stimulation of the viewer.

Replies:   Dicrostonyx  Joe Long
Dicrostonyx ๐Ÿšซ

@DBActive

I've been giving this a bit of thought:

contains or incorporates in any manner, any film or filmstrip that depicts a person under the age of 18 years engaged in an act of sexual conduct

I'm not a lawyer (or American), but my understanding of CP laws in the US are that they are written to require the involvement of actual children. This is why it's legal to have 3D art, cartoons, and stories with kids.

I'd guess that if any charges come of this they are going to be based on the fact that images of real children were input into the AI to create the final product. So it's not the AI generated body that's the problem.

From a bigger picture, though, I suspect that police agencies are going to start putting a lot more pressure on the companies creating AI to get lists of training materials. The question I'd be asking is "How did the AI know what a naked 13-year old girl looks like?"

Replies:   REP  Dominions Son  LupusDei
REP ๐Ÿšซ

@Dicrostonyx

Probably a series of pictures of a naked 13-year-old girl, of girls.

Dominions Son ๐Ÿšซ

@Dicrostonyx

US are that they are written to require the involvement of actual children. This is why it's legal to have 3D art, cartoons, and stories with kids.

They weren't necessarily written that way originally. It was made into a constitutional requirement by the US Supreme Court (SCOTUS).

When the first US child porn law came before SCOTUS in the 1970s, the Supreme Court carved out a First Amendment exemption explicitly because of the harm done to the children used to produce child porn.

In the late 1990s, the US Congress tried to explicitly expand the definition of child porn to include 3D art, cartoons, stories with kids, and real photos/videos made with young looking adult performers.

That came before SCOTUS in 2002. SCOTUS explicitly rejected it saying they weren't going to expand the first amendment exception beyond material produced using real children.

LupusDei ๐Ÿšซ
Updated:

@Dicrostonyx

The question I'd be asking is "How did the AI know what a naked 13-year old girl looks like?"

Images of nude children are readily available online. Not as much as once was, but under the label of nudism you won't search for long.

Flat chested adult women exist. Adult women with "forever fifteen" small pointy breasts exist (I personally know at least one girl in her fifties that easily could be mistaken for early teen from the neck down or with enough blur). Those are rather underrepresented in erotic and porn images, but not nonexistent.

In Japan, there's a whole genre where the core advertising number of a pornstar's bio is her weight (the less the better) next to height (same). Those legal age women look like children, and playact children in the material they produce. Obviously, it's legal in Japan, and it is all online.

Some or all of it was pulled into the training data.

More over, the power of AI is that it can generate an image it has never seen by extrapolating elements. It can easily age/de-age a picture, or bridge the gap between the flat chested adult body and child face. Or make a boy a girl (often by accident, as it likes drawing girls much more, apparently).

That all said, indeed, in my limited experience, the specific characteristic twelve-year-old budding breasts aren't likely to be generated without extra work. SD would far more likely generate flat chest, androgynous features or more or less adult looking breasts even when explicitly asked to draw a child of certain age.

Replies:   Dicrostonyx  tenyari
Dicrostonyx ๐Ÿšซ

@LupusDei

Yes, but my point isn't whether legal alternatives are available, but rather that police agencies are likely to start asking companies to "show their work" as it were.

If a given AI has been trained using illegal content, then the person who fed that content into the AI, and possibly their whole company, might be charged under distribution laws.

I can see this logic applying even if the AI was simply hooked up to a spiderbot and instructed to find images. Given that the AIs are not considered people under the law, a person is in charge of them legally. That person is responsible for what the AI consumes.

Similarly, I suspect there are already major cases working their way through the system that are going regarding intentional copyright infringement due to AIs being trained using material that's under copyright. Again, doesn't matter how the AI learned, someone is ultimately responsible for the AI.

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ

@Dicrostonyx

If a given AI has been trained using illegal content, then the person who fed that content into the AI, and possibly their whole company, might be charged under distribution laws.

That depends. Existing distribution laws require actual knowledge of the content, not merely suspicion (and there is no 'they should have known' exception, either).

It's pretty certain that LAION-5B contained some amount of alleged CSAM, but no one is certain what that CSAM might be, and it might not be illegal. All that is known is that the checksum hash values of some of LAION-5B's images match the hash values of images in a CSAM database. No one has the images handy (for fairly obvious reasons, in the case of the database).

Mind you, LAION-5B is a set of links and text descriptions. It 'contains' no images. Those who made it available never viewed the images - descriptions were created by AI. Those who used it never viewed the images. There were almost six billion images in LAION-5B. How could someone(s) possibly have audited them, especially a small nonprofit?

I can see this logic applying even if the AI was simply hooked up to a spiderbot and instructed to find images. Given that the AIs are not considered people under the law, a person is in charge of them legally. That person is responsible for what the AI consumes.

Again, the law requires 'knowingly'. No one 'knowingly' did anything.

Similarly, I suspect there are already major cases working their way through the system that are going regarding intentional copyright infringement due to AIs being trained using material that's under copyright. Again, doesn't matter how the AI learned, someone is ultimately responsible for the AI.

There are. The odds are high that it'll be found to be fair use, but if not, no one is discussion criminal liability, only licensing agreements.

tenyari ๐Ÿšซ

@LupusDei

In Japan, there's a whole genre where the core advertising number of a pornstar's bio is her weight (the less the better) next to height (same).

Isn't the age of 'consent' also shockingly low in Japan...

AI tools are pretty good at 'realism' filters also - taking cartoon to real and back. So if you get something that was a cartoon in Japan, it can figure out what the realism of that would be, and other filters can alter the ethnicity.

That noted. Most of the stuff I've seen on AI training galleries is either American or Chinese. The "Japanese stuff" is usually actually Chinese. Even the Hentai / XXX. And Chinese stuff might sexualize women a LOT, but not children.

Replies:   Dominions Son
Dominions Son ๐Ÿšซ

@tenyari

Isn't the age of 'consent' also shockingly low in Japan...

Not any more.

https://indianexpress.com/article/explained/explained-law/japan-age-of-consent-13-to-16-8682662/

Apparently they raised it to 16 last year (it was 13) but even that wasn't the lowest in the world.

It's complicated in Mexico. There are basically two ages of consent in all the Mexican states. Unrestricted at 18 and a second lower age where below that age it's always illegal. between the two, sex may or may not be illegal depending on factors other than age. The lower cut off in some Mexican states is as low as 12.

https://en.wikipedia.org/wiki/Ages_of_consent_in_North_America

Replies:   Michael Loucks
Michael Loucks ๐Ÿšซ

@Dominions Son

Apparently they raised it to 16 last year (it was 13) but even that wasn't the lowest in the world.

It was a bit more complicated, as intercourse was technically illegal for anyone under eighteen except in 'legitimate' romantic relationships (not clearly defined, but effectively if neither set of parents objected).

Replies:   Dicrostonyx
Dicrostonyx ๐Ÿšซ

@Michael Loucks

not clearly defined, but effectively if neither set of parents objected

To be fair, this is how it worked in Canada in the 90s.

Age of consent was 14 until May 2008 when it was increased to 16. However, there was legal precedent establishing that (a) the parents of the minor could file charges if the child was under 16, and (b) it was illegal to lie to or manipulate anyone 14 - 17 for the purposes of having sex. The second also applies to giving alcohol, for example, which was illegal anyway but was treated as exacerbating a sexual misdemeanour.

Joe Long ๐Ÿšซ

@DBActive

The AI knows what 13yo girls look like and it knows what a naked adult female look like. From there it's just a matter of interpretation.

Grey Wolf ๐Ÿšซ

@Switch Blayde

There's yet another one, so I'll add it here:

https://www.theverge.com/2024/3/8/24094633/deepfake-ai-explicit-images-florida-teenagers-arrested

This is a criminal case with third-degree felony charges under Florida law. It's likely to be a very interesting case to follow.

Replies:   Dominions Son
Dominions Son ๐Ÿšซ

@Grey Wolf

Somewhat less interesting if you look at the law they were charged under.

From your link:

They were charged with third-degree felonies under a 2022 Florida law that criminalizes the dissemination of deepfake sexually explicit images without the victim's consent.

This has nothing at all to do with virtual child pornography. The fact that the victims were minors is irrelevant to the state law under which the suspects were charged.

The fact that the images were AI generated rather than manual photo chops also appears to be irrelevant to the law under which they were charged.

Replies:   Grey Wolf  DBActive
Grey Wolf ๐Ÿšซ

@Dominions Son

This seems very similar to the case that Switch Blayde quoted:

They created and shared images that superimposed pictures of real students' faces onto simulated nude bodies generated by artificial intelligence.

In both cases, it's an AI-abetted deepfake. The original article mentions 'child pornography,' but only in a 'the law doesn't apply' sense.

One is an expulsion, the other is criminal charges, but both are about AI deepfakes of real people, not 'virtual child pornography' in particular.

The California law might actually apply - it's arguable that it would cover an AI-generated image that used an actual person's image as an input. If they just generated a million images at random and stopped when one looked like a classmate, that would be different. In either case, as the original article states, it would be a difficulty for prosecutors.

Replies:   Dominions Son
Dominions Son ๐Ÿšซ

@Grey Wolf

In both cases, it's an AI-abetted deepfake. The original article mentions 'child pornography,' but only in a 'the law doesn't apply' sense.

In both cases, it is likely that AI being involved is irrelevant to the law.

AI deepfakes panic.

BS. A purely manual photo chop by a human would likely also violate the law in exactly the same way.

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ

@Dominions Son

Oh, I agree with you there, except that it is considerably easier for the average person to make a photorealistic output using AI than with PhotoShop. Despite Adobe's claims, 'old school' Photoshop isn't that easy, and 'newfangled' Photoshop is AI.

Also, a 'photo chop' with 12-year-old heads and 18-year-old bodies is probably a Frankenstein mess. Getting 12-year-old bodies makes it very questionable. No necessarily 'child pornography', because mere nudity is not always consider CSAM, but it's an entirely different thing than using AI to generate a fake 12-year-old body.

DBActive ๐Ÿšซ

@Dominions Son

The fact that the images were AI generated rather than manual photo chops also appears to be irrelevant to the law under which they were charged.

That came before SCOTUS in 2002. SCOTUS explicitly rejected it saying they weren't going to expand the first amendment exception beyond material produced using real children.

Given some of the discussion in this thread I thought it would be good to clarify that this does not mean putting a real child's head and face on a "legal" body - however the body is generated. That's child porn under the PROTECT Act.

Replies:   Dominions Son
Dominions Son ๐Ÿšซ

@DBActive

That's child porn under the PROTECT Act.

https://www.lee.senate.gov/services/files/6272B879-C153-4DBB-BD5D-11A5073006C4

WHAT THE BILL DOES
First, this law requires websites to verify the age and identity of individuals uploading pornographic images. Failure to comply with this section would result in a $10,000 civil penalty.

Second, this law requires pornography sites to obtain signed consent forms verifying the age and identity of any individual appearing in uploaded content. Any image uploaded without the consent of the individual appearing therein must be removed from the platform within 72 hours. Failure to follow these regulations would result in a civil penalty of up to $10,000 per day per image.

Third, this law criminalizes knowing publication of pornographic images to websites with knowledge or reckless
disregard for the lack of consent and the reasonable expectation that the images would not be published. This is
commonly referred to as revenge porn. Individuals appearing in these images, or their legally authorized representative,may request that these images be removed from the platforms. If the images are not removed within 72 hours of the request, the information content provider the image(s) will be subject to a criminal penalty and the victim may seek damages against them.

Overall, this law would require information content providers to reasonably engage in the fight against human trafficking by eliminating the ability of traffickers to monetize victim content on pornographic platforms and provide victims with a mechanism to have nonconsensual content removed.

If you have a better source for what's in the act, I'd be interested in seeing it, but this doesn't indicate that the act does any thing to change the definition of child pornography in any way.

What it does do is force websites that host third party pornographic video/photos to do more to verify the ages and identities of performers in material posted to their sites.

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ

@Dominions Son

My suspicion was that DBActive was referring to the PROTECT Act of 2003, not Mike Lee's PROTECT Act of 2024.

https://en.wikipedia.org/wiki/PROTECT_Act_of_2003

DBActive may be referring to:

Prohibits computer-generated child pornography when "(B) such visual depiction is a computer image or computer-generated image that is, or appears virtually indistinguishable from that of a minor engaging in sexually explicit conduct"; (as amended by 1466A for Section 2256(8)(B) of title 18, United States Code).

That applies to obscenity (which Miller does not cover). It would not apply to a 'real child, face / fake body' case if there was no sexual activity, 'lascivious exhibition of the genitals or pubic area', etc. It wouldn't matter if the child themselves weren't doing anything sexual - the image as a whole represents a child engaged in sexual activity, which is illegal.

Note that the PROTECT Act of 2003 has been tested in court with respect to traditional animation (so, not photorealistic) of children engaged in sexual activity. Since that made it obscene under Miller (and thus not covered by Ashcroft), the person was convicted. As the Wikipedia entry notes:

Obscenity, including obscene depictions of children, either virtual or real, is unprotected speech

There is some belief that parts of that act remain unconstitutional, and prosecutors have made plea deals to avoid raising those constitutional challenges, but that's hypothetical.

Replies:   Dominions Son  DBActive
Dominions Son ๐Ÿšซ
Updated:

@Grey Wolf

Note that the PROTECT Act of 2003 has been tested in court with respect to traditional animation (so, not photorealistic) of children engaged in sexual activity. Since that made it obscene under Miller (and thus not covered by Ashcroft), the person was convicted. As the Wikipedia entry notes:

I'd want to see the actual cases. I would be willing to bet that there was more going on than just children engaged in sexual activities.

I am aware of obscenity prosecutions for stories involving sex with children. However the cases I am aware of involved violent rape and torture of prepubescent children, not just sex with children.

My understanding is that just sex with minors especially if it was teens would not automatically qualify as obscene under the Miller test.

The Miller case (1973) was almost a decade before the court carved out a separate 1A exemption for child pornography.

https://en.wikipedia.org/wiki/Child_pornography_laws_in_the_United_States

Child pornography is also not protected by the First Amendment, but importantly, for different reasons. In 1982 the Supreme Court held in New York v. Ferber that child pornography, even if not obscene, is not protected speech. The court gave a number of justifications why child pornography should not be protected, including that the government has a compelling interest in safeguarding the physical and psychological well-being of minors.

The holding in New York v. Ferber would have been unnecessary if anything depicting sex involving minors was automatically obscene under the Miller test.

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ

@Dominions Son

I'd want to see the actual cases. I would be willing to bet that there was more going on than just children engaged in sexual activities.

From Wikipedia (yes, not the best source, but it covers a fair bit of what you're asking):

The first conviction of a person found to have violated the sections of the act relating to virtual child pornography was Dwight Whorley of Virginia, who used computers at the Virginia Employment Commission to download "Japanese anime style cartoons of children engaged in explicit sexual conduct with adults" alleged to depict "children engaged in explicit sexual conduct with adults". He was charged with 19 counts of "knowingly receiving" child pornography for printing out two cartoons and viewing others. His conviction was upheld in a 2โ€“1 panel decision of the Fourth Circuit Court of Appeals in December 2008. This decision was consistent with the U.S. Supreme Court ruling in Ashcroft v. Free Speech Coalition in which the Supreme Court held that virtual child pornography was protected free speech, provided that the virtual depictions are not obscene. Obscenity, including obscene depictions of children, either virtual or real, is unprotected speech. (Whorley was also previously convicted of offenses in connection with pornographic depictions of real children.)

The case is: https://caselaw.findlaw.com/court/us-4th-circuit/1431669.html

There are some potentially disturbing things in the Fourth Circuit opinion in terms of e.g. SoL:

We also reject his arguments that textual matter cannot be obscene under ยงโ€‚1462 and that cartoons depicting minors in sexually explicit conduct must depict real-life minors to violate ยงโ€‚1466A(a)(1). โ€‚ Finally, we reject his challenges to the district court's procedural rulings and his sentence. โ€‚ Accordingly, we affirm.

Note the part about textual material. Most SoL stories would pass Miller with no problem, but an 'All Sex' might not.

The case clearly involves only textual and cartoon material, but yes, you are correct that there is violence, rape, etc. The point is more that it's clear that visual and textual material which do not involve actual children can be held obscene and can be punished under existing law, at some level of obscenity (yet to be determined).

I agree that not everything depicting sex involving minors is obscene, but the 2003 PROTECT act effectively defines it that way, and (to my knowledge - could be wrong) that has not been tested in court.

Replies:   Dominions Son
Dominions Son ๐Ÿšซ

@Grey Wolf

From Wikipedia (yes, not the best source, but it covers a fair bit of what you're asking):

No, it doesn't cover even a tiny bit what I'm asking.

"Japanese anime style cartoons of children engaged in explicit sexual conduct with adults" alleged to depict "children engaged in explicit sexual conduct with adults"

Addressing my question even a little would require description of the exact nature of the sexual conduct. Just saying sexual conduct does not in any way address my question.

A jury would have seen the actual cartoons in question.

DBActive ๐Ÿšซ

@Grey Wolf

Read the provisions of the PROTECT Act of 2003 relating to both pandering and alteration of the image of a child.
https://www.law.cornell.edu/uscode/text/18/2252A
Also read US vs Williams (2008) upholding the pandering provisions of the act.
Finally, read Ashcroft vs Free Speech Coalition and the reference to use of a real child's altered image in porn. The Court did not rule on the constitutional validity of that provision.
After Williams, I believe it would be impossible to distribute any computer generated or altered image of a child engaged in the prohibited act without violation of the pandering provision.
As to juvenile offenders - federal prosecutors and courts are not really set up to deal with them and except in the most extreme circumstances will leave them to the state to deal with them.

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ

@DBActive

There's an important caveat in US vs Williams:

The Court further stated that 18 U.S.C. ยง 2252A(a)(3)(B) would not be construed to punish the solicitation or offering of "virtual" (computer generated/animated) child pornography, thus comporting with the holding of Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002).

The Court stated that "an offer to provide or request to receive virtual child pornography is not prohibited by the statute. A crime is committed only when the speaker believes or intends the listener to believe that the subject of the proposed transaction depicts real children. It is simply not true that this means 'a protected category of expression [will] inevitably be suppressed,' post, at 13. Simulated child pornography will be as available as ever."

Thus, computer generated/animated CSAM is likely still covered by Ashcroft unless a court finds it to be obscene (in which case Ashcroft would not necessarily apply). Williams explicitly does not change that, and the PROTECT act of 2003 thus does not bar virtual CSAM under the pandering provision, per current precedent.

Replies:   DBActive
DBActive ๐Ÿšซ

@Grey Wolf

Maybe, I doubt it, but maybe the person actually creating the image would say, "These are not real children." And maybe someone would believe that. But, as the image gets passed down the line that disclaimer, as if it was ever believable, would be lost.
That leaves aside the fact that these images are traded for other images. What person with real images would trade them for fakes?
That still leaves the issue of altered images of real children. Those are still illegal under the CPPA.

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ
Updated:

@DBActive

Metadata is the answer to your first point. Tag the image with the metadata used to recreate it algorithmically. Done.

For the second point, if one merely need send the prompt and setup, no images need be traded. And if one cannot be prosecuted for virtual images, who would trade for real images? Yes, that's a somewhat hyperbolic statement. There are people who would only be satisfied by real images. But it quite likely creates a major divide in which a large number of people who don't children to be harmed can opt out of it, while those who do wish children to be harmed essentially wave their hands in the air and acknowledge that. That, in turn, tells people whose concern is the protection of children which body of people to focus on and removes a large source of distraction: people who aren't a threat to children and whose prosecution doesn't make children safer.

On your third point, yes, if it's primarily an image of a real child. 'Child head on adult body' where the body 'looks young' but is provably adult, especially where there's no obscenity, seems very likely to be covered under Ashcroft. It may run afoul of 'deepfake' laws, though.

Part of the text of the PROTECT act of 2003 fits exactly your first point (that everyone will claim it's virtual, and it'll be a bottleneck to prosecutions to force the government to prove that it's not). Nevertheless, the Williams court explicitly, in so many words, stated that the 'pandering' provisions do not apply to virtual child pornography. If they're not going to apply the pandering provisions, why would they apply possession or distribution?

Repeating for emphasis:

A crime is committed only when the speaker believes or intends the listener to believe that the subject of the proposed transaction depicts real children.

That's a pretty strong statement.

Replies:   DBActive
DBActive ๐Ÿšซ
Updated:

@Grey Wolf

Part of the text of the PROTECT act of 2003 fits exactly your first point (that everyone will claim it's virtual, and it'll be a bottleneck to prosecutions to force the government to prove that it's not). Nevertheless, the Williams court explicitly, in so many words, stated that the 'pandering' provisions do not apply to virtual child pornography. If they're not going to apply the pandering provisions, why would they apply possession or distribution?

Repeating for emphasis:

A crime is committed only when the speaker believes or intends the listener to believe that the subject of the proposed transaction depicts real children.

That's a pretty strong statement.

But, the images (child's face on a virtual body) does depict a real child.

Ashcroft specifically did not deal with this issue.

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ

@DBActive

But, the images (child's face on a virtual body) does depict a real child.

I disagree. The images depict something which is definitely not a real child. It's part of a real child and part not, which makes it not a real child.

But, even if I grant that point for argument, that's still not responsive to the quoted text. The quoted text says that either 'the speaker believes' or 'intends the listener to believe'

The speaker does not believe the subject depicts real children.
The speaker does not intend the listener to believe the subject depicts real children.

Thus, there is, per the Williams court, no crime.

Replies:   DBActive
DBActive ๐Ÿšซ
Updated:

@Grey Wolf

The Williams decision on the pandering provision would not help the image creator, distributor or holder. An actual child is depicted in a sexually explicit manner.

The point of the law is to protect children. Creation and distribution of realistic images of a child in a sexually explicit situation harms the child depicted.

That image is designed to appear as if the child is doing something sexual.

Aside from the fact that the only purpose of the creation is to deceive someone that the child is doing something sexual (Williams) the existence and distribution of the image is inherently harmful to the child.

Based on all the cases I cannot believe the courts would give those images a pass.

Grey Wolf ๐Ÿšซ

@DBActive

An actual child is depicted in a sexually explicit manner.

There is no 'actual child' here. That was my entire point. By definition, this is not a depiction of a real child engaged in the behavior shown. Everyone knows that going in. That's a fact.

'The point of the law is to protect children. Creation and distribution of realistic images of a child in a sexually explicit situation harms the child depicted.'

Only if the image is of a specific child and is believed to be that child. Otherwise, no. What you're arguing for here is a 'deepfake' law that doesn't exist and, even if it did, functions differently than nearly all 'deepfake' laws.

I hate to keep banging the same drum, but you're not hearing it. The Williams court was extremely explicit: if the speaker does not believe a real child is depicted (and they clearly do not, since the image is factually not the image of a real child) and the recipient is notified that this is not the image of an real child, there is no crime. Period.

Based on all the cases I cannot believe the courts would give those images a pass.

You are welcome to believe that, just as people for years believed some court, somewhere, would overturn Roe, and eventually one did. But, prior to that, the state of the law was what it was. As of right now, we have a clear statement of the law, and it is that the 'pandering' rule does not apply when neither the distributor or the recipient believe the image is that of an real child.

Dropping children from this, because they honestly are irrelevant to the analogy: consider the recent Taylor Swift (TS) deepfake mess.

If:
1) The poster says 'These are actual images of TS! Look what she's doing!' that may be actionable under some 'deepfake' laws, and there may (or may not) be grounds to sue under other laws.

2) The poster says nothing, just puts out the images, that is less likely to be actionable under either (but non-zero).

3) The poster says 'These are not images of TS, TS did not do these things, and this is just a parody', very few 'deepfake' laws apply and the other laws also very likely don't apply.

What you seem to be arguing is that, even though the poster says 'this is not X', and the recipient says 'this is not X', and both believe that, and also the image factually is not X, still somehow the image is X. That makes no sense, and I would be very worried if any court were to make such a ruling, because once a court rules that an image no one believes to be real and that is also factually not real is yet to be treated legally as if it were real, all bets are off.

awnlee jawking ๐Ÿšซ

@DBActive

Creation and distribution of realistic images of a child in a sexually explicit situation harms the child depicted.

The man on the Clapham Omnibus certainly wouldn't want pictures of his kids used in that way. I believe, in the UK, privacy laws means the child's face cannot be used unpixelated without permission.

AJ

DBActive ๐Ÿšซ

@Switch Blayde

You're wrong. The statute was amended in 2022 to specifically deal with computer generated child porn.

Dominions Son ๐Ÿšซ

@DBActive

The statute was amended in 2022 to specifically deal with computer generated child porn.

And will likely be overturned when it gets to SCOTUS. They've already rejected extending the first amendment exception for child pornography to cover "virtual child porn".

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ

@Dominions Son

Very different SCOTUS, though. I have no confidence that the current court will respect precedent in general But, then, nor do I think they would necessarily overturn, just that I don't see a good way to predict this one this far out.

Switch Blayde ๐Ÿšซ

@DBActive

You're wrong. The statute was amended in 2022

Not me. The L.A. Times. I copied from their article dated March 7, 2024.

Joe Long ๐Ÿšซ

@Switch Blayde

Can simple nudity be classified as porn? I would think it needs to portray sex. I can see butts, boobs and even an occasional limp dick on broadcast tv.

AI can also be considered artwork. Would it be illegal for me to use paper and pencil to draw a nude depiction of a real person, using a photo as a model?

Replies:   Switch Blayde  Grey Wolf
Switch Blayde ๐Ÿšซ

@Joe Long

Can simple nudity be classified as porn? I would think it needs to portray sex.

I remember years ago a guy was charged with child porn because he had a nude photo of his kid in the bathtub or something like that. I don't know if it went to trial or, if it did, if he was convicted. I doubt it.

But the reason for my post wasn't about the law. It was about using AI.

Grey Wolf ๐Ÿšซ

@Joe Long

Can simple nudity be classified as porn?

In the US, 'simple nudity' is generally not 'porn' unless the pose is considered so sexualized as to make the photo one taken for erotic purposes. Photos of people (adults, kids, both) just standing around, playing, eating, whatever: pretty much legal. Someone (regardless of age) lying back on a bed, legs spread suggestively: probably porn.

For the quintessential example involving children, check the work of David Hamilton. His photography of naked children is legal in the US.

It's likely illegal in Canada, on the other hand.

Replies:   Dicrostonyx
Dicrostonyx ๐Ÿšซ

@Grey Wolf

As far as I know of there has never been a case of anyone being charged for ownership of Hamilton's books in Canada, though I haven't done a deep search. There have been two cases in the UK, though.

Case 1 (2005) involved a man with over 19,000 digital images of children including images from Hamilton. His defence was that all images were purchased from legitimate sites including WH Smiths and Waterstones, but charges were filed and he pleaded guilty.

Case 2 (2010) involved a man charged for possession of 4 physical books. The case was overturned on appeal and the judge blasted the Crown Prosecution Service saying that if they wanted to test the legality of the images they needed to go after publishers not individuals.

Replies:   awnlee jawking
awnlee jawking ๐Ÿšซ

@Dicrostonyx

I know of Brits who purchased Hamilton books from Amazon.co.uk. It would be weird if that were illegal.

AJ

Replies:   DBActive
DBActive ๐Ÿšซ

@awnlee jawking

Anyone who has a collection of Playboy (there are a lot of people who collected every issue) or Penthouse has a fairly large number of nude, often sexually provocative, images of underage girls, including by Hamilton. I doubt anyone is interested in prosecuting them. It's interesting that a number of the young models Hamilton used made credible claims of sexual abuse.
As was stated earlier, nude images of children do not violate US law unless lascivious and are available on the web, bookstores and in libraries. As far as I can tell any attempts to prosecute the photographers of "artistic" nudes have failed.

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ

@DBActive

There are also libraries which have every issue; no one has gone after them.

The flip side of this is that some attempts have been made to go after people both for sending and for receiving naked selfies of teenagers. By sending, I don't mean resending - there are at least been allegations made that a person sending a naked selfie of themself commits a crime in so doing and that having that picture on their phone is itself a crime even if it was never sent.

I suspect they won't hold up in court, but it's a place where the law is running headlong into 'unexpected consequences'. Redistributing a selfie without consent may be both wrong and possibly criminal, but it's not what CSAM laws were designed to deal with, and having a selfie of oneself is very much something those laws did not foresee.

Replies:   DBActive
DBActive ๐Ÿšซ

@Grey Wolf

The cases I have on teen and preteen "porn" homemade by themseves or other teens and preteens all involved more than nudity. All involved lascivious images - focus on genitals, masturbation , oral sex, and so on. All of them have been dealt with as juveniles.
The sharing of nudes alone has led to expulsion from the school or other non-judicial intervention.
That may not be true in other parts of the country.

Replies:   Grey Wolf
Grey Wolf ๐Ÿšซ

@DBActive

There have been cases of minors being placed on the sex offender registry for sharing nude selfies of themselves or others, though so far they've all been juvenile offenses. However, in at least one of the cases, the offender will be listed as a sex offender into his 40s.

There's a case where two girls wearing bras swapped selfies and were threatened with prosecution (but not prosecuted).

No dispute that there are legitimate and reasonable consequences. Expulsion seems reasonable in limited circumstances (non-consensual sharing, for instance, which might also warrant criminal and civil penalties).

Trying them as 'child porn', though (even when prosecuted as a juvenile) or putting someone on a decades-long sex offender registry seems excessive, though.

Replies:   Dominions Son
Dominions Son ๐Ÿšซ

@Grey Wolf

There have been cases of minors being placed on the sex offender registry for sharing nude selfies of themselves or others,

There are people on sex offender registries in some US states for urinating in public.

Joe Long ๐Ÿšซ

@Switch Blayde

They were charged with third-degree felonies under a 2022 Florida law that criminalizes the dissemination of deepfake sexually explicit images without the victim's consent.

So it's ok as long as you don't share share with anyone?

I've dabbled with one site where you can submit a pic and it very realistically reproduced the face, but won't generate a body that can be considered to be under 18. But really, there's very little difference between 14 & 18.

Replies:   Dominions Son  Grey Wolf
Dominions Son ๐Ÿšซ

@Joe Long

So it's ok as long as you don't share share with anyone?

It doesn't violate that one specific law if you don't share it. It could theoretically violate other laws.

But really, there's very little difference between 14 & 18.

The age of the victims appears to be irrelevant to the Florida law.

Grey Wolf ๐Ÿšซ

@Joe Long

Most 'deepfake' legislation requires the image to be distributed, and quite a bit of it requires some level of misrepresentation of the image as being that of the target.

That's not true of all of the laws, but for most of them, keeping the images to yourself is legally fine. Similarly, distributing the images by saying 'These are definitely not X, Y, or Z, nope, not way, not them at all' would also steer clear of the law.

DBActive ๐Ÿšซ

@Switch Blayde

Here's a summary of Williams:

Supreme Court upholds the pandering portion of child porn law against a First Amendment challenge


The Supreme Court reversed the Eleventh Circuit decision and upheld the constitutionality of the law by a 7-2 vote.



Writing for the majority, Justice Antonin Scalia explained that "offers to engage in illegal transactions are categorically excluded from First Amendment protection." He added that "offers to provide or requests to obtain child pornography are [also] categorically excluded from the First Amendment."



To Scalia it did not matter that someone could be criminally charged for bragging that they had child pornography when in actuality they did not possess such material. He analogized situations in which individuals may offer to sell illegal drugs even when they do not possess such contraband: "Offers to deal in illegal products or otherwise engage in illegal activity do not acquire First Amendment protection when the offeror is mistaken about the factual predicate of his offer."



Scalia rejects argument that law was overly broad


Scalia also emphasized that a successful overbreadth challenge requires a litigant to show that the law is substantially overbroad โ€” an effort he said that Williams failed to meet. A litigant must show more than "an endless stream of fanciful hypotheticals." Justice John Paul Stevens โ€” joined by Stephen G. Breyer โ€” wrote a short concurrence, emphasizing that the law should be interpreted to apply to defendants who act with a lascivious purpose.



Justice David H. Souter โ€” joined by Ruth Bader Ginsburg โ€” dissented. He criticized the majority for "undermin[ing] Ferber and Free Speech Coalition."



Souter reasoned that the PROTECT Act criminalized the very type of expression protected in Free Speech Coalition โ€” virtual child pornography that did not involve the actual use of minors or harm to children.



"The tension with existing constitutional law is obvious," Souter wrote. "Free Speech Coalition reaffirmed that nonobscene virtual pornographic images are protected, because they fail to trigger the concern for child safety that disentitles child pornography to First Amendment protection." He also warned that the decision "might have an unsettling significance well beyond the subject of child pornography."



In Free Speech Coalition, the Court had ruled that that virtual pornography did not involve the same harms to children as pornography involving real children. This distinguished it from New York v. Ferber (1982), in which the Court had justified a child pornography exception to the First Amendment's broad free speech protections based on actual harm to children.

LupusDei ๐Ÿšซ
Updated:

@Switch Blayde

Behind a Secretive Global Network of Non-Consensual Deepfake Pornography

Clothoff, DrawNudes, Nudify, and Undress all allow users to "undress" photos of any woman using AI, generating non-consensual intimate images of them. In doing so, they have racked up tens of millions of website visits in just the past four months. In the dozens of Telegram groups we found set up by these websites, some had as many as 800,000 followers.

...just leaving it here, someone may be intrigued. The link is to Bellingcat. The article is mostly concerned with sneaky ways the payments are concealed.

Back to Top

Close
 

WARNING! ADULT CONTENT...

Storiesonline is for adult entertainment only. By accessing this site you declare that you are of legal age and that you agree with our Terms of Service and Privacy Policy.