A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
I mostly agree with you, but a counterpoint:
Downloading and possession of CSAM seems to be a common first step in a person initiating communication with a minor with the intent to meet up and abuse them. I’ve read many articles over the years about men getting arrested for trying to meet up with minors, and one thing that shows up pretty often in these articles is the perpetrator admitting to downloading CSAM for years until deciding the fantasy wasn’t enough anymore. They become comfortable enough with it that it loses its taboo and they feel emboldened to take the next step.
CSAM possession is illegal because possession directly supports creation, and creation is inherently abusive and exploitative of real people, and generating it from a model that was trained on non-abusive content probably isn’t exploitative, but there’s a legitimate question as to whether we as a society decide it’s associated closely enough with real world harms that it should be banned.
Not an easy question for sure, and it’s one that deserves to be answered using empirical data, but I imagine the vast majority of Americans would flatly reject a nuanced view on this issue.
The problem is empirical data cannot be morally or ethically found. You can’t show a bunch of people porn and then make a statistical observation of whether those shown child porn are more likely to assault children. So we have to go forward without that data.
I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even “ethically produced CSAM.”
Can we look at trends between consenting adults (who are likely watching porn of real people by the way) as an indicator of what pedophiles will do? I’m not so sure. It’s not like step sibling sex is suddenly through the roof now with it being the “trend” in porn.
Looking specifically at fake rape porn maybe and seeing if it increases rates of rape in the real world might be a better indicator.
That’s fair. I tried to make clear that my interpretation is not in any way scientific or authoritative. Better correlations are probably possible.
ETA on further thought: I wonder if prevalence of incest porn has had an effect on actual incest rates. That might be a much closer correlation due to the similar social taboo. But I’m not sure we have good data on that, either.
Do you think people like Andrew Tate have caused more rapes to occur? Like do you think his rhetoric encourages a rapist mindset in his listeners?
True, it wouldn’t be ethical to conduct an experiment, but we can (and probably do) collect lots of observational data that can provide meaningful insight. People are arrested at all stages of CSAM related offenses from just possession, distribution, solicitation, and active abuse.
While observation and correlations are inherently weaker than experimental data, they can at least provide some insight. For example, “what percentage of those only in possession of artificially generated CSAM for at least one year go on to solicit minors” vs. “real” CSAM.
If it seems that artificial CSAM is associated with a lower rate of solicitation, or if it ends up decreasing overall demand for “real” CSAM, then keeping it legal might provide a real net benefit to society and its most vulnerable even if it’s pretty icky.
That said, I have a nagging suspicion that the thing many abusers like most about CSAM is that it’s a real person and that the artificial stuff won’t do it for them at all. There’s also the risk that artificial CSAM reduces the taboo of CSAM and can be an on-ramp to more harmful materials for those with pedophilic tendencies that they otherwise are able to suppress. But it’s still way too early to know either way.
Perhaps. But what about when they can’t tell the difference between real and virtual? It seems like the allure of all pornography is the fantasy, rather than the reality. That is, you may enjoy extreme BDSM pornography, and enjoy seeing a person flogged until they’re bleeding, or see needles slowly forced through their penis, but do you really care that it’s a real person that’s going to end the scene, take a shower, and go watch a few episodes of “The Good Place” with their dog before bed? Or is it about the power fantasy that you’re constructing in your head about that scene? How important is the reality of the scene, versus being able to suspend your disbelief long enough to get sexual gratification from it? If the whole scene was done with really good practical effects and CG, would your experience, as a user–even if you were aware–be different?
I think it would be ethical for researchers to go onto the boards of these already-existing CP distribution forums and conduct surveys. But then the surveyors would be morally obligated to report that board to the authorities to get it shut down. Which means that no one would ever answer surveyor questions because they knew the board would be shut down soon so they’d just find a new site ugh…
Yeah nvm I don’t see any way around this one
To expound on this: prior to this point, the creation of CSAM requires that children be sexually exploited. You could not have CSAM without children being harmed. But what about when no direct harms have occurred? Is lolicon hentai ‘obscene’? Well, according to the law and case law, yes, but it’s not usually enforced. If we agree that drawings of children engaged in sexual acts aren’t causing direct harm–that is, children are not being sexually abused in order to create the drawings–then how much different is a computer-generated image that isn’t based off any specific person or event? It seem to me that, whether or not a pedophile might decide that they eventually want more than LLM-generated images is not relevant. Treating a future possibility as a foregone conclusion is exactly the rationale behind Reefer Madness and the idea of ‘gateway’ drugs.
Allow me to float a second possibility that will certainly be less popular.
Start with two premises: first, pedophilia is a characteristic that appears to be an orientation. That is, a true pedophile–a person exclusively sexually attracted to pre-pubescent children–does not choose to be a pedophile, any more than a person chooses to be gay. (My understanding is that very few pedophiles are exclusively pedophilic though, and that many child molesters are opportunistic sexual predators rather than being pedophiles.) Secondly, the rates of sexual assault appear to have decreased as pornography availability has increased. So the question I would have is, would wide availability of LLM-generated CSAM–CSAM that didn’t cause any real, direct harm to children–actually decrease rates of child sexual assault?
With regards to your last paragraph: Pedophiles can indeed by straight, gay or bi. Pedophiles may also not become molesters, and molesters of children may not at all be pedophilic. It’s seems you understand this. I mentioned ITT that I read a newspaper article many years ago that was commissioned to show the access to cp would increase child abuse, it seemed to show the opposite.
If persons could use AI to generate their own porn of their own personal fantasies (whatever those might be) and NOT share that content what then? Canada allows this for text (maybe certain visuals? Audio? IDK). I don’t know about current ‘obscene’ laws in the USA, however, I do recall reading about an art exhibit in NY which featured an upside down urinal that was deemed obscene, than later deemed a work or art. I also recall seeing (via an internet image) a sculpture of what seemed to be a circle of children with penises as noses. Porn? Art? Comedy?
My understanding was that ‘pure’ pedophiles–ones that have no interest at all in post-pubescent children or any adults whatsoever–tend to be less concerned with sex/gender, particularly because children don’t have defined secondary sex characteristics. I don’t know if this is actually correct though. I’m not even sure how you could ethically research that kind of thing and end up with valid results.
And honestly, not being able to do solid research that has valid results makes it really fuckin’ hard to find solutions that work to prevent as many children from being harmed as possible. In the US at least research about sex and sexuality in general-much less deviant sexualities–seems to be taboo, and very difficult to get funding for.
Hard to say. I generally agree with what you’ve said though. Also, lots of people have other fantasies that they would never enact in real life for various reasons (e.g. it’s unsafe, illegal, or both; edit: I should also absolutely list non-consensual here). I feel like pedophilia isn’t necessarily different.
However part of the reason loli/whatever is also illegal to distribute (it is, right? I assume it is at least somewhere) is that otherwise it helps people facilitate/organize distribution of real CSAM, which increases demand for it. That’s what I’ve heard at least and it makes sense to me. And I feel like that would apply to AI generated as well.
It’s obvs. very hard to get accounts of what pedophiles are doing; the only ones that you can survey are ones that have been caught, which isn’t necessarily a representative sample. I don’t think that there are any good estimates on the rate of pedophilic tendencies.
From a cursory reading, it looks like possession and distribution are both felonies. Lolicon hentai is pretty widely available online, and prosecutions appear to be very uncommon when compared to the availability. (Low priority for enforcement, probably?)
I’m not sure that increasing the supply of CSAM would necessarily increase demand for CSAM in people that aren’t already pedophiles though. To put it another way, I’m sure that increasing the supply of gay porn would increase consumption of gay porn, but I am pretty sure that it’s not going to make more people gay. And people that aren’t gay (or at least bi-) aren’t going to be interested in gay porn, regardless of how hard up (heh) they might be for porn, as long as they have any choices at all. There’s a distinction between fetishes/paraphilia, and orientations, and my impression has been that pedophilia is much more similar to an orientation than a paraphilia.
No, but allowing people to organize increases demand because then those who would want CSAM have a place to look for it and ask for it where it’s safe for them to do so, and maybe even pay for it to be created. It’s rather the other way around, the demand increases the supply if you want to put it like that. I’m not saying lolicon being freely available turns people into pedophiles or something like that, at all.
I guess where I come down is that, as long as no real people are being harmed–either directly, or because their likeness is being used–then I’d rather see it out in the open than hidden. At least if it’s open you can have a better chance of knowing who is immediately unsafe around children, and easily using that to exclude people from positions where they’d have ready access to children (teachers, priests, etc.).
Unfortunately, there’s also a risk of pedophilia being ‘normalized’ to the point where people let their guard down around them.
Yeah I agree.
But this is like the arguments used to say that weed is a “gateway drug” by talking about how people strung out on harder drugs almost always have done weed as well, ignoring everyone who uses only weed. But this is even hazier because we literally have no real idea how many people consume that stuff but don’t ‘escalate’.
I remember reading once in some research out of Japan that child molesters consume less porn overall than the average citizen, which seems counter-intuitive, but may not be, if you consider the possibility that maybe it (in this case, they were talking primarily about manga with anime-style drawings of kids in sexual situations) is actually curbing the incidence of the ‘real thing’, since the ones actually touching kids in the real world are reading those mangas less.
I’m also reminded of people talking about sex dolls that look like kids, and if that’s a possible ‘solution’ for pedophiles, or if it would ‘egg on’ actual molestation.
I think I lean on the side of ‘satiation’, from the limited bits of idle research I’ve done here and there. And if that IS in fact the case, then regardless of if it grosses me out, I can’t in good conscience oppose something that actually reduces the number of children who actually get abused, you know?
It’s less that these materials are like a “gateway” drug and more like these materials could be considered akin to advertising. We already have laws about advertising because it’s so effective, including around cigarettes and prescriptions.
Second, the role that CP plays in most countries is difficult. It is used for blackmail. It is also used to generate money for countries. And it’s used as advertising for actual human trafficking organizations. And similar organizations exist for snuff and gore btw. And ofc animals. And any combination of those 3. Or did you all forget about those monkey torture videos, or the orangutan who was being sex trafficked? Or Daisy’s Destruction and Peter Scully?
So it’s important to not allow these advertisers to combine their most famous monkey torture video with enough AI that they can say it’s AI generated, but it’s really just an ad for their monkey torture productions. And they do that with CP, rape, gore, etc, too.
People, please don’t just downvote with no comment. Why is this being downloaded? The comparisons to advertisements have validity. And, if you disagree, be productive and tell us why.
Because a huge percentage of Lemmy is sexist and I am openly a woman. You’ll know because this comment will get nuked also.
Fellow female here. I support your right to contribute on Lemmy.
Why should that be a question at all? If it causes harm, ban it. If not, don’t. Being “associated with” should never be grounds for a legal statute.
generally a very good point, however i feel it’s important to point out some important context here:
the pedophiles you’re talking about in your comment are almost always members of tight knit communities that share CSAM, organize distribution, share sources, and most importantly, indulge their fantasies/desires together.
i would think that the correlation that leads to molestation is not primarily driven by the CSAM itself, but rather the community around it.
we clearly see this happening in other similarly structured and similarly isolated communities: nazis, incels, mass shooters, religious fanatics, etc.
the common factor in radicalization and development of extreme views in all these groups is always isolation and the community they end up joining as a result, forming a sort of parallel society with it’s own rules and ideals, separate from general society. over time people in these parallel societies get used to seeing the world in a way that aligns with the ideals of the group.
nazis start to see anyone not part of their group as enemies, incels start to see “females” instead of women, religious fanatics see sinners…and pedophiles see objects that exist solely for their gratification instead of kids…
I don’t see why molesters should be any different in this aspect, and would therefore argue that it’s the communal aspect that should probably be the target of the law, i.e.: distribution and organization (forums, chatrooms, etc.)
the harder it is for them to organize, the less likely these groups are to produce predators that cause real harm!
if on top of that there is a legally available outlet where they can indulge themselves in a safe manner without harming anyone, I’d expect rates of child molestation to drop significantly, because, again, there’s precedence from similar situations (overdoses in drug addicts, for example)
i think it is a potentially fatal mistake to think of pedophiles as “special” cases, rather than just another group of outcasts, because in nearly all cases of such pariahs the solutions that prove to work best in the real world are the ones that make these groups feel less like outcasts, which limits avenues of radicalization.
i thought these parallels are something worth pointing out.