Hi Aella. I'm a fan not because of your online activities, but because I appreciate intelligent open minded women who don't succumb to societal norms blindly.
I understand where you're coming from and believe your intention is good. But I agree with the critics on this one (rare) that this is actually a bad idea. And it does mostly revolve around this section: People Want To Try Out Things They See In Porn.
Flooding the Internet with it, or even just making it legal and widely available would in fact normalize it. Which would in fact have the effect of making it more prevalent. Individuals who would never have had an interest or watched it before would end up doing so. How many can be debated but you'd have a very difficult time convincing anybody that there would not be a noticeable quantity. And this effect has been noticed before.
I've been involved in the BDSM community and used to be very active in local groups. As BDSM became more popular in media and porn more people started showing up. But also it was extremely obvious that more people were engaging in it on their own in private due to the increased popularity and exposure. This was evident because porn and media often depicts BDMS very, very wrong and/or poorly. It often shows types of play which are considered edge and are dangerous and people actually in the community put in great effort to learn how to do safely if they engage in it at all. But random people just seeing it done in popular media and porn started trying it at home without any knowledge or clue of the dangers. You know what they say, "don't try this at home" - well it applies to BDSM too if you don't have the skills/experience. This more prevalent engagement by inexperienced people obviously lead to problems and injuries and then that lead to blowback on the BDMS community because society does not like things outside the norm and they blame everything they can on anyone they can.
The person you quote also mentions choking and anal play. I can also attest from personal experience that those have indeed become much more normalized and common over the last couple of decades due to their regular inclusion in porn.
Also, about your reference to gay porn, uh straight guys don't watch gay porn. They also don't try to masturbate to gay porn. But I'd bet very good money that if 33% of all "straight" porn actually had bi-male (aka gay) porn mixed in and the guys watching it could not skip it for whatever reason over a period of a couple of decades you'd end up seeing a whole lot more bisexual or bi-curious guys I'd bet. Exposure does normalize things and repeated exposure particularly during arousal will form mental connections over long periods of time.
I even have some personal experience with this. There is something I absolutely was not interested in, not repulsive but just no interested. But I had multiple girlfriends over numerous years who ended up liking similar things and through repeated exposure I did in fact end up growing an interest in it to my great surprise.
This is a real effect, it does happen. You can debate the degree if you'd like. But it ABSOLUTELY WOULD lead to some people ending up being interested in CP that would not have been otherwise. There is no avoiding that simple fact. Given the nature of CP in particular even a relatively small quantity is not acceptable.
This is a complicated subject and as mentioned above I do understand what you're trying to accomplish. And it is also true that completely banning things and making things illegal absolutely does not stop them and can sometimes make them more interesting or desirable to some people due to psychology. Both are true.
But flooding the Internet with CP would almost certainly cause more harm than good. I'm a very open minded person but things that harm children are not okay.
This is a very complicated subject because human psychology is very complicated, messy and varies widely. Most humans are reasonably good given the chance, but there are always (sadly) going to be some bad people out there that will do bad things no matter what.
Does having people engage in simulated abuse (simulated rape, choking, hitting, beating, et cetera) with a consenting adult partner make these people more likely to engage in actual abuse?
> This is a real effect, it does happen. You can debate the degree if you'd like. But it ABSOLUTELY WOULD lead to some people ending up being interested in CP that would not have been otherwise. There is no avoiding that simple fact. Given the nature of CP in particular even a relatively small quantity is not acceptable.
We should look at the net effect here. In other words, is the net effect resulting in less children being abused, no change, or more children being abused. If this results in an extra couple of children being abused but in a much greater children than that being saved from abuse, then this would still be a tradeoff worth making, no, since the net effect would still result in less child abuse?
You are not going to sleep well at night, knowing you would have indirectly allowed one child to be offended by a pedophile. You don't even care if that happens. What if it was your child, or a child of your friend? Now, you'd care, right?
This would be a decision society / our government as a representation of society would have to make. If the benefit vastly outweighed the harm then maybe. Like if there was strong statistics that showed that something like this did cause 1 additional bad outcome per 100 bad outcomes it prevented. But - it would be extremely difficult if not impossible to ever gather such statistics. And without such statistics the decision should always be to protect the children.
>And without such statistics the decision should always be to protect the children
But it's begging the question to assume that the status quo is what "protects children" in the absence of statistics.
I legitimately don't understand what you mean by "If we can't prove definitively whether or not AI CP protects or harms kids, we have to default to the thing that protects kids". How do we know what that is?
I think BDSM and child molestation are two incomparable actions. One is not illegal or immoral, and is something you'd conceivably try if you were aware and curious enough. The second comes at a staggering cost to the abuser, both in terms of morality and pure self-interest.
People aren't going to be trying out committing heinous crimes just because they were exposed to it during porn. They might try out consensual choking, anal, spanking, or BDSM, but not this.
Further, even if legalization of AI CSAM increased the prevalence of pedos by 50%, it still seems obviously true that the increase would still be offset by the counterpressure of easily accessible AI material heavily suppressing demand for actual children. Conventional wisdom circulating around NoFAP is that porn addicts lose the motivation to go after real women because their sexual needs were so saturated by it, not the reverse!
You make a lot of good points, and it seems right that normalizing behaviors can result in an uptick of interest in engaging in them because of the examples you gave.
Then I think about how common it is to see serious crimes like murder in popular movies and tv series. It probably has resulted in more people being interested in committing murders, but I’m not sure how many people have actually committed a murder as a direct cause (I’ve only heard about the slenderman case, but I wouldn’t be surprised if there are more examples). Murder is still very stigmatized with severe social and legal repercussions, and basic human morality causes most of us to be revulsed by it. This to me shows that it’s likely possible that fictional depictions of something can become prevalent without influencing a concerning amount of people to seriously perform those actions. I think the harms of making content that depicts fictional harmful behaviors can be largely offset by enforcing severe consequences for actually doing those behaviors
Since pedos and sexually abusing children are so reviled like murder (maybe even more despised as I’ve seen some support for murder like when it involves a parent killing their child’s abuser, executing criminals, and Brian Thompson), I find it hard to believe that making AI CP prevalent will likely result in a noticeable increase in actual crimes. Although, I’m not sure if it will probably cause a reduction in crimes.
If only a small minority is driven to harm children from fictional content, it makes sense why it should still be banned or at least restricted. But what about other things like some violent video games, movies, etc. with realistic simulations of dangerous and harmful actions that make them look fun?
I feel viscerally opposed to making AI CP something that can be easily accessed, but I don’t feel the same way about things like violent movies for some reason. I feel logically inconsistent if I reject one but not the other, but maybe there’s a core difference I haven’t noticed.
Basically, I don’t think AI CP will cause a noticeable increase in actual abuse like how the popularization of bdsm in films resulted in more people being doing it because the level of stigmatization and consequences seem way higher for the former. It definitely seems safer to keep AI CP illegal/hard to access because there’s probably a non-negligible chance it would make some individuals do crimes. I agree that we probably shouldn’t do it and instead find less risky solutions to combat child abuse.
Have you heard of mass shootings? The number of those drastically increased over the past couple of decades as they became more widely covered in media and pop-culture. The individuals perpetrating those seem very much to be motivated by media coverage of previous mass shootings. So yes, I would in fact say this effect even extends to murder sadly.
Regarding mass shootings I've long suspected that if anyone perpetrating a mass shooting was completely wiped from society. They picture and name never to be shown or spoken again. Interviews discussing them never to be done. Etc. Basically the exact OPPOSITE of the coverage they get today - I'd bet substantial money that the quantity of future mass shootings would drop over time. However this would obviously conflict with the 1st amendment and there are also transparency concerns when it comes to governments and law enforcement.
Does engaging in rape fantasy roleplay and other simulated abuse (choking, hitting, beating, et cetera) make one more likely to commit these acts in real life?
Also, I think that the burden of proof here should be on those who advocate criminalization. After all, we don't know what effect legalization will have on children, but we do know that it would give MAPs more harm-free options, especially when combined with realistic child sex dolls/robots, thus likely ensuring that less MAPs feel compelled to get castrated (and castration is a very significant harm).
Not being a woman I can only comment on this anecdotally. From what I gather women in more recent years say that cases of their male partner choking them during sex have increased in recent years.
I do know first hand the number of women wanting to be choked during sex certainly has increased over the last couple of decades. Even seemingly normal vanilla women engaging in vanilla sex. Though I would not personally consider choking to be a vanilla activity.
And, no. Your take here is completely backward. We don't help out the MAPs and make their lives better just because we are unsure what effect it will have on the children. FUCK THAT.
When it comes to protecting children I'm going to side with protecting the children. Frankly I'm not concerned about the "MAPs". The goal should always be to minimize harm to the children (presuming that is actually the goal and not surveillance or other agendas wrapped in CP). If a MAP is so worried that they might do something that they are considering castrating themselves - personally I'd suggest they go further... People like that are really not a benefit to society.
I'll be blunt. If people like (actual) psychopaths, and such were completely removed from society and future generations the human race would be vastly better off. These individuals are a blight on everyone. Its likely that the VAST majority of seriously negative things that happen in the world and effect everyone are caused by this tiny percentage of people.
Yeah but like, pedophiles and psychopaths *aren't* going to "remove themselves from society", so it's just wishful thinking to fantasize about pedophiles castrating themselves.
The principled argument is "Criminalizing ANYTHING requires a positive justification". There are various formulations of this, from "Everything which is not forbidden is allowed" to "it neither picks my pocket nor breaks my leg", but in general, if you're going to say "don't do this thing, or else we're going to throw you in a cage/kill you", you better have a very strong justification for "don't do this thing". In the case of murder, or rape, that's pretty easy to justify. All this argument asks is you similarly demonstrate consuming AI CP has a similar justification for criminalization.
Porn websites themselves admit to trying to expand the sexual interests of their user bases. They truly do manipulate the viewer down different pathways. It’s like Aella is somehow separating one type of porn from all the rest, but in reality, there are no such separations on tube sites (where most people watch) and that is INTENTIONAL. There are cross category suggestions planted all over the place.
You've "seen the world of BDSM". Okay. So you're a tourist with a tiny amount of experience. My perspective is from someone who spent a massive amount of time and effort as an organizer and a leader. Running groups and organizing events. I talked to literally thousands of individuals personally on these subjects. My experience started in the 90s before the Internet had an impact on such things and went forward from there, so I got to see the effect that publicity and popularization in pop-culture had. So I'm sticking with my opinion on the subject which is derived from a whole lot of first hand experience spanning a meaningful chunk of time.
Also, you're choice of example case is incredibly poor. The Handmaids tale? That's an extreme case that actually involves all of society and isn't an inter-personal kinda thing.
Here's an even more controversial take - Child porn laws are not about protecting children, but protecting social norms. If real children need to continue being abused so collectively we feel like we are in an environment where children are safe, then that's what is going to happen.
Look into what Apple tried to do with their storage scanning software for "CSAM". The government and corporations will always try to take your freedom away in the form of "safety". I highly recommend reading into the bills that get passed that dictate these laws. The goal of corporations is to maximize profit; the goal of governments is to maximize power.
That seems to sum up the way modern society is headed, with every single cause someone claims to care about: if the ship needs to sink so we can feel like we’re saving it, let it sink.
As a private teacher, I ended up in situations where I had to accompany a student to the police in order to report CSAM twice. Both times, I was met with the most indifferent cops on earth who really didn't feel like having their Miller time ruined by how much work it would be to take this seriously.
Meanwhile, I turn on the news and CSA is the worst crime in history of crimes and justifies everything, in fact my government would like to read my WhatsApp messages this very minute.
I have become so blackpilled on this, my firm belief is now that less than 1% of men who claim to care about CSA(M) do actually care, the rest are looking to protect the last bastion of lynch law, where calling for torture and castration is still fine and dandy. In fact you can still effectively boost your social status by being the person "wanting to be the most horrible towards pedos" in many male-dominated circles.
I work with sexual assault victims and offenders. You are 100% right. I hope I didn't misread your comment but I'll do my best here. The government does not care about victims. More offenders = more free jail labor. It is simply a fact. The government imprisoned people for smoking marijuana. Imagine having an ounce of marijuana and going to jail for years because of it. Our government does not care about victims nor the rights of anyone. They haven't even released the entire Epstein files. This tells you everything. They gave Epstein a reduced sentence, a sentence lower than many sentences given to people charged with possession of marijuana.
Interestingly enough, I realized that the majority of people calling for wood chippers and lynching are further fueling the problem. Another problem is that our government does nothing to rehabilitate this people and prevent children from being abused. They protected Epstein for decades.
I was SA'd when I was a child, and seeing the generation Z on social media say that 20 year olds who date 16 years olds should be thrown in a wood chipper just crushes all hope of solving this problem. Many people do not realize what the victims go through. Exclaiming pedophilia and advocating for lynching at any time possible just softens people to the outright horrible crimes committed by pedophiles, aka true ones and not just legal individuals looking for relationships or love. It is like calling every single person a racist- it defeats the original purpose of the movement.
And as someone who is a sex positive liberal who believes in equal rights, we need to offer solutions instead of letting our emotions get to us. Solutions work, angry comments do not. But as time goes on, I will continue helping victims and rehabilitating offenders. We need to work on our society.
Are you yourself not perpetuating this problem by using words like "pedophiles" to mean "child molesters" while claiming working in a field where you should be aware of this distinction, or did you just use the words like you did because it's contemporary to do so (despite the fact you're lamenting the current zeitgeist around the issue?)
Are you comparing teenagers having nudes of each other to actual CSAM? Or do you mean you discovered multiple times that elementary schoolers' parents were recording them being sexually abused and uploading to the internet, and the cops didn't care?
I mean two incidents of being extorted by an adult ex-boyfriend with imagery that was (in one case) recorded under highly questionable circumstances (I also keep hearing that teenagers cannot consent anyway, but interestingly enough, that doesn't keep the police from going all "you silly girl, why did you let him film you, huh?"), and in another case was recorded under threat of violence.
Police response in the first case was "I mean, you could just block your ex, because clearly all he wants is to get back in contact with you", and in the second case (where the parents threatened to kill my student for "being a slut"), they at least helped her retrieve her things from her parents' home safely so she could move in with another boyfriend (what a great solution), while heavily discouraging her from filing a report against her ex for child pornography, as that would mean "many other people would have to look at these images to deal with them legally, and would you really want that? Just move on with your life". I have never seen or heard from this student ever again, and for all I know, she could be fucking dead.
I also want to mention that in Germany, teachers can lose their jobs for bringing child pornography to the attention of the police, since if they ever happen to use their own devices to secure the evidence (because maybe they gain access to a class group chat where such material is being shared), that already counts as ownership of CSAM and carries a minimum sentence of 12 months - and any sentence above 11 months means losing your civil servant status. Also, child pornography is not a criminal offence prosecuted only upon application by the victim ("Antragsdelikt"), so the police's choice to not act on their own was very much illegal in both cases.
So, is this as bad as "actual" CSAM? No, of course not. Are theses cases representative of what is typically understood by "child pornography" in the vast majority of cases? Yes, yes they are. And did the action match all the big talk? You be the judge.
This isn’t just a debate or an opinion, this is about protecting the innocence of children. What’s being discussed here is deeply disturbing, and I feel a moral obligation to speak up
Reading even the first few lines of this post deeply unsettled me, not just because of the subject matter, but because of the attempt to intellectualize and rationalize something so horrific. The very idea of entertaining or defending the production of AI-generated child pornography, regardless of the argument, is appalling.
This conversation doesn’t just cross a line, it risks normalizing behavior that is psychologically and spiritually devastating to victims. It is not a debate. It is a violation of conscience, and of basic human decency. No technology, no theoretical framework, no “greater good” should ever be used to justify or soften the perception of child abuse in any form.
Children are the most vulnerable, and predators have always taken advantage of that. Watching content that portrays child harm "AI or not", implies desire. A normal, compassionate human being should feel disgust, rage, and the instinct to protect, not entertain nuance in such matters.
I believe content like this doesn’t just alienate responsible readers, it dangerously blurs ethical boundaries society should be reinforcing, not debating.
You’ve lost a Subscriber. And frankly, I believe this kind of conversation warrants serious reflection, not just from readers, but from the platform itself. Silence and neutrality are not options when it comes to the protection of children.
Deeper in the article, Aella explains that pedophiles who want access to child porn must abuse children themselves and post photos of that abuse to forums. This situation is unspeakably awful, in that it amplifies the incentive to abuse children, getting you access to a community of people who share your inclination to moral atrocity and are only too happy to provide you with more fuel. Thus far more children are abused. This is an unacceptable state of affairs. If AI child porn would put a stop to that -- I don't believe Aella is crazy for thinking it might -- then I can't oppose it on principle.
This kind of conversation should never exist in the first place and there are some lines that should never be crossed and this is one of them. The very existence of this topic, and any attempt to intellectualize or justify child sexual content , AI-generated or not is deeply disturbing.
Children are the most vulnerable members of our society. They deserve our full protection, not to be used in theoretical debates that risk normalizing or enabling predatory behavior in any form.
The idea that creating artificial content to "reduce harm" is somehow acceptable, ignores a fundamental truth: if someone is seeking that kind of material, AI or otherwise, they already pose a danger. And introducing such content doesn’t neutralize the threat, it feeds it.
This isn’t just a moral issue, it’s a human one. There should be zero tolerance, and the law should reflect that with severe, public consequences that deter anyone who even thinks of crossing that line.
Some topics should never be normalized, and this is one of them. Full stop.
Our greatest priority here should be to make it so that pedophiles are not abusing children. There is evidence that pedophiles are abusing more children because abusing children gets them access to vaults of CSAM. We want to make that stop, and to do that, we must discuss ways of stopping it. That requires a discussion on this topic, otherwise things will never change. Banning all discussion on this topic for fear of "normalizing" an intellectual approach to these matters is not productive if we are seeking to change the current state of affairs. And I don't think the current state of affairs is acceptable.
I hear the intention behind seeking solutions, but we must be extremely clear, there is no form of child abuse that should ever be discussed as “strategic” or “functional.” The very suggestion that AI-generated content could reduce harm by catering to the same twisted desires is deeply dangerous. It doesn’t solve the problem, it feeds it.
This isn't about fear of debate. It's about preserving the most sacred boundary we have as a society: the innocence of children. When we even entertain the idea of synthetic child abuse as a solution, we desensitize people. We shift the moral line. And history has shown that once that line shifts, it's nearly impossible to pull it back.
Jesus said it best: "If anyone causes one of these little ones to stumble, better for him to tie a millstone around his neck and be thrown into the sea." That’s how serious this is.
There are topics that should never feel safe to explore. This is one of them. People shouldn’t feel comfortable even thinking about children in that way, let alone debating how to satisfy that urge with AI.
We need moral courage, not moral compromise. This conversation doesn’t lead us forward, it takes us somewhere no human being should want to go.
Did you really just invoke Jesus here lol. Might want to refocus on turn the other cheek, cast the first stone, forgiveness, etc. Also you might want to think long and hard about the word harm and who is doing it, generally it's society revictimizing.
Jesus didn't say "Go put a millstone around your neighbors neck and kill him because he doesn't walk his kids on a leash and let's them eat ultra processed food".
Yes, I invoked Jesus not as a call to literal violence, but to highlight the gravity with which true morality treats the protection of children. His words weren’t about junk food or parenting styles. They were about people who prey on the innocent, and how seriously we should take that.
You want to talk about “harm” and “revictimization”? Then recognize that legitimizing the simulation of child abuse even through AI is a form of harm. It validates the desire. It shifts cultural perception. And it erodes the very instincts that make us recoil at evil.
This isn’t about forgiveness. It’s about refusing to let moral confusion masquerade as compassion. I don’t need to “cast the first stone”, I just won’t stand by while people try to sanitize the unspeakable in the name of progress.
Some things are wrong not because they result in harm, but because they are a betrayal of who we are at our core. That’s not religious dogma. That’s human decency.
Well, how is it going? Are people being deterred? Is it a good thing that we're creating a structure where people are highly incentivized to remain private about abnormal desires and never seek help? And what do you care more about? Saving the children, or being the "look at me, how I am pretending to be saving all the children"-guy?
We all know the earthly justice system is flawed, a "kangaroo court," if you will. But under divine law, the truth is absolute. Imagine if this happened to your child, the pain and anger would justify any action you take in defense of them.
We should focus more on that higher moral law than on the injustices of these human-made courts. When it comes to matters of true justice, divine law reigns supreme, and that’s what we ultimately need to be concerned about.
Just a reminder, God has handled it before. Let’s not forget Sodom and Gomorrah, the Passover during the exodus from Egypt, Babylon… just to name a few. These moments are reminders of how God deals with corruption and evil when it festers unchecked.
But it’s not just about waiting on divine justice, it’s also about taking action in the face of wrong. Just like those before us, we have a responsibility to speak up, stand firm, and protect what is good and right. Trusting God doesn’t mean staying silent, it means being willing to act while holding faith in the bigger picture.
Therefore, let’s make more absurd laws that can only be enforced arbitrarily, thus ensuring the justice system can’t be anything but a kangaroo court. It won’t help abused children one iota, but someone will be hurt. That’ll show them how much you care about the children.
Would you rather have a world with less real abuse but more fake AI porn, or a world with more real abuse and less fake AI porn?
According to this line of argument, you are *literally causing* more abuse to happen.
I get that you don't believe allowing fake AI porn would decrease the amount of real abuse, but for the sake of argument, *if* it did, would you allow it?
I understand the point you’re raising, it’s a classic moral dilemma: “Would you allow a lesser evil to prevent a greater one?” But here's the problem with that line of reasoning:
It assumes that AI-generated child abuse content actually reduces real-world abuse, which, to my knowledge, has no solid evidence and is a dangerous assumption to make. Without clear proof, using it as a justification becomes highly irresponsible.
Even if hypothetically, such content reduced abuse, it comes at the cost of normalizing child exploitation in any form. That normalization could lead to more harm over time by blurring ethical boundaries, desensitizing society, and potentially increasing demand for the real thing.
There are some lines that, as a society, we must never cross, not because it’s convenient, but because they uphold our humanity. Child abuse, even simulated or “fake,” isn’t just a functional issue, it’s a moral one.
Some things aren’t meant to be optimized or compromised they’re meant to be condemned without exception.
Yes it relies on the assumption that it actually reduces real-world abuse. That's the whole point. To your second point, if it in fact increases real-world abuse then of course it's a bad idea. The entire premise is that it reduces real-world abuse in the short and long term, otherwise it's just a completely dumb idea.
Then you go on to say that even if it does reduce real-world abuse in the short and long term, we still shouldn't do it, to protect society or something. Let me just point out how insane and morally reprehensible this view is: you're willing to condemn more innocent children to unnecessary abuse, for no real reason.
Let’s be clear: your argument hinges entirely on a hypothetical, that AI-generated child abuse content might reduce real-world abuse. But there is no credible evidence for that. None. And until there is, building a moral defense on that fantasy is not only reckless, it's dangerous.
Even entertaining the idea that we should create synthetic versions of one of the most horrific crimes imaginable in order to potentially reduce it is a slippery slope to moral collapse. It’s not just about whether it works, it’s about what we become when we justify evil for the sake of utility.
Because even if, in theory, it “worked,” it would normalize deviant behavior, blur ethical boundaries, and create psychological permission structures that do more damage over time. We’re not just preventing harm, we’re shaping conscience. And when you shift that conscience, you shift culture. And when culture shifts, the unthinkable becomes thinkable.
Some lines are not meant to be debated, they’re meant to be defended, absolutely, unapologetically, and at cost.
Children are not chess pieces. You don’t sacrifice one to maybe save another. That’s not strategy, that’s savagery dressed as reason. The fact that this argument even needs to be refuted is a sign of how far some are willing to drift from decency in the name of intellect. There’s nothing smart about surrendering your soul.
We of course to this, every single day, with the distribution of limited resources. For instance, Tuberculosis is a curable disease over a million people a year (among them almost certainly tens of thousands of children) die from each year. If we as a society choose to spend money on anything else, even researching childhood cancers, we're sacrificing the TB kids to save the cancer kids. Obviously point of fact is we're sacrificing them for something much less worthy and more stupid, but it really drives home how worthless this moral grandstanding is.
"My favorite anecdote along these lines comes from a team of researchers who evaluated the effectiveness of a certain project, calculating the cost per life saved, and recommended to the government that the project be implemented because it was cost-effective. The governmental agency rejected the report because, they said, you couldn’t put a dollar value on human life. After rejecting the report, the agency decided not to implement the measure."
Turns out, if you refuse to put a number on the value of human life (or a child's innocence), the world sets that value for you, and it usually ends up insultingly cheap.
You're confusing resource allocation with moral absolutes, two completely different conversations.
Yes, societies constantly make difficult decisions about limited resources. But that’s not what's happening here. We're not talking about choosing between curing TB and curing cancer. We’re talking about whether or not to legitimize the simulation of child abuse and trying to dress that up as "strategy" is intellectually dishonest and morally bankrupt.
You don’t "optimize" child exploitation. You don’t play ethical games with innocence. And you sure as hell don’t compare a budgetary decision to the willful decision to allow any form of child sexualization, real or artificial.
Trying to make this about effective altruism or cost-benefit analysis completely ignores the actual harm of normalizing such depravity. Children are not line items in a spreadsheet. You can’t put a price on safeguarding their dignity. You either defend that line or you don't. And if you don't, don't pretend it's because you're being "pragmatic." Just admit you've traded conscience for calculus.
> Let’s be clear: your argument hinges entirely on a hypothetical, that AI-generated child abuse content might reduce real-world abuse.
Yes it hinges on that, in fact it's the premise and even title of the post we're commenting on. If it's not true then of course we shouldn't do it. Who knows if it's actually true, but I think this post lays out compelling evidence that it is in fact true.
And if it turns out that it is true, then not doing it would be morally reprehensible. If it really truly is the case that allowing fake images of abuse to exist would cause less real abuse to happen, then it's obviously stupid and morally wrong to continue to ban the fake ones.
If this is true, you are right now morally responsible for real abuse occurring. I would trade this current world for a world with less real abuse, but more fake abuse. You would not. Who is the monster?
Calling me a monster for refusing to normalize child exploitation, even simulated is one of the most backwards, morally bankrupt accusations I’ve ever seen.
Let’s be clear, if you think defending a child’s right not to be reduced to fantasy abuse makes someone “the monster,” then you’ve inverted good and evil.
You say you'd trade this world for one with "less real abuse but more fake abuse"? You’re not offering a solution, you're offering a moral poison pill. You're saying: let's feed the sickness so it doesn't spread. But that's not how evil works. You don't curb depravity by catering to it. You don't stop fires by handing out matches.
What you’re defending requires the creation and normalization of a grotesque fiction that simulates the very thing we say we abhor and then dares to call it progress. That’s not moral courage. That’s spiritual cowardice.
Some things are not “lesser evils” they are non-negotiable evils. And when you try to rationalize them in the name of “saving” others, you lose the soul of your argument and the soul of your society.
I’d rather be falsely called a monster for standing against simulated child abuse than be remembered as one who stood by while people tried to dress moral decay up as harm reduction.
Thank you for the new introduction to my seminar "I like moral arguments too, but wherever I look, the functional arguments end up being a whole lot smarter"
Functional arguments might be clever, but morality is why we have society in the first place, otherwise we'd still be in caves throwing rocks at each other.
Function might build tools, but morality builds trust, dignity, and the very fabric of human life.
We’re not throwing rocks at each other because functionally, using rocks for something else clearly leads to better outcomes. Slavery was abolished because it was economically unfeasible. So was child labor and colonialism. This list goes on and on. Unless we’re talking about a few select authors, moral arguments are nothing more than a shortcut to functional arguments, with the added bonus that understanding of the functional reasoning is not mandatory. Sometimes this is an advantage, sometimes it isn’t.
Drawing functional arguments from what were ultimately moral revolutions is a dangerous distortion. Slavery, child labor, and colonialism weren’t ended because they were inefficient, they ended because people had the courage to say, “This is wrong.” That same courage is needed now.
Pornography, at its core, was never about freedom or expression, it was created to exploit and degrade. As the Meese Commission and other studies have shown, it distorts intimacy, fuels addiction, and erodes the very essence of human dignity.
Child sexual abuse is a boundary that must never be crossed. The fact that we’re even entertaining AI-generated abuse under the guise of “debate” or “harm reduction” isn’t progress, it’s devolution. It normalizes the grotesque and chips away at our shared moral compass.
Some things are not meant to be explored, optimized, or rationalized, they are meant to be condemned.
"Some things are not meant to be explored, optimized, or rationalized, they are meant to be condemned."
I firmly believe that you are the one making a dangerous distortion, because you are dismissing the idea of implementing structural measures that can reasonably be expected to decrease the occurences of these grotesque acts. The innate weakness of moral arguments (because sociologically, they form a moral functional system, which sits opposite of cognitive functional systems) is that they fail to systematically produce outcomes unless they are structurally validated within those cognitive functional systems like science or the economy.
While I am willing to concede that moral arguments are necessary to ultimately attain favorable outcomes, they are never sufficient. While there was a highly effective public courage to stand up against colonialism, that has been the case decades before decolonisation became a structural reality. And the moral outrage was only met at that point with a structural correlate because colonizers had to let go of their ideology and come to the realization that at its core, colonies produce a net-negative outcome for the occupying state.
Condemnation does not work well enough. Morality does not have inherent value, only betterment does.
It's a lot easier to get people to agree to get rid of a moral outrage if the economic incentives to keep it aren't strong, but when England ended slavery in its colonies, it was primarily because of the moral arguments, and a large portion of the government's budget went to compensating former slaveowners for the financial value of their "property".
I understand that you're emphasizing structural solutions and functional outcomes, and I agree that real-world systems must support any meaningful change. But here's where we differ: not everything can or should be measured only by outcomes or systems logic, especially when the cost involves normalizing abuse, even artificially.
When it comes to something as deeply violating and morally corrosive as child sexual abuse, in any form, the role of morality is to draw a clear line in the sand. The line exists to say: this should not exist under any circumstances, regardless of theoretical benefits.
We don’t reduce fire hazards by allowing people to play with fire more safely, we create firewalls. We don’t tolerate synthetic versions of abuse in the hope it’ll reduce real abuse. That thinking leads us down a path of desensitization, where the line becomes harder to defend over time.
Condemnation in this case is a structural response. It protects the cultural and psychological boundary that keeps society from slipping into moral numbness. Once that line is blurred, the very foundation that allows us to claim something is “harmful” or “wrong” begins to erode.
So no, not everything should be optimized. Some things should be stopped cold. Not because we lack the tools to test alternatives, but because what’s at stake is too sacred to experiment with.
"Pornography, at its core, was never about freedom or expression, it was created to exploit and degrade. As the Meese Commission and other studies have shown, it distorts intimacy, fuels addiction, and erodes the very essence of human dignity."
What a complete and utter pack of lies, untruths and misconceptions.
Pornography at its core is about people enjoying something that is a natural bodily function which is quite pleasurable and enjoyable. Sex, at its most basic, is a fun and entertaining activity that can be experienced solo (masturbation) or (for more fun) between two or more consenting adults. Pornography is simply the recording of those activities and sharing them with other people. Its content that focuses on sex, really not all that much different than mindless TikToks but everyone is naked and rolling around. The fact that porn was the first media to ever go "viral" shows that it is something people really and truly want and enjoy. People were recording their own home videos and sharing them long before there was OnlyFans. The majority of people making porn are doing it because they want to be doing it - which is not exploitative.
Just about anything can be turned into exploitation or used for alternative purposes, not just porn. There are a whole lot of people who use religion for very nefarious, harmful and even violent purposes. Much more so than pornography in fact. Would you say that religion at its core is exploitative and harmful?
Sex can also of course be part of romantic relationships and used to build bonds and increase intimacy.
What you're describing may sound innocent, but it's part of a deeper moral confusion that's slowly poisoning society. Sex was never meant to be commodified or stripped of its sacredness. It's meant to be intimate, sacred, and a deeply meaningful part of a committed relationship between a man and a woman, ideally within the context of marriage. It's where two souls come together, sharing love, vulnerability, and trust, and it's in that space that the true magic happens - the creation of life. This is what makes it special, it's not just a bodily act, it’s a union that transcends the physical.
Pornography, however, corrupts this truth. It reduces intimacy to a performance, and the deep emotional connection between partners to a series of shallow, mechanical acts. It's a distortion of what sex is meant to be. And we don’t have sex in public for a reason, it’s meant to be cherished behind closed doors, in the privacy of a relationship, not displayed or exploited for anyone to consume.
What you're defending isn’t about "natural bodily functions." It's about an industry that thrives on the degradation of real human connection. It’s not about pleasure, it’s about exploitation. The Meese Commission, a government-backed study, outlined the very damage that pornography causes to relationships and individuals. It showed how it warps our expectations, fuels addiction, and leads to real harm in society. That’s the reality, whether or not it fits the narrative we want to sell.
To reduce sex to just "fun" is to miss the point entirely. It breaks marriages, shatters relationships, and introduces harmful fantasies that have no place in real intimacy. When you think about what’s happening to our society, it’s not just a matter of people consuming media, it’s about the disintegration of values that hold us together. We’ve normalized something that was never meant to be commodified. And that’s why we see the rise of broken marriages, deep loneliness, and the dehumanization of real connection.
This isn’t prudishness or a fear of pleasure. It’s about drawing a line. A line that says: sex is sacred. It’s about intimacy, not exploitation. And when we lose that, we lose ourselves. The "solution" isn't just turning a blind eye and pretending it's harmless entertainment. It's about reminding ourselves of what real love and connection look like, and what happens when we forget it.
This isn’t just a conversation about “pleasure.” It’s a conversation about our values, our integrity, and the kind of society we want to live in. And until we return to that foundational truth, that sex is not a product, it’s a powerful force to be nurtured, we’ll continue to spiral into moral decay.
Oh take pity, often those who cling to moral arguments do so because reasoning and logical thinking is beyond their grasp. Someone once told them a story about something bad that they called a "moral" and now they repeat it even though they don't fully understand the full implications. People have to work within their limitations.
Do I sound confused, or are you just uncomfortable with someone drawing a clear moral line?
If refusing to intellectualize something as inhumane as AI-generated child abuse makes me “confused” in your view, then maybe it’s not my clarity that needs questioning, but the discourse norms you’re choosing to defend.
I see no evidence attempting to engage thoughtfully with such a thought-free commenter would do any good. As to the second point, Aella attracts a lot of religious weirdos who sublimate their erotic fascination with her into tirades against her general sluttiness.
I think it is a deeply... im not sure the word, scary? thing when someone turns there brain off to alternative thought.
How can intellectualizing and rationalizing be bad? Whats wrong with the greater good?
He doesnt answer this.
Sophic doesnt engage much with Aella's argument, and condemns her pretty damn hard.
He also seems like a thoughtful and reasonable person. He articulated himself, he behaved more respectfully than 90% of the commenters here.
A norm I try to hold is respect, I think your comment is kind of a pejorative. It mainly is just telling Sophic he doesnt belong here, in a dismissive and looking down upon way.
I think generally, one sentence dismissals, nawt good. Even when deserved.
LUke, I appreciate your willingness to hold space for respectful discourse, and your instinct to look for good faith in others. That’s a rare and valuable norm, especially in conversations as difficult as this one.
I want to clarify that my strong moral stance doesn’t come from turning my brain off to alternative thought, it comes from turning my whole self toward the reality of what’s at stake: children. There are some lines in society that aren’t up for intellectual debate, because debating them already risks normalizing the unthinkable.
We can talk about structure, function, and harm reduction all day but at some point, the foundation we’re building on must be rooted in moral clarity. My response wasn’t dismissive of complexity, it was protective of innocence. That doesn’t mean I’m against thoughtfulness, it means I’m for drawing lines where they must be drawn.
As for Aella’s content, I unsubscribed precisely because I believe there’s a point where discourse becomes complicity. I won’t sit silently when children even in hypothetical or artificial form become subject to normalization through tech or detached reasoning.
Thanks for hearing me out. I'm here for thoughtful dialogue, but on matters this serious, there's no room for moral ambiguity. Protecting children isn't a stance I defend it's a principle I refuse to compromise on.
Although I disagree, what your saying makes sense! I appreciate your insight. It reminds me a bit of the ai box experiment. A super intelligent AI would always convince you, to let it out of its box. So its not even worth talking to it. https://www.yudkowsky.net/singularity/aibox
Maybe its the same with morals, some moral questions are so bad and some people so pure, you cant even discuss it. Some things cant be tainted, and are just wrong no matter how you square it.
I see no evidence of thoughtfulness or reasonableness, and generally think it’s fine to let people have what they deserve. Open discourse norms are easily exploited by people who don’t share them. Rationalist flavored folks are always getting into interminable discussions with people who suck because of their norms of unilateral assumption of good will, to no advantage I can see. But of course you’re free to do that if you want to, just as I’m free to be rude to people who suck.
Why do you think this? I mean, I compare his response to
"Welcome to the world of effective altruism and rationalism. These people are some of the most evil motherfuckers on earth"
"I think this author should climb the tallest building in their city and jump, in place."
and I think on the "thoughtfulness and reasonableness spectrum" he is pales beyond these two.
You might say, "this is a baseline, there is no benefit to simply meeting it" to which I would say, baselines are meant to be easy to meet. And this one obviously isn't for this topic.
I don't think of things in terms of good will or not, I just go off general vibes, and try and articulate them.
People can be free to be rude to those who suck, thats valid, but if everyone here was like that, I wouldn't be here.
Aella could have written a blogpost "10'001 reasons why anti-pedophiles should suck on my fat hairy balls" and it would be really righteous and justified and totally useless.
There is something to turning the other cheek. I cant really explain it but there is.
In regards to Sophical's post, we can imagine two posts trying to convery the same idea, coming from the same feeling, Sophical's post and someone who is less emotionally honest
Sophical writes
"This isn’t just a debate or an opinion, this is about protecting the innocence of children. What’s being discussed here is deeply disturbing, and I feel a moral obligation to speak up"
He isn't lieing or pretending that his idea's are anything than what they are. He isn't making up facts or allowing himself to become biased, he simply strongly believes this ideas are disturbing, and values the innocence of children above all else
A less honest person would look at this, think "this is disturbing, aella doesnt value the innocence of children" and write
"Lmao lol woodchipper go brrr"
or
"Aella is OBVIOUSLY WRONG, she clearly doesnt know ANYTHING about pedo psychology, I DO. The one thing we do, is THROW EM IN THE WOODCHIPPER then all innocent children will understand they are safe, and everyone will agree disturbing pedo's are disturbing. No talk other than LOCK EM UP is allowed, we must all make sure EVERYONE UNDERSTANDS this"
I prefer what Sophical wrote, its honest, and I can engage with it.
> You keep trying to paint this as simple math, like defending the dignity of children is a cold equation
Yes because it is whether we like it or not.
> even fabricate horror to possibly prevent it
Yes, fabricated horror is clearly better than real horror.
> You don’t protect children by normalizing the abuse of their image
But what if you can? If it doesn't actually lead to more harm? Then you can.
> Because once you let those lines blur, the culture blurs with them. Conscience erodes. And suddenly, the unthinkable becomes thinkable. That’s how real harm begins.
Yeah, if it causes more harm in the long run because of effects like this, I agree! I completely agree. But there's a very real chance it doesn't! And then the argument flips around, and *you* become the monster for causing unnecessary harm.
> Not even if it “works.” Because the cost is too high.
The cost is only too high if it doesn't work. If it actually works, there is no cost. If it actually, really saves children in the long run, then surely we have to do that right?
Your entire argument is that doing this will corrupt us and lead to worse outcomes in the long run, but I'm saying if that doesn't happen, then we have to do it.
You keep saying “but what if it works?” as if that magically erases the cost. It doesn’t. Some things are wrong even if they produce favorable outcomes. That’s the whole point of having a moral backbone.
The moment you accept simulated child abuse as a “strategy,” you’ve already lost the plot. You're not preventing harm, you're legitimizing a tool built on the aesthetic of harm. You're not protecting children, you're just shifting the abuse from flesh to fantasy and pretending that's noble.
If your solution requires us to simulate the worst crimes against innocence to maybe reduce them, then your solution is part of the sickness. You're not solving anything, you're surrendering.
The world doesn’t need more people who can justify evil with a flowchart. It needs people who still remember that some lines exist to be held, not debated.
> Some things are wrong even if they produce favorable outcomes.
I guess this is where we disagree, I think this is just plain wrong. If less real people suffer less harm, it's not wrong. (Assuming you count all people and all the way they are harmed!)
> You're not preventing harm, you're legitimizing a tool built on the aesthetic of harm. You're not protecting children
No we *are* preventing harm and protecting children by the premise of the argument. I agree that if it doesn't do this, we shouldn't do it, but if it does - then we should.
> you're just shifting the abuse from flesh to fantasy and pretending that's noble.
Yes this is what I think we should do. If by some magic miracle every drawing of a child saved a real child, you bet I would be drawing all day. Drawings aren't harmful in and of themselves. I get your argument that this might have second order effects, like normalizing it, which would lead to real harm of real people down the line, but if that doesn't happen (per stipulation of the argument), then there is no harm in the drawing itself.
> The world doesn’t need more people who can justify evil with a flowchart.
I guess I understand the sentiment, but this is how the world works every day in almost all situations. Everything is a tradeoff all the time, and every action and inaction causes some amount of harm. It's our responsibility to make these decisions as best we can in everything we do. There is no inherent evil in an action except for the consequences it has on real people (or sentient animals).
If you genuinely believe that creating or promoting child abuse in any form, even in fantasy, is acceptable if it leads to “less harm,” then you've already crossed a line no decent human being should ever approach. That’s not pragmatism, that’s moral rot dressed up in utilitarian language.
This is not how a healthy world works. It’s how a broken one does, one that’s forgotten what it means to protect the innocent, to draw hard lines, to say some things are wrong, period. The willingness to even entertain the idea that simulating the abuse of children might be “worth it” if it produces better numbers is a symptom of how disconnected we've become from our own humanity.
No, not everything is a tradeoff. Not every evil can be weighed and justified on a scale. Some things are wrong because they desecrate the very fabric of what makes us human. And when you start making exceptions for the unthinkable even hypothetically, you’re not reducing harm, you’re training people to see children as variables in an equation.
That’s not protecting them. That’s predatory logic. And there is no place for that in a sane, moral society.
Some lines exist not to be debated, but to anchor us when everything else is up for grabs. And if that sounds foreign to you, maybe it’s not the world that needs to change, maybe it's your conscience that needs to wake up.
But the reason creating fictional depictions of harm is harmful (even in your ontology) is that it leads to moral collapse or whatever, which then leads to actual harm happening. If it doesn't lead to actual harm, then the fictional depictions aren't inherently harmful!
But even if you won't look at the numbers, in your world more children are harmed than in mine, and that's fucked up (not to mention your fault).
If you think the only reason depictions of harm are wrong is because they might lead to real harm, then you’ve already conceded too much. You’re treating morality like a spreadsheet as if the only thing that makes something wrong is the outcome it produces. But some things are wrong in and of themselves because they degrade our sense of what is sacred, of what must never be treated as permissible under any circumstance.
You say in “your world” fewer children are harmed. But your world is one where abuse is rebranded as simulation, and innocence becomes an input in a moral experiment. That’s not a safer world, that’s a desensitized one. A world where we’re trained to view atrocity through the lens of utility until we forget how to recognize evil when we see it.
You’re not reducing harm. You’re normalizing detachment. You’re laying down moral explosives under the assumption no one will ever step on them.
This isn’t about refusing to look at numbers, it’s about refusing to let numbers erase what matters. Protecting children isn’t a function of efficiency. It’s a line we draw to remind ourselves who we are. And once you cross that line, even in theory, you’ve already begun to lose what you claim to protect.
Don’t try to blame people like me for holding that line. The harm doesn’t come from those who refuse to budge. It comes from those who think everything is negotiable.
You refuse to negotiate, and you pay a hefty price on the form of innocent children being abused unnecessarily (which is why I called you a monster). The price you pay to not negotiate is too high.
If degrading our sense of what is sacred leads to more harm, then yeah that's bad. But if doesn't, I'm willing to sacrifice the sense of the sacred for reducing harm.
A world that doesn't recognize evil when they see it leads to more harm, which I oppose. But if it doesn't lead to more harm, how is it evil? Evilness is defined by the harm! If there's no harm, I don't see how it's evil.
Whatever your moral intuition says, it has to cash out in real harm at some point. Reducing harm as much as we can is the point, and that's what I will always support. It's sad that you would oppose that.
You keep framing this as if refusing to negotiate with evil is the problem as if the moral boundary itself is to blame for the suffering, not the systems or mindsets that allow it to exist in the first place. That’s not just wrong; it’s a complete reversal of cause and effect.
You call me a monster for not compromising, but here’s the truth: I’m not the one introducing simulations of child abuse into the moral ledger and asking whether they might be “worth it.” I’m the one saying that some boundaries are sacred, that children are not variables in your thought experiment, and that our humanity begins where our willingness to compromise ends.
You reduce evil to harm and harm to numbers but morality isn’t just a consequence calculator. It’s also a compass. If you throw out the compass in favor of results alone, you may walk efficiently, but you'll never realize you've been heading into darkness the whole time.
Your world may look optimized on paper, but it’s spiritually bankrupt. It demands the sacrifice of our moral intuitions, the very instincts that tell us to protect the vulnerable, to reject what is perverse, to recoil at the grotesque. That isn’t progress. That’s decay.
You say you'd sacrifice the sacred if it "doesn’t lead to more harm." But once you’ve sacrificed the sacred, your definition of harm becomes malleable. And that’s how societies forget what needs protecting in the first place.
You can call that sadness. I call it clarity. I’m not sorry for refusing to negotiate with ideas that dehumanize innocence. That’s not stubbornness, it’s the one thing keeping us from becoming what we fear most.
"This kind of conversation should never exist in the first place"
blah blah blah
Lets not use our brains. Lets certainly not think. Oh my, the "violation of conscience, and of basic human decency"
I have news for you - humans really aren't all that decent. People who hide behind these types of moral outrage often have the most to hide and often the louder the are the more they are probably self reflecting.
Sorry, no. Problems don't magically go away or get solved by sweeping them under the rug and refusing to talk about them because they are uncomfortable or icky. Society and the real world is full of messy nasty things that need good, EFFECTIVE solutions.
If you can't handle that by all means bury your head in the sand and don't participate.
But for this:
"I believe this kind of conversation warrants serious reflection, not just from readers, but from the platform itself. Silence and neutrality are not options when it comes to the protection of children."
This is a veiled threat that anyone daring to discuss how to deal with this problem that does cause harm to children should be silenced. This is NOT how you actually protect children, this is how you protect predators. You should be ashamed. You are empowering pedophiles and basically offering up children for them with an attitude like this. You are part of the problem.
Let me be very clear: abusing a child is moral murder. You don't try to “optimize” murder. You don't debate how to make it "less harmful." You crush it with every legal and moral force a society has.
Suggesting that people like me who refuse to entertain grotesque hypotheticals about synthetic child abuse are “part of the problem,” is one of the most revolting accusations I’ve seen here. It’s cowardly projection. It says far more about you than it ever could about me.
You think moral outrage is a façade? You think people like me are “hiding something” because we draw a hard line against predators? That’s not reason, that’s rot. You’ve chosen to mock moral clarity in the name of intellectual “courage,” but there’s nothing courageous about moral collapse dressed up as logic.
We've lost something vital in society, the instinct to protect, to fight, to draw unbreakable lines. Too many men today are taught to suppress their warrior nature, to be passive in the face of evil, to avoid discomfort rather than confront depravity. And now? Our children aren't safe. Even animals aren't safe. Because the protectors are silent or worse, confused.
We don't need more “tolerance” for moral decay. We need courage. We need spine. We need people willing to make evil afraid again.
Humans made laws. Laws did not make us. And without morality at our foundation, any law becomes a tool of rationalized evil.
The solution is to draw an unbreakable line, enforce it with the full weight of justice, and ensure the consequences are so severe that even the thought of such acts brings fear.
Our conscience, our morality, and our instinct to shield the innocent must come first or we become no better than the evil we claim to fight.
Let’s be honest: you didn’t misread my comments, you read them, and still chose to twist them. Why? Because you’re not looking for solutions. You’re looking for justifications. You’re not disturbed by what we’re discussing, you’re disturbed that someone dared to say no to that discussion.
If that enrages you, ask yourself why. I will never let a conversation like this pretend to be “rational” when it's actually morally bankrupt.
If lives are too valuable to play calculus with, then that means they are also too cheap to be worth optimizing. Perhaps you believe lives are cheap in this way, and are fine with that, but I, for one, am not.
Absolutely. When so-called "rationality" starts justifying the unthinkable, like normalizing simulated child abuse in the name of data, it's no longer reason, it's moral rot.
There’s nothing enlightened about stripping away conscience to optimize cruelty. Some values must remain sacred, or we lose the very humanity we claim to be protecting. I'm glad others see how dark this “logic” can really get.
While I'm not particularly offended by this general line of questioning, I think this post follows a general pathology in public discourse of conflating pedophilia as a clinical condition with abuse of children as a social problem. It may seem counterintuitive, but they're not particularly tied. This is maybe best discussed in terms of this claim in your post:
"Probably most of the pedophiles who risk jail and extreme social ostracization to actually molest a child, are the ones who are really into it."
As it turns out, if we look at who tends to assault children (as opposed to e.g. compulsively collect CSAM), most offenders are not hardcore obligate pedophiles. This is because most crimes (of all types, but especially sex crimes) are not coldly planned days in advance, but rather happen as a spur of the moment response involving cognitive impairment (from drugs, especially alcohol, disability, stress, etc.) and opportunity (access to a vulnerable subject). That is, sex crimes are explained by poor impulse control, not by super-strong core desires.
Of course, serial sexual offenders who target children exist. There are people out there making child porn. But if we're to treat CSA as a social problem, pedophiles shouldn't be our main concern. Based on the numbers, we should be thinking about what affects the behavior of alcoholic stepdads. I'm not certain that AI-generated porn matters to alcoholic stepdads, so maybe we should consider the weak effect it might have on hardcore pedophiles as the dominant concern.
But there's another possibility that has to be considered. If most assault events are explained by poor impulse control, then we have an interest in optimizing for the disgust that people feel towards sex acts involving younger children. Reducing people's ability/likelihood of viewing sexual materials involving children might help avoid reinforcement patterns that go against that disgust response. I think this model is a lot more plausible than a "CSAM will make people obligate pedophiles" model.
While I'm minded to be laissez-faire towards any media that doesn't require harming a child to produce, we (as a society) should probably think hard about this when considering how/when to distribute AI-generated images that sexualize children.
This is an interesting line of thought. I'm unaware of the statistics, but based on demographics it does seem very likely that the number of people who actually engage in such cases are probably not pedophiles (who are statistically rare) but rather someone with poor impulse control who finds themselves in an opportunistic situation to take advantage of someone they should not (a more statistically likely scenario).
"I'm not certain that AI-generated porn matters to alcoholic stepdads"
Actually... Lets consider your scenario a bit further. We have an alcoholic stepdad. Lets say this stepdad at least occasionally consumes porn. Lets further suppose that Aella's proposal of flooding the internet with CP to suppress the real-world urges of actual pedophiles gained traction and had been enacted years earlier.
So whenever this alcoholic stepdad visited porn sites he would be likely to come across some of this content. Do you think an alcoholic stepdad with poor impulse control might just click on some of that and watch some?
If this stuff literally "floods" the Internet odds are he's going to eventually trip over something that reminds him of his stepdaughter... Might this alcoholic step dad with poor impulse control click on that?
Then what happens one day when he's and the girl are home alone, he's drunk and a compromising situation occurs. Do you think the exposure to CP might compound his poor impulse control to make a bad outcome more likely?
This made me think. AI CP would almost certainly destroy the market for CP. But obviously there is a difference between viewing CP and being a person who actively assaults a minor. IMO the best work would be done by unentangling the viewing of CP with the behavior of being an assaulter. They are in reality quite different actions, but are often treated as the same both in common discourse and in a legal context.
Then why isn’t it called just _abuse_? And why is it so rare for a tall, fit woman to do it to a small, dilapidated, out-of-shape man even if she could easily overpower him?
You probably don’t need me to say this, and I’m not sure I should say it, having read about your murdery-looking visitor (<https://aella.substack.com/p/the-attempted-kidnapping-at-my-house>), but I admire your bravery, once more. You’re definitely much braver than anyone making fun of you about the needle thing.
> People somehow have interpreted this as me supporting child sexual assault, which is confusing to me but makes more sense if I model people as being LLMs that get triggered if you say too many negatively-flavored keywords too close together.
Nice model. Before LLMs became widespread, we had to make do with terms like _Copenhaguen interpretation of ethics_.
I've never in my life seen a person defend being able to watch cp so hard. Weather it's real or fake is irrelevant even if it's ai you get caught distributing fake cp you're going to jail. We are talking about the same government that is using every soft target in the country to completely eliminate from society and now is the time you decide to drop this? Yeah up until this post i was cool with the edgy outtakes but no ones going to look pasta video of a child in a sex act and be like "no wait it's cool this kids got 2 left hands and the right one has 7 fingers. Yeah I'm unsubscribing from this.
Police, FBI and CIA are already allowed to have CP libraries for purposes of bait, and do own and use those libraries. There is no "difference" introduced via AI CP
Yepp, exactly. It doesn't matter whether it is AI or not. Child Pornography is not okay and I cannot comprehend why would anyone defend AICP while making it look better comparing to actual CP, when both the things are equally problematic and can cause worse consequences.
Yeah I think this is the core of the debate here. Aella is coming from the assumption that fake CP is less bad than actually abusing children, others are operating from the assumption that all CP is equally and infinitely bad.
Can't justify that on utilitarian grounds, but it's sacredness or something
This paper goes over the data from Denmark as well as the other countries mentioned and the claims contained therein. It found that violent sexual assault and rape actually increased when porn proliferated as many other sexual crimes such as voyeurism, "peeping," incest, and others were simultaneously decriminalized, which incorrectly demonstrated an overall drop in sex crimes over that period.
It also studies the effects of Sexually Oriented Businesses (SOBs) in a couple of major cities and found:
♦ Austin, TX -- 1986 - in four study areas with SOBs, sexually related crimes were 177%
to 482% higher than the city's average.
♦ Indianapolis, IN -- 1984-1986 - Between 1978-1982, crime in study areas was 46%
higher than for the city as a whole. Sex related crimes were four times greater when
SOBs were located near residential areas vs. commercial areas.
♦ Garden Grove, CA -- 1981-1990 - On Garden Grove Blvd., seven adult businesses
accounted for 36% of all crime in the area. In one case, a bar opened within 500 feet of
an SOB and serious crime within 1000 feet of that business rose 300% during the next
year.
♦ Phoenix, AZ -- 1978 - Sex offenses, including indecent exposure, were 506% greater in
neighborhoods with SOBs. Even excluding indecent exposure, the sex offenses were still
132% greater in those neighborhoods.
-- 6 --
♦ Whittier, CA -- In comparison studies of two residential areas conducted between 1970-
1973 before SOBs, and 1974-1977 after SOBs, malicious mischief increased 700%,
assault increased 387%, prostitution increased 300%, and all theft increased 120%.
Virtually all SOBs, regardless of the city in which they are located, have similar negative
effects upon their surrounding neighborhoods. The Indianapolis study concluded that: Even a
relatively passive use such as an adult book store [has] a serious negative effect on [its]
immediate environs. It is difficult to miss the implication that these harmful secondary effects
simply reflect something harmful in the nature of the material.
Further, "high-frequency pornography consumers who were exposed to the nonviolent, dehumanizing pornography
(relative to those in the no-exposure condition) were particularly likely to report that they
might rape, were more sexually callous, and reported engaging in more acts of sexual
aggression. These effects were not apparent for men who reported a very low frequency
of habitual pornography consumption.The authors noted that the effects of exposure
were strongest and most pervasive in the case of exposure to nonviolent dehumanizing
pornography, the type of material that may in fact be most prevalent in mainstream
commercial entertainment videos.
The study found that more than twice as many men indicated at least some likelihood of
raping after exposure to this material 20.4 percent versus 9.6 percent. Detailed analysis
revealed that these effects occurred primarily for high P (psychotism) subjects those
who are inclined to be rather solitary and hostile, lack empathy, disregard danger and
prefer impersonal, non-caring sex (although not meeting clinical criteria as psychotics)."
A lot of data has been parsed more broadly on this subject and it doesn't bode well for the argument in favor of the proliferation of AI child pornography to combat real child exploitation.
The devil is often in the details when it comes to such things. One things to consider regarding sexually oriented businesses in the United States, particularly for the date ranges mentioned here, is that the US is rather sexually repressed as a society. It was far more repressed back then, leaving very few places to engage in sexual anything outside the home. Which I would imagine lead to more incidents at the few places that do exist.
In one of the cited examples, 7 sex shops were tracked from 1981-1990. By this time, sexual mores had loosened significantly, the free love movement began in the 1960s and the massive boomer generation had already reached sexual maturity by this time. They grew up in an era where sexually was far less suppressed or had significant culture influences condoning the liberalization of sexuality in society.
Young people are more impulsive and consist of most crimes, sexual or otherwise and by this time they weren't nearly as repressed as the Silent/Greatest generation or earlier was.
Further, 7 locations in Garden Grove (https://en.wikipedia.org/wiki/Garden_Grove,_California), a city of 123-143k at the time, is not exactly an urban metropolis where every sex pest can gather in one spot - it is seven spots dispersed through a relatively small population (~1/17,500 - ~1/20,400 inhabitants).
This was a time period where sex shops were perhaps not as as overlooked as they are today with greater Christian influence, but still certainly quite common.
When it comes to sex, it is something primal and gatekept aggressively by men and as cited in the prior comments, porn exposure led to more dehumanizing values expressed towards women and a greater positive inclination towards rape and stated intent of committing rape.
Showing intent towards committing crimes is positively correlated with committing crimes: "...attitudes toward crime, subjective norms, and perceived behavior control effectively predict the intention to injure, steal, or use drugs. Criminal intention effectively predicts the occurrence and the frequency of injury to others, theft, and drug use."
It also found that for child sexual abuse, "...An anonymous self-report questionnaire was adopted to collect data from 915 incarcerated males at Taipei, Taichung, and Kaohsiung prisons in Taiwan and 559 male college students from 6public and private universities in Taipei and Hsinchu. The results show that the intention and practice of child sexual aggression are related. Child sexual aggression intention accounts for 13.7~28.0%, 2.2~8.4% variances of practice separately in male inmates and college students."
The devil is in the details and the details tell us that pornography results in dehumanization of the victims, a ~2 fold increase in the stated intent of committing rape against those portrayed in porn and that this is significant in predicting the incidence of committing aggravated sex crimes.
Further, in the previous document I cited (https://www.protectkids.com/effects/justharmlessfun.pdf) it was found that after liberalization of pornography, total rapes and sexual assaults increased on a national level, even if the overall incidence on paper declined, due to lesser sex crimes, such as peeping, voyeurism and incest being simultaneously decriminalized.
Very disturbing post, but I don't agree with the commenter who finds that a reason not to have it. In my opinion, if there's a way to reduce the behavior, or the harm of the behavior that occurs, there may be a moral obligation to do it. At the very least to consider it.
I wonder if "nearly legal" porn contributes to the development of soft fetishes.
And I am extremely cautious about any restrictions of first amendment rights (in the US). Every regulation involves a regulator, and in all too many cases these are purchased, controlled, and, if I may, perverted.
Make it only legal to use authorized generators. Make those generators leave an invisible identifier in their images. If the identifier isn't there, the image is considered real and punished accordingly
The problem with this is that the identifier could just be cloned onto a "real" cp image. The best solution I heard suggested was to prohibit sharing the actual images but to have some method for easily sharing the generation parameters which could then be generated by the viewer, perhaps even having on-the-fly generation performed seemlessly within the viewer's browser. It would require those people to have better than average hardware, but would eliminate the LE enforcement problem.
This would be solved by providing the seed, model, and +/- keywords used to produce the image. You would be able to produce an exact replica if it's generated.
If you are just meaning that the image should include the creation metadata (or be required to do so), that wouldn't solve the problem. In order to verify that the metadata actually matches the image, the LE agency would actually need to download every model and generation application anyone uses, generate each image based on that metadata, and then see if the images are identical. Assuming a rather modest estimate of 500k AI-generated CP images produced per day (based on number of AI porn images created on one website 2 years ago) - that's over 5 per second - it seems that LE could never practically sort through those images for "real" CP and thus could never prevent proliferation of CP involving real children.
If you meant for sharing the image creation data, then, yes, those are the "generation parameters" I was talking about. But, unless there is a method (think an ai html tag like "img" and associated standardized generation model and engine within the browser), it would not be any more functional than people just posting generation paramaters on a text-only message board - which technically-minded people can do already, likely without running afoul of any law in many countries.
Edit: Also, a big problem with both the "embedded metadata" and "invisible identifier" approaches is that they would require a complete reworking of how the internet functions. Almost every internet service, web browser, email service, messaging service, image board, message board, chat room, etc, resize and reformat images constantly to save data transfer and storage and to improve user experience. That can - and usually does - destroy any digital watermarking and metadata.
So no free-software generators? We’ll have to police all programmers to make sure they don’t secretly develop one. And we’ll have to police everyone else to make sure they don’t secretly learn to program and do the same. And who gets to squeeze the juicy oligopoly of authorized, non-free generators, which, because they are non-free, can do whatever the authorized developers feel like, in addition to generating imagery, and noöne else will know? Of course, those who already have enough political power to get their way.
I am attracted to children, but have never and will never have sexual contact with a child. So let me offer an insider's view on the questions you raise, because... I appreciate your points, but I don't think you're right in all respects, and even if it's against my "interests" I think it's worth clearing some things up. Please pardon my very frank talk in this comment about attraction to kids and child molestation.
First of all, some terminology: I am a pedophile but not a child molester. I have the sexual attraction but would never touch a child inappropriately. I also don't view child pornography. However, I understand the urge to view it very well. I feel jealous of the people who do, although I'm glad that I stay away. I was also clearly born this way; this is very obviously a "deep" fetish for me, and in fact I think of it as a sexual orientation. It became obvious at age 13 and since then, dealing with it responsibly is just something I've always had to do in my life.
The core question you're asking is: What if more sexual outlets were available to people like me? I agree that having sexual outlets is likely to, on balance, reduce offending. I think anyone who thinks clearly about this will see the argument: imagine going through life if all you're allowed to do is masturbate without any visual stimulation. We see, for example, what happens with priests or others who try to blanket deny their sexuality. It goes badly.
Nonetheless, the impact of porn varies by the person. Some people will see AI porn and it will be a useful way to vent their need to get off and then they will go about their lives. Others will get obsessed or spend too much time with it, sinking further into the abyss. It's ultimately up to each person to manage their desires and sexual needs responsibly. The question is if the positive or negative effects of porn access are on average larger, and I think we have some decent data (which you pointed out) to suggest that porn access reduces sexual offenses, but... it's not the strongest data, you know? It's suggestive but far from proof.
OK, that's my overall take. Now for some places where I want to correct some of the things you said since I have that "inside view."
Right now, there is plenty of dubiously-legal stuff out there. There are drawn images and 3D renders of fake kids. Most "responsible" sites (grading on a curve: they are full of drawings of kids getting sexually molested) make sure that anything posted doesn't look too realistic, even if it's made with AI. My personal experience is that having access to such images is a pure good for me: it allows me to have more satisfying orgasms so that I don't become a pent up mess of a person, and I have not seen any rise in my desire to have sex with kids. (I suspect my desires actually go down because it gives a release, but it's awfully hard to tell for sure.) My experience also doesn't generalize to everyone: I'm sure that for some people, sexual imagery involving children "gives them ideas." Again, it comes down to the averages: do you get more positive or negative for most people?
I should also note that this stuff may still be illegal because the first amendment has an exception for obscenity. So drawings and computer renders that don't look realistic may still be illegal. Obviously, people will have different opinions about that; I think it's a mistake to make this stuff illegal, because I think more people are like me (helped by access to it) than those that will be more likely to offend, but I have no way to get objective data. Obviously, my friends online are those who don't offend.
However, my point is, it's not "realistic AI or nothing." We really have a number of policy choices: we can keep the status quo, legalize stuff that doesn't look too real, or legalize anything that provably does not involve real children even if it looks realistic. Which is best? I don't know, but a baby step might be legalizing stuff that looks fake (i.e. hentai involving kids). (I also wonder about the following wild situation: I have a couple of friends whom I met online who were molested as kids, but where it's become a fetish for them and they wish they could show me those images. Should they be allowed to show them to me if it's between two consenting adults, and thus there is no "victim"?)
Another thing that didn't feel right in your write-up is the accessibility of real child pornography. I've talked to people who look at real CP, and the truth is that it's definitely accessible if you do not yourself produce it. My sense is that there's a bunch of (mostly older) stuff floating around that anyone can access, and then there is some stuff (probably the newer, better(?????) stuff) which you can only get in trade, as you indicate. So the real stuff is unfortunately widely accessible if you know where to look, but I think this is still better than the world you summarize: at least it's not encouraging people to abuse kids just to get access!
If I could shape the world myself, I would create a world where sexual contact between adults and children was still looked upon as horrendous and punished harshly. Similarly, production of child pornography. Viewing of child pornography I would have milder punishment with a focus on treatment, because I've seen tons of teenagers who are just realizing their attractions fall into the wrong parts of the internet, get drawn in, and then get addicted. I would legalize stuff that does not involve real children. This is a guess, but I believe it would reduce child victimization.
Anyway, apologies for writing so much but this is obviously very important to me. If anyone is still reading, I wrote on my own post about similar questions, although it was before the generative AI boom so it doesn't address that as directly.
And if anyone has additional questions for me, I am happy to answer them as honestly as I can. I try hard to approach these questions from a perspective of evidence and genuine inquiry, and I am as open as I can be given the need to protect my identity.
> I should also note that this stuff may still be illegal because the first amendment has an exception for obscenity
It should be noted that "obscenity" is defined rather strictly for this purpose: under the https://en.m.wikipedia.org/wiki/Miller_test precedent something is obscene only if "contemporary community standards" would find it offensively sexual (an unclear & variable standard) & if it has no "serious literary, artistic, political, or scientific value"; this is strict enough that even what is conventionally considered pornography often wouldn't qualify (eg. the Utah case cited on that Wikipedia page). If https://en.m.wikipedia.org/wiki/Legal_status_of_fictional_pornography_depicting_minors is correct, even unrealistic simulated child porn that is legally obscene is banned by federal law, but this is enforced rarely & usually against people who have committed other sex offences.
Yes, agreed. Nonetheless, the law still feels like there's a bit of ambiguity, but perhaps this is me trying to avoid feeling like I do something "illegal" because I think it is beneficial and harms no one.
Haha, I love how Aella stirs up a hornet's nest , noone could do it better than her. :)
To me, the whole idea of limiting an AI image generator is as absurd as limiting MS Paint. Both tools can be used to turn imagination into creation but imagination always comes first. Can we criminalize thoughts? I don't believe in crime without a victim. Someone desperate enough could make CSAM pixel by pixel for personal use and noone would know but is it a lifechanger? I doubt that. I think the impact of porn on people is greatly overestimated to the point some could forget fap is possible without porn too! So my guess is that AI generated CSAM wouldn't really change anything in positive or negative way, at least if it's just for personal use. Even now nothing stops pedos from jerking off to their imagination and non-sexual stuff, it's not just porn or real children dilemma. If they cross the line of risking a jail time right now it's probably deeper problem than just lack of fap material.
My only concern is that AI generated CSAM in public space could waste lots of time and resources to investigate fake stuff instead of helping real victims. But I guess this problem goes far beyond mentioned topic and may be part of the whole AI generated fake media threat.
You see, people like limiting things. How far could we fly if we weren’t too busy clipping one another’s wings?
> some could forget fap is possible without porn too!
Well, I’ve read women claiming to be offended when a man in their vicinity has a spontaneous erection, because of what it says, according to them, about his thoughts. Eschewing porn won’t save you.
This is all obviously correct, and will obviously never happen. We could have perfect AI sex bots to fill this role and crater demand for real victims, and the same people with visceral reactions would be crying about normalization.
Pedophiles are sick people. Their brains mistakenly perceive a prepubescent individual as fertile, and this causes harm towards the individual. It is a real mental illness but like with murderers and thieves, they are human beings as well who need rehabilitation.
I myself work with sexual assault victims. It's very sad to see how our current society accidentally enables such things to happen. The lack of sex education towards the youth leads them to being unaware of all the dangers that happen. Our society rightfully punishes these people but rehabilitation needs to be normalized. We should offer these people rehabilitation and punishment instead of keeping the subject as taboo as possible. Rehabilitate the offenders, heal the victims, and educate the youth on the potential dangers.
We also need to differentiate between true pedophilia and emotional outrage. Many people easily criticize others for finding 16 or 17 year olds attractive, even though it has been a normal thing for the entirety of human civilization. Criticizing a 20 year old for finding a 16 or 17 year old attractive is just pushing the issue past its tipping point and downplaying the real victims. It is akin to the outrage and homophobia in the late 1900s. There is a difference between natural relationships taking place and predators actively targetting children who don't know any better. Age of consent is 14-18 in many countries for this reason.
The real issue is real pedophiles who damage lives of victims who are too young to even know what puberty brings them.
Obviously evolution favors attractive towards pubescent fertile individuals. Fertility means that procreation is able to take place and thus evolution thinks it is normal. What is not normal though is pedophilia. Evolution does NOT favor procreation in prepubescent individuals. Even if childbirth were to occur, this would easily kill the child and the mother. This is why pedophilia is not a natural thing. It is a mental illness akin to schizophrenia.
I'm attracted to prepubescent children but I don't perceive them as being fertile mates. I'm not even interested in doing intercourse with them. I just find their physically and personality traits attractive. Now of course I don't speak for all pedophiles but I've never met one who thinks children are fertile.
Agreed. I'm romantically and sexually attracted to children. They're just indescribably beautiful and charming. It's nothing to do with fertility. That just seems ridiculous. And it's absolutely nothing to do with power or control or anything like that. On the contrary, I think it's terrible that children are kept so powerless and dependent. It's a major factor in child abuse, both sexual and otherwise, and it seems downright cruel to spend 18 years of a person's life teaching them that they're inferior and unworthy of respect.
I know of two studies that have looked at why people are pedophiles.
"Taken together, the pedophilic subjects of the present study showed an over-responding to infant animal stimuli in a network of brain regions that contribute to motivating behaviors. This is in accordance with our hypothesis that nurturing stimuli receive additional processing resources in pedophiles. It is of interest that some of the areas of increased response to infant animals are related to the mating domain. The left anterior insula, being a crucial area of nurturing processing, was also frequently found to be activated in sexual brain studies (Stoleru et al., 2012). Furthermore, the left anterior insula (as well as the SMA) is a constituent of the human attachment system, thereby enabling both nurturing and pair-bonding (Feldman, 2016). Based on both observations, (i) the over-responding to nurturing stimuli in various motivational areas and (ii) the functional overlap of nurturing and sexual processing of the involved left anterior insula a tentative and simple model of pedophilia could be as follows: Nurturing stimuli receive additional processing resources by mating-circuits. In case of human infant stimuli this leads to a sexual connotation of infant stimuli."
"Our results support the idea of an overactive nurturing system in pedophilia, which may be influenced by the endogenous testosterone level."
I don't mean to cherry-pick, but every other study I've seen on the matter was based on a sample of convicted child molesters — hardly a representative sample.
I could be mistaken, I'm certainly not an expert on the subject. But I've never heard it being about fertility, it seems to be much more about naivety and control. Children are generally uneducated (particularly about sex), much less sure of themselves and much easier to manipulate/control.
I get the impression its similar to the reason a whole lot of guys like the idea of virgins. This is a VERY common interest though (assuming legal age) not illegal. Personally this also given me the ick though.
Predators exist in all shapes and sizes. Child predators are simply a subset of the overall predatory nature of human beings. All predators seek weaker individuals. This doesn't mean just being younger, it also means being poorer, smaller, weaker, shorter, dumber, naive, etc. and predators also seek out individuals who were already victims of previous abuse. And fun fact: we are all predated upon in every day society- by our government. Your government is the biggest predator to exist, much worse than the 30 year old sex offender down the block.
I have dealt with all types of cases involving these predators. These people are mostly individuals who are mentally ill, severely abused themselves, or simply just dangerous criminals already. However these people can easily be taken off the street and either punished or hopefully rehabilitated. Your government cannot. They will use any means necessary, including the promise of protection your children / citizens (remember Patriot Act?) just to gain control of you. And remember when Apple tried to implement CSAM technology? Thankfully we stood up to that. It's all about control. Our government literally does not care about sexual assault victims. It's sad and frightening.
On the topic of virgins, it is mostly a purity thing that revolves around marriage and long term relationships. It is less of a sexual fetish and more of a romance type of thing overall, but sexual fetishes around virginity is a popular thing, but that stems directly from purity. A man feels like he can be more connected emotionally to a girl who has never had sex, than to a girl who does sex work and has met dozens of men.
It does seem like nature and evolution tends to approach towards neoteny. If you dive into the rabbit hole of decades of scientific and biological studies, it could actually be where humans get beauty from. Beauty seems to stem from the combination of maturity and youth, maturity being pubescent and fertile and aware of the entire ordeal of sexual reproduction. Pedophiles do not see this, and they immediately go for the youngest possible. Nature works with relativity, and since pedophiles aren't able to follow the right path that evolution intends them to, it just leads to problems occurring and abuse happening. But on the other hand, it is why humans are most fertile at their late teens to early 20s. But pedophiles are way way way off the line here.
Really? You're going to bring governments and politics like that into this? Shut up you fool, saying a government is as bad as someone who intentionally harms children frankly makes you a horrible person. Bluntly my government is responsible for delivering a century of world peace and prosperity. Its also the only country that puts citizens speech and rights above the government. Up until the current admin at least. I'm done with you - you're not worth having a conversation with.
Indeed, it's quite absurd to compare someone who rapes a single digit number of children to someone like Putin that's brought death and suffering to many orders of magnitude more people.
At least the US government has usually been less evil than many others. :/
Being a Reddit-brained materialist, you have lost the ability to mentally model the people who are into This Stuff, who specifically get off on the fear and pain that children suffer during the creation of these materials.
You assume the AI stuff will act as a "dampener" when any normal person on the street instinctively understands that it will inflame. That's because it's not about seeing naked bodies. It's about sadism, the euphoria of power over a helpless creature, of defiling an innocent soul. A real soul, not a fake one.
You see This Stuff as just another expression of sexuality, but it's not the same thing. It's a different thing.
Look at stories about people who are caught with gigabytes of This Stuff on their hard drives. They require an ever-flowing wellspring of fresh suffering. They risk their lives trying to get more of it, and that's part of the thrill.
If you really want to ruin your day, google "Eleanor Hunton Hoppe," a respected socialite who was recently caught trying to meet up in the middle of the night at a hotel to abuse an 8yo. She had loads of This Stuff on her computer. A line from texts that were revealed during her trial that stuck with me is "that hazy/dreamlike state is perfect to introduce a variety of new things.” If you have kids, you know what state she is talking about. Kids wake up in the middle of the night, loopy, confused, they just want to snuggle and be silly, it's basically the cutest thing ever, special moments you remember for years. Every parent reading this knows what I'm talking about and is viscerally repulsed to their core at the idea of someone exploiting this particular moment of vulnerability to hurt a child.
That's what This Stuff is all about, and what can you call it but evil? Not to sound like a libtard but THE CRUELTY IS THE POINT.
These are not urges that you can simply "get out of your system." If you think AI-generated alternatives are going to satisfy, I'm sorry but you don't understand human nature and no amount of quirkball *erm checks notes* Bill Nye psychobabble will convince us otherwise.
Hi Aella. I'm a fan not because of your online activities, but because I appreciate intelligent open minded women who don't succumb to societal norms blindly.
I understand where you're coming from and believe your intention is good. But I agree with the critics on this one (rare) that this is actually a bad idea. And it does mostly revolve around this section: People Want To Try Out Things They See In Porn.
Flooding the Internet with it, or even just making it legal and widely available would in fact normalize it. Which would in fact have the effect of making it more prevalent. Individuals who would never have had an interest or watched it before would end up doing so. How many can be debated but you'd have a very difficult time convincing anybody that there would not be a noticeable quantity. And this effect has been noticed before.
I've been involved in the BDSM community and used to be very active in local groups. As BDSM became more popular in media and porn more people started showing up. But also it was extremely obvious that more people were engaging in it on their own in private due to the increased popularity and exposure. This was evident because porn and media often depicts BDMS very, very wrong and/or poorly. It often shows types of play which are considered edge and are dangerous and people actually in the community put in great effort to learn how to do safely if they engage in it at all. But random people just seeing it done in popular media and porn started trying it at home without any knowledge or clue of the dangers. You know what they say, "don't try this at home" - well it applies to BDSM too if you don't have the skills/experience. This more prevalent engagement by inexperienced people obviously lead to problems and injuries and then that lead to blowback on the BDMS community because society does not like things outside the norm and they blame everything they can on anyone they can.
The person you quote also mentions choking and anal play. I can also attest from personal experience that those have indeed become much more normalized and common over the last couple of decades due to their regular inclusion in porn.
Also, about your reference to gay porn, uh straight guys don't watch gay porn. They also don't try to masturbate to gay porn. But I'd bet very good money that if 33% of all "straight" porn actually had bi-male (aka gay) porn mixed in and the guys watching it could not skip it for whatever reason over a period of a couple of decades you'd end up seeing a whole lot more bisexual or bi-curious guys I'd bet. Exposure does normalize things and repeated exposure particularly during arousal will form mental connections over long periods of time.
I even have some personal experience with this. There is something I absolutely was not interested in, not repulsive but just no interested. But I had multiple girlfriends over numerous years who ended up liking similar things and through repeated exposure I did in fact end up growing an interest in it to my great surprise.
This is a real effect, it does happen. You can debate the degree if you'd like. But it ABSOLUTELY WOULD lead to some people ending up being interested in CP that would not have been otherwise. There is no avoiding that simple fact. Given the nature of CP in particular even a relatively small quantity is not acceptable.
This is a complicated subject and as mentioned above I do understand what you're trying to accomplish. And it is also true that completely banning things and making things illegal absolutely does not stop them and can sometimes make them more interesting or desirable to some people due to psychology. Both are true.
But flooding the Internet with CP would almost certainly cause more harm than good. I'm a very open minded person but things that harm children are not okay.
This is a very complicated subject because human psychology is very complicated, messy and varies widely. Most humans are reasonably good given the chance, but there are always (sadly) going to be some bad people out there that will do bad things no matter what.
I agree with you here and also with Aella's critic on this one, too. I didn't find Aella's arguments particularly convincing this time around.
Does having people engage in simulated abuse (simulated rape, choking, hitting, beating, et cetera) with a consenting adult partner make these people more likely to engage in actual abuse?
> This is a real effect, it does happen. You can debate the degree if you'd like. But it ABSOLUTELY WOULD lead to some people ending up being interested in CP that would not have been otherwise. There is no avoiding that simple fact. Given the nature of CP in particular even a relatively small quantity is not acceptable.
We should look at the net effect here. In other words, is the net effect resulting in less children being abused, no change, or more children being abused. If this results in an extra couple of children being abused but in a much greater children than that being saved from abuse, then this would still be a tradeoff worth making, no, since the net effect would still result in less child abuse?
You are not going to sleep well at night, knowing you would have indirectly allowed one child to be offended by a pedophile. You don't even care if that happens. What if it was your child, or a child of your friend? Now, you'd care, right?
This would be a decision society / our government as a representation of society would have to make. If the benefit vastly outweighed the harm then maybe. Like if there was strong statistics that showed that something like this did cause 1 additional bad outcome per 100 bad outcomes it prevented. But - it would be extremely difficult if not impossible to ever gather such statistics. And without such statistics the decision should always be to protect the children.
>And without such statistics the decision should always be to protect the children
But it's begging the question to assume that the status quo is what "protects children" in the absence of statistics.
I legitimately don't understand what you mean by "If we can't prove definitively whether or not AI CP protects or harms kids, we have to default to the thing that protects kids". How do we know what that is?
I think BDSM and child molestation are two incomparable actions. One is not illegal or immoral, and is something you'd conceivably try if you were aware and curious enough. The second comes at a staggering cost to the abuser, both in terms of morality and pure self-interest.
People aren't going to be trying out committing heinous crimes just because they were exposed to it during porn. They might try out consensual choking, anal, spanking, or BDSM, but not this.
Further, even if legalization of AI CSAM increased the prevalence of pedos by 50%, it still seems obviously true that the increase would still be offset by the counterpressure of easily accessible AI material heavily suppressing demand for actual children. Conventional wisdom circulating around NoFAP is that porn addicts lose the motivation to go after real women because their sexual needs were so saturated by it, not the reverse!
You make a lot of good points, and it seems right that normalizing behaviors can result in an uptick of interest in engaging in them because of the examples you gave.
Then I think about how common it is to see serious crimes like murder in popular movies and tv series. It probably has resulted in more people being interested in committing murders, but I’m not sure how many people have actually committed a murder as a direct cause (I’ve only heard about the slenderman case, but I wouldn’t be surprised if there are more examples). Murder is still very stigmatized with severe social and legal repercussions, and basic human morality causes most of us to be revulsed by it. This to me shows that it’s likely possible that fictional depictions of something can become prevalent without influencing a concerning amount of people to seriously perform those actions. I think the harms of making content that depicts fictional harmful behaviors can be largely offset by enforcing severe consequences for actually doing those behaviors
Since pedos and sexually abusing children are so reviled like murder (maybe even more despised as I’ve seen some support for murder like when it involves a parent killing their child’s abuser, executing criminals, and Brian Thompson), I find it hard to believe that making AI CP prevalent will likely result in a noticeable increase in actual crimes. Although, I’m not sure if it will probably cause a reduction in crimes.
If only a small minority is driven to harm children from fictional content, it makes sense why it should still be banned or at least restricted. But what about other things like some violent video games, movies, etc. with realistic simulations of dangerous and harmful actions that make them look fun?
I feel viscerally opposed to making AI CP something that can be easily accessed, but I don’t feel the same way about things like violent movies for some reason. I feel logically inconsistent if I reject one but not the other, but maybe there’s a core difference I haven’t noticed.
Basically, I don’t think AI CP will cause a noticeable increase in actual abuse like how the popularization of bdsm in films resulted in more people being doing it because the level of stigmatization and consequences seem way higher for the former. It definitely seems safer to keep AI CP illegal/hard to access because there’s probably a non-negligible chance it would make some individuals do crimes. I agree that we probably shouldn’t do it and instead find less risky solutions to combat child abuse.
Have you heard of mass shootings? The number of those drastically increased over the past couple of decades as they became more widely covered in media and pop-culture. The individuals perpetrating those seem very much to be motivated by media coverage of previous mass shootings. So yes, I would in fact say this effect even extends to murder sadly.
Regarding mass shootings I've long suspected that if anyone perpetrating a mass shooting was completely wiped from society. They picture and name never to be shown or spoken again. Interviews discussing them never to be done. Etc. Basically the exact OPPOSITE of the coverage they get today - I'd bet substantial money that the quantity of future mass shootings would drop over time. However this would obviously conflict with the 1st amendment and there are also transparency concerns when it comes to governments and law enforcement.
Does engaging in rape fantasy roleplay and other simulated abuse (choking, hitting, beating, et cetera) make one more likely to commit these acts in real life?
Also, I think that the burden of proof here should be on those who advocate criminalization. After all, we don't know what effect legalization will have on children, but we do know that it would give MAPs more harm-free options, especially when combined with realistic child sex dolls/robots, thus likely ensuring that less MAPs feel compelled to get castrated (and castration is a very significant harm).
Not being a woman I can only comment on this anecdotally. From what I gather women in more recent years say that cases of their male partner choking them during sex have increased in recent years.
I do know first hand the number of women wanting to be choked during sex certainly has increased over the last couple of decades. Even seemingly normal vanilla women engaging in vanilla sex. Though I would not personally consider choking to be a vanilla activity.
And, no. Your take here is completely backward. We don't help out the MAPs and make their lives better just because we are unsure what effect it will have on the children. FUCK THAT.
When it comes to protecting children I'm going to side with protecting the children. Frankly I'm not concerned about the "MAPs". The goal should always be to minimize harm to the children (presuming that is actually the goal and not surveillance or other agendas wrapped in CP). If a MAP is so worried that they might do something that they are considering castrating themselves - personally I'd suggest they go further... People like that are really not a benefit to society.
I'll be blunt. If people like (actual) psychopaths, and such were completely removed from society and future generations the human race would be vastly better off. These individuals are a blight on everyone. Its likely that the VAST majority of seriously negative things that happen in the world and effect everyone are caused by this tiny percentage of people.
Yeah but like, pedophiles and psychopaths *aren't* going to "remove themselves from society", so it's just wishful thinking to fantasize about pedophiles castrating themselves.
The principled argument is "Criminalizing ANYTHING requires a positive justification". There are various formulations of this, from "Everything which is not forbidden is allowed" to "it neither picks my pocket nor breaks my leg", but in general, if you're going to say "don't do this thing, or else we're going to throw you in a cage/kill you", you better have a very strong justification for "don't do this thing". In the case of murder, or rape, that's pretty easy to justify. All this argument asks is you similarly demonstrate consuming AI CP has a similar justification for criminalization.
> so it's just wishful thinking to fantasize about pedophiles castrating themselves.
The virtuous ones could, possibly, but it's still extremely inhumane not to offer them another, better alternative to this.
Yes
Porn websites themselves admit to trying to expand the sexual interests of their user bases. They truly do manipulate the viewer down different pathways. It’s like Aella is somehow separating one type of porn from all the rest, but in reality, there are no such separations on tube sites (where most people watch) and that is INTENTIONAL. There are cross category suggestions planted all over the place.
You've "seen the world of BDSM". Okay. So you're a tourist with a tiny amount of experience. My perspective is from someone who spent a massive amount of time and effort as an organizer and a leader. Running groups and organizing events. I talked to literally thousands of individuals personally on these subjects. My experience started in the 90s before the Internet had an impact on such things and went forward from there, so I got to see the effect that publicity and popularization in pop-culture had. So I'm sticking with my opinion on the subject which is derived from a whole lot of first hand experience spanning a meaningful chunk of time.
Also, you're choice of example case is incredibly poor. The Handmaids tale? That's an extreme case that actually involves all of society and isn't an inter-personal kinda thing.
Here's an even more controversial take - Child porn laws are not about protecting children, but protecting social norms. If real children need to continue being abused so collectively we feel like we are in an environment where children are safe, then that's what is going to happen.
child porn is already illegal. any further legislation is about invading your privacy. sesta/fosta was way more than enough in my face
Look into what Apple tried to do with their storage scanning software for "CSAM". The government and corporations will always try to take your freedom away in the form of "safety". I highly recommend reading into the bills that get passed that dictate these laws. The goal of corporations is to maximize profit; the goal of governments is to maximize power.
Apple isnt governtment tho it's been aquiring that kind of power
That seems to sum up the way modern society is headed, with every single cause someone claims to care about: if the ship needs to sink so we can feel like we’re saving it, let it sink.
There are social norms which should be protected.
Which social norms? And for 325 million.
Adults having sex with minors being understood as being a bad thing is one example. Many things which cause disgust are good examples.
Boom... (my head exploding... in a positive sense :D).
Agree!
As a private teacher, I ended up in situations where I had to accompany a student to the police in order to report CSAM twice. Both times, I was met with the most indifferent cops on earth who really didn't feel like having their Miller time ruined by how much work it would be to take this seriously.
Meanwhile, I turn on the news and CSA is the worst crime in history of crimes and justifies everything, in fact my government would like to read my WhatsApp messages this very minute.
I have become so blackpilled on this, my firm belief is now that less than 1% of men who claim to care about CSA(M) do actually care, the rest are looking to protect the last bastion of lynch law, where calling for torture and castration is still fine and dandy. In fact you can still effectively boost your social status by being the person "wanting to be the most horrible towards pedos" in many male-dominated circles.
I work with sexual assault victims and offenders. You are 100% right. I hope I didn't misread your comment but I'll do my best here. The government does not care about victims. More offenders = more free jail labor. It is simply a fact. The government imprisoned people for smoking marijuana. Imagine having an ounce of marijuana and going to jail for years because of it. Our government does not care about victims nor the rights of anyone. They haven't even released the entire Epstein files. This tells you everything. They gave Epstein a reduced sentence, a sentence lower than many sentences given to people charged with possession of marijuana.
Interestingly enough, I realized that the majority of people calling for wood chippers and lynching are further fueling the problem. Another problem is that our government does nothing to rehabilitate this people and prevent children from being abused. They protected Epstein for decades.
I was SA'd when I was a child, and seeing the generation Z on social media say that 20 year olds who date 16 years olds should be thrown in a wood chipper just crushes all hope of solving this problem. Many people do not realize what the victims go through. Exclaiming pedophilia and advocating for lynching at any time possible just softens people to the outright horrible crimes committed by pedophiles, aka true ones and not just legal individuals looking for relationships or love. It is like calling every single person a racist- it defeats the original purpose of the movement.
And as someone who is a sex positive liberal who believes in equal rights, we need to offer solutions instead of letting our emotions get to us. Solutions work, angry comments do not. But as time goes on, I will continue helping victims and rehabilitating offenders. We need to work on our society.
Are you yourself not perpetuating this problem by using words like "pedophiles" to mean "child molesters" while claiming working in a field where you should be aware of this distinction, or did you just use the words like you did because it's contemporary to do so (despite the fact you're lamenting the current zeitgeist around the issue?)
Are you comparing teenagers having nudes of each other to actual CSAM? Or do you mean you discovered multiple times that elementary schoolers' parents were recording them being sexually abused and uploading to the internet, and the cops didn't care?
I mean two incidents of being extorted by an adult ex-boyfriend with imagery that was (in one case) recorded under highly questionable circumstances (I also keep hearing that teenagers cannot consent anyway, but interestingly enough, that doesn't keep the police from going all "you silly girl, why did you let him film you, huh?"), and in another case was recorded under threat of violence.
Police response in the first case was "I mean, you could just block your ex, because clearly all he wants is to get back in contact with you", and in the second case (where the parents threatened to kill my student for "being a slut"), they at least helped her retrieve her things from her parents' home safely so she could move in with another boyfriend (what a great solution), while heavily discouraging her from filing a report against her ex for child pornography, as that would mean "many other people would have to look at these images to deal with them legally, and would you really want that? Just move on with your life". I have never seen or heard from this student ever again, and for all I know, she could be fucking dead.
I also want to mention that in Germany, teachers can lose their jobs for bringing child pornography to the attention of the police, since if they ever happen to use their own devices to secure the evidence (because maybe they gain access to a class group chat where such material is being shared), that already counts as ownership of CSAM and carries a minimum sentence of 12 months - and any sentence above 11 months means losing your civil servant status. Also, child pornography is not a criminal offence prosecuted only upon application by the victim ("Antragsdelikt"), so the police's choice to not act on their own was very much illegal in both cases.
So, is this as bad as "actual" CSAM? No, of course not. Are theses cases representative of what is typically understood by "child pornography" in the vast majority of cases? Yes, yes they are. And did the action match all the big talk? You be the judge.
Which part of my response is giving you the impression of "not caring'?
This isn’t just a debate or an opinion, this is about protecting the innocence of children. What’s being discussed here is deeply disturbing, and I feel a moral obligation to speak up
Reading even the first few lines of this post deeply unsettled me, not just because of the subject matter, but because of the attempt to intellectualize and rationalize something so horrific. The very idea of entertaining or defending the production of AI-generated child pornography, regardless of the argument, is appalling.
This conversation doesn’t just cross a line, it risks normalizing behavior that is psychologically and spiritually devastating to victims. It is not a debate. It is a violation of conscience, and of basic human decency. No technology, no theoretical framework, no “greater good” should ever be used to justify or soften the perception of child abuse in any form.
Children are the most vulnerable, and predators have always taken advantage of that. Watching content that portrays child harm "AI or not", implies desire. A normal, compassionate human being should feel disgust, rage, and the instinct to protect, not entertain nuance in such matters.
I believe content like this doesn’t just alienate responsible readers, it dangerously blurs ethical boundaries society should be reinforcing, not debating.
You’ve lost a Subscriber. And frankly, I believe this kind of conversation warrants serious reflection, not just from readers, but from the platform itself. Silence and neutrality are not options when it comes to the protection of children.
Deeper in the article, Aella explains that pedophiles who want access to child porn must abuse children themselves and post photos of that abuse to forums. This situation is unspeakably awful, in that it amplifies the incentive to abuse children, getting you access to a community of people who share your inclination to moral atrocity and are only too happy to provide you with more fuel. Thus far more children are abused. This is an unacceptable state of affairs. If AI child porn would put a stop to that -- I don't believe Aella is crazy for thinking it might -- then I can't oppose it on principle.
This kind of conversation should never exist in the first place and there are some lines that should never be crossed and this is one of them. The very existence of this topic, and any attempt to intellectualize or justify child sexual content , AI-generated or not is deeply disturbing.
Children are the most vulnerable members of our society. They deserve our full protection, not to be used in theoretical debates that risk normalizing or enabling predatory behavior in any form.
The idea that creating artificial content to "reduce harm" is somehow acceptable, ignores a fundamental truth: if someone is seeking that kind of material, AI or otherwise, they already pose a danger. And introducing such content doesn’t neutralize the threat, it feeds it.
This isn’t just a moral issue, it’s a human one. There should be zero tolerance, and the law should reflect that with severe, public consequences that deter anyone who even thinks of crossing that line.
Some topics should never be normalized, and this is one of them. Full stop.
Our greatest priority here should be to make it so that pedophiles are not abusing children. There is evidence that pedophiles are abusing more children because abusing children gets them access to vaults of CSAM. We want to make that stop, and to do that, we must discuss ways of stopping it. That requires a discussion on this topic, otherwise things will never change. Banning all discussion on this topic for fear of "normalizing" an intellectual approach to these matters is not productive if we are seeking to change the current state of affairs. And I don't think the current state of affairs is acceptable.
I hear the intention behind seeking solutions, but we must be extremely clear, there is no form of child abuse that should ever be discussed as “strategic” or “functional.” The very suggestion that AI-generated content could reduce harm by catering to the same twisted desires is deeply dangerous. It doesn’t solve the problem, it feeds it.
This isn't about fear of debate. It's about preserving the most sacred boundary we have as a society: the innocence of children. When we even entertain the idea of synthetic child abuse as a solution, we desensitize people. We shift the moral line. And history has shown that once that line shifts, it's nearly impossible to pull it back.
Jesus said it best: "If anyone causes one of these little ones to stumble, better for him to tie a millstone around his neck and be thrown into the sea." That’s how serious this is.
There are topics that should never feel safe to explore. This is one of them. People shouldn’t feel comfortable even thinking about children in that way, let alone debating how to satisfy that urge with AI.
We need moral courage, not moral compromise. This conversation doesn’t lead us forward, it takes us somewhere no human being should want to go.
Zero tolerance doesn't produce best results anywhere. Haven't we figured that out yet? I know there are a lot of papers on the topic.
Did you really just invoke Jesus here lol. Might want to refocus on turn the other cheek, cast the first stone, forgiveness, etc. Also you might want to think long and hard about the word harm and who is doing it, generally it's society revictimizing.
Jesus didn't say "Go put a millstone around your neighbors neck and kill him because he doesn't walk his kids on a leash and let's them eat ultra processed food".
Yes, I invoked Jesus not as a call to literal violence, but to highlight the gravity with which true morality treats the protection of children. His words weren’t about junk food or parenting styles. They were about people who prey on the innocent, and how seriously we should take that.
You want to talk about “harm” and “revictimization”? Then recognize that legitimizing the simulation of child abuse even through AI is a form of harm. It validates the desire. It shifts cultural perception. And it erodes the very instincts that make us recoil at evil.
This isn’t about forgiveness. It’s about refusing to let moral confusion masquerade as compassion. I don’t need to “cast the first stone”, I just won’t stand by while people try to sanitize the unspeakable in the name of progress.
Some things are wrong not because they result in harm, but because they are a betrayal of who we are at our core. That’s not religious dogma. That’s human decency.
Well, how is it going? Are people being deterred? Is it a good thing that we're creating a structure where people are highly incentivized to remain private about abnormal desires and never seek help? And what do you care more about? Saving the children, or being the "look at me, how I am pretending to be saving all the children"-guy?
What precisely do you want the law to do about “anyone who even thinks of” crossing that line?
We all know the earthly justice system is flawed, a "kangaroo court," if you will. But under divine law, the truth is absolute. Imagine if this happened to your child, the pain and anger would justify any action you take in defense of them.
We should focus more on that higher moral law than on the injustices of these human-made courts. When it comes to matters of true justice, divine law reigns supreme, and that’s what we ultimately need to be concerned about.
Uh ok well, if god is handling it then why are we even talking here
Just a reminder, God has handled it before. Let’s not forget Sodom and Gomorrah, the Passover during the exodus from Egypt, Babylon… just to name a few. These moments are reminders of how God deals with corruption and evil when it festers unchecked.
But it’s not just about waiting on divine justice, it’s also about taking action in the face of wrong. Just like those before us, we have a responsibility to speak up, stand firm, and protect what is good and right. Trusting God doesn’t mean staying silent, it means being willing to act while holding faith in the bigger picture.
This platform really needs an IGNORE feature. Some people just never have anything constructive, useful or practical to add.
Therefore, let’s make more absurd laws that can only be enforced arbitrarily, thus ensuring the justice system can’t be anything but a kangaroo court. It won’t help abused children one iota, but someone will be hurt. That’ll show them how much you care about the children.
Do you agree that AI CSAM is better than CSAM? Or is there some sort of sacredness intuition that prevents you from even comparing them?
God, you're so dumb.
This is such a fascinating reply.
Would you rather have a world with less real abuse but more fake AI porn, or a world with more real abuse and less fake AI porn?
According to this line of argument, you are *literally causing* more abuse to happen.
I get that you don't believe allowing fake AI porn would decrease the amount of real abuse, but for the sake of argument, *if* it did, would you allow it?
I understand the point you’re raising, it’s a classic moral dilemma: “Would you allow a lesser evil to prevent a greater one?” But here's the problem with that line of reasoning:
It assumes that AI-generated child abuse content actually reduces real-world abuse, which, to my knowledge, has no solid evidence and is a dangerous assumption to make. Without clear proof, using it as a justification becomes highly irresponsible.
Even if hypothetically, such content reduced abuse, it comes at the cost of normalizing child exploitation in any form. That normalization could lead to more harm over time by blurring ethical boundaries, desensitizing society, and potentially increasing demand for the real thing.
There are some lines that, as a society, we must never cross, not because it’s convenient, but because they uphold our humanity. Child abuse, even simulated or “fake,” isn’t just a functional issue, it’s a moral one.
Some things aren’t meant to be optimized or compromised they’re meant to be condemned without exception.
Yes it relies on the assumption that it actually reduces real-world abuse. That's the whole point. To your second point, if it in fact increases real-world abuse then of course it's a bad idea. The entire premise is that it reduces real-world abuse in the short and long term, otherwise it's just a completely dumb idea.
Then you go on to say that even if it does reduce real-world abuse in the short and long term, we still shouldn't do it, to protect society or something. Let me just point out how insane and morally reprehensible this view is: you're willing to condemn more innocent children to unnecessary abuse, for no real reason.
Let’s be clear: your argument hinges entirely on a hypothetical, that AI-generated child abuse content might reduce real-world abuse. But there is no credible evidence for that. None. And until there is, building a moral defense on that fantasy is not only reckless, it's dangerous.
Even entertaining the idea that we should create synthetic versions of one of the most horrific crimes imaginable in order to potentially reduce it is a slippery slope to moral collapse. It’s not just about whether it works, it’s about what we become when we justify evil for the sake of utility.
Because even if, in theory, it “worked,” it would normalize deviant behavior, blur ethical boundaries, and create psychological permission structures that do more damage over time. We’re not just preventing harm, we’re shaping conscience. And when you shift that conscience, you shift culture. And when culture shifts, the unthinkable becomes thinkable.
Some lines are not meant to be debated, they’re meant to be defended, absolutely, unapologetically, and at cost.
Children are not chess pieces. You don’t sacrifice one to maybe save another. That’s not strategy, that’s savagery dressed as reason. The fact that this argument even needs to be refuted is a sign of how far some are willing to drift from decency in the name of intellect. There’s nothing smart about surrendering your soul.
>You don’t sacrifice one to maybe save another
We of course to this, every single day, with the distribution of limited resources. For instance, Tuberculosis is a curable disease over a million people a year (among them almost certainly tens of thousands of children) die from each year. If we as a society choose to spend money on anything else, even researching childhood cancers, we're sacrificing the TB kids to save the cancer kids. Obviously point of fact is we're sacrificing them for something much less worthy and more stupid, but it really drives home how worthless this moral grandstanding is.
https://www.lesswrong.com/posts/Nx2WxEuPSvNBGuYpo/feeling-moral
"My favorite anecdote along these lines comes from a team of researchers who evaluated the effectiveness of a certain project, calculating the cost per life saved, and recommended to the government that the project be implemented because it was cost-effective. The governmental agency rejected the report because, they said, you couldn’t put a dollar value on human life. After rejecting the report, the agency decided not to implement the measure."
Turns out, if you refuse to put a number on the value of human life (or a child's innocence), the world sets that value for you, and it usually ends up insultingly cheap.
You're confusing resource allocation with moral absolutes, two completely different conversations.
Yes, societies constantly make difficult decisions about limited resources. But that’s not what's happening here. We're not talking about choosing between curing TB and curing cancer. We’re talking about whether or not to legitimize the simulation of child abuse and trying to dress that up as "strategy" is intellectually dishonest and morally bankrupt.
You don’t "optimize" child exploitation. You don’t play ethical games with innocence. And you sure as hell don’t compare a budgetary decision to the willful decision to allow any form of child sexualization, real or artificial.
Trying to make this about effective altruism or cost-benefit analysis completely ignores the actual harm of normalizing such depravity. Children are not line items in a spreadsheet. You can’t put a price on safeguarding their dignity. You either defend that line or you don't. And if you don't, don't pretend it's because you're being "pragmatic." Just admit you've traded conscience for calculus.
Some lines are sacred. This is one of them.
> Let’s be clear: your argument hinges entirely on a hypothetical, that AI-generated child abuse content might reduce real-world abuse.
Yes it hinges on that, in fact it's the premise and even title of the post we're commenting on. If it's not true then of course we shouldn't do it. Who knows if it's actually true, but I think this post lays out compelling evidence that it is in fact true.
And if it turns out that it is true, then not doing it would be morally reprehensible. If it really truly is the case that allowing fake images of abuse to exist would cause less real abuse to happen, then it's obviously stupid and morally wrong to continue to ban the fake ones.
If this is true, you are right now morally responsible for real abuse occurring. I would trade this current world for a world with less real abuse, but more fake abuse. You would not. Who is the monster?
Calling me a monster for refusing to normalize child exploitation, even simulated is one of the most backwards, morally bankrupt accusations I’ve ever seen.
Let’s be clear, if you think defending a child’s right not to be reduced to fantasy abuse makes someone “the monster,” then you’ve inverted good and evil.
You say you'd trade this world for one with "less real abuse but more fake abuse"? You’re not offering a solution, you're offering a moral poison pill. You're saying: let's feed the sickness so it doesn't spread. But that's not how evil works. You don't curb depravity by catering to it. You don't stop fires by handing out matches.
What you’re defending requires the creation and normalization of a grotesque fiction that simulates the very thing we say we abhor and then dares to call it progress. That’s not moral courage. That’s spiritual cowardice.
Some things are not “lesser evils” they are non-negotiable evils. And when you try to rationalize them in the name of “saving” others, you lose the soul of your argument and the soul of your society.
I’d rather be falsely called a monster for standing against simulated child abuse than be remembered as one who stood by while people tried to dress moral decay up as harm reduction.
Thank you for the new introduction to my seminar "I like moral arguments too, but wherever I look, the functional arguments end up being a whole lot smarter"
Functional arguments might be clever, but morality is why we have society in the first place, otherwise we'd still be in caves throwing rocks at each other.
Function might build tools, but morality builds trust, dignity, and the very fabric of human life.
We’re not throwing rocks at each other because functionally, using rocks for something else clearly leads to better outcomes. Slavery was abolished because it was economically unfeasible. So was child labor and colonialism. This list goes on and on. Unless we’re talking about a few select authors, moral arguments are nothing more than a shortcut to functional arguments, with the added bonus that understanding of the functional reasoning is not mandatory. Sometimes this is an advantage, sometimes it isn’t.
Drawing functional arguments from what were ultimately moral revolutions is a dangerous distortion. Slavery, child labor, and colonialism weren’t ended because they were inefficient, they ended because people had the courage to say, “This is wrong.” That same courage is needed now.
Pornography, at its core, was never about freedom or expression, it was created to exploit and degrade. As the Meese Commission and other studies have shown, it distorts intimacy, fuels addiction, and erodes the very essence of human dignity.
Child sexual abuse is a boundary that must never be crossed. The fact that we’re even entertaining AI-generated abuse under the guise of “debate” or “harm reduction” isn’t progress, it’s devolution. It normalizes the grotesque and chips away at our shared moral compass.
Some things are not meant to be explored, optimized, or rationalized, they are meant to be condemned.
"Some things are not meant to be explored, optimized, or rationalized, they are meant to be condemned."
I firmly believe that you are the one making a dangerous distortion, because you are dismissing the idea of implementing structural measures that can reasonably be expected to decrease the occurences of these grotesque acts. The innate weakness of moral arguments (because sociologically, they form a moral functional system, which sits opposite of cognitive functional systems) is that they fail to systematically produce outcomes unless they are structurally validated within those cognitive functional systems like science or the economy.
While I am willing to concede that moral arguments are necessary to ultimately attain favorable outcomes, they are never sufficient. While there was a highly effective public courage to stand up against colonialism, that has been the case decades before decolonisation became a structural reality. And the moral outrage was only met at that point with a structural correlate because colonizers had to let go of their ideology and come to the realization that at its core, colonies produce a net-negative outcome for the occupying state.
Condemnation does not work well enough. Morality does not have inherent value, only betterment does.
It's a lot easier to get people to agree to get rid of a moral outrage if the economic incentives to keep it aren't strong, but when England ended slavery in its colonies, it was primarily because of the moral arguments, and a large portion of the government's budget went to compensating former slaveowners for the financial value of their "property".
John Woolman may have been one of the greatest people who ever lived. https://thingofthings.substack.com/p/on-john-woolman
Thank you for the detailed response.
I understand that you're emphasizing structural solutions and functional outcomes, and I agree that real-world systems must support any meaningful change. But here's where we differ: not everything can or should be measured only by outcomes or systems logic, especially when the cost involves normalizing abuse, even artificially.
When it comes to something as deeply violating and morally corrosive as child sexual abuse, in any form, the role of morality is to draw a clear line in the sand. The line exists to say: this should not exist under any circumstances, regardless of theoretical benefits.
We don’t reduce fire hazards by allowing people to play with fire more safely, we create firewalls. We don’t tolerate synthetic versions of abuse in the hope it’ll reduce real abuse. That thinking leads us down a path of desensitization, where the line becomes harder to defend over time.
Condemnation in this case is a structural response. It protects the cultural and psychological boundary that keeps society from slipping into moral numbness. Once that line is blurred, the very foundation that allows us to claim something is “harmful” or “wrong” begins to erode.
So no, not everything should be optimized. Some things should be stopped cold. Not because we lack the tools to test alternatives, but because what’s at stake is too sacred to experiment with.
"Pornography, at its core, was never about freedom or expression, it was created to exploit and degrade. As the Meese Commission and other studies have shown, it distorts intimacy, fuels addiction, and erodes the very essence of human dignity."
What a complete and utter pack of lies, untruths and misconceptions.
Pornography at its core is about people enjoying something that is a natural bodily function which is quite pleasurable and enjoyable. Sex, at its most basic, is a fun and entertaining activity that can be experienced solo (masturbation) or (for more fun) between two or more consenting adults. Pornography is simply the recording of those activities and sharing them with other people. Its content that focuses on sex, really not all that much different than mindless TikToks but everyone is naked and rolling around. The fact that porn was the first media to ever go "viral" shows that it is something people really and truly want and enjoy. People were recording their own home videos and sharing them long before there was OnlyFans. The majority of people making porn are doing it because they want to be doing it - which is not exploitative.
Just about anything can be turned into exploitation or used for alternative purposes, not just porn. There are a whole lot of people who use religion for very nefarious, harmful and even violent purposes. Much more so than pornography in fact. Would you say that religion at its core is exploitative and harmful?
Sex can also of course be part of romantic relationships and used to build bonds and increase intimacy.
What you're describing may sound innocent, but it's part of a deeper moral confusion that's slowly poisoning society. Sex was never meant to be commodified or stripped of its sacredness. It's meant to be intimate, sacred, and a deeply meaningful part of a committed relationship between a man and a woman, ideally within the context of marriage. It's where two souls come together, sharing love, vulnerability, and trust, and it's in that space that the true magic happens - the creation of life. This is what makes it special, it's not just a bodily act, it’s a union that transcends the physical.
Pornography, however, corrupts this truth. It reduces intimacy to a performance, and the deep emotional connection between partners to a series of shallow, mechanical acts. It's a distortion of what sex is meant to be. And we don’t have sex in public for a reason, it’s meant to be cherished behind closed doors, in the privacy of a relationship, not displayed or exploited for anyone to consume.
What you're defending isn’t about "natural bodily functions." It's about an industry that thrives on the degradation of real human connection. It’s not about pleasure, it’s about exploitation. The Meese Commission, a government-backed study, outlined the very damage that pornography causes to relationships and individuals. It showed how it warps our expectations, fuels addiction, and leads to real harm in society. That’s the reality, whether or not it fits the narrative we want to sell.
To reduce sex to just "fun" is to miss the point entirely. It breaks marriages, shatters relationships, and introduces harmful fantasies that have no place in real intimacy. When you think about what’s happening to our society, it’s not just a matter of people consuming media, it’s about the disintegration of values that hold us together. We’ve normalized something that was never meant to be commodified. And that’s why we see the rise of broken marriages, deep loneliness, and the dehumanization of real connection.
This isn’t prudishness or a fear of pleasure. It’s about drawing a line. A line that says: sex is sacred. It’s about intimacy, not exploitation. And when we lose that, we lose ourselves. The "solution" isn't just turning a blind eye and pretending it's harmless entertainment. It's about reminding ourselves of what real love and connection look like, and what happens when we forget it.
This isn’t just a conversation about “pleasure.” It’s a conversation about our values, our integrity, and the kind of society we want to live in. And until we return to that foundational truth, that sex is not a product, it’s a powerful force to be nurtured, we’ll continue to spiral into moral decay.
Oh take pity, often those who cling to moral arguments do so because reasoning and logical thinking is beyond their grasp. Someone once told them a story about something bad that they called a "moral" and now they repeat it even though they don't fully understand the full implications. People have to work within their limitations.
But, of course, you need purely moral arguments to defend dysfunctionality.
You seem profoundly confused about the discourse norms here.
Do I sound confused, or are you just uncomfortable with someone drawing a clear moral line?
If refusing to intellectualize something as inhumane as AI-generated child abuse makes me “confused” in your view, then maybe it’s not my clarity that needs questioning, but the discourse norms you’re choosing to defend.
No, you just sound confused.
I'm sure you mean well but I dont think this comment is good discourse either. Wouldn't it be better to engage with his comment more thoroughly?
Also, on an unrelated point, im surprised Sophic Sam is a subscriber of Aella given this kind of response?
I see no evidence attempting to engage thoughtfully with such a thought-free commenter would do any good. As to the second point, Aella attracts a lot of religious weirdos who sublimate their erotic fascination with her into tirades against her general sluttiness.
I think it is a deeply... im not sure the word, scary? thing when someone turns there brain off to alternative thought.
How can intellectualizing and rationalizing be bad? Whats wrong with the greater good?
He doesnt answer this.
Sophic doesnt engage much with Aella's argument, and condemns her pretty damn hard.
He also seems like a thoughtful and reasonable person. He articulated himself, he behaved more respectfully than 90% of the commenters here.
A norm I try to hold is respect, I think your comment is kind of a pejorative. It mainly is just telling Sophic he doesnt belong here, in a dismissive and looking down upon way.
I think generally, one sentence dismissals, nawt good. Even when deserved.
LUke, I appreciate your willingness to hold space for respectful discourse, and your instinct to look for good faith in others. That’s a rare and valuable norm, especially in conversations as difficult as this one.
I want to clarify that my strong moral stance doesn’t come from turning my brain off to alternative thought, it comes from turning my whole self toward the reality of what’s at stake: children. There are some lines in society that aren’t up for intellectual debate, because debating them already risks normalizing the unthinkable.
We can talk about structure, function, and harm reduction all day but at some point, the foundation we’re building on must be rooted in moral clarity. My response wasn’t dismissive of complexity, it was protective of innocence. That doesn’t mean I’m against thoughtfulness, it means I’m for drawing lines where they must be drawn.
As for Aella’s content, I unsubscribed precisely because I believe there’s a point where discourse becomes complicity. I won’t sit silently when children even in hypothetical or artificial form become subject to normalization through tech or detached reasoning.
Thanks for hearing me out. I'm here for thoughtful dialogue, but on matters this serious, there's no room for moral ambiguity. Protecting children isn't a stance I defend it's a principle I refuse to compromise on.
Although I disagree, what your saying makes sense! I appreciate your insight. It reminds me a bit of the ai box experiment. A super intelligent AI would always convince you, to let it out of its box. So its not even worth talking to it. https://www.yudkowsky.net/singularity/aibox
Maybe its the same with morals, some moral questions are so bad and some people so pure, you cant even discuss it. Some things cant be tainted, and are just wrong no matter how you square it.
I see no evidence of thoughtfulness or reasonableness, and generally think it’s fine to let people have what they deserve. Open discourse norms are easily exploited by people who don’t share them. Rationalist flavored folks are always getting into interminable discussions with people who suck because of their norms of unilateral assumption of good will, to no advantage I can see. But of course you’re free to do that if you want to, just as I’m free to be rude to people who suck.
"no evidence of thoughtfulness or reasonableness"
Why do you think this? I mean, I compare his response to
"Welcome to the world of effective altruism and rationalism. These people are some of the most evil motherfuckers on earth"
"I think this author should climb the tallest building in their city and jump, in place."
and I think on the "thoughtfulness and reasonableness spectrum" he is pales beyond these two.
You might say, "this is a baseline, there is no benefit to simply meeting it" to which I would say, baselines are meant to be easy to meet. And this one obviously isn't for this topic.
I don't think of things in terms of good will or not, I just go off general vibes, and try and articulate them.
People can be free to be rude to those who suck, thats valid, but if everyone here was like that, I wouldn't be here.
Aella could have written a blogpost "10'001 reasons why anti-pedophiles should suck on my fat hairy balls" and it would be really righteous and justified and totally useless.
There is something to turning the other cheek. I cant really explain it but there is.
In regards to Sophical's post, we can imagine two posts trying to convery the same idea, coming from the same feeling, Sophical's post and someone who is less emotionally honest
Sophical writes
"This isn’t just a debate or an opinion, this is about protecting the innocence of children. What’s being discussed here is deeply disturbing, and I feel a moral obligation to speak up"
He isn't lieing or pretending that his idea's are anything than what they are. He isn't making up facts or allowing himself to become biased, he simply strongly believes this ideas are disturbing, and values the innocence of children above all else
A less honest person would look at this, think "this is disturbing, aella doesnt value the innocence of children" and write
"Lmao lol woodchipper go brrr"
or
"Aella is OBVIOUSLY WRONG, she clearly doesnt know ANYTHING about pedo psychology, I DO. The one thing we do, is THROW EM IN THE WOODCHIPPER then all innocent children will understand they are safe, and everyone will agree disturbing pedo's are disturbing. No talk other than LOCK EM UP is allowed, we must all make sure EVERYONE UNDERSTANDS this"
I prefer what Sophical wrote, its honest, and I can engage with it.
> You keep trying to paint this as simple math, like defending the dignity of children is a cold equation
Yes because it is whether we like it or not.
> even fabricate horror to possibly prevent it
Yes, fabricated horror is clearly better than real horror.
> You don’t protect children by normalizing the abuse of their image
But what if you can? If it doesn't actually lead to more harm? Then you can.
> Because once you let those lines blur, the culture blurs with them. Conscience erodes. And suddenly, the unthinkable becomes thinkable. That’s how real harm begins.
Yeah, if it causes more harm in the long run because of effects like this, I agree! I completely agree. But there's a very real chance it doesn't! And then the argument flips around, and *you* become the monster for causing unnecessary harm.
> Not even if it “works.” Because the cost is too high.
The cost is only too high if it doesn't work. If it actually works, there is no cost. If it actually, really saves children in the long run, then surely we have to do that right?
Your entire argument is that doing this will corrupt us and lead to worse outcomes in the long run, but I'm saying if that doesn't happen, then we have to do it.
You keep saying “but what if it works?” as if that magically erases the cost. It doesn’t. Some things are wrong even if they produce favorable outcomes. That’s the whole point of having a moral backbone.
The moment you accept simulated child abuse as a “strategy,” you’ve already lost the plot. You're not preventing harm, you're legitimizing a tool built on the aesthetic of harm. You're not protecting children, you're just shifting the abuse from flesh to fantasy and pretending that's noble.
If your solution requires us to simulate the worst crimes against innocence to maybe reduce them, then your solution is part of the sickness. You're not solving anything, you're surrendering.
The world doesn’t need more people who can justify evil with a flowchart. It needs people who still remember that some lines exist to be held, not debated.
> Some things are wrong even if they produce favorable outcomes.
I guess this is where we disagree, I think this is just plain wrong. If less real people suffer less harm, it's not wrong. (Assuming you count all people and all the way they are harmed!)
> You're not preventing harm, you're legitimizing a tool built on the aesthetic of harm. You're not protecting children
No we *are* preventing harm and protecting children by the premise of the argument. I agree that if it doesn't do this, we shouldn't do it, but if it does - then we should.
> you're just shifting the abuse from flesh to fantasy and pretending that's noble.
Yes this is what I think we should do. If by some magic miracle every drawing of a child saved a real child, you bet I would be drawing all day. Drawings aren't harmful in and of themselves. I get your argument that this might have second order effects, like normalizing it, which would lead to real harm of real people down the line, but if that doesn't happen (per stipulation of the argument), then there is no harm in the drawing itself.
> The world doesn’t need more people who can justify evil with a flowchart.
I guess I understand the sentiment, but this is how the world works every day in almost all situations. Everything is a tradeoff all the time, and every action and inaction causes some amount of harm. It's our responsibility to make these decisions as best we can in everything we do. There is no inherent evil in an action except for the consequences it has on real people (or sentient animals).
If you genuinely believe that creating or promoting child abuse in any form, even in fantasy, is acceptable if it leads to “less harm,” then you've already crossed a line no decent human being should ever approach. That’s not pragmatism, that’s moral rot dressed up in utilitarian language.
This is not how a healthy world works. It’s how a broken one does, one that’s forgotten what it means to protect the innocent, to draw hard lines, to say some things are wrong, period. The willingness to even entertain the idea that simulating the abuse of children might be “worth it” if it produces better numbers is a symptom of how disconnected we've become from our own humanity.
No, not everything is a tradeoff. Not every evil can be weighed and justified on a scale. Some things are wrong because they desecrate the very fabric of what makes us human. And when you start making exceptions for the unthinkable even hypothetically, you’re not reducing harm, you’re training people to see children as variables in an equation.
That’s not protecting them. That’s predatory logic. And there is no place for that in a sane, moral society.
Some lines exist not to be debated, but to anchor us when everything else is up for grabs. And if that sounds foreign to you, maybe it’s not the world that needs to change, maybe it's your conscience that needs to wake up.
But the reason creating fictional depictions of harm is harmful (even in your ontology) is that it leads to moral collapse or whatever, which then leads to actual harm happening. If it doesn't lead to actual harm, then the fictional depictions aren't inherently harmful!
But even if you won't look at the numbers, in your world more children are harmed than in mine, and that's fucked up (not to mention your fault).
If you think the only reason depictions of harm are wrong is because they might lead to real harm, then you’ve already conceded too much. You’re treating morality like a spreadsheet as if the only thing that makes something wrong is the outcome it produces. But some things are wrong in and of themselves because they degrade our sense of what is sacred, of what must never be treated as permissible under any circumstance.
You say in “your world” fewer children are harmed. But your world is one where abuse is rebranded as simulation, and innocence becomes an input in a moral experiment. That’s not a safer world, that’s a desensitized one. A world where we’re trained to view atrocity through the lens of utility until we forget how to recognize evil when we see it.
You’re not reducing harm. You’re normalizing detachment. You’re laying down moral explosives under the assumption no one will ever step on them.
This isn’t about refusing to look at numbers, it’s about refusing to let numbers erase what matters. Protecting children isn’t a function of efficiency. It’s a line we draw to remind ourselves who we are. And once you cross that line, even in theory, you’ve already begun to lose what you claim to protect.
Don’t try to blame people like me for holding that line. The harm doesn’t come from those who refuse to budge. It comes from those who think everything is negotiable.
You refuse to negotiate, and you pay a hefty price on the form of innocent children being abused unnecessarily (which is why I called you a monster). The price you pay to not negotiate is too high.
If degrading our sense of what is sacred leads to more harm, then yeah that's bad. But if doesn't, I'm willing to sacrifice the sense of the sacred for reducing harm.
A world that doesn't recognize evil when they see it leads to more harm, which I oppose. But if it doesn't lead to more harm, how is it evil? Evilness is defined by the harm! If there's no harm, I don't see how it's evil.
Whatever your moral intuition says, it has to cash out in real harm at some point. Reducing harm as much as we can is the point, and that's what I will always support. It's sad that you would oppose that.
You keep framing this as if refusing to negotiate with evil is the problem as if the moral boundary itself is to blame for the suffering, not the systems or mindsets that allow it to exist in the first place. That’s not just wrong; it’s a complete reversal of cause and effect.
You call me a monster for not compromising, but here’s the truth: I’m not the one introducing simulations of child abuse into the moral ledger and asking whether they might be “worth it.” I’m the one saying that some boundaries are sacred, that children are not variables in your thought experiment, and that our humanity begins where our willingness to compromise ends.
You reduce evil to harm and harm to numbers but morality isn’t just a consequence calculator. It’s also a compass. If you throw out the compass in favor of results alone, you may walk efficiently, but you'll never realize you've been heading into darkness the whole time.
Your world may look optimized on paper, but it’s spiritually bankrupt. It demands the sacrifice of our moral intuitions, the very instincts that tell us to protect the vulnerable, to reject what is perverse, to recoil at the grotesque. That isn’t progress. That’s decay.
You say you'd sacrifice the sacred if it "doesn’t lead to more harm." But once you’ve sacrificed the sacred, your definition of harm becomes malleable. And that’s how societies forget what needs protecting in the first place.
You can call that sadness. I call it clarity. I’m not sorry for refusing to negotiate with ideas that dehumanize innocence. That’s not stubbornness, it’s the one thing keeping us from becoming what we fear most.
"This kind of conversation should never exist in the first place"
blah blah blah
Lets not use our brains. Lets certainly not think. Oh my, the "violation of conscience, and of basic human decency"
I have news for you - humans really aren't all that decent. People who hide behind these types of moral outrage often have the most to hide and often the louder the are the more they are probably self reflecting.
Sorry, no. Problems don't magically go away or get solved by sweeping them under the rug and refusing to talk about them because they are uncomfortable or icky. Society and the real world is full of messy nasty things that need good, EFFECTIVE solutions.
If you can't handle that by all means bury your head in the sand and don't participate.
But for this:
"I believe this kind of conversation warrants serious reflection, not just from readers, but from the platform itself. Silence and neutrality are not options when it comes to the protection of children."
This is a veiled threat that anyone daring to discuss how to deal with this problem that does cause harm to children should be silenced. This is NOT how you actually protect children, this is how you protect predators. You should be ashamed. You are empowering pedophiles and basically offering up children for them with an attitude like this. You are part of the problem.
Let me be very clear: abusing a child is moral murder. You don't try to “optimize” murder. You don't debate how to make it "less harmful." You crush it with every legal and moral force a society has.
Suggesting that people like me who refuse to entertain grotesque hypotheticals about synthetic child abuse are “part of the problem,” is one of the most revolting accusations I’ve seen here. It’s cowardly projection. It says far more about you than it ever could about me.
You think moral outrage is a façade? You think people like me are “hiding something” because we draw a hard line against predators? That’s not reason, that’s rot. You’ve chosen to mock moral clarity in the name of intellectual “courage,” but there’s nothing courageous about moral collapse dressed up as logic.
We've lost something vital in society, the instinct to protect, to fight, to draw unbreakable lines. Too many men today are taught to suppress their warrior nature, to be passive in the face of evil, to avoid discomfort rather than confront depravity. And now? Our children aren't safe. Even animals aren't safe. Because the protectors are silent or worse, confused.
We don't need more “tolerance” for moral decay. We need courage. We need spine. We need people willing to make evil afraid again.
Humans made laws. Laws did not make us. And without morality at our foundation, any law becomes a tool of rationalized evil.
The solution is to draw an unbreakable line, enforce it with the full weight of justice, and ensure the consequences are so severe that even the thought of such acts brings fear.
Our conscience, our morality, and our instinct to shield the innocent must come first or we become no better than the evil we claim to fight.
Let’s be honest: you didn’t misread my comments, you read them, and still chose to twist them. Why? Because you’re not looking for solutions. You’re looking for justifications. You’re not disturbed by what we’re discussing, you’re disturbed that someone dared to say no to that discussion.
If that enrages you, ask yourself why. I will never let a conversation like this pretend to be “rational” when it's actually morally bankrupt.
If lives are too valuable to play calculus with, then that means they are also too cheap to be worth optimizing. Perhaps you believe lives are cheap in this way, and are fine with that, but I, for one, am not.
good point
Yeah, let’s keep protecting the ship by piercing its hull.
Absolutely. When so-called "rationality" starts justifying the unthinkable, like normalizing simulated child abuse in the name of data, it's no longer reason, it's moral rot.
There’s nothing enlightened about stripping away conscience to optimize cruelty. Some values must remain sacred, or we lose the very humanity we claim to be protecting. I'm glad others see how dark this “logic” can really get.
While I'm not particularly offended by this general line of questioning, I think this post follows a general pathology in public discourse of conflating pedophilia as a clinical condition with abuse of children as a social problem. It may seem counterintuitive, but they're not particularly tied. This is maybe best discussed in terms of this claim in your post:
"Probably most of the pedophiles who risk jail and extreme social ostracization to actually molest a child, are the ones who are really into it."
As it turns out, if we look at who tends to assault children (as opposed to e.g. compulsively collect CSAM), most offenders are not hardcore obligate pedophiles. This is because most crimes (of all types, but especially sex crimes) are not coldly planned days in advance, but rather happen as a spur of the moment response involving cognitive impairment (from drugs, especially alcohol, disability, stress, etc.) and opportunity (access to a vulnerable subject). That is, sex crimes are explained by poor impulse control, not by super-strong core desires.
Of course, serial sexual offenders who target children exist. There are people out there making child porn. But if we're to treat CSA as a social problem, pedophiles shouldn't be our main concern. Based on the numbers, we should be thinking about what affects the behavior of alcoholic stepdads. I'm not certain that AI-generated porn matters to alcoholic stepdads, so maybe we should consider the weak effect it might have on hardcore pedophiles as the dominant concern.
But there's another possibility that has to be considered. If most assault events are explained by poor impulse control, then we have an interest in optimizing for the disgust that people feel towards sex acts involving younger children. Reducing people's ability/likelihood of viewing sexual materials involving children might help avoid reinforcement patterns that go against that disgust response. I think this model is a lot more plausible than a "CSAM will make people obligate pedophiles" model.
While I'm minded to be laissez-faire towards any media that doesn't require harming a child to produce, we (as a society) should probably think hard about this when considering how/when to distribute AI-generated images that sexualize children.
This is an interesting line of thought. I'm unaware of the statistics, but based on demographics it does seem very likely that the number of people who actually engage in such cases are probably not pedophiles (who are statistically rare) but rather someone with poor impulse control who finds themselves in an opportunistic situation to take advantage of someone they should not (a more statistically likely scenario).
"I'm not certain that AI-generated porn matters to alcoholic stepdads"
Actually... Lets consider your scenario a bit further. We have an alcoholic stepdad. Lets say this stepdad at least occasionally consumes porn. Lets further suppose that Aella's proposal of flooding the internet with CP to suppress the real-world urges of actual pedophiles gained traction and had been enacted years earlier.
So whenever this alcoholic stepdad visited porn sites he would be likely to come across some of this content. Do you think an alcoholic stepdad with poor impulse control might just click on some of that and watch some?
If this stuff literally "floods" the Internet odds are he's going to eventually trip over something that reminds him of his stepdaughter... Might this alcoholic step dad with poor impulse control click on that?
Then what happens one day when he's and the girl are home alone, he's drunk and a compromising situation occurs. Do you think the exposure to CP might compound his poor impulse control to make a bad outcome more likely?
I don't like the odds of that.
This made me think. AI CP would almost certainly destroy the market for CP. But obviously there is a difference between viewing CP and being a person who actively assaults a minor. IMO the best work would be done by unentangling the viewing of CP with the behavior of being an assaulter. They are in reality quite different actions, but are often treated as the same both in common discourse and in a legal context.
Sexual abuse is a crime of power, not sex. That's just a way to hurt people.
Then why isn’t it called just _abuse_? And why is it so rare for a tall, fit woman to do it to a small, dilapidated, out-of-shape man even if she could easily overpower him?
You probably don’t need me to say this, and I’m not sure I should say it, having read about your murdery-looking visitor (<https://aella.substack.com/p/the-attempted-kidnapping-at-my-house>), but I admire your bravery, once more. You’re definitely much braver than anyone making fun of you about the needle thing.
> People somehow have interpreted this as me supporting child sexual assault, which is confusing to me but makes more sense if I model people as being LLMs that get triggered if you say too many negatively-flavored keywords too close together.
Nice model. Before LLMs became widespread, we had to make do with terms like _Copenhaguen interpretation of ethics_.
Seconded. Aella, your bravery is super inspirational, and a big part of why I subscribe.
I've never in my life seen a person defend being able to watch cp so hard. Weather it's real or fake is irrelevant even if it's ai you get caught distributing fake cp you're going to jail. We are talking about the same government that is using every soft target in the country to completely eliminate from society and now is the time you decide to drop this? Yeah up until this post i was cool with the edgy outtakes but no ones going to look pasta video of a child in a sex act and be like "no wait it's cool this kids got 2 left hands and the right one has 7 fingers. Yeah I'm unsubscribing from this.
Police, FBI and CIA are already allowed to have CP libraries for purposes of bait, and do own and use those libraries. There is no "difference" introduced via AI CP
Yepp, exactly. It doesn't matter whether it is AI or not. Child Pornography is not okay and I cannot comprehend why would anyone defend AICP while making it look better comparing to actual CP, when both the things are equally problematic and can cause worse consequences.
Equally problematic? The production of genuine CP already is the consequence you're afraid of. CSA has happened in that case.
Yeah I think this is the core of the debate here. Aella is coming from the assumption that fake CP is less bad than actually abusing children, others are operating from the assumption that all CP is equally and infinitely bad.
Can't justify that on utilitarian grounds, but it's sacredness or something
This paper goes over the data from Denmark as well as the other countries mentioned and the claims contained therein. It found that violent sexual assault and rape actually increased when porn proliferated as many other sexual crimes such as voyeurism, "peeping," incest, and others were simultaneously decriminalized, which incorrectly demonstrated an overall drop in sex crimes over that period.
It also studies the effects of Sexually Oriented Businesses (SOBs) in a couple of major cities and found:
♦ Austin, TX -- 1986 - in four study areas with SOBs, sexually related crimes were 177%
to 482% higher than the city's average.
♦ Indianapolis, IN -- 1984-1986 - Between 1978-1982, crime in study areas was 46%
higher than for the city as a whole. Sex related crimes were four times greater when
SOBs were located near residential areas vs. commercial areas.
♦ Garden Grove, CA -- 1981-1990 - On Garden Grove Blvd., seven adult businesses
accounted for 36% of all crime in the area. In one case, a bar opened within 500 feet of
an SOB and serious crime within 1000 feet of that business rose 300% during the next
year.
♦ Phoenix, AZ -- 1978 - Sex offenses, including indecent exposure, were 506% greater in
neighborhoods with SOBs. Even excluding indecent exposure, the sex offenses were still
132% greater in those neighborhoods.
-- 6 --
♦ Whittier, CA -- In comparison studies of two residential areas conducted between 1970-
1973 before SOBs, and 1974-1977 after SOBs, malicious mischief increased 700%,
assault increased 387%, prostitution increased 300%, and all theft increased 120%.
Virtually all SOBs, regardless of the city in which they are located, have similar negative
effects upon their surrounding neighborhoods. The Indianapolis study concluded that: Even a
relatively passive use such as an adult book store [has] a serious negative effect on [its]
immediate environs. It is difficult to miss the implication that these harmful secondary effects
simply reflect something harmful in the nature of the material.
https://www.protectkids.com/effects/justharmlessfun.pdf
Further, "high-frequency pornography consumers who were exposed to the nonviolent, dehumanizing pornography
(relative to those in the no-exposure condition) were particularly likely to report that they
might rape, were more sexually callous, and reported engaging in more acts of sexual
aggression. These effects were not apparent for men who reported a very low frequency
of habitual pornography consumption.The authors noted that the effects of exposure
were strongest and most pervasive in the case of exposure to nonviolent dehumanizing
pornography, the type of material that may in fact be most prevalent in mainstream
commercial entertainment videos.
The study found that more than twice as many men indicated at least some likelihood of
raping after exposure to this material 20.4 percent versus 9.6 percent. Detailed analysis
revealed that these effects occurred primarily for high P (psychotism) subjects those
who are inclined to be rather solitary and hostile, lack empathy, disregard danger and
prefer impersonal, non-caring sex (although not meeting clinical criteria as psychotics)."
A lot of data has been parsed more broadly on this subject and it doesn't bode well for the argument in favor of the proliferation of AI child pornography to combat real child exploitation.
The devil is often in the details when it comes to such things. One things to consider regarding sexually oriented businesses in the United States, particularly for the date ranges mentioned here, is that the US is rather sexually repressed as a society. It was far more repressed back then, leaving very few places to engage in sexual anything outside the home. Which I would imagine lead to more incidents at the few places that do exist.
In one of the cited examples, 7 sex shops were tracked from 1981-1990. By this time, sexual mores had loosened significantly, the free love movement began in the 1960s and the massive boomer generation had already reached sexual maturity by this time. They grew up in an era where sexually was far less suppressed or had significant culture influences condoning the liberalization of sexuality in society.
Young people are more impulsive and consist of most crimes, sexual or otherwise and by this time they weren't nearly as repressed as the Silent/Greatest generation or earlier was.
Further, 7 locations in Garden Grove (https://en.wikipedia.org/wiki/Garden_Grove,_California), a city of 123-143k at the time, is not exactly an urban metropolis where every sex pest can gather in one spot - it is seven spots dispersed through a relatively small population (~1/17,500 - ~1/20,400 inhabitants).
This was a time period where sex shops were perhaps not as as overlooked as they are today with greater Christian influence, but still certainly quite common.
When it comes to sex, it is something primal and gatekept aggressively by men and as cited in the prior comments, porn exposure led to more dehumanizing values expressed towards women and a greater positive inclination towards rape and stated intent of committing rape.
Showing intent towards committing crimes is positively correlated with committing crimes: "...attitudes toward crime, subjective norms, and perceived behavior control effectively predict the intention to injure, steal, or use drugs. Criminal intention effectively predicts the occurrence and the frequency of injury to others, theft, and drug use."
It also found that for child sexual abuse, "...An anonymous self-report questionnaire was adopted to collect data from 915 incarcerated males at Taipei, Taichung, and Kaohsiung prisons in Taiwan and 559 male college students from 6public and private universities in Taipei and Hsinchu. The results show that the intention and practice of child sexual aggression are related. Child sexual aggression intention accounts for 13.7~28.0%, 2.2~8.4% variances of practice separately in male inmates and college students."
Source: https://www.researchgate.net/publication/323721776_Relationship_between_child_sexual_aggression_intention_and_practice_and_its_impact_factors
The devil is in the details and the details tell us that pornography results in dehumanization of the victims, a ~2 fold increase in the stated intent of committing rape against those portrayed in porn and that this is significant in predicting the incidence of committing aggravated sex crimes.
Further, in the previous document I cited (https://www.protectkids.com/effects/justharmlessfun.pdf) it was found that after liberalization of pornography, total rapes and sexual assaults increased on a national level, even if the overall incidence on paper declined, due to lesser sex crimes, such as peeping, voyeurism and incest being simultaneously decriminalized.
Very disturbing post, but I don't agree with the commenter who finds that a reason not to have it. In my opinion, if there's a way to reduce the behavior, or the harm of the behavior that occurs, there may be a moral obligation to do it. At the very least to consider it.
I wonder if "nearly legal" porn contributes to the development of soft fetishes.
And I am extremely cautious about any restrictions of first amendment rights (in the US). Every regulation involves a regulator, and in all too many cases these are purchased, controlled, and, if I may, perverted.
Excellent article.
This is crazy, with many counter arguments as to why, but let me offer one.
Pretend we live in a world where this law was passed- it was okay to make a distribute fake CP.
AI fakes got so good that is was near impossible to distinguish from the real thing.
How then would you police actual CSA?
Make it only legal to use authorized generators. Make those generators leave an invisible identifier in their images. If the identifier isn't there, the image is considered real and punished accordingly
The problem with this is that the identifier could just be cloned onto a "real" cp image. The best solution I heard suggested was to prohibit sharing the actual images but to have some method for easily sharing the generation parameters which could then be generated by the viewer, perhaps even having on-the-fly generation performed seemlessly within the viewer's browser. It would require those people to have better than average hardware, but would eliminate the LE enforcement problem.
This would be solved by providing the seed, model, and +/- keywords used to produce the image. You would be able to produce an exact replica if it's generated.
If you are just meaning that the image should include the creation metadata (or be required to do so), that wouldn't solve the problem. In order to verify that the metadata actually matches the image, the LE agency would actually need to download every model and generation application anyone uses, generate each image based on that metadata, and then see if the images are identical. Assuming a rather modest estimate of 500k AI-generated CP images produced per day (based on number of AI porn images created on one website 2 years ago) - that's over 5 per second - it seems that LE could never practically sort through those images for "real" CP and thus could never prevent proliferation of CP involving real children.
If you meant for sharing the image creation data, then, yes, those are the "generation parameters" I was talking about. But, unless there is a method (think an ai html tag like "img" and associated standardized generation model and engine within the browser), it would not be any more functional than people just posting generation paramaters on a text-only message board - which technically-minded people can do already, likely without running afoul of any law in many countries.
Edit: Also, a big problem with both the "embedded metadata" and "invisible identifier" approaches is that they would require a complete reworking of how the internet functions. Almost every internet service, web browser, email service, messaging service, image board, message board, chat room, etc, resize and reformat images constantly to save data transfer and storage and to improve user experience. That can - and usually does - destroy any digital watermarking and metadata.
So no free-software generators? We’ll have to police all programmers to make sure they don’t secretly develop one. And we’ll have to police everyone else to make sure they don’t secretly learn to program and do the same. And who gets to squeeze the juicy oligopoly of authorized, non-free generators, which, because they are non-free, can do whatever the authorized developers feel like, in addition to generating imagery, and noöne else will know? Of course, those who already have enough political power to get their way.
Unfreedom only breeds more unfreedom.
I am attracted to children, but have never and will never have sexual contact with a child. So let me offer an insider's view on the questions you raise, because... I appreciate your points, but I don't think you're right in all respects, and even if it's against my "interests" I think it's worth clearing some things up. Please pardon my very frank talk in this comment about attraction to kids and child molestation.
First of all, some terminology: I am a pedophile but not a child molester. I have the sexual attraction but would never touch a child inappropriately. I also don't view child pornography. However, I understand the urge to view it very well. I feel jealous of the people who do, although I'm glad that I stay away. I was also clearly born this way; this is very obviously a "deep" fetish for me, and in fact I think of it as a sexual orientation. It became obvious at age 13 and since then, dealing with it responsibly is just something I've always had to do in my life.
The core question you're asking is: What if more sexual outlets were available to people like me? I agree that having sexual outlets is likely to, on balance, reduce offending. I think anyone who thinks clearly about this will see the argument: imagine going through life if all you're allowed to do is masturbate without any visual stimulation. We see, for example, what happens with priests or others who try to blanket deny their sexuality. It goes badly.
Nonetheless, the impact of porn varies by the person. Some people will see AI porn and it will be a useful way to vent their need to get off and then they will go about their lives. Others will get obsessed or spend too much time with it, sinking further into the abyss. It's ultimately up to each person to manage their desires and sexual needs responsibly. The question is if the positive or negative effects of porn access are on average larger, and I think we have some decent data (which you pointed out) to suggest that porn access reduces sexual offenses, but... it's not the strongest data, you know? It's suggestive but far from proof.
OK, that's my overall take. Now for some places where I want to correct some of the things you said since I have that "inside view."
Right now, there is plenty of dubiously-legal stuff out there. There are drawn images and 3D renders of fake kids. Most "responsible" sites (grading on a curve: they are full of drawings of kids getting sexually molested) make sure that anything posted doesn't look too realistic, even if it's made with AI. My personal experience is that having access to such images is a pure good for me: it allows me to have more satisfying orgasms so that I don't become a pent up mess of a person, and I have not seen any rise in my desire to have sex with kids. (I suspect my desires actually go down because it gives a release, but it's awfully hard to tell for sure.) My experience also doesn't generalize to everyone: I'm sure that for some people, sexual imagery involving children "gives them ideas." Again, it comes down to the averages: do you get more positive or negative for most people?
I should also note that this stuff may still be illegal because the first amendment has an exception for obscenity. So drawings and computer renders that don't look realistic may still be illegal. Obviously, people will have different opinions about that; I think it's a mistake to make this stuff illegal, because I think more people are like me (helped by access to it) than those that will be more likely to offend, but I have no way to get objective data. Obviously, my friends online are those who don't offend.
However, my point is, it's not "realistic AI or nothing." We really have a number of policy choices: we can keep the status quo, legalize stuff that doesn't look too real, or legalize anything that provably does not involve real children even if it looks realistic. Which is best? I don't know, but a baby step might be legalizing stuff that looks fake (i.e. hentai involving kids). (I also wonder about the following wild situation: I have a couple of friends whom I met online who were molested as kids, but where it's become a fetish for them and they wish they could show me those images. Should they be allowed to show them to me if it's between two consenting adults, and thus there is no "victim"?)
Another thing that didn't feel right in your write-up is the accessibility of real child pornography. I've talked to people who look at real CP, and the truth is that it's definitely accessible if you do not yourself produce it. My sense is that there's a bunch of (mostly older) stuff floating around that anyone can access, and then there is some stuff (probably the newer, better(?????) stuff) which you can only get in trade, as you indicate. So the real stuff is unfortunately widely accessible if you know where to look, but I think this is still better than the world you summarize: at least it's not encouraging people to abuse kids just to get access!
If I could shape the world myself, I would create a world where sexual contact between adults and children was still looked upon as horrendous and punished harshly. Similarly, production of child pornography. Viewing of child pornography I would have milder punishment with a focus on treatment, because I've seen tons of teenagers who are just realizing their attractions fall into the wrong parts of the internet, get drawn in, and then get addicted. I would legalize stuff that does not involve real children. This is a guess, but I believe it would reduce child victimization.
Anyway, apologies for writing so much but this is obviously very important to me. If anyone is still reading, I wrote on my own post about similar questions, although it was before the generative AI boom so it doesn't address that as directly.
https://livingwithpedophilia.wordpress.com/2020/11/08/the-data-what-impact-do-sexual-images-of-children-have-on-offending/
And if anyone has additional questions for me, I am happy to answer them as honestly as I can. I try hard to approach these questions from a perspective of evidence and genuine inquiry, and I am as open as I can be given the need to protect my identity.
> I should also note that this stuff may still be illegal because the first amendment has an exception for obscenity
It should be noted that "obscenity" is defined rather strictly for this purpose: under the https://en.m.wikipedia.org/wiki/Miller_test precedent something is obscene only if "contemporary community standards" would find it offensively sexual (an unclear & variable standard) & if it has no "serious literary, artistic, political, or scientific value"; this is strict enough that even what is conventionally considered pornography often wouldn't qualify (eg. the Utah case cited on that Wikipedia page). If https://en.m.wikipedia.org/wiki/Legal_status_of_fictional_pornography_depicting_minors is correct, even unrealistic simulated child porn that is legally obscene is banned by federal law, but this is enforced rarely & usually against people who have committed other sex offences.
Yes, agreed. Nonetheless, the law still feels like there's a bit of ambiguity, but perhaps this is me trying to avoid feeling like I do something "illegal" because I think it is beneficial and harms no one.
Haha, I love how Aella stirs up a hornet's nest , noone could do it better than her. :)
To me, the whole idea of limiting an AI image generator is as absurd as limiting MS Paint. Both tools can be used to turn imagination into creation but imagination always comes first. Can we criminalize thoughts? I don't believe in crime without a victim. Someone desperate enough could make CSAM pixel by pixel for personal use and noone would know but is it a lifechanger? I doubt that. I think the impact of porn on people is greatly overestimated to the point some could forget fap is possible without porn too! So my guess is that AI generated CSAM wouldn't really change anything in positive or negative way, at least if it's just for personal use. Even now nothing stops pedos from jerking off to their imagination and non-sexual stuff, it's not just porn or real children dilemma. If they cross the line of risking a jail time right now it's probably deeper problem than just lack of fap material.
My only concern is that AI generated CSAM in public space could waste lots of time and resources to investigate fake stuff instead of helping real victims. But I guess this problem goes far beyond mentioned topic and may be part of the whole AI generated fake media threat.
You see, people like limiting things. How far could we fly if we weren’t too busy clipping one another’s wings?
> some could forget fap is possible without porn too!
Well, I’ve read women claiming to be offended when a man in their vicinity has a spontaneous erection, because of what it says, according to them, about his thoughts. Eschewing porn won’t save you.
This is all obviously correct, and will obviously never happen. We could have perfect AI sex bots to fill this role and crater demand for real victims, and the same people with visceral reactions would be crying about normalization.
This is the most well thought out take on this matter I've ever seen. With arguments based on actual data. Great work!
Pedophiles are sick people. Their brains mistakenly perceive a prepubescent individual as fertile, and this causes harm towards the individual. It is a real mental illness but like with murderers and thieves, they are human beings as well who need rehabilitation.
I myself work with sexual assault victims. It's very sad to see how our current society accidentally enables such things to happen. The lack of sex education towards the youth leads them to being unaware of all the dangers that happen. Our society rightfully punishes these people but rehabilitation needs to be normalized. We should offer these people rehabilitation and punishment instead of keeping the subject as taboo as possible. Rehabilitate the offenders, heal the victims, and educate the youth on the potential dangers.
We also need to differentiate between true pedophilia and emotional outrage. Many people easily criticize others for finding 16 or 17 year olds attractive, even though it has been a normal thing for the entirety of human civilization. Criticizing a 20 year old for finding a 16 or 17 year old attractive is just pushing the issue past its tipping point and downplaying the real victims. It is akin to the outrage and homophobia in the late 1900s. There is a difference between natural relationships taking place and predators actively targetting children who don't know any better. Age of consent is 14-18 in many countries for this reason.
The real issue is real pedophiles who damage lives of victims who are too young to even know what puberty brings them.
Obviously evolution favors attractive towards pubescent fertile individuals. Fertility means that procreation is able to take place and thus evolution thinks it is normal. What is not normal though is pedophilia. Evolution does NOT favor procreation in prepubescent individuals. Even if childbirth were to occur, this would easily kill the child and the mother. This is why pedophilia is not a natural thing. It is a mental illness akin to schizophrenia.
I'm attracted to prepubescent children but I don't perceive them as being fertile mates. I'm not even interested in doing intercourse with them. I just find their physically and personality traits attractive. Now of course I don't speak for all pedophiles but I've never met one who thinks children are fertile.
Agreed. I'm romantically and sexually attracted to children. They're just indescribably beautiful and charming. It's nothing to do with fertility. That just seems ridiculous. And it's absolutely nothing to do with power or control or anything like that. On the contrary, I think it's terrible that children are kept so powerless and dependent. It's a major factor in child abuse, both sexual and otherwise, and it seems downright cruel to spend 18 years of a person's life teaching them that they're inferior and unworthy of respect.
I know of two studies that have looked at why people are pedophiles.
https://pmc.ncbi.nlm.nih.gov/articles/PMC5778266/
"Taken together, the pedophilic subjects of the present study showed an over-responding to infant animal stimuli in a network of brain regions that contribute to motivating behaviors. This is in accordance with our hypothesis that nurturing stimuli receive additional processing resources in pedophiles. It is of interest that some of the areas of increased response to infant animals are related to the mating domain. The left anterior insula, being a crucial area of nurturing processing, was also frequently found to be activated in sexual brain studies (Stoleru et al., 2012). Furthermore, the left anterior insula (as well as the SMA) is a constituent of the human attachment system, thereby enabling both nurturing and pair-bonding (Feldman, 2016). Based on both observations, (i) the over-responding to nurturing stimuli in various motivational areas and (ii) the functional overlap of nurturing and sexual processing of the involved left anterior insula a tentative and simple model of pedophilia could be as follows: Nurturing stimuli receive additional processing resources by mating-circuits. In case of human infant stimuli this leads to a sexual connotation of infant stimuli."
https://pmc.ncbi.nlm.nih.gov/articles/PMC11252362/
"Our results support the idea of an overactive nurturing system in pedophilia, which may be influenced by the endogenous testosterone level."
I don't mean to cherry-pick, but every other study I've seen on the matter was based on a sample of convicted child molesters — hardly a representative sample.
I could be mistaken, I'm certainly not an expert on the subject. But I've never heard it being about fertility, it seems to be much more about naivety and control. Children are generally uneducated (particularly about sex), much less sure of themselves and much easier to manipulate/control.
I get the impression its similar to the reason a whole lot of guys like the idea of virgins. This is a VERY common interest though (assuming legal age) not illegal. Personally this also given me the ick though.
Predators exist in all shapes and sizes. Child predators are simply a subset of the overall predatory nature of human beings. All predators seek weaker individuals. This doesn't mean just being younger, it also means being poorer, smaller, weaker, shorter, dumber, naive, etc. and predators also seek out individuals who were already victims of previous abuse. And fun fact: we are all predated upon in every day society- by our government. Your government is the biggest predator to exist, much worse than the 30 year old sex offender down the block.
I have dealt with all types of cases involving these predators. These people are mostly individuals who are mentally ill, severely abused themselves, or simply just dangerous criminals already. However these people can easily be taken off the street and either punished or hopefully rehabilitated. Your government cannot. They will use any means necessary, including the promise of protection your children / citizens (remember Patriot Act?) just to gain control of you. And remember when Apple tried to implement CSAM technology? Thankfully we stood up to that. It's all about control. Our government literally does not care about sexual assault victims. It's sad and frightening.
On the topic of virgins, it is mostly a purity thing that revolves around marriage and long term relationships. It is less of a sexual fetish and more of a romance type of thing overall, but sexual fetishes around virginity is a popular thing, but that stems directly from purity. A man feels like he can be more connected emotionally to a girl who has never had sex, than to a girl who does sex work and has met dozens of men.
It does seem like nature and evolution tends to approach towards neoteny. If you dive into the rabbit hole of decades of scientific and biological studies, it could actually be where humans get beauty from. Beauty seems to stem from the combination of maturity and youth, maturity being pubescent and fertile and aware of the entire ordeal of sexual reproduction. Pedophiles do not see this, and they immediately go for the youngest possible. Nature works with relativity, and since pedophiles aren't able to follow the right path that evolution intends them to, it just leads to problems occurring and abuse happening. But on the other hand, it is why humans are most fertile at their late teens to early 20s. But pedophiles are way way way off the line here.
Really? You're going to bring governments and politics like that into this? Shut up you fool, saying a government is as bad as someone who intentionally harms children frankly makes you a horrible person. Bluntly my government is responsible for delivering a century of world peace and prosperity. Its also the only country that puts citizens speech and rights above the government. Up until the current admin at least. I'm done with you - you're not worth having a conversation with.
Did you get fired from USAID after making this comment lol?
Century of world peace? In what universe may I ask? Coz it's not in this one.
Indeed, it's quite absurd to compare someone who rapes a single digit number of children to someone like Putin that's brought death and suffering to many orders of magnitude more people.
At least the US government has usually been less evil than many others. :/
Being a Reddit-brained materialist, you have lost the ability to mentally model the people who are into This Stuff, who specifically get off on the fear and pain that children suffer during the creation of these materials.
You assume the AI stuff will act as a "dampener" when any normal person on the street instinctively understands that it will inflame. That's because it's not about seeing naked bodies. It's about sadism, the euphoria of power over a helpless creature, of defiling an innocent soul. A real soul, not a fake one.
You see This Stuff as just another expression of sexuality, but it's not the same thing. It's a different thing.
Look at stories about people who are caught with gigabytes of This Stuff on their hard drives. They require an ever-flowing wellspring of fresh suffering. They risk their lives trying to get more of it, and that's part of the thrill.
If you really want to ruin your day, google "Eleanor Hunton Hoppe," a respected socialite who was recently caught trying to meet up in the middle of the night at a hotel to abuse an 8yo. She had loads of This Stuff on her computer. A line from texts that were revealed during her trial that stuck with me is "that hazy/dreamlike state is perfect to introduce a variety of new things.” If you have kids, you know what state she is talking about. Kids wake up in the middle of the night, loopy, confused, they just want to snuggle and be silly, it's basically the cutest thing ever, special moments you remember for years. Every parent reading this knows what I'm talking about and is viscerally repulsed to their core at the idea of someone exploiting this particular moment of vulnerability to hurt a child.
That's what This Stuff is all about, and what can you call it but evil? Not to sound like a libtard but THE CRUELTY IS THE POINT.
These are not urges that you can simply "get out of your system." If you think AI-generated alternatives are going to satisfy, I'm sorry but you don't understand human nature and no amount of quirkball *erm checks notes* Bill Nye psychobabble will convince us otherwise.
Any normal person on the street used to instinctively understand that the Earth was flat.