Hacker Newsnew | past | comments | ask | show | jobs | submit | mortarion's commentslogin

In the EU the platform becomes responsible for posted content, the moment someone notifies them that they are hosting something illegal. They have plausible deniability until notified, after which they have a certain time to act, and if they don't they are criminally liable. The user posting the content is also liable, from the moment they made the post.

Are lewd drawings illegal? To my knowledge unless they are real photographs then it is legally fine, if disgusting.


Compared to the USA, we have incredible privacy in the EU.

Countries in Europe (and most of the world) have positive constitutions, which defines what the government "must do" (for its citizens), whilst the USA has a negative constitution that defines what the government "cannot do" (against its citizens).

What constitutes hate speech is carefully defined in the constitutions of EU countries. Politicians can't just amend or extend the definition at will, except in the UK which has a strange system of laws and not a constitution like you're used to in the USA or in the EU.

In Europe we recognize that Hitler came to power by abusing free speech, which is why using the same rhetoric now can land you in trouble with the law. We also recognize that the pen is mightier than the sword and that unfettered speech can be used to persuade groups of people to use violence against other groups of people.


>In Europe we recognize that Hitler came to power by abusing free speech,

I've heard this again and again - no one mentions that the Nazis had roving bands of men intimidating people like a mob, and that Hitler came to power because of a false flag operation that burned the Reichstag.

But we should forget the physical threats of the Nazis and focus on thin parallels to their ideas, under the guise 'hate'.

When you do that, you end up with people arbitrarily deciding what's hateful and not, depending on their own values. Chants about English culture threatened by Muslims, hate, chants about Israel and Jews dominating the country, not hate (courtesy of UK hate speech protections).


Hitler was literally banned from public speaking for two years.

The Nazis came to power through widespread normalized political violence, not speech, and banning Hitler from speaking did nothing but further undermine the legitimacy of the government’s mandate to rule.


Joseph Goebbels would have been disappointed to learn that his office was superfluous and irrelevant!

Goebbels and his office represent the direct opposite of freedom of speech, just to counter your Reddit-inspired mic-drop hot take comment.

The point was how they gained absolute power, and I would also say that there were multiple factors at work, and I doubt that the GP meant that “abusing free speech” was the only method or reason, but was it not a factor at all? There is often so much “not this but that”, folks should consider “both-and”.

They gained absolute power through violence.

The Nazi party had a private paramilitary wing — https://en.wikipedia.org/wiki/Sturmabteilung — and political violence was both common and integral to their rise.

When the Enabling Act was deliberated and passed, giving Hitler effectively absolute power, Sturmabteilung paramilitary members were positioned both inside and outside the chamber.

That period of history was fraught with political violence enacted by people who claimed a moral imperative to curtail the freedoms of others.


Maybe you should look up child pornography laws in Europe. In Sweden, the mere act of scrolling by an image depicting (real or not) a child in a sexual position, and having it stored in the browser cache, is a crime with up to 2 years of prison time.


How does that negate my initial comment?


Maybe US law makes a distinction, but in Europe there is no difference. Sexual depictions of children (real or not) is considered child pornography and will get you sent to the slammer.


On the contrary, in Europe there is a huge difference. Child porn might get you mere community service, a fine - or even less, as per the landmark court ruling below.

It all depends on the severity of the offence, which itself depends on the category of the material, including whether or not it is CSAM.

The Supreme Court has today delivered its judgment in the case where the court of appeals and district court sentenced a person for child pornography offenses to 80 day fines on the grounds that he had called Japanese manga drawings into his computer. Supreme Court dismiss the indictment.

The judgment concluded that the cartoons in and of itself may be considered pornographic, and that they represent children. But these are fantasy figures that can not be mistaken for real children.

https://bleedingcool.com/comics/swedish-supreme-court-exoner...


> The Supreme Court has today delivered its judgment

For future readers: the [Swedish] supreme court.


This is the cyber crime unit. They will exfiltrate any data they want. They will use employee account to pivot into the rest of the X network. They don't just go in, grab a couple of papers, laptops and phones. They hook into the network and begin cracking.


Within the Schengen area, you don't really need an ID to get on a plane either. In fact you can go through security screening in many places without an ID or a valid ticket.


In most of the modern world, it's impossible to go through life without a bank account at the minimum (which requires an ID), but not so in the USA, there you can live your whole life, paying with, and accepting cash, storing it in your matress.


CSAM does not have a universal definition. In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response. If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.

No abuse of a real minor is needed.


> CSAM does not have a universal definition.

Strange that there was no disagreement before "AI", right? Yet now we have a clutch of new "definitions" all of which dilute and weaken the meaning.

> In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response.

No corroboration found on web. Quite the contrary, in fact:

"Sweden does not have a legislative definition of child sexual abuse material (CSAM)"

https://rm.coe.int/factsheet-sweden-the-protection-of-childr...

> If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.

> No abuse of a real minor is needed.

Even the Google "AI" knows better than that. CSAM "is considered a record of a crime, emphasizing that its existence represents the abuse of a child."

Putting a bikini on a photo of a child may be distasteful abuse of a photo, but it is not abuse of a child - in any current law.


In Swedish:

https://www.regeringen.se/contentassets/5f881006d4d346b199ca...

> Även en bild där ett barn t.ex. genom speciella kameraarrangemang framställs på ett sätt som är ägnat att vädja till sexualdriften, utan att det avbildade barnet kan sägas ha deltagit i ett sexuellt beteende vid avbildningen, kan omfattas av bestämmelsen.

Which translated means that the children does not have to be apart of sexual acts and indeed undressing a child using AI could be CSAM.

I say "could" because all laws are open to interpretation in Sweden and it depends on the specific image. But it's safe to say that many images produces by Grok are CSAM by Swedish standards.


Thanks, but CSAM includes abuse, and the offence of your quote (via Google Translate) does not.

Your quote's offence looks like child porn. Max. 2 years jail. CSAM goes up to life, at least here in UK. Quite a difference.

> But it's safe to say that many images produces by Grok are CSAM by Swedish standards.

So the Govt/police would have acted against Grok, right? Have they?


" Strange that there was no disagreement before "AI", right? Yet now we have a clutch of new "definitions" all of which dilute and weaken the meaning. "

Are you from Sweden? Why do you think the definition was clear across the world and not changed "before AI"? Or is it some USDefaultism where Americans assume their definition was universal?


> Are you from Sweden?

No. I used this interweb thing to fetch that document from Sweden, saving me a 1000-mile walk.

> Why do you think the definition was clear across the world and not changed "before AI"?

I didn't say it was clear. I said there was no disagreement.

And I said that because I saw only agreement. CSAM == child sexual abuse material == a record of child sexual abuse.


"No. I used this interweb thing to fetch that document from Sweden, saving me a 1000-mile walk."

So you cant speak Swedish, yet you think you grasped the Swedish law definition?

" I didn't say it was clear. I said there was no disagreement. "

Sorry, there are lots of different judical definitions about CSAM in different countries, each with different edge cases and how to handle them. I very doubt it, there is a disaggrement.

But my guess about your post is, that an American has to learn again there is a world outside of the US with different rules and different languages.


> So you cant speak Swedish, yet you think you grasped the Swedish law definition?

I guess you didn't read the doc. It is in English.

I too doubt there's material disagreement between judicial definitions. The dubious definitions I'm referring to are the non-judicial fabrications behind accusations such as the root of this subthread.


" I too doubt there's material disagreement between judicial definitions. "

Sources? Sorry , your gut feeling does not matter. Esspecially if you are not a lawyer


I have no gut feeling here. I've seen no disagreeing judicial definitions of CSAM.

Feel free to share any you've seen.


> Even the Google "AI" knows better than that. CSAM "is [...]"

Please don't use the "knowledge" of LLMs as evidence or support for anything. Generative models generate things that have some likelihood of being consistent with their input material, they don't "know" things.

Just last night, I did a Google search related to the cell tower recently constructed next to our local fire house. Above the search results, Gemini stated that the new tower is physically located on the Facebook page of the fire department.

Does this support the idea that "some physical cell towers are located on Facebook pages"? It does not. At best, it supports that the likelihood that the generated text is completely consistent with the model's input is less than 100% and/or that input to the model was factually incorrect.


Thanks. For a moment I slipped and fell for the "AI" con trick :)


This is the actual law (Brottsbalken 16:10a)

https://www.riksdagen.se/sv/dokument-och-lagar/dokument/sven...

A person who

1. depicts a child in a pornographic image,

2. disseminates, transfers, provides, exhibits, or otherwise makes such an image of a child available to another person,

3. acquires or offers such an image of a child,

4. facilitates contacts between buyers and sellers of such images of children or takes any other similar measure intended to promote trade in such images, or

5. possesses such an image of a child or views such an image to which he or she has gained access

shall be sentenced for a child pornography offense to imprisonment for at most two years.

Then there's Proposition 2009/10:70, which is a clarifying document on how the law should be interpreted:

https://www.riksdagen.se/sv/dokument-och-lagar/dokument/prop...

Let me quote (translated):

"To depict a child in a pornographic image entails the production of such an image of a child. An image can be produced in various ways, e.g., by photographing, filming, or drawing a real child. Through various techniques, more or less artificial images can also be created. For criminal liability, it is not required that the image depicts a real child; images of fictitious children are also covered. New productions can also be created by reproducing or manipulating already existing depictions, for example, by editing film sequences together in a different order or by splicing an image of a child’s head onto an image of another child’s body."


Not only that. This law exists like this because of a EU directive.

https://eur-lex.europa.eu/eli/dir/2011/93/oj/eng

Let me quote again: Pay attention to c.iv specifically:

(c) ‘child pornography’ means:

(i) any material that visually depicts a child engaged in real or simulated sexually explicit conduct;

(ii) any depiction of the sexual organs of a child for primarily sexual purposes;

(iii) any material that visually depicts any person appearing to be a child engaged in real or simulated sexually explicit conduct or any depiction of the sexual organs of any person appearing to be a child, for primarily sexual purposes; or

(iv) realistic images of a child engaged in sexually explicit conduct or realistic images of the sexual organs of a child, for primarily sexual purposes;


Thanks. I paid attention but still didn't see how:

realistic images of a child engaged in sexually explicit conduct or realistic images of the sexual organs of a child, for primarily sexual purposes;

covers the example in question:

If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.

How about you?


> This is the actual law (Brottsbalken 16:10a)

Thanks, but what "the" actual law? Your one doesn't contain the purported Swedish CSAM definition, or any for that matter. Nor does it even mention abuse.


> a child pornography offense to imprisonment for at most two years.

That underlines the extreme difference w.r.t. CSAM, which can get you life, at least here in UK.


> - in any current law.

It has been since at least 2012 here in Sweden. That case went to our highest court and they decided a manga drawing was CSAM (maybe you are hung up on this term though, it is obviously not the same in Swedish).

The holder was not convicted but that is besides the point about the material.


> It has been since at least 2012 here in Sweden. That case went to our highest court

This one?

"Swedish Supreme Court Exonerates Manga Translator Of Porn Charges"

https://bleedingcool.com/comics/swedish-supreme-court-exoner...

It has zero bearing on the "Putting a bikini on a photo of a child ... is not abuse of a child" you're challenging.

> and they decided a manga drawing was CSAM

No they did not. They decided "may be considered pornographic". A far lesser offence than CSAM.


You are both arguing semantics. A pornographic image of a child. That's illegal no matter what it's called. I say killing, you say murder, same law though, still illegal.


> I say killing, you say murder, same law though

Not in any European law I know. See suicide and manslaughter.


"Sweden does not have a legislative definition of child sexual abuse material (CSAM)"

Because that is up to the courts to interpret. You cant use your common law experience to interpret the law in other countries.


> You cant use your common law experience to interpret the law in other countries.

That interpretation wasn't mine. It came from the Court of Europe doc I linked to. Feel free to let them know its wrong.


So aggressive and rude, and over... CSAM? Weird.


chrisjj has some atrocious takes. But supporting CSAM is the worst he has done by far.


I'm not supporting CSAM. I'm supporting the defence of the term CSAM from attempts at dilution and diminution which downplay the true severity of this appaling crime.


You just seem incredibly argumentative and unreasonable and do not seem to care about "this appalling crime" at all.


The lady doth protest too much, methinks.


That's the problem with CSAM arguments, though. If you disagree with the current law and think it should be loosened, you're a disgusting pedophile. But if you think it should be tightened, you're a saint looking out for the children's wellbeing. And so laws only go one way...


Where do these people come from???


As good as Australia's little boobie laws.



You don't see a huge difference between abusing a child (and recording it) vs drawing/creating an image of a child in a sexual situation? Do you believe they should have the same legal treatment? In Japan for instance the latter is legal.


He made no judgement in his comment, he just observed the fact that the term csam - in at least the specified jurisdiction - applies to generated pictures of teenagers, wherever real people were subjected to harm or not.

I suspect none of us are lawyers with enough legal knowledge of the French law to know the specifics of this case


This comment is a part of the chain that starts with a very judgemental comment and is an answer to a response challenging that starting one. You don't need legal knowledge of the French law to want to distinguish real child abuse from imaginary. One can give arguments why the latter is also bad, but this is not an automatic judgment, should not depend on the laws of a particular country and I, for one, am deeply shocked that some could think it's the same crime of the same severity.


Yeah, delivering using Falcon 9.

The Starship stack? Not so much. It's plagued, and will continue to be plagued, by endless problems. BO will beat them with NG.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: