Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not sure what the solution is, but to steel man a bit, the alternative is kids have access to all the adult spaces, where they will be groomed. A website/app serving grooming content to a kid is just so incredibly unlikely compared to a kid being groomed as the result of having unrestricted access.

Since I do not see a solution, and you see identifying children as a risk, what do you see as a solution for kids being in the same spaces as adults? Do you see a reasonable implementation to separate them, that doesn't have the "we know which accounts are children" problem? Maybe there's something in between?

Also, I think it's important to understand the life of a modern child, who's in front of a screen 7.5 hours a day on average [1], with that increasingly being social media, half having unrestricted access to the internet [2].

I hate government control/nanny state, but I think 5 year olds watching gore websites, watching other children die for fun, is probably not ok (I saw this at the dentist). People are really stupid, and many parents are really shitty. What do you do? Maybe nothing is the answer?

[1] https://www.aacap.org/AACAP/Families_and_Youth/Facts_for_Fam...

[2] https://fosi.org/parental-controls-for-online-safety-are-und...

 help



The solution is parental liability.

So say one of the 50% of children that have unrestricted access goes somewhere they shouldn't, or interacts with people they shouldn't. How is it detected so the parents can be held liable? What does the implementation look like to you?

The same way anything illegal is detected: a police report.

You misread my comment.

How is it detected? A police report is for after it's detected.


At the very least, the affected parties would know if a crime has been committed.

Preventing the crime from happening is out of scope of the government, as it should be.


> At the very least, the affected parties would know if a crime has been committed.

The affected parties are unsupervised children, who are accessing adult spaces and content. Are you saying the children will tell on themselves?

Maybe take a moment to re-read this comment chain.


So never.

As the problem is adults trying to groom kids, the answer is robust detection and enforcement of the current anti-grooming laws.

It's ironic that people supposedly care about this when there's also a child rapist/murderer being kept safe as President without being held accountable for his crimes.

I suppose this law could be used as a defense against getting caught grooming minors - "I thought they were adult as surely a kid wouldn't be able to access that chat group"


> robust detection and enforcement

How, exactly, does one accomplish "robust detection of a child"? I assume your answer would include complete surveillance of all internet communication? Could you expand on your idea of the implementation?


Sorry if I wasn't clear - I am proposing that the adults face the robust detection and enforcement of anti-grooming laws. One method is to set up honey-pots with law enforcement officers playing the part of an innocent child (i.e. avoiding entrapment) and then throwing the full weight of the law behind any adult showing predatory behaviour.

What I propose is rather than putting all the effort into preventing children from entering dangerous adult spaces, it's better to put the effort into ensuring that sex criminals are prosecuted and trying to make adult spaces less dangerous.


I think an obvious problem for this method is scaling, partly from grooming not being a local phenomenon. It would require worldwide cooperation, especially in a few countries that are statistical offenders.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: