Teens are also children. Apple has no business checking if they send or receive nude pics. Let alone tell their parents. This is very creepy behavior from Apple.
The fact that teens are children means that if, say a 16-yo sends a nude selfie to their s.o., they've just committed a felony (distributing child pornography) that can have lifelong consequences (thanks to hysterical laws about sex offender registries, both kids could end up having to register as sex offenders for the rest of their life and will be identified as having committed a crime that involved a minor. Few if any of the registries would say more than this and anyone who looks in the registry will be led to believe that they molested a child and not shared a selfie or had one shared with them). The laws may not be just or correct, but they are the current state of the world. Parents need to talk to their kids about this sort of thing, and this seems one of the less intrusive way for them to discover that there's an issue. If it were automatically shared with law enforcement? That would be a big problem (and a guarantee that my kids don't get access to a device until they're 18), but I'm not ready¹ to be up in arms about this yet.
1. I reserve the right to change my mind as things are revealed/developed.
> they've just committed a felony (distributing child pornography)
In the US, maybe (not sure if this is even true in all states), but not in most other countries in the world, where a 16-year-old is not a child, nudity is not a problem, and "sex offender registries" don't exist.
The US is entitled to make its own (crazy, ridiculous, stupid) laws, but we shouldn't let them impose those on the rest of us.
Yet for the largest part this is where we are ending up. Just look at Facebook and Twitter deciding what is right and wrong. I think that's wrong in a lot of ways but appearantly there is very little the EU and others can do about it.
I’d argue that the problem of minors declared sex offenders for nude pictures has reached a critical mass that scares me. At this point, sex offenders of truly vile things can hide by saying that they are on a sex offender registry because of underage selfies. And I think most people will believe them.
As I understand it, one of the primary purposes of sex offender registries is to get information out about who is on them. I believe some people are forced to knock on doors in their neighbourhood and disclose it. In other situations they would just be getting ahead of the story.
They wouldn't. But if they were sex offenders, they could claim that their offense was simply sending a nude when they were 16. While their real offense may have been rape.
I don't actually know what the law is in regard to sex offense. I'm simply explaining what I understood from the previous comment.
> if, say a 16-yo sends a nude selfie to their s.o., they've just committed a felony ... The laws may not be just or correct, but they are the current state of the world.
Hence, strong E2E encryption designed to prevent unjust government oppression, without backdoors.
Parents should talk to their teenagers about sex regardless of if they get a notification on their phone telling them they missed the boat.
I get your points, but the end result is that the client in an E2EE system can no longer be fully trusted to act on the clients behalf. That seems alarming to me.
> Apple has no business checking if they send or receive nude pics. Let alone tell their parents.
Some people might disagree with you.
There are people out there who are revolted by the "obviously okay" case of 2 fully-consenting teenagers sending each other nude pics, without any coercion, social pressure, etc.
Not to mention all the gray areas and "obviously not okay" combinations of ages, circumstances, number of people involved, etc.
So this gives kids a heads up that they shouldn't send it, and that if they do, their parent will be notified. So that's me in case you are not reading this clearly.
Now yes, if someone is SENDING my child porn from a non-child account, I as the parent will be notified. Great.
If this is terrifying - that's a bit scary! HN is going a bit off the rails these days.
Allow me to make a prediction - users are going to like this - it will INCREASE their trust in apple in terms of a company trying to keep them and their family safe.
I just looked it up - Apple is literally the #1 brand globally supposedly in 2021. So they are doing the right thing in customers minds so far.
"it will be the other kids’ parent who will decide for your kid. Apple will give your children’s picture to the other kid’s parents."
This is an absolute falsehood. Literally a total flat lie. Have people not read the paper?
Apple does not give android phone users the child porn. They block it on device. So no, it will not be the other children's parents who will decide it will be me if my child is sending it.
And if someone sends my child a photo of child porn etc, then apple will block it and will notify me and that is also fine - if you are sending my child child porn - then I have a right to know - and I will pursue you aggressively as almost any parent will I think.
The idea that apple is doing something wrong here is ridiculous. The lying about what apple is doing is also ridiculous.
Presumably the sender won't get a notification that the picture they are about to send will be flagged, but it will be flagged by the recipient's device automatically.
Neither participant will have the opportunity to be warned in advance and avoid getting flagged.
Children as defined by Apple differs per legal region, for the US it is set to 13 years or younger. Also, your parents need to have added your account to the iCloud family for the feature to work.
Wanting to know as a parent, and the way Apple is going about this are two different issues.
The government also wants to know about potential terrorist attacks. Why not scan all our phones for all kinds of data to protect innocent people from being killed by mass shootings?
That's nonsense. I'm saying that and I'm deeply locked in Apple's ecosystem. Which is pissing me off.
> Parents can’t inform their kids if they aren’t aware
Why not inform your children of the potential consequences when giving them a phone? Why do you need Apple to notify you of inappropriate behavior before having that conversation? That's like waiting until you find a pregnancy test in the garbage before talking to your children about sex.
> Apple has no business checking if they send or receive nude pics.
Furthermore, if Apple is deliberately, by design resending them to additional people beyond the addressee, as a feature it markets to the people that will be receiving them, that seems to...raise problematic issues.
> In these new processes, if an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. If the under-13 child still chooses to send the content, they have to accept that the “parent” will be notified, and the image will be irrevocably saved to the parental controls section of their phone for the parent to view later. For users between the ages of 13 and 17, a similar warning notification will pop up, though without the parental notification.
This specifically says that it will not notify the parents of teens, as GGP claims. So GP is right that Apple isn't doing what GGP claimed. However I still think you might be right that GP didn't read the full article and just got lucky. Lol.
You're conflating the CSAM detection of photos uploaded to iCloud with the explicit detection for child devices. The latter is loosely described here: https://www.apple.com/child-safety/.
> Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.
That’s only the first part of what was announced and addressed in the article.
The other part is on-device scanning for nude pics a child is intending to send using machine learning and securely notifying the child, and then parents within the family account. The alert that the kids get by itself will probably be enough to stop a lot of them from sending the pic in the first place.
If you are talking about parents giving their children a device to use as a weapon against them, you are implying some really bad parenting and really bad general human behavior by some other parents.
Not that there aren't some really lousy parents out there.
I suspect/hope the exclusion parenting style (you don't get a phone because I don't trust you with it or the rest of the world with you) and the guidance parenting style (you get a phone but I'm going to make sure to talk to you about responsibility and about lousy people in the world, and I want to know if something happens so we can keep talking) both far outweigh this sort of proposed 'entrapment' parenting style.
Good lord - people have scary parenting styles - "information weapons"???
A phone is a tool. You start slowly. Do you give a 3 year old a chainsaw or machine gun right away? No.
I grew up and learned to shoot. I started with a bb gun, then a 22lr rife etc. You start with stuff less dangerous.
A phone potentially exposes children to all sorts of crap.
It's not some horrible thing to start slowly with something a bit locked down (and yes, blocking porn is part of that), then it opens up as they get mature.
This conversation has really made me look at the HN community in a different way. I've worked with kids in lots of contexts and with parents etc - the HN feelings about apple's efforts in this area are going to be somewhat extreme outliers relative to population as a whole and particularly vis a vis parents.
It seems like the "guiding parent" style is acceptable to you.
And yes, a phone - and the web and computing in general - are information weapons. Same as a knife or a hammer they have utility too which is great and also something to be lauded. But dangers must be recognised.
I won't trust Apple or Google to draw the line in the sand for me. Both the kid and myself will know what the expectations are and be talking about it the whole way through growing up.
I agree with you in principle, but I also know that kids will soon find methods of sharing that defeat any scans. Other apps and ephemeral websites can be used to escape Apple’s squeaky-clean version of the world.
But if I'm picking a phone for my kid, and my choice is this (even if imperfect) and the HN freedomFone - it's going to be Apple. We'll see what other parents decide.
@philistine is right that apple solution can only worth for short time being. Parents want easy automatic solution with "out of sight out of mind", but the only solution that will work long-term is talking with kid and educating them instead of out-sourcing parenting to tech company.
It's easily to design solution that will work around and allow sending either criminals or kids nude pictures under the radar, e.g.
1) Someone will eventually write web app that will use webrtc or similar for snapping nude picture (so that no picture is stored on device)
2) encrypting those photos on the server
3) sending the link to this nude image that will be rendered in html canvas (again no image will be stored on device)
4) link to web app that will rendered image will be behind captcha so that automated bots cannot scan it
Now do we want to go into the rabbit hole to 'protect the kids' and make Apple buildin camera drivers that will filter all video stream for nudity?
> the only solution that will work long-term is talking with kid and educating them instead of out-sourcing parenting to tech company.
Sometimes otherwise good, smart kids, with good parents, can be caught up by predation. It isn't always as simple as making sure the kid is educated; they'll still be young, they'll still be immature (< 13!), and it will sometimes be possible to manipulate them. I'm not saying that what Apple is doing is a good answer, it probably isn't, but it's an answer to a genuine felt need.
Under 13 seems like an ok cutoff but I’d be very concerned that they push it to under 18. Pretty much everyone is sharing adult context responsibly at 17.
Nude pic ID is routine online. Facebook developed this capability over 5 years ago and employs it liberally today, as do many other net service providers.
Parents do have a legal and moral responsibility to check on their children’s behaviour, and that includes teens. It’s somewhat analogous to a teacher telling parents about similar behaviour taking place at school.
I suspect a lot of how people feel about this will come down to whether they have kids or not.
You describe it as if Apple's got people in some room checking each photo. It's some code that notifies their parents in certain situations. ;P
I know several parents in just my extended circle alone that would welcome the feature, so... I just don't think I agree with this statement. These parents already resort to other methods to try and monitor their kids but it's increasingly (or already) impossible to do so.
I suppose we should also take issue with Apple letting parents watch their kids location...?
Edit: I'm talking about this https://pbs.twimg.com/media/E8DYv9hWUAksPO8?format=jpg&name=...