If your child is searching for adult partners, complete strangers, over the Internet, you as a parent have a lot to answer for.
That being said, none of that justifies what happened to the minors mentioned in the article. Ultimately, there are a lot of bad people out there. It would be great for children to not be exposed to such people.
But I think the necessary change is parental supervision, not government intervention.
I'm going to hazard a guess you don't have kids. Short of bringing up children in a panopticon that gives no privacy, ever, and continual spot checks of what apps and usage, or a feature phone only until 18, what do you expect? Watch over shoulder the entire time they use a device? Bolt down the home router and rely on network 18+ limiting, and they can still use their friend's or Costa's wifi.
I might prefer to limit younger children from access, I can advise and try and give them a degree of common sense, and advise against blind trust. I can't as a parent prevent them doing some very daft things. It's part of the rite of passage of every teen.
I'd aim to be supportive enough that they can come to me when they fuck up for support and help. I can't in good conscience blame other parents for every lapse.
Her Wikipedia article offers a quick summary:
https://en.wikipedia.org/wiki/Judith_Rich_Harris
In short, kids, especially teenagers, aren't out to be like their parents. They want to be like their peers. And what we often see as the effects of good parental influence or supervision may simply be heritable behavior.
What about the internet suddenly absolves the company from being diligent and responsible?
If this was shops selling alcohol & cigarettes to 14 year olds, would your response be "where the hell are the parents?"
Parents aren't going to monitor their kids 24/7.
> If your child is searching for adult partners, complete strangers, over the Internet, you as a parent have a lot to answer for.
I don't have children yet I think this is a very uninformed comment. Sexuality and attraction (and even more-so perceived attraction) are complex and multi faceted things so I think jumping to blaming the parents is _very_ simple minded.
It's not about monitoring your kids 24/7. It's about instilling values in them precisely so they can monitor themselves. This is what differentiates good upbringing and clueless parenting looking for scapegoats.
I think we understand that parents are not in complete control over their children, and in the interest of harm reduction, we don't give parents the entire burden of keeping their children safe but spread parts of it out among the community. This doesn't seem like a particularly burdensome request, here. Some parents are terrible parents, some kids are complete terrors, sometimes there are circumstances outside your control. Parents are not gods.
It clearly states that tinder and grinder already monitor for minors. They state that they spend, and I'm guessing this wouldn't be an exaggeration given their volume, millions of dollars on this effort.
What lawmakers want is ID required.
This is an extremely burdensome requirement. To have every new user of any online platform have to verify photo ID against a webcam is... very onerous. And even then whats to stop a kid from using his parents confirmed account?
Alcohol is a transaction for a substance that is deadly and purchased in public. Comparing it to creating an online profile in the privacy of your home is a big stretch. It would be more like creating an account with... I don't know, maybe buying a can of figs off amazon.
Can a child buy a can of figs without ID? Not easily, but he could. Could he hurt himself with it? Yes. Should we require ID for this: I'd say not.
The article seems to not mention any of the age verification things the kids forged and their parents computer. But don't worry, if it was convenient for the powers that be, they would have framed this as: "Evil kids get into banking system by HACKING."
Funny how I'm sure they won't be asking Amazon to verify ID everytime a purchase is made. I mean, plenty of kids have ordered down right dangerous things this way, probably even caused a death.
The worst part of the authoritarian inclined who use the 'what about the children?' argument is that many times they ironically make it less safe for children since the solutions aren't particularly well thought through (see drinking age in the US) - all while making it worse for the rest of us.
The thing about the outrage crowd is they point to a problem with while implicitly saying 'anyone who doesn't support 'the solution' is in favor of the problem'. I'm not in favor of the problem. I'm in favor of fixing the problem IF it can be fixed in a cost effective way. Many people can't accept that there aren't widespread easy and simple solutions to giant complex social problems. Accepting that this behavior is an emergent quality from our individual actions and pruning one's own actions is very hard. Most people shy from this and look for externalizations.
Not every aspect of every interaction online needs to automate out human interaction.
The other aspect is to actively encourage REPORT for minors on the platform.
Another possibility is to allow 16+yo and wall them off from those over ~21 with a big fat banner on the account that it's a minor.
So to truly keep minors off adult dating apps you’d have to require everyone to submit photo ID and also somehow check that they haven’t just taken a picture of someone else’s ID and used that and I’m not sure how you’d enforce that.
Smartphones are a playground filled with jagged edges and predators behind an entrance of flashing lights and music. People made it, profit from it, and those same people should be held liable for their actions.
This parent isn't understanding that there are cost benefits and diminishing returns. We already filter kids from such platforms and it works to 99.999%, so exactly where should it be? And at what cost?
This parent is also filled with innuendo, seemingly unable to make points that stand on their own with actual dialectics.
> Last month the BBC reported on the death of a 14-year-old schoolgirl who killed herself in 2017 after being exposed to self-harm imagery on the platform.
Fantastic. Everything attributable to a public pseudonym privately tied to a real person.
This sounds exactly how it should be.
If somebody tries to subvert the law, they can easily be arrested and tried in a court of law.
The reason is simple: even if you trust the current government to not abuse all the tasty meta-data, who says the next government, led by the party you vocally criticized on the Internet, will look away?
Just look at China, or what Turkey is doing, or even what US border checks asking for your social media logins are doing right now.
After I realised the issue, we integrated Amazon Rekognition API to detect age range < 18 & block the profiles who had a child's picture on their user profile. The issue is that Rekognition isn't perfect & we lost several users because of this integration & false positives, yet we continued to use it till the end for moderation. We were processing >200,000 profile pictures each month for this.
For the most part, aside from using someone else’s credit information, these businesses have information on age and with facial recognition should be able to flag most falsified accounts.
To further prevent problems - like kids taking their parent's ID and then changing the name and age - dating sites could bind the age of the site profile to the license provided. So if junior uses his old man's ID, he's going to show as 45, not 18, and he won't be able to change it. That has the additional benefit of stopping a lot of the lies related to age on dating sites.
Lastly, they could also bind the first name only to the profile, hide the last name from other users, and allow a nickname or preferred handle to be displayed. So you can't upload a license for John Doe, age 30, and then say in your profile that your preferred name is Tony and you look 15. This would be a red flag to anyone. But makes perfect sense for Robert who likes to be called Bob, or William who goes by Bill, and actually look their age.
1- There are still strong taboos against certain kinds of relationships; being forced to tie one's dating preferences and behaviors to a governmental ID seems dangerous.
2- Should companies get automatic access to one's name and age based solely on the ID number? Seems like a violation of privacy.
3- Mandating that for apps served over Google Play or the App Store is not hard; what about dating webapps served from other jurisdictions?
Finally: will it really work? Looking 15 is already a red flag, because no minor is supposed to be there, yet they are.
2. Not sure I agree this is a violation of privacy since a) you would be providing them this exact same information when building your new profile upon account creation and b) you would agree to its collection as part of the signup process thereby waiving any privacy right you may or may not have in this regard. It would also be considered something a reasonable person would expect the third party to be given when supplying that identification number. For example, I assume that when I provided my information to my insurance company when applying for a personal disability policy that they checked everything on those documents for accuracy (name, age, address, etc.).
3. I was assuming this was legislation and would therefore apply to all services of this nature as long as they operate in the country/state where the legislation is effective.
4. 15 year olds are able to exist on these platforms in part because some 18 year olds look very young while others look much older. So someone can be 18 or 20 and look 15-16.
Am I just jaded or does this seem to be the way things are going?
It's been this way since at least the Mann Act, which was ostensibly about trafficking but used to crack down on interracial relationships (notably by Jack Johnson, who recently got a posthumous pardon)
"Remember when some tea company added "blockchain" to its name and stock prices soared? "Trafficking" is the equivalent word in the policy / non-profit world." -- Alex Frell Levy [2]
[1] https://amp.theguardian.com/us-news/2018/sep/30/houston-robo...
Nobody in first-world countries, you mean. In third-world countries, a lot of what gets categorized as “human trafficking” is just self-motivated border-crossing by career prostitutes.
You know how you might find a fellow working in America who is from, say, Bangladesh, and is here for the jobs that pay 100x as much as jobs in Bangladesh do, so he can send money home to his family? Well, in the hotbed countries of “human trafficking”, many of the women involved are just people who did exactly the same thing, moving from countries like Cambodia (“human trafficking source countries”) to countries like Thailand (“human trafficking destination countries”) because they know there’s far more of a market of sex tourists there with higher expectations of average prices. And her family back home? Thinks exactly the same of her as the family of the Bangladeshi man working in America does. “She’s providing for us; we’re proud of her.”
The annoying thing is that “human trafficking” was established to go after what is, at its core, a real and horrifying crime—the combination of kidnapping and slavery that mafias tend to consider the highest-margin way to operate brothels, factories, etc. If someone isn’t themselves doing the kidnapping, or operating the brothel, but is just, say, driving the kidnapped people around, we didn’t have a crime to charge them with and had to let them go, until we invented “human trafficking.” It was essentially a gang-busting crime, a way to put pressure on ground-level members to get them to give up their higher-ups. But now, basically everything related to the original intent seems to fall under the aegis of that crime†, and its scope has grown to the point that we’ve forgotten why we invented it.
† Probably because of motivated reasoning of statisticians working for the trafficking-related nonprofits. Sort of like the motivated reasoning of statisticians working for cigarette or sugary-cereal companies.
What do you mean by “another”? When has regulating sex with children had the effect of making consensual adult sex illegal?
In these cases, how are children getting in first? Where are the parents of these children??
Then, it's not the apps that create abuse, but the users on the other side. There could be a set of rules that the apps should make users agree with, and remind from time to time, like be kind, don't get angry if it doesn't work out, report strange behaviors/underage profiles/illegal stuff if you happen to detect one, ...
Is it that complicated to be adults in 2019?
Technology changes what it's possible for humans to do or not do (or more often, make it qualitatively more/less feasible). Tool-makers should not take it as axiomatic that they have no responsibility or involvement for the ways that their tools impact other people's lives.
The people in question are very literally not adults.
"Guns don't mass murder people, humans do"
In the real world, if I saw a child go on dates with a grown adult, I'd alert the police and have the man arrested, then I'd do my damndest to have the restaurant shut down for facilitating the grooming of minors.
And most importantly, _in the real world_, we all have visibility of this and society as a whole acts in disgust and would (legal or otherwise) punish the offenders and people who facilitated and accommodated this.
> There could be a set of rules that the apps should make users agree with
A pinky promise maybe...
You lost me there. This is an entirely unreasonable burden.
Yes, that is true, even though you are being snarky. In fact, guns don't mass murder people, humans, put together into larger organizations (typically governments), do.