Don’t forget WhatsApp. Kids are allowed to have WhatsApp as messaging but they get fed videos there too. There is no way to really disable them . Also this be allowed as parental supervision, not something that kids can override.
In before someone says ‘blame the parents’ and not the multi-billion dollar companies who’ve spent decades targeting children for lifelong addiction, ignoring the negative effects on their mental health.
The guy who made the drugs is guilty. The guy who sold the drugs to kids is guilty. But parents who failed to warn kids about drugs and to oversee them properly are also guilty...
Generally in an article about arresting or sentencing a drug dealer, people don't bring up that the drug users are actually to blame.
Now if we're in a discussion around the cartels, plenty of people do bring up (and there's also those that get annoyed by it) that the drug users are actually the ones funding the cartels via their drug use.
Along these lines, I think another fun comparison might be opioid use and Purdue.
That is a fair point, I did attempt to make a complete list, of course, but you are right, there are more layers that could be named. All valid. The point I was making is that parents are also responsible.
eg: I grew up in a very nasty place. My neighborhood had a few pregnant 13 year old girls and a lot of drunks and smokers, including kids in their early teens. My parents kept me away from it all, while also both having full-time jobs. They put a lot of work into filtering whom I could be friends with and where I was allowed to be. THAT is the job of a parent.
The thing is, it should be both. Parents often give too little fucks for long term welfare of their children, often also guilty of same vices. Issue is, these addictions are way more destructive to young forming mind than to adults. Nobody having small kids now had fb or instagram access when they were 5, did they.
Maybe you don't do this. Certainly I don't. But when looking around, its much less rosy and... lets say in blue collar families its too common to drug kids with screens so parents have off time. Heck, some are even proud how modern parents they are. Any good advice is successfully ignored, and ideas of passing some proper time with kids instead are skillfully avoided. People got lazy and generally expect miracles from life without putting in any miracle-worth efforts.
Companies just maximize their profits till laws allows them (and then some more), and expecting nice moral behavior by default is dangerously naive and never true.
This just seems ripe for selective enforcement if not codified in law. I agree the algorithm they use can be addicting, but it's because it's simply good at providing content the user wants to consume.
Besides a general 'don't be too good' I'm really not sure what companies should do about it. It just seems like it'll lead to some judges allowing rulings against companies they don't like.
Television's goal was always viewer retention as well, they were just never able to target as well as you can on the internet.
I see it as similar to the public health crisis created when protonated nicotine salts made their way into vapes along with flavors allowing 2-10x more nicotine to be delivered and the innovation that made Juul so popular with children.
The subsequent effects - namely being easier to consume and more addictive - eventually resulted in legislation catching up, and restrictions on what Juul could do. It being "too good" of a product parallels what we're seeing in social media seven years later.
Like most[all] all public health problems we see individualization of responsibility touted as a solution. If individualization worked, it would have already succeeded. Nothing prevents individualization except its failure of efficacy.
What does work is systems-level thinking and considering it an epidemiological problem rather than a problem of responsibility. Responsibility didn't work with the AIDS crisis, it didn't work on Juul, and it's not going to work on social media.
It is ripe for public health strategies. The biggest impediment to this is people who mistakingly believe that negative effects represent a personal moral failure.
Are there any takeaways here for builders of social media applications who are not Facebook or Google? Is this a warning to not make your newsfeed algorithm "too engaging" or is it only really relevant for big companies?
IMO, parents share just as much blame here, if not more. Giving your kids independence doesn't mean being oblivious to what they're doing online. Too many parents confuse hands-off parenting with not parenting at all.
Have you met kids? They’re devious, tech knowledgeable, and scheming and can find ways around any rule. Plus, no matter how good of a parent you are, you’re somewhat at the mercy of their friends’ parents as well. I can block TikTok from my daughter’s phone, but can’t block her from watching her friend’s phone while she’s out of the house.
> During his first-ever appearance before a jury in February, Meta's chairman and chief executive, Mark Zuckerberg, relied on his company's longstanding policy of not allowing users under the age of 13 on any of its platforms.
> When presented with internal research and documents showing that Meta knew young children were in fact using its platforms, Zuckerberg said he "always wished" for faster progress to identify users under 13. He insisted the company had reached the "right place over time".
Soon there will be government IDs required to use social media sites because parent's can't take phones away from their kids.
Apps like instagram and YouTube should be required at least to give an option to disable reels and shorts
Don’t forget WhatsApp. Kids are allowed to have WhatsApp as messaging but they get fed videos there too. There is no way to really disable them . Also this be allowed as parental supervision, not something that kids can override.
In before someone says ‘blame the parents’ and not the multi-billion dollar companies who’ve spent decades targeting children for lifelong addiction, ignoring the negative effects on their mental health.
It need not be either-or.
The guy who made the drugs is guilty. The guy who sold the drugs to kids is guilty. But parents who failed to warn kids about drugs and to oversee them properly are also guilty...
Generally in an article about arresting or sentencing a drug dealer, people don't bring up that the drug users are actually to blame.
Now if we're in a discussion around the cartels, plenty of people do bring up (and there's also those that get annoyed by it) that the drug users are actually the ones funding the cartels via their drug use.
Along these lines, I think another fun comparison might be opioid use and Purdue.
So is the judicial system that is not making this illegal or don't enforce laws to prevent people targeting kids to create early dependence on drugs.
That is a fair point, I did attempt to make a complete list, of course, but you are right, there are more layers that could be named. All valid. The point I was making is that parents are also responsible.
eg: I grew up in a very nasty place. My neighborhood had a few pregnant 13 year old girls and a lot of drunks and smokers, including kids in their early teens. My parents kept me away from it all, while also both having full-time jobs. They put a lot of work into filtering whom I could be friends with and where I was allowed to be. THAT is the job of a parent.
The thing is, it should be both. Parents often give too little fucks for long term welfare of their children, often also guilty of same vices. Issue is, these addictions are way more destructive to young forming mind than to adults. Nobody having small kids now had fb or instagram access when they were 5, did they.
Maybe you don't do this. Certainly I don't. But when looking around, its much less rosy and... lets say in blue collar families its too common to drug kids with screens so parents have off time. Heck, some are even proud how modern parents they are. Any good advice is successfully ignored, and ideas of passing some proper time with kids instead are skillfully avoided. People got lazy and generally expect miracles from life without putting in any miracle-worth efforts.
Companies just maximize their profits till laws allows them (and then some more), and expecting nice moral behavior by default is dangerously naive and never true.
This just seems ripe for selective enforcement if not codified in law. I agree the algorithm they use can be addicting, but it's because it's simply good at providing content the user wants to consume.
Besides a general 'don't be too good' I'm really not sure what companies should do about it. It just seems like it'll lead to some judges allowing rulings against companies they don't like.
Television's goal was always viewer retention as well, they were just never able to target as well as you can on the internet.
I see it as similar to the public health crisis created when protonated nicotine salts made their way into vapes along with flavors allowing 2-10x more nicotine to be delivered and the innovation that made Juul so popular with children.
The subsequent effects - namely being easier to consume and more addictive - eventually resulted in legislation catching up, and restrictions on what Juul could do. It being "too good" of a product parallels what we're seeing in social media seven years later.
Like most[all] all public health problems we see individualization of responsibility touted as a solution. If individualization worked, it would have already succeeded. Nothing prevents individualization except its failure of efficacy.
What does work is systems-level thinking and considering it an epidemiological problem rather than a problem of responsibility. Responsibility didn't work with the AIDS crisis, it didn't work on Juul, and it's not going to work on social media.
It is ripe for public health strategies. The biggest impediment to this is people who mistakingly believe that negative effects represent a personal moral failure.
Lets just be honest, if you make enough money its legal in America.
Unless you hurt children, then its mostly legal and a slap on the wrist.
Nukes are the same as knives, just different in magnitude. Should one have special rules?
Are there any takeaways here for builders of social media applications who are not Facebook or Google? Is this a warning to not make your newsfeed algorithm "too engaging" or is it only really relevant for big companies?
I'm not an authority on this matter. But if you say "I can stop any time", and it is not true, then you have a problem.
IMO, parents share just as much blame here, if not more. Giving your kids independence doesn't mean being oblivious to what they're doing online. Too many parents confuse hands-off parenting with not parenting at all.
Have you met kids? They’re devious, tech knowledgeable, and scheming and can find ways around any rule. Plus, no matter how good of a parent you are, you’re somewhat at the mercy of their friends’ parents as well. I can block TikTok from my daughter’s phone, but can’t block her from watching her friend’s phone while she’s out of the house.
Notably a different case from the other one in New Mexico:
Jury finds Meta liable in case over child sexual exploitation on its platforms
https://news.ycombinator.com/item?id=47509984
> During his first-ever appearance before a jury in February, Meta's chairman and chief executive, Mark Zuckerberg, relied on his company's longstanding policy of not allowing users under the age of 13 on any of its platforms.
> When presented with internal research and documents showing that Meta knew young children were in fact using its platforms, Zuckerberg said he "always wished" for faster progress to identify users under 13. He insisted the company had reached the "right place over time".
Soon there will be government IDs required to use social media sites because parent's can't take phones away from their kids.