This libertarian’s argument, in this case, boils down to the idea that if you have some new, blatantly addicting technology that may well be associated with depression, narcissism, and self-harm, you have to wait for certainty. absolute in this association before starting. thinking about the limits of how kids use it, as there was once a moral panic about comics and it wasn’t that embarrassing. I may have buried my 13-year-old self too much, but I’m not convinced.
But if we are ready to think about putting limits on the teen Instagram experience, we probably need something more than a general rage against reckless Silicon Valley nerds. Yes, ideally social media companies would self-regulate in their dealings with teens, and it’s true that following the Wall Street Journal’s bad publicity, Facebook is temporarily halting plans to explicitly launch a version. from Instagram. for kids. But real and sustained self-regulation usually only occurs under threat of external action or with the establishment of a new consensus around what is acceptable to sell to children. So, for people who read the Journal article and come away furious with Facebook, the question should be: what exact consensus do you want? What standards do you expect Instagram or any other business to follow? In view of the data, what rules must they obey?
And if your answer is that they should be forced to invent an algorithm that doesn’t fuel depression or anxiety, then I’m not sure I take your anger seriously. You set us up for a future of endless public promises to fine tune the algorithm coupled with constant pressure behind the scenes to get as many as possible, damn the mental health effects. (A future very similar to our own present.)
No, if you really want to take precautionary steps that could really limit the damage done by social media, you need those steps to be much simpler and more straightforward: you have to create a world where social media is meant to be for. adults and older children. the networks are supposed to control their members and try to prevent children under the age of 16 or 18 from entering.
What would be lost in such a world? Arguably, social media provides essential forms of connection and belonging to isolated and unhappy children in their flesh and blood environment. (Although if this is really the case, you would expect the previous decade to be an inflection point towards improving adolescent mental health, which it most certainly was not.) support that this provides opportunities for children to experiment creatively and develop as artists and innovators. . (Although the belief that TikTok nurtures aesthetic genius sometimes feels the illusion of a Philistine, nurtured by an adult facility that lacks the self-confidence to actually educate its children about the distinction between quality and waste.)
In either case, however, in a world where Instagram couldn’t rely on 15-year-olds to mine its stats, some of these alleged benefits of social media would still be available through the wider internet, which offered all kinds of forms of community, all kinds of outlets for creativity, before the arrival of Twitter and Facebook.
A key issue with social media, from this perspective, is not just their online character, but their scale. As Chris Hayes puts it in a recent essay for The New Yorker, the contemporary Internet universalizes “the psychological experience of celebrity” and takes “all the mechanisms of human relationships and puts them to work” in seeking more. But it happens much deeper on a network like Instagram, with all of its millions of users swarming, than in a forum or chat room for a specific niche identity or interest.