Instagram is adult entertainment |


The rise of big tech and social media presents a series of difficult, if not intractable, problems for Western societies. Our internet behemoths are in fact huge media companies that pretend to be neutral platforms, feasting on the revenues that once supported the old media ecosystem while rejecting normal forms of editorial responsibility. Their key products are agents of decentralized suspicion, generating information overload and fueling both populist paranoia and centrist hysteria. Meanwhile, their leaders run transnational pseudo-governments, wielding traditional political powers – cultural censorship, political banishment, structuring large markets – without clear lines of political accountability.

Figuring out how to deal with these challenges is a generational political project, and there is a reasonably strong possibility that Facebook’s Khanate and Amazon’s Most Peaceful Republic will defeat the efforts of real-world republics to restrain their power.

Nonetheless, there is a relatively loose starting point for the political dilemmas that cloud most Internet regulatory plans: we can try to further isolate childhood and adolescence from the reach of social media.

Two weeks ago, the Wall Street Journal reported what Facebook’s own internal research shows about how Instagram, its photo-based social network, is affecting the mental state of the roughly 22 million teens who go online. in the United States every day. The revelations won’t come as a surprise to anyone who has taken a look at social trends since the dawn of the social media age, or for that matter anyone who knows someone with teenagers: Internal documents suggest that the The app has helped teenage depression and anxiety, suicidal ideation and body image issues.

These are not the first results to link the use of social media to young people’s discontent, and whenever information like this enters the public conversation, there are two main reactions.

On the one hand, skeptics who fear uncontrolled moral panic and are inclined to give new technologies the benefit of the doubt, attempt to separate the data, to argue that correlation is not causation (perhaps children who are already prone to discontent are more likely to spend more time online etc.) These responses assume that advocating for restrictions on a product that people clearly like to use is inherently dangerous or illiberal – and therefore the onus is on restaurateurs to establish foolproof proof of the danger they fear.

Alternatively, from people willing to believe the evidence that social media is bad for you, there is a familiar wave of anger at the tech companies themselves, who are accused of only caring about how many (” expanding its base of young users is vital for the company generates more than $ 100 billion in annual revenue, ”the Facebook newspaper notes,“ and it doesn’t want to jeopardize their engagement with the platform ”) instead of be socially responsible and recognize that they are a bunch of nerds who get rich by ruining the world.

My personal feeling is that when you are dealing with children, none of these reactions are quite correct. Many of the problems created by Internet companies involve the aggregation of decisions made, for lack of a better expression, by consenting adults. Amazon has helped carve the heart of the United States, in part because millions of people love convenience and low prices. Misinformation, rumors, and fake news have spread across Facebook, in part because there is a strong human predisposition to share things that confirm our own biases and, in this country, the First Amendment protections to do so. And while the common good may require that certain adult decisions be overturned or curtailed, in a free society we are rightly hesitant about making this kind of judgment.

However, restricting the decisions of minors is another matter. 14-year-old teenager has no more constitutional right to use Instagram than she has the constitutional right to buy one-fifth of Hennessy’s, and strong limits on teenage access to various substances and products are a normal feature of liberal society – opposed primarily by the sort of libertarian who forever identifies with his 13-year-old self.

This libertarian’s argument, in this case, boils down to the idea that if you have some new, blatantly addicting technology that may well be associated with depression, narcissism, and self-harm, you have to wait for certainty. absolute in this association before starting. thinking about the limits of how kids use it, as there was once a moral panic about comics and it wasn’t that embarrassing. I may have buried my 13-year-old self too much, but I’m not convinced.

But if we are ready to think about putting limits on the teenage Instagram experience, we probably need something more than a general rage against the reckless nerds of Silicon Valley. Yes, ideally social media companies would self-regulate in their dealings with teens, and it’s true that following the Wall Street Journal’s bad publicity, Facebook is temporarily halting plans to explicitly launch a version. from Instagram. for kids. But real and sustained self-regulation usually only occurs under threat of external action or with the establishment of a new consensus around what is acceptable to sell to children. So, for people who read the Journal article and come away furious with Facebook, the question should be: what exact consensus do you want? What standards do you expect Instagram or any other business to follow? In view of the data, what rules must they obey?

And if your answer is that they should be forced to invent an algorithm that doesn’t fuel depression or anxiety, then I’m not sure I take your anger seriously. You set us up for a future of endless public promises to fine tune the algorithm coupled with constant pressure behind the scenes to get as many as possible, damn the mental health effects. (A future very similar to our own present.)

No, if you really want to take precautionary steps that could really limit the damage done by social media, you need those steps to be much simpler and more straightforward: you have to create a world where social media is meant to be for. adults and older children. the networks are supposed to control their members and try to prevent children under the age of 16 or 18 from entering.

What would be lost in such a world? Arguably, social media provides essential forms of connection and belonging to isolated and unhappy children in their flesh and blood environment. (Although if this is really the case, you would expect the previous decade to be an inflection point towards improving adolescent mental health, which it most certainly was not.) support that this provides opportunities for children to experiment creatively and develop as artists and innovators. . (Although the belief that TikTok nurtures aesthetic genius sometimes feels the illusion of a Philistine, nurtured by an adult facility that lacks the self-confidence to actually educate its children about the distinction between quality and waste.)

Either way, however, in a world where Instagram couldn’t rely on 15-year-olds to mine its stats, some of these alleged benefits of social media would still be available through the wider internet, which offered all kinds of forms of community, all kinds of outlets for creativity, before the arrival of Twitter and Facebook.

A key issue with social media, from this perspective, is not just their online character, but their scale. As Chris Hayes puts it in a recent essay for The New Yorker, the contemporary Internet universalizes “the psychological experience of fame” and takes “all the mechanisms of human relationships and puts them to work” in seeking more. But it happens in a much deeper way on a network like Instagram, with all of its millions of users swarming, than it would in a forum or chatroom for a specific niche identity or interest.

So the goal of preventing teens from using major social media wouldn’t be to achieve perfect compliance (obviously kids would always slip on it) or to prevent some version of Teen Facebook or Teen TikTok from being able to. take shape on a smaller scale. It would be a question of allowing an experience of adolescence liberated from a Automatique pressure to come together on platforms designed to be panopticon, to host performances aimed at tens of millions of viewers, and to create addictive pressures that clearly drive fully mature adults a little bit crazy.

Saving these adults may not be possible. But taming the internet enough to preserve a childhood free from its worst inconveniences – well, if we can’t even accomplish that, we deserve the bleak future algorithms have prepared.

ROSS DOUTHAT is a columnist for the New York Times.

Previous Beatrice Humane Society discusses the need for a sterilization and sterilization clinic | Local News
Next HomeGoods has finally launched its online store - the wait is over

No Comment

Leave a reply

Your email address will not be published. Required fields are marked *