• megopie@beehaw.org
    link
    fedilink
    arrow-up
    6
    ·
    10 days ago

    I mean there are huge issues with tech, but like, they’re in no way limited to kids… nor does it seem to affect them particularly strongly.

  • TachyonTele@lemm.ee
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    10 days ago

    Every generation this stuff is brought up. And it always means nothing. Of course it’s wrong.

    Rock and roll. TV. Videogames. Social media…

    • Midnitte@beehaw.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 days ago

      Indeed, but to riff on the article a bit - the thing that’s different is that social media has demonstrative harm.

      We need to be teaching kids to use it responsibly, regulating tech companies to give it away responsibly, and not just banning it and grabbing screens out of hands.

      • Rivalarrival@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        9 days ago

        the thing that’s different is that social media has demonstrative harm.

        Is that actually a difference?

        Rock and roll causes harm: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8580930/

        TV causes harm: https://www.health.harvard.edu/mind-and-mood/too-much-tv-might-be-bad-for-your-brain

        Video games cause harm: https://www.apa.org/news/press/releases/2000/04/video-games

        Pretty much everything kids do that their parents didn’t has been “proven” to cause harm. Radio, cinema, comic books, even newspapers were “proven” to harm young people.

        Authoritarianism is a far bigger threat than any of these.

        • Gaywallet (they/it)@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          9 days ago

          I do want to point out that social media use may be one of the first of these ‘evils’ to meet actual statistical significance on a large scale. I’ve seen meta-analyses which show an overall positive association with negative outcomes, as well as criticisms and no correlation found, but the sum of those (a meta-analyses of meta-analyses) shows a small positive association with “loneliness, self-esteem, life satisfaction, or self-reported depression, and somewhat stronger links to a thin body ideal and higher social capital.”

          I do think this is generally a public health reflection though, in the same way that TV and video games can be public health problems - moderation and healthy interaction/use of course being the important part here. If you spend all day playing video games, your physical health might suffer, but it can be offset by playing games which keep you active or can be offset by doing physical activity. I believe the same can be true of social media, but is a much more complex subject. Managing mental health is a combination of many factors - for some it may simply be about framing how they interact with the platform. For others it may be about limiting screen time. Some individuals may find spending more time with friends off the platform to be enriching.

          It’s a complicated subject, as all of the other ‘evils’ have always been, but it is an interesting one because it is one of the first I’ve personally seen where even kids are self-recognizing the harm social media has brought to them. Not only did they invent slang to create social pressures against being constantly online, but they have also started to self-organize and interact with government and local authority (school boards, etc.) to tackle the problem. This kind of self-awareness combined with action being taken at such a young age on this kind of scale is unique to social media - the kids who were watching a bunch of TV and playing video games didn’t start organizing about the harms of it, the harms were a narrative created solely by concerned parents.

  • t3rmit3@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    9 days ago

    This is a tough and complex issue, because tech companies using algorithmic curation and control mechanisms to influence kids and adults is a real, truly dangerous issue. But it’s getting torn at from all sides to force their own agendas.

    Allowing large corporations to control and influence our social interactions is a hugely dangerous precedent. Apple and Google and huge telcos may be involved in delivering your text messages, but they don’t curate or moderate them, nor do they send you texts from other people based on how they want you to feel about an issue, or to sell you products. On social media, companies do.

    But you’ve got right-wingers clamoring to strip companies from liability protections from user-generated content, which does not address the issue, and is all about allowing the government to dictate what content is acceptable from a political standpoint (because LGBTQ+ content is harmful /s and they want companies to censor it).

    And you’ve got neolibs and some extremely misguided progressives pushing for sites that allow UGC (which is by definition all social media) to have to check ages of their users by implementing ID checks (which also of course treats any adults without an accepted form of ID as children), which just massively benefits large companies who can afford the security infra to do those checks and store that data, and kills small and medium platforms, all while creating name-and-face tracking of peoples’ online activities, and legally mandating we turn over more personal data to corporations…

    …and still doesn’t address the issue of corporations exerting influence algorithmically.

    tl;dr the US is a corporatist hellscape where 90% of politicians serve corporations either willfully, or are trivially manipulated to.

    PS: KOSA just advanced out of committee.

    • Kissaki@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 days ago

      The tldr doesn’t match the text. Your elaboration is a lot better than your tldr.