Mark Zuckerberg says sorry to families of children who committed suicide — after rejecting suggestion to set up a compensation fund to help the families get counseling::CEOs of Meta, TikTok, Snap, Discord, and X testified at hearing on child safety.

  • angelsomething@lemmy.one
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    1
    ·
    edit-2
    10 months ago

    So if they look like lizard people, and speak like lizard people, and when they blink their eyelids move horizontally, doesn’t that make them lizard people? Bunch of cunts, the lot of them. Especially Zuck. Poison of this world and they know it. And by the way, by lizard people I mean literal people that are so distanced from reality that they may well be from another planet.

  • stoly@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    10 months ago

    Why on earth would someone think that this douchenozzle is capable of empathy for humans? He literally stole facebook because he felt entitled to it and had no problems letting governments use it to coordinate genocides THAT HE WAS AWARE OF. No, if there is a hell, this person will be at the top levels of tortured souls and he fully deserves to suffer.

  • THCDenton@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    33
    ·
    10 months ago

    Not a fan of the reptilian, but this isn’t fb’s fault. This is on the abusers, the kids that killed themselves and the careless parents.

    • 31337@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      56
      arrow-down
      1
      ·
      10 months ago

      Meta could’ve done a lot of things to prevent this. Internal documents show Zuckerberg repeatedly rejected suggestions to improve child safety. Meta lobbies congress to prevent any regulation. Meta controls the algorithms and knows they promote bad behavior such as dog piling, but this bad behavior increases “engagement” and revenue, so they refuse to change it. (Meta briefly changed its algorithms for a few months during the 2020 election to decrease the promotion of disinformation and hate speech, because they were under more scrutiny, but then changed it back after the election).

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      10 months ago

      So what you’re saying is that victims of bullying are the real problem, not the people being bullies.

  • Lutra@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    9 months ago

    headline: “We’re still asking some people what they think should be done about the harm they caused.”

    must be nice to get asked what you think you you might want to do about it.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    10 months ago

    This is the best summary I could come up with:


    During a Senate Judiciary Committee hearing weighing child safety solutions on social media, Meta CEO Mark Zuckerberg stopped to apologize to families of children who committed suicide or experienced mental health issues after using Facebook and Instagram.

    asked Zuckerberg if he had ever apologized and suggested that the Meta CEO personally set up a compensation fund to help the families get counseling.

    Zuckerberg did not agree to set up any compensation fund, but he turned to address families in the crowded audience, which committee chair Dick Durbin (D-Ill.) described as the “largest” he’d ever seen at a Senate hearing.

    Among these bills is the Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act (STOP CSAM).

    When that bill was introduced, it originally promised to make platforms liable for "the intentional, knowing, or reckless hosting or storing of child pornography or making child pornography available to any person.” Since then, Durbin has amended the bill to omit the word “reckless” to prevent platforms from interpreting the law as banning end-to-end encryption, Recorded Future News reported.

    Durbin noted that X became the first social media company to publicly endorse the STOP CSAM Act when X CEO Linda Yaccarino agreed to support the bill during today’s hearing.


    The original article contains 414 words, the summary contains 208 words. Saved 50%. I’m a bot and I’m open source!