just me

  • 0 Posts
  • 350 Comments
Joined 1 year ago
cake
Cake day: October 3rd, 2023

help-circle

  • shneancy@lemmy.worldtoMemes@lemmy.mlDucks
    link
    fedilink
    arrow-up
    10
    ·
    1 day ago

    after the goose comes a swan, which though bigger, tougher, and stronger, has chilled the hell out a bit

    after a swan then comes the Canadian goose, which even though it appears to be a return to goose, it’s actually the might of a swan, and the rage of a 100 regular geese



  • honestly, this is not a terrible idea

    if you see someone at the verge of a panic attack that means they’re fully in their head spiraling - you can try to calm them down the normal way, but you can also try to force them out of their own head and ground them by saying something weird, ideally a question so their mind can latch onto it. It won’t always work, but it might shock them just the right amount to ground them!



  • shneancy@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    8 days ago

    this is not about wanting this is about companies taking advantage of vulnerable people who should be grieving. This can cause lasting psychological harm

    you might as well be saying, if someone came to a drug maker, and wanted some heroine, and provided ingredients for heroine, and agreed to whatever costs were involved, isn’t that entirely their business?



  • shneancy@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    8 days ago

    wow, so many reasons

    • to create a mimic of a person you must first destroy their privacy
    • after an AI has devoured all they’ve ever written or spoken on video it will then mimic such person very well, but most likely still be a legal property of a company that made it
    • in a situation like that you’d then have to pay a subscription to interact with the mimic (because god forbid you ever get actually sold something nowadays)

    now imagine having to pay to talk with a ghost of your loved one, a chatbot that sometimes allows you to forget that the actual person is gone, and makes all the moments where that illusion is broken all the more painful. A chatbot that denies you grief, and traps you in hell where you can talk with the person you lost, but never touch them, never feel them, never see them grow (or you could pay extra for the chatbot to attend new skill classes you could talk about :)).

    It would make grieving impossible and take constant advantage of those who “just want to say goodbye”. Grief is already hard as is, a wide spread mimicry of our dead ones would make it a psychological torture

    for more information watch a prediction of our future a fun sci-fi show called Black Mirror, specifically the episode titled Be Right Back (entire series is fully episodic you don’t need to watch from the start)


  • shneancy@lemmy.worldto196@lemmy.blahaj.zoneRule
    link
    fedilink
    arrow-up
    38
    arrow-down
    4
    ·
    8 days ago

    “was outed as”

    i’m not defending Mr. Beast here but from what i’ve read the friend of his, Ava?, was only accused of that, without any actual evidence, and even the person she allegedly groomed came out in her defence. Seeing as she’s a trans woman and the right wingers are trying very hard to paint every trans person as a pedo i’m skeptical if those accusations hold much water

    it feels like every other celebrity is getting accused of grooming, which is either a plague, or the definition of “grooming” has drifted to include “an adult that has conversations with teenagers”