• palordrolap@kbin.social
    link
    fedilink
    arrow-up
    25
    arrow-down
    1
    ·
    7 months ago

    There are already stories about companies being sued because their AI gave advice that caused the customer to act in a manner detrimental to themselves. (Something about 'plane flight refunds being available if I remember correctly).

    Then when they contacted the company to complain (perhaps get the promised refund), they were told that there was no such policy at their company. The customer had screenshots. The AI had wholesale hallucinated a policy.

    We all know how this is going to go. AI left, right and centre until it costs companies more in AI hallucination lawsuits than it does to employ people to do it.

    And all the while they’ll be bribing lobbying government representatives to make AI hallucination lawsuits not a thing. Or less of a thing.

    • asdfasdfasdf@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      On the other hand, are you implying that human call center workers are accurate with what they tell customers and that when they make mistakes the companies will own up to them and honor them?

      • zbyte64@awful.systems
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        6 months ago

        I mean that’s generally how it is now. If a rep lied to me then you better believe I’m talking to the manager and going to extract some concession. The difference is you can hold a rep accountable, dunno how you do that with AI