• Basic Glitch@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    when talking about a LLM making someone go off the rails or killing themselves

    The warning would be for LLMs/chat bots that make people kill themselves.

    Automated killing systems (like lavender) are use of technology as a weapon of mass destruction.

    It’s working as intended and the people who created, enabled, and used it should be held accountable.