RSS Bot@lemmy.bestiver.seMB to Hacker News@lemmy.bestiver.seEnglish · 1 month agoAI Error May Have Contributed to Girl's School Bombing in Iranthisweekinworcester.comexternal-linkmessage-square31linkfedilinkarrow-up1117arrow-down17file-text
arrow-up1110arrow-down1external-linkAI Error May Have Contributed to Girl's School Bombing in Iranthisweekinworcester.comRSS Bot@lemmy.bestiver.seMB to Hacker News@lemmy.bestiver.seEnglish · 1 month agomessage-square31linkfedilinkfile-text
minus-squareBasic Glitch@sh.itjust.workslinkfedilinkEnglisharrow-up2·1 month ago when talking about a LLM making someone go off the rails or killing themselves The warning would be for LLMs/chat bots that make people kill themselves. Automated killing systems (like lavender) are use of technology as a weapon of mass destruction. It’s working as intended and the people who created, enabled, and used it should be held accountable.
minus-squareNihilsineNefas@slrpnk.netlinkfedilinkEnglisharrow-up2·1 month agoNo arguments from me. Even if the companies developing these programs are one and the same.
The warning would be for LLMs/chat bots that make people kill themselves.
Automated killing systems (like lavender) are use of technology as a weapon of mass destruction.
No arguments from me. Even if the companies developing these programs are one and the same.