Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I thought the title meant that a chatbot gave bad medical, engineering, and/or safety-critical advice that a human ended up following.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: