Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The idea of adding hallucination to medical advice seems very dangerous.


There’s also a regression-to-the-mean problem, the systems really shouldn’t optimize just for the easier cases. I wonder if that’s a direct tradeoff, I think maybe it is with the kinds of things I see used to tweak out hallucinations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: