Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
I Asked GPT-3 About Covid-19. Its Responses Shocked Me (onezero.medium.com)
4 points by andreyk on Sept 3, 2021 | hide | past | favorite | 3 comments


AI/ML have a need for clean, high quality data but I think even an AGI would run into the same problems as human brains.

Which information is accurate? Which are blatant lies or unintentional misinformation? What biases its knowledge might have based on the data it chose to trust?

It's going to be disappointing at best and very dangerous at worst to assume an AGI is infallible and won't run into the same problems human brains run into in the real world.


Too bad it got the same thing wrong the early press coverage did.


Models are a reflection of data, isn't that absolutely clear at this point? This should not be news to anybody. Applied ML today is, in large part (very large), collecting high quality data.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: