Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>To truly predict the next token you have to understand everything in the universe.

What do "truly" and "understand" mean here, when LLMs are known for making errors and generating semantically correct but contextually meaningless results? Do they truly understand everything in the universe, or is the appearance of understanding the result of that universe being composed of data created by humans, so that relevant and meaningful results are statistically more likely than not?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: