Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

After reading this article, my reaction is that yes, transformers are not the optimal model for NLP and reasoning, but transformers are good enough and scale really well on current GPU hardware. Going forward, I think hardware is the constraint upon which new architectures will need to accommodate


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: