Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I really hope that BNNS is pronounced "bananas." But more substantially, I'm not totally convinced neural nets have quite as well defined required subroutines yet. Because of backprop, it's not just about what the blocks are, but also about how you combine them. Adding recursion changes the problem. Or a paper like the one on stochastic computation graphs comes along and does the same. I like the idea, but I expect it's still too early to expect something analogous to BLAS to stick.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: