Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Evolution is not trying to find one particular DNA string out of that exponentially sized set. There are lots of members of that set that are viable organisms.

In any case this is beside the point. The fact remains that genetic programming is a class of algorithms that has not been succesful in the slightest. People should just stop talking about it as if it was anything other than a complete failure, since that will only lead to even more wasted human effort. Gradient descent on the other hand is hugely succesful in solving a wide variety of real problems.



Im not advocating GP > nn+grad. I'm just saying nobody knows why nn+grad performs better on practical problems. Also nn+grad methods were essentially complete failures for the first 3-4 decades of their existence (invented in 60s, legit results like Lecun or Hinton in like 90s/00s).


This is so wrong at so many levels (sadly quite characteristic of the GA crowd) that I can only suggest reading up on the background and the math of it. Good keywords would be stochastic gradient descent, spin glasses and renormalization groups.


Whether nns have an interpretation as spin systems is completely nonresponsive to the hardness of training them. Ie minimizing "free energy" in spin systems is as NP-hard as minimizing training error in nns.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: