Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You are evolving the topology, but using regular gradient descent/backprop for any given network, correct?


No, in NEAT both the weights and topology are evolved. It is totally gradient-free.


Yeah, topology and weights. It's highly subject to initial conditions. You almost need another NEAT network to evolve the initial conditions. I believe it's turtles all the way down.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: