Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

What's interesting is I've run it several times now, and I've gotten the perfect bird about half the time on an early generation (<10), but sometimes it just doesn't want to find it for a while. I'm wondering if the net is just at the threshold of being the minimal set of neurons that are needed for it to be able to get it so easily by random chance.


I was surprised to get a winner on generation 7 the first time I started the trainer. Subsequent trials were much worse.


I was surprised that I got one that get really far and then in the next generation they failed right at the start


In a GA approach, you'd have kept that bird AND mutate/breed offspring from it. In a NN, you often get oscillation during training.


This is a GA. I just think it doesn't practice elitism (saving the best solutions found so far.) Or it's nondeterministic.


The problem seems to be the map.

It's always different and you can get to 1000 points if you are lucky with the map. But the one that got to 1000 might not get a gap that didn't show up in the last map.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: