Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags

CarlosFernandez

1
Posts
A member registered Sep 11, 2018

Recent community posts

This is actually a very common problem in machine learning, including neural networks. I think what's happening here is that for your neural network's current weights, an optimal (or close to optimal) combination of weights has been reached. So in the next generation, when the weight's values are slightly "mutated" and then recombined, the optimal combination of weights for your neural network is being "overshot." 

One way of thinking of this is imagine you're trying to find the lowest point in a landscape. At 13.35, you've just about gotten to the lowest point possible (or at least very close to it). So each generation after that you're stepping over, or "overshooting" that minimum point. 

What you could do, to get even closer to that minimum point (or in this case, optimum fitness), is lower the mutation rate. That'll get you faster speeds, but downside of this is that getting to those speeds will typically take longer.