Christenson, ChrisKaikhah, Khosrow2012-02-242012-02-242006-02Christenson, C. & Kaikhah, K. (2006). Incremental Evolution of Trainable Neural Networks that are Backwards Compatible. Proceedings of the 5th IASTED International Conference on Artificial Intelligence and Applications (AIA), pp. 222-227.https://hdl.handle.net/10877/3809Supervised learning has long been used to modify the artificial neural network in order to perform classification tasks. However, the standard fully-connected layered design is often inadequate when performing such tasks. We demonstrate that evolution can be used to design an artificial neural network that learns faster and more accurately. By evolving artificial neural networks within a dynamic environment, the artificial neural network is forced to use learning. This strategy combined with incremental evolution produces an artificial neural network that outperforms the standard fully-connected layered design. The resulting artificial neural network can learn to perform an entire domain of tasks, including those of reduced complexity. Evolution alone can be used to create a network that performs a single task. However, real world environments are dynamic and thus require the ability to adapt to changes.Text6 pages1 file (.pdf)enincremental evolutionneural networkstrainingbackwards compatibleComputer ScienceIncremental Evolution of Trainable Neural Networks that are Backwards CompatibleArticle