Supervised Data in the Genetics and Backpropagation Learning Algorithms

0
2682

References

Amini, A. (2018, February 01). MIT 6.S191: Introduction to deep learning. Retrieved January 6, 2019, from https://www.youtube.com/watch?v=JN6H4rQvwgY

Dietterich, T. (1995). Overfitting and under computing in machine learning. ACM Computing Surveys,27(3), 326-327. doi:10.1145/212094.212114

Gendler, A. (2016, April 25). The Turing Test: Can a computer pass for a human? Retrieved September 11, 2018, from https://www.youtube.com/watch?v=3wLqsRLvV-c

Glorot, X., & Bengio, Y. (2018). Understanding the difficulty of training deep feedforward Neural Networks. History Studies International Journal of History,10(7), 241-264. doi:10.9737/hist.2018.658

Goldberg, D. E. (1989). Genetic algorithms in search, optimization and machine learning. Reading, MA: Addison-Wesley.

Gregg, B. (2013). Table 2.2. In systems performance: Enterprise and the cloud (1st ed.). Prentice Hall.

Grimson, E. (2017, May 19). 11. Introduction to machine learning. Retrieved November 27, 2018, from https://www.youtube.com/watch?v=h0e2HAPTGF4

Larochelle, H. (2016, October 12). The deep end of deep learning. Retrieved December 28, 2018, from https://www.youtube.com/watch?v=dz_jeuWx3j0

LeBlanc, A. (2015, January 12). Artificial intelligence and the future. Retrieved December 18, 2018, from https://www.youtube.com/watch?v=xH_B5xh42xc

Leedy, P. D., & Ormrod, J. E. (2016). Practical Research(11th ed.). Pearson Education.

Loy, J. (2018, May 14). How to build your own Neural Network from scratch in python. Towards Data Science. Retrieved October 2, 2018, from https://towardsdatascience.com/how-to-build-your-own-neural-network-from-scratch-in-python-68998a08e4f6

Mallawaarachchi, V. (2017). Introduction to Genetic Algorithms?. Towards Data Science. Retrieved January 19, 2019, from https://towardsdatascience.com/introduction-to-genetic-algorithms-including-example-code-e396e98d8bf3.

Mansor, M. A., & Sathasivam, S. (2016). Activation function comparison in neural-symbolic integration. American Institute of Physics. doi:10.1063/1.4954526

McCaffrey, J. D. (2014). Neural Networks Using C# Succinctly. Retrieved January 16, 2019, from https://www.syncfusion.com/ebooks/neuralnetworks/neural-networks

Reptile [Def. 2]. (n.d.). In Merriam Webster. Retrieved March 3, 2019, from https://www.merriam-webster.com/dictionary/reptile

Mitchell, M. (2018, March 12). How we can build AI to help humans, not hurt us. Retrieved September 12, 2018, from https://www.youtube.com/watch?v=twWkGt33X_k

Ramachandran, P., Zoph, B., & Le, Q. V. (2017). Searching for activation functions. ArXiv. Retrieved December 12, 2018, from https://arxiv.org/pdf/1710.05941.pdf.

Raval, S. (2017, May 26). Which activation function should I use? Retrieved December 19, 2018, from https://www.youtube.com/watch?v=-7scQpJT7uo

Raval, S. (2017, January 13). Intro to deep learning #1. Retrieved January 2, 2019, from https://www.youtube.com/watch?v=vOppzHpvTiQ

Raval, S. (2017, January 20). Intro to deep learning #2. Retrieved January 3, 2019, from https://www.youtube.com/watch?v=p69khggr1Jo

Raval, S. (2018, July 08). Backpropagation explained. Retrieved January 5, 2019, from https://www.youtube.com/watch?v=FaHHWdsIYQg

Reed, R., & Marks, R. J. (1999). Learning Rate and Momentum. Neural Smithing. doi:10.7551/mitpress/4937.003.0007

Richter, J. (2012). CLR via c#(4th ed.). Microsoft Press.

Rosenblatt, Frank (1962). Principles of neurodynamics. New York: Spartan. Cf. Rumelhart, D.E., J. L. McClelland and the PDP Research Group (1986). Parallel Distributed Processing vol. 1&2. Cambridge: MIT.

Sanderson, G. (2017, November 03). Backpropagation calculus: Deep learning, chapter 4. Retrieved October 3, 2018, from https://www.youtube.com/watch?v=tIeHLnjs5U8

Sanderson, G. (2017, October 05). But what is a Neural Network? Deep learning, chapter 1. Retrieved September 24, 2018, from https://www.youtube.com/watch?v=aircAruvnKk

Sanderson, G. (2017, October 16). Gradient descent, how Neural Networks learn: Deep learning, chapter 2. Retrieved September 24, 2018, from https://www.youtube.com/watch?v=IHZwWFHWa-w

Sanderson, G. (2017, November 03). What is backpropagation really doing? Deep learning, chapter 3. Retrieved September 25, 2018, from https://www.youtube.com/watch?v=Ilg3gGewQ5U&t=24s&list=WL&index=7

SethBling. (2015, June 13). MarI/O: Machine learning for video games. Retrieved October 31, 2018, from https://www.youtube.com/watch?v=qv6UVOQ0F44

Shah, D. (2017). Activation functions. Towards Data Science. Retrieved January 2, 2019, from https://towardsdatascience.com/activation-functions-in-neural-networks-58115cda9c96.

Skeet, J. (2014). C# in depth(3rd ed.). Shelter Island, NY: Manning.

Strauss, A., & Corbin, J. (1994). Grounded theory methodology. Handbook of Qualitative Research, 17, 273-285. Retrieved November 29, 2018, from http://www.depts.ttu.edu/education/our-people/Faculty/additional_pages/duemer/epsy_5382_class_materials/Grounded-theory-methodology.pdf

Tyka, M. (2015, December 07). The art of Neural Networks. Retrieved January 2, 2019, from https://www.youtube.com/watch?v=0qVOUD76JOg

Winston, P. H. (2010, Fall). Lecture 6: Search: Games, minimax, and alpha-beta. Retrieved October 9, 2018, from https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-034-artificial-intelligence-fall-2010/lecture-videos/lecture-6-search-games-minimax-and-alpha-beta

Zell, A. (1997). Simulation neuronaler Netze. Bonn: Addison-Wesley.

Appendix A

Glossary of Terms and Definitions

TermDefinition
Activation FunctionA function that is used in all neurons within a Neural Network to constrain the output of each neuron into a range that is useful to the programmer (McCaffrey, 2014; Raval, “Which activation function should I use?” 2017; Shah, 2017).
Computer Logic GateAn electronic component that performs a logical function.
Data StructureA method of management/organization of data in software that allows for efficient access and modification.
DendriteA connection between two neurons within a Neural Network. Each dendrite has a weight which is used in the calculation of a neuron’s value (McCaffrey, 2014; Sanderson, “What is backpropagation really doing? Deep learning, chapter 3” 2017; Sanderson, “Backpropagation calculus: Deep learning, chapter 4” 2017).
ErrorThe Error refers to the overall deviation from the desired outputs of the Neural Network. The error is calculated as the Neural Network’s average squared difference from the Neural Network prediction value and corresponding correct value. 
Epoch/GenerationEvery time a Neural Network revisits each of the given test data points when training with a learning algorithm.
Feed ForwardThe process in which a Neural Network passes values through its neurons to compute the overall output of the Neural Network (Zell, 1997).
Fitness RatingThe rating that is given to the specific Neural Network that measures how well the Neural Network completed the given task.
Fitness CalculationA calculation that the Genetics Learning Algorithm performs to find each Neural Network’s fitness rating. In this research project specifically, the fitness ratings were calculated as the Neural Network’s error or deviation from the desired output(s). 
Learning algorithmsComputational formulas used in machine learning to help the technology imitate the human learning process (Grimson, 2017).
Learning/Training ProcessThe process in which a Neural Network’s dendrites are manipulated by a Learning Algorithm. This process is what “teaches” the Neural Network allowing it to successfully complete tasks with minimal error.
Learning RateThe constant value in which the Backpropagation Learning Algorithm uses to limit its dendrite weight updates lowering the magnitude of its steps towards a lower error (Sanderson, “Backpropagation calculus: Deep learning, chapter 4” 2017).
Machine LearningA method of data analysis that uses computers to identify patterns and make decisions with minimal human intervention (Grimson, 2017).
Momentum RateThe multiplier used to limit the influence of the previous dendrite weight updates on the new dendrite weight update.
Neural NetworkA data structure that acts as a mathematical function mapping specific inputs to corresponding outputs (Loy, 2018; McCaffrey, 2014; Zell, 1997).
OverfittingThe process in machine learning where a learning algorithm too closely fits a limited set of data points hindering it from understanding the underlying pattern and thus from predicting future points in the series (Dietterich, 1995).
PlateauA period of time in which the error of a learning algorithm is stagnant despite continuous training.
Training DataData that the learning algorithm uses to train the Neural Network.
Supervised DataData that is labeled with the correct output that the Neural Network is set to predict.
Weight UpdateA weight update is a value calculated by the Backpropagation Learning Algorithm which identifies the direction and distance each dendrite should follow to minimize the Neural Network’s error.

Appendix B

XOR Test Results

Backpropagation Learning AlgorithmGenetics Learning Algorithm
Test NumberEpoch CountTime (Milliseconds)Test NumberGeneration CountTime (Milliseconds)
13656249.62131209305.5349
25582284.67282432620.5403
32819171.03773102214.5134
43387251.82454164318.0572
59455530.52845568996.9213
66509401.997967661286.4294
73153223.01167118236.9378
83261163.93628227340.9875
96003301.92289162251.1158
104557233.786710118185.4179
1110691549.061211177279.5639
126955369.416912367612.7986
134015209.910113134221.8533
145147280.436214200293.5432
157369399.977615174278.4152
165228296.430916207330.7115
173418181.207717120177.5634
1812543647.639418260376.7631
1911302600.022519134203.6898
203614191.368420127189.2223
219749458.366921228335.4027
224957269.016522117185.0294
236783353.123823195292.0773
247511407.366224121192.3093
258032454.610525126223.1806
268198456.634826141227.1523
275656294.333527101169.9704
285165262.73092814351863.1769
295968313.035129139231.7074
304831235.923630106214.5134
Epoch CountTime (Milliseconds)Generation CountTime (Milliseconds)
Minimum:2819163.9362Minimum:101169.9704
1st Quartile:4150.5239.3480251st Quartile:122.25214.5134
Median:5619295.3822Median:163264.7655
3rd Quartile:7475.5406.0241253rd Quartile:222334.2299
Maximum:12543647.6394Maximum:14351863.1769

Appendix C

Reptile Classification Test Results

Training Time (in milliseconds)Backpropagation Prediction PercentGenetics Prediction Percent
061.5453.85
1546.1561.54
3092.3161.54
4592.3161.54
6092.3161.54
7510061.54
9010061.54
10510061.54
12010061.54
13510061.54
15010069.23
16510069.23
18010061.54
19510061.54
21010061.54
22510092.31
24010092.31
25510092.31
270100100
285100100
300100100
31510092.31
33010092.31
345100100
360100100
375100100
390100100
405100100
420100100

Appendix D

Activation Function Definitions

Sigmoid Function
Soft Sign Function

Appendix E

Project Source Code

The project source code written to conduct the research has been publicly shared through Github for other researchers to access. The following link has the code which was created under Amitai’s Github account.

https://github.com/Amitai5/Backprop-VS-Genetics