TY - CONF A1 - Naulia, P.S. A1 - Watada, J. A1 - Aziz, I.B.A. A1 - Roy, A. UR - https://www.scopus.com/inward/record.uri?eid=2-s2.0-85112411080&doi=10.1109%2fICCOINS49721.2021.9497147&partnerID=40&md5=96dcbde660cad3083340b0c72fd84a4b EP - 356 Y1 - 2021/// PB - Institute of Electrical and Electronics Engineers Inc. SN - 9781728171517 N1 - cited By 1; Conference of 6th International Conference on Computer and Information Sciences, ICCOINS 2021 ; Conference Date: 13 July 2021 Through 15 July 2021; Conference Code:170762 N2 - In recent days a lot of activities in Deep Learning demonstrated ability to produce much better than other Machine Learning techniques. Much of the challenge in the Deep Learning is about optimizing the weights and several hyper parameters as it takes lot of computation and time to do. Gradient descent has been most popular technique currently in its weights optimization for back propagation. Most of the existing implementation of Convolution Neural Networks/Deep Learning Networks plays pivotal role in image processing. Though being scientifically regressive, BP and GD is slowly converging and getting easily trapped in local minima these are inherent disadvantages. For this reason, we explored another optimization with Meta Heuristic Algorithms such as Genetic Algorithm in the Deep Learning algorithm. © 2021 IEEE. ID - scholars14712 TI - A GA approach to Optimization of Convolution Neural Network SP - 351 KW - Convolution; Deep learning; Genetic algorithms; Gradient methods; Heuristic algorithms; Image processing; Learning systems; Neural networks KW - Convolution neural network; Gradient descent; Hyper-parameter; Learning network; Local minimums; Machine learning techniques; Meta heuristic algorithm KW - Learning algorithms AV - none ER -