relation: https://khub.utp.edu.my/scholars/15270/ title: Vision-Based Autonomous Navigation Approach for a Tracked Robot Using Deep Reinforcement Learning creator: Ejaz, M.M. creator: Tang, T.B. creator: Lu, C.-K. description: Tracked robots need to achieve safe autonomous steering in various changing environments. In this article, a novel end-to-end network architecture is proposed for tracked robots to learn collision-free autonomous navigation through deep reinforcement learning. Specifically, this research improved the learning time and exploratory nature of the robot by normalizing the input data and injecting parametric noise into the network parameters. Features were extracted from the four consecutive depth images by deep convolutional neural networks, which were used to derive the tracked robot. In addition, a comparison was made between three Q-variant models in terms of average reward, variance, and dispersion across episodes. Also, a detailed statistical analysis was performed to measure the reliability of all the models. The proposed model was superior in all the environments. It is worth noting that our proposed model, layer normalisation dueling double deep Q-network (LND3QN), could be directly transferred to a real robot without any fine-tuning after being trained in a simulation environment. The proposed model also demonstrated outstanding performance in several cluttered real-world environments considering both static and dynamic obstacles. © 2001-2012 IEEE. publisher: Institute of Electrical and Electronics Engineers Inc. date: 2021 type: Article type: PeerReviewed identifier: Ejaz, M.M. and Tang, T.B. and Lu, C.-K. (2021) Vision-Based Autonomous Navigation Approach for a Tracked Robot Using Deep Reinforcement Learning. IEEE Sensors Journal, 21 (2). pp. 2230-2240. ISSN 1530437X relation: https://www.scopus.com/inward/record.uri?eid=2-s2.0-85098142899&doi=10.1109%2fJSEN.2020.3016299&partnerID=40&md5=485df683ab0c1b1d1a6d2bb15a4d1b8a relation: 10.1109/JSEN.2020.3016299 identifier: 10.1109/JSEN.2020.3016299