TY - JOUR Y1 - 2024/// JF - Journal of Network and Computer Applications A1 - Mustafa, E. A1 - Shuja, J. A1 - Rehman, F. A1 - Riaz, A. A1 - Maray, M. A1 - Bilal, M. A1 - Khan, M.K. UR - https://www.scopus.com/inward/record.uri?eid=2-s2.0-85191587173&doi=10.1016%2fj.jnca.2024.103886&partnerID=40&md5=ef65fd26cc57d5521f5d839c875e2111 VL - 226 AV - none N2 - Mobile Edge Computing (MEC) is a modern paradigm that involves moving computing and storage resources closer to the network edge, reducing latency, and enabling innovative, delay-sensitive applications. Within MEC, computation offloading refers to the process of transferring computationally intensive tasks or processes from mobile devices to edge servers, optimizing the performance of mobile applications. Traditional numerical optimization methods for computation offloading often necessitate numerous iterations to attain optimal solutions. In this paper, we provide a tutorial on how Deep Neural Networks (DNNs) resolve the challenges of computation offloading. The article explores various applications of DNNs in computation offloading, encompassing channel estimation, caching, AR and VR applications, resource allocation, mode selection, unmanned aerial vehicles (UAVs), and vehicle management. We present a comprehensive taxonomy that categorizes these applications, and offer an overview of existing schemes, comparing their effectiveness. Additionally, we outline the open research issues that can be addressed through the application of DNNs in MEC offloading. We also highlight specific challenges related to DNN utilization in computation offloading. In conclusion, we affirm that DNNs are widely acknowledged as invaluable tools for optimizing computation offloading in MEC. © 2024 Elsevier Ltd N1 - cited By 0 TI - Deep Neural Networks meet computation offloading in mobile edge networks: Applications, taxonomy, and open issues ID - scholars19632 KW - Antennas; Computation offloading; Delay-sensitive applications; Mobile edge computing; Numerical methods; Optimization; Reinforcement learning; Taxonomies KW - Computation offloading; Computing resource; Delay-sensitive applications; EDGE Networks; Edge server; Modern paradigms; Network applications; Network edges; Reinforcement learnings; Storage resources KW - Deep neural networks ER -