Optimizing Energy Efficiency in Vehicular Edge-Cloud Networks Through Deep Reinforcement Learning-Based Computation Offloading
Vehicular Edge-Cloud Computing (VECC) paradigm has emerged as a viable approach to overcome the inherent resource limitations of vehicles by offloading computationally demanding tasks to remote servers. Despite its potential, existing offloading strategies often result in increased latency and sub-o...
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2024-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10788691/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1846113899081891840 |
|---|---|
| author | Ibrahim A. Elgendy Ammar Muthanna Abdullah Alshahrani Dina S. M. Hassan Reem Alkanhel Mohamed Elkawkagy |
| author_facet | Ibrahim A. Elgendy Ammar Muthanna Abdullah Alshahrani Dina S. M. Hassan Reem Alkanhel Mohamed Elkawkagy |
| author_sort | Ibrahim A. Elgendy |
| collection | DOAJ |
| description | Vehicular Edge-Cloud Computing (VECC) paradigm has emerged as a viable approach to overcome the inherent resource limitations of vehicles by offloading computationally demanding tasks to remote servers. Despite its potential, existing offloading strategies often result in increased latency and sub-optimal performance due to the concentration of workloads on a limited number of connected Roadside Units (RSUs). Moreover, ensuring data security and optimizing energy usage within this framework remain significant challenges. To address these concerns, this paper proposes a comprehensive framework for VECC systems. A novel load-balancing algorithm is proposed to effectively redistribute vehicles among RSUs, considering factors such as RSUs load, computational capacity, and data rate. Additionally, a robust security mechanism is incorporated using the Advanced Encryption Standard (AES) in conjunction with Electrocardiogram (ECG) signals as encryption keys to enhance data protection during transmission. To further improve system efficiency, a novel caching strategy is introduced, enabling edge servers to store completed tasks, which in turn reduces both latency and energy consumption. An optimization model is also proposed to minimize energy expenditure while ensuring that latency constraints are satisfied during computation offloading. Given the complexity of this problem in large-scale vehicular networks, the study formulates an equivalent reinforcement learning model and employs a deep learning algorithm to derive optimal solutions. Simulation results conclusively demonstrate that the proposed model significantly outperforms existing benchmark techniques in terms of energy savings. |
| format | Article |
| id | doaj-art-d2d0413fefeb4e4d89932c1a548c6911 |
| institution | Kabale University |
| issn | 2169-3536 |
| language | English |
| publishDate | 2024-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Access |
| spelling | doaj-art-d2d0413fefeb4e4d89932c1a548c69112024-12-21T00:01:23ZengIEEEIEEE Access2169-35362024-01-011219153719155010.1109/ACCESS.2024.351488110788691Optimizing Energy Efficiency in Vehicular Edge-Cloud Networks Through Deep Reinforcement Learning-Based Computation OffloadingIbrahim A. Elgendy0https://orcid.org/0000-0001-7154-2307Ammar Muthanna1Abdullah Alshahrani2Dina S. M. Hassan3https://orcid.org/0000-0002-2186-6052Reem Alkanhel4https://orcid.org/0000-0001-6395-4723Mohamed Elkawkagy5IRC for Finance and Digital Economy, KFUPM Business School, King Fahd University of Petroleum and Minerals, Dhahran, Saudi ArabiaDepartment of Telecommunication Networks and Data Transmission, Bonch-Bruevich Saint Petersburg State University of Telecommunications, Saint Petersburg, RussiaDepartment of Computer Science and Artificial Intelligence, College of Computer Science and Engineering, University of Jeddah, Jeddah, Saudi ArabiaDepartment of Information Technology, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, P.O. Box 84428, Riyadh, Saudi ArabiaDepartment of Information Technology, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, P.O. Box 84428, Riyadh, Saudi ArabiaDepartment of Computer Science, Faculty of Computers and Information, Menoufia University, Shebeen El-Kom, EgyptVehicular Edge-Cloud Computing (VECC) paradigm has emerged as a viable approach to overcome the inherent resource limitations of vehicles by offloading computationally demanding tasks to remote servers. Despite its potential, existing offloading strategies often result in increased latency and sub-optimal performance due to the concentration of workloads on a limited number of connected Roadside Units (RSUs). Moreover, ensuring data security and optimizing energy usage within this framework remain significant challenges. To address these concerns, this paper proposes a comprehensive framework for VECC systems. A novel load-balancing algorithm is proposed to effectively redistribute vehicles among RSUs, considering factors such as RSUs load, computational capacity, and data rate. Additionally, a robust security mechanism is incorporated using the Advanced Encryption Standard (AES) in conjunction with Electrocardiogram (ECG) signals as encryption keys to enhance data protection during transmission. To further improve system efficiency, a novel caching strategy is introduced, enabling edge servers to store completed tasks, which in turn reduces both latency and energy consumption. An optimization model is also proposed to minimize energy expenditure while ensuring that latency constraints are satisfied during computation offloading. Given the complexity of this problem in large-scale vehicular networks, the study formulates an equivalent reinforcement learning model and employs a deep learning algorithm to derive optimal solutions. Simulation results conclusively demonstrate that the proposed model significantly outperforms existing benchmark techniques in terms of energy savings.https://ieeexplore.ieee.org/document/10788691/Autonomous vehiclesenergy efficiencyload balancingcomputation offloadingvehicular edge-cloud computingtask caching |
| spellingShingle | Ibrahim A. Elgendy Ammar Muthanna Abdullah Alshahrani Dina S. M. Hassan Reem Alkanhel Mohamed Elkawkagy Optimizing Energy Efficiency in Vehicular Edge-Cloud Networks Through Deep Reinforcement Learning-Based Computation Offloading IEEE Access Autonomous vehicles energy efficiency load balancing computation offloading vehicular edge-cloud computing task caching |
| title | Optimizing Energy Efficiency in Vehicular Edge-Cloud Networks Through Deep Reinforcement Learning-Based Computation Offloading |
| title_full | Optimizing Energy Efficiency in Vehicular Edge-Cloud Networks Through Deep Reinforcement Learning-Based Computation Offloading |
| title_fullStr | Optimizing Energy Efficiency in Vehicular Edge-Cloud Networks Through Deep Reinforcement Learning-Based Computation Offloading |
| title_full_unstemmed | Optimizing Energy Efficiency in Vehicular Edge-Cloud Networks Through Deep Reinforcement Learning-Based Computation Offloading |
| title_short | Optimizing Energy Efficiency in Vehicular Edge-Cloud Networks Through Deep Reinforcement Learning-Based Computation Offloading |
| title_sort | optimizing energy efficiency in vehicular edge cloud networks through deep reinforcement learning based computation offloading |
| topic | Autonomous vehicles energy efficiency load balancing computation offloading vehicular edge-cloud computing task caching |
| url | https://ieeexplore.ieee.org/document/10788691/ |
| work_keys_str_mv | AT ibrahimaelgendy optimizingenergyefficiencyinvehicularedgecloudnetworksthroughdeepreinforcementlearningbasedcomputationoffloading AT ammarmuthanna optimizingenergyefficiencyinvehicularedgecloudnetworksthroughdeepreinforcementlearningbasedcomputationoffloading AT abdullahalshahrani optimizingenergyefficiencyinvehicularedgecloudnetworksthroughdeepreinforcementlearningbasedcomputationoffloading AT dinasmhassan optimizingenergyefficiencyinvehicularedgecloudnetworksthroughdeepreinforcementlearningbasedcomputationoffloading AT reemalkanhel optimizingenergyefficiencyinvehicularedgecloudnetworksthroughdeepreinforcementlearningbasedcomputationoffloading AT mohamedelkawkagy optimizingenergyefficiencyinvehicularedgecloudnetworksthroughdeepreinforcementlearningbasedcomputationoffloading |