A Channel-Aware AUV-Aided Data Collection Scheme Based on Deep Reinforcement Learning
2025
Lizheng Wei | Minghui Sun | Zheng Peng | Jingqian Guo | Jiankuo Cui | Bo Qin | Jun-Hong Cui
Underwater sensor networks (UWSNs) play a crucial role in subsea operations like marine exploration and environmental monitoring. A major challenge for UWSNs is achieving effective and energy-efficient data collection, particularly in deep-sea mining, where energy limitations and long-term deployment are key concerns. This study introduces a Channel-Aware AUV-Aided Data Collection Scheme (CADC) that utilizes deep reinforcement learning (DRL) to improve data collection efficiency. It features an innovative underwater node traversal algorithm that accounts for unique underwater signal propagation characteristics, along with a DRL-based path planning approach to mitigate propagation losses and enhance data energy efficiency. CADC achieves a 71.2% increase in energy efficiency compared to existing clustering methods and shows a 0.08% improvement over the Deep Deterministic Policy Gradient (DDPG), with a 2.3% faster convergence than the Twin Delayed DDPG (TD3), and reduces energy cost to only 22.2% of that required by the TSP-based baseline. By combining a channel-aware traversal with adaptive DRL navigation, CADC effectively optimizes data collection and energy consumption in underwater environments.
Mostrar más [+] Menos [-]Palabras clave de AGROVOC
Información bibliográfica
Este registro bibliográfico ha sido proporcionado por Directory of Open Access Journals