5G NTN: Low-Earth-Orbit 5G Satellite Networks

In recent years, the field of 5G Non-Terrestrial Networks (NTN) has been actively developing, where Low-Earth-Orbit (LEO) satellites are used as base stations to provide broadband Internet access. Such systems enable the extension of mobile network coverage to hard-to-reach and sparsely populated areas, while also supporting various traffic types, including broadband services and real-time applications with strict Quality of Service (QoS) requirements. However, compared to terrestrial 5G systems, satellite networks face several specific limitations, including large signal propagation delays, high satellite mobility, and limited onboard computational resources. Therefore, there is a need to develop new algorithms for data transmission and radio resource management that take into account the specifics of 5G NTN and ensure the required QoS for users.

The Wireless Networks Laboratory conducts research in this area within the framework of a Russian Science Foundation (RSF) grant. In particular, a system-level simulation platform for 5G NTN networks has been developed in the ns-3 environment, and a number of studies on data transmission and resource management algorithms have been carried out. Research on transmission parameter selection and radio resource scheduling for real-time traffic showed that using a delay-aware scheduler can increase network capacity by up to 40%.

Simultaneously, several effects specific to satellite systems were identified, such as channel measurement aging and the dependence of optimal transmission parameters on propagation delays. A model of a satellite base station’s computational unit was developed, enabling accurate estimation of onboard algorithm execution times and the selection of suitable computing platform configurations. Verification against real hardware measurements confirmed a relative execution time estimation error of less than 1%, demonstrating the model’s suitability for analyzing computational resource requirements. This analysis revealed that executing a radio resource scheduling algorithm for 1,000 users requires tens of processing cores.

Another research initiative led to the development of a satellite model with a multibeam antenna, investigating the impact of inter-beam interference on system throughput. It was demonstrated that transmission in adjacent beams creates significant interference, necessitating frequency separation for users, whereas distant beams allow for efficient frequency reuse.

The investigation into channel prediction revealed that under strict delay and reliability constraints, its use can increase the capacity of 5G NTN networks by up to 40%, establishing an upper bound for the potential gain from such algorithms.

For TCP traffic in multi-connectivity scenarios, an algorithm for selecting the optimal satellite base station connection timing was proposed, yielding simulation results showing a throughput gain of 20–70% compared to existing approaches, with performance within 10% of the theoretical upper bound.

The obtained results form a scientific foundation for developing computationally efficient algorithms for data transmission, channel prediction, and radio resource scheduling in 5G NTN networks, with the potential for their further application in real systems.

List of relevant papers: