Radio resource management explained

Radio resource management (RRM) is the system level management of co-channel interference, radio resources, and other radio transmission characteristics in wireless communication systems, for example cellular networks, wireless local area networks, wireless sensor systems, and radio broadcasting networks.[1] [2] RRM involves strategies and algorithms for controlling parameters such as transmit power, user allocation, beamforming, data rates, handover criteria, modulation scheme, error coding scheme, etc. The objective is to utilize the limited radio-frequency spectrum resources and radio network infrastructure as efficiently as possible.

RRM concerns multi-user and multi-cell network capacity issues, rather than the point-to-point channel capacity. Traditional telecommunications research and education often dwell on channel coding and source coding with a single user in mind, but when several users and adjacent base stations share the same frequency channel it may not be possible to achieve the maximum channel capacity. Efficient dynamic RRM schemes may increase the system spectral efficiency by an order of magnitude, which often is considerably more than what is possible by introducing advanced channel coding and source coding schemes. RRM is especially important in systems limited by co-channel interference rather than by noise, for example cellular systems and broadcast networks homogeneously covering large areas, and wireless networks consisting of many adjacent access points that may reuse the same channel frequencies.

The cost for deploying a wireless network is normally dominated by base station sites (real estate costs, planning, maintenance, distribution network, energy, etc.) and sometimes also by frequency license fees. So, the objective of radio resource management is typically to maximize the system spectral efficiency in bit/s/Hz/area unit or Erlang/MHz/site, under some kind of user fairness constraint, for example, that the grade of service should be above a certain level. The latter involves covering a certain area and avoiding outage due to co-channel interference, noise, attenuation caused by path losses, fading caused by shadowing and multipath, Doppler shift and other forms of distortion. The grade of service is also affected by blocking due to admission control, scheduling starvation or inability to guarantee quality of service that is requested by the users.

While classical radio resource managements primarily considered the allocation of time and frequency resources (with fixed spatial reuse patterns), recent multi-user MIMO techniques enables adaptive resource management also in the spatial domain.[3] In cellular networks, this means that the fractional frequency reuse in the GSM standard has been replaced by a universal frequency reuse in LTE standard.

Static radio resource management

Static RRM involves manual as well as computer-aided fixed cell planning or radio network planning. Examples:

Static RRM schemes are used in many traditional wireless systems, for example 1G and 2G cellular systems, in today's wireless local area networks and in non-cellular systems, for example broadcasting systems. Examples of static RRM schemes are:

Dynamic radio resource management

Dynamic RRM schemes adaptively adjust the radio network parameters to the traffic load, user positions, user mobility, quality of service requirements, base station density, etc. Dynamic RRM schemes are considered in the design of wireless systems, in view to minimize expensive manual cell planning and achieve "tighter" frequency reuse patterns, resulting in improved system spectral efficiency.

Some schemes are centralized, where several base stations and access points are controlled by a Radio Network Controller (RNC). Others are distributed, either autonomous algorithms in mobile stations, base stations or wireless access points, or coordinated by exchanging information among these stations.[1]

Examples of dynamic RRM schemes are:

Inter-cell radio resource management

Future networks like the LTE standard (defined by 3GPP) are designed for a frequency reuse of one. In such networks, neighboring cells use the same frequency spectrum. Such standards exploit Space Division Multiple Access (SDMA) and can thus be highly efficient in terms of spectrum, but required close coordination between cells to avoid excessive inter-cell interference. Like in most cellular system deployments, the overall system spectral efficiency is not range limited or noise limited, but interference limited.[1] Inter-cell radio resource management coordinates resource allocation between different cell sites by using multi-user MIMO techniques. There are various means of inter-cell interference coordination (ICIC) already defined in the standard.[4] Dynamic single-frequency networks, coordinated scheduling, multi-site MIMO or joint multi-cell precoding are other examples for inter-cell radio resource management.[5]

See also

Notes and References

  1. Book: Guowang. Miao. Jens. Zander. Ki Won. Sung. Ben. Slimane. Fundamentals of Mobile Data Networks. Cambridge University Press. 978-1107143210. 2016.
  2. Book: N. D.. Tripathi. J. H.. Reed. H. F.. Vanlandingham. Radio Resource Management in Cellular Systems. Springer. 079237374X. 2001.
  3. E.. Björnson. E.. Jorswieck. Optimal Resource Allocation in Coordinated Multi-Cell Systems. Foundations and Trends in Communications and Information Theory. 9. 2–3. 113–381. 2013. 10.1561/0100000069.
  4. Heterogeneous LTE Networks and Inter-Cell Interference Coordination. V.. Pauli. J. D.. Naranjo. E.. Seidel. White Paper, Nomor Research. December 2010.
  5. D.. Gesbert. S.. Hanly. H.. Huang. S.. Shamai. O.. Simeone. W.. Yu. 10.1109/JSAC.2010.101202. Multi-cell MIMO cooperative networks: A new look at interference. IEEE Journal on Selected Areas in Communications. 28. 9. 1380–1408. December 2010. 706371. 10.1.1.711.7850.