Reference Signal Received Power or with abbreviated version RSRP is linear averaged signal strength of Reference Signal. The Reference Signal is carried by Resource Elements (RE).
RSRP is used for cell selection, cell reselection and handover purposes in LTE. It is equivalent of CPICH Power in WCDMA.
UE measures all the reference signals which are carried by resource elements and makes an average of measurements to obtain an RSRP value.
In this example we will see theoretical calculation of RSRP:
Frequency Bandwidth: 20 MHz
Number of subcarriers: 1200 (there are 1200 subcarriers in 20 MHz)
RRU power: 20W = 20000mW (milliWatt)
Power in each subcarrier: 20000mW / 1200 = 16.66mW = 12.2dBm
So based on calculation, transmission power for each Resource Element is 12.2dBm
Antenna Gain: 16dBi
Feeder Loss: 3dB
Power from antenna: Power + Antenna Gain – Feeder Loss = 12.2 + 16 – 3 = 25.2dBm
Let’s imagine Path Loss as 128dB in the UE location, then:
UE Measured RSRP equals to: Power from antenna – Path Loss = 25.2 – 128 = -102.8dBm
Thank you for this clear, concise calculation. I wonder if you could help with one aspect that has been confusing me? It is to do with the link budget. Well-accepted calculations (e.g. page 226 of “LTE for UMTS” by Holma and Toskala show that the maxmimum path loss for LTE is about 163 dB. In this case, the Measured RSRP by the UE would be around -138 dBm. But it is also commonly acknowledged (and UE measurements bear this out) that, if the RSRP is less than -120 dBm, no service is possible. That is a discrepancy of 18 dB. Are you able to shed any light on this? Best wishes, Chris