The radiometric resolution of a sensor refers its sensitivity, which is the ability to detect small differences in signal strength as it records the radiant flux reflected, emitted, or back-scattered from the terrain. The specification of the radiometric resolution is different in the optical domain of the electromagnetic spectrum than in the radar range. In the optical domain, the radiometric resolution is given in bits. The maximum number of brightness levels available depends on the number of bits. The larger this number, the higher the radiometric resolution. As an example, the optical sensor Sentinel-2 has a radiometric resolution of 12 bits. This means that a pixel of an image acquired by Sentinel-2 can have 2^12 = 4096 grey levels. In the radar domain, the radiometric resolution is usually specified as a backscatter level expressed as an logarithmic value. For instance, the radiometric resolution of Radar Scattermeters lies in the range of 0.1 to 0.3 dB, whereas the radiometric resolution of SAR sensors are in the range of 1.2 – 2.5 dB. This means that only differences in radar backscatter larger than these values can be interpreted as interpretable changes the of backscatter conditions at the Earth’s surface. Smaller measurement differences could have been caused by differences in backscatter conditions or just as well by instrument noise.
Discuss how radiometric resolution influences the granularity of a land cover classification
Completed