In this lecture, we want to look at a major consideration in terms of the perceived quality of radar images. They often look speckled in appearance and lack the crisp quality of optical images. In the next lecture, we are going to look at the nature of the scattering coefficient for various Earth surface cover tops. Before that though, we need to be aware of this unusual problem of speckle, how it arises, and how it can be traded. That's what we want to do in this lecture, along with an overview of what is to follow. As just noted, one of the most striking differences in the appearance of radar imagery compared to optical image data is its poor radiometric quality, that is caused by speckle. The figure here shows a portion from an image of a fairly homogeneous region in which speckle is obvious. Speckle is a direct result of the fact that the incident energy is coherent. It is assumed to have a single frequency and the wavefront arrives at a pixel with a single phase. What does that mean?. Incident sunlight that forms the basis of most of our optical remote sensing is not coherent. It is a collection of energy at the different wavelengths corresponding to the wavelength range of the image and band, and there is no relationship between the phases of the different components. The phase angle of a sinusoid, which represents a propagating electric field, is a relative quantity which describes its position in time. In the diagram on the left, the phase angle is described with respect to the time origin. More often though, we talk about the differences in phase between two or more sinusoids as shown in the diagram on the right. A direct consequence of the phase difference between two or more sinusoids is interference. When they add in phase, we get a sinusoid of twice the amplitude, as in the top right-hand diagram. When they add out of phase or have a phase difference of 190 degrees between them we get zero, as in the bottom right-hand diagram. In between, we get somewhere between those two extremes depending on the phase difference between the sinusoids as illustrated. How does that explain speckle? Typically, a pixel will be a set of a very large number of incremental scatterers. We saw that when we developed the radar equation. Their returns combine to give the resultant received signal for the pixel. Such a situation is illustrated in the diagram here. The next brightness of the resolution element or pixel will be the result of all the incremental reflections interfering with each other. An adjacent pixel will have a different set of interfering scatterers, but will have about the same average brightness value if both pixels are of the same cover type. If nothing is done to reduce the speckle in a recorded radar image, the speckle level will be too high to allow sensible interpretation of the image. Before radar imagery is released for use, it usually has undergone some form of speckle reduction. There are many ways to reduce the influence of speckle. Some quite sophisticated filters are used, but often just a simple averaging of several images of the same scene is used. But averaging will unfortunately reduce spatial resolution as well since averaging is smoothing. However, when the radar is designed, it is anticipated that averaging will be needed to reduce speckle, so the designed spatial resolution, usually in azimuth, is higher than needed. After averaging the azimuth and range resolutions approximately match as required by the user. The number of signals averaged in the terminology of radar imaging is called the number of looks. On the next slide, we see an example of speckle reduction by averaging. This is a simple made-up example, but it illustrates the point that averaging will reduce to notice periods or speckle while retaining the average value of the signal, that is the scattering coefficient. Before separate images have been generated from a distribution with a mean of 50.7 and a standard deviation of 15.05. When full looks are averaged, the mean is preserved but the standard deviation has been halved. The speckle variance, which is equivalent to its notice power contribution, has been reduced by a factor of four. In the next lecture, we start our trip with our radar scattering mechanisms. That is, how different Earth surface elements scatter energy in radar remote sensing leading to the formation of an image. Here we've summarized the situation so that we have an overview of what is to come. The most common scattering mechanisms that contribute to the formation of images in radar remote sensing are illustrated here. Unlike the case of optical remote sensing where, because of the very short wavelengths involved, scattering mostly occurs in surfaces. In radar, the situation is much more complex, including the possibility of scattering from elements within the surface. In summary, the actual radar signal is carried by electric and magnetic fields. We concentrate just on the electric fields since the accompanying magnetic fields are directly related to the electric fields. Electric fields are represented as sine waves, since that is how that propagate. As a result, sets of electric fields can experience constructive and destructive interference. The backscattered signal from a pixel is composed of a large set of reflected fields from the myriad of scatterers that compose the pixel. Those fields interfere with each other, the result of which is that adjacent pixels, even though from the same cover type, may have different brightnesses. That causes the image to have a speckle appearance, even though adjacent pixels might have quite different brightness values because of speckle, the average brightness of scattering coefficient over a set of pixels of the same cover type will be representative of that cover type. Speckle can be reduced by averaging, which doesn't affect the average value of pixel brightness but reduces the variance and thus the image will look less noisy. The last two questions here are included to add to your understanding about look averaging in radar remote sensing.