In-Depth Study: ISE Using Optical and SAR Data

Authored by: Hongsheng Zhang , Hui Lin , Yuanzhi Zhang , Qihao Weng

Remote Sensing of Impervious Surfaces

Print publication date:  September  2015
Online publication date:  September  2015

Print ISBN: 9781482254839
eBook ISBN: 9781482254860
Adobe ISBN:


 Download Chapter



Urban impervious surfaces, such as transport-related land (e.g., roads, streets, and parking lots) and building rooftops (commercial, residential, and industrial areas), have been widely recognized as important indicators for urban environments (Arnold and Gibbons 1996; Hurd and Civco 2004; Weng 2001; Weng et al. 2006). Remote sensing has become the major technique to estimate impervious surfaces due to its low cost and convenience for impervious surface mapping on local to global scales. Numerous methods have been proposed to estimate impervious surfaces from remotely sensed images, including subpixel approaches (e.g., the SMA method [Wu and Murray 2003], classification and regression tree model [Yang et al. 2003b], ANN [Weng and Hu 2008], and SVM [Sun et al. 2011]), and per-pixel approaches such as conventional classification methods (Weng 2012). Recently, a BCI was proposed to extract urban impervious surfaces following the VIS conceptual model (Deng and Wu 2012). However, most of these approaches were proposed with optical remote sensing images, and accurate estimation of impervious surfaces remains challenging due to the diversity of urban land covers, leading to difficulties of separating different land covers with similar spectral signatures (Weng 2012). For instance, dry soils or sands are reported to be confused with bright impervious surfaces due to their high reflectance, while water and shade tend to be confused with dark impervious surfaces.

Search for more...
Back to top

Use of cookies on this website

We are using cookies to provide statistics that help us give you the best experience of our site. You can find out more in our Privacy Policy. By continuing to use the site you are agreeing to our use of cookies.