A measure for change detection in very high

alized Gaussian Density (GGD) is a reliable PDF approxima- tion of the .... methods in ecosystem monitoring : A review,” International J. of Rem. Sens., vol.
338KB taille 3 téléchargements 299 vues
A MEASURE FOR CHANGE DETECTION IN VERY HIGH RESOLUTION REMOTE SENSING IMAGES BASED ON TEXTURE ANALYSIS Antoine Lefebvre, Thomas Corpetti, Laurence Hubert Moy COSTEL, UMR CNRS 6554 LETG / IFR 90 CAREN Universit´e Rennes 2, Place du recteur Henri Le Moal, CS 24307, 35043 Rennes Cedex - France ABSTRACT This paper address the problem of change detection in very high resolution remote sensing images. To that end, we define a measure of the observed change based on the distribution of the coefficients issued from a wavelet transform, taking care to be rotation invariant. The dissimilarities are obtained through the Kullback-Liebler distance and a change features vector is defined from all the distances between the bands of the wavelet decomposition. This measurement is able to classify the nature of the change between two images. We present two applications: the first one uses a decision tree to classify several changes (homogeneous or oriented texture, abrupt or subtle change) whereas the second one detects some particular changes from a pair of images (an aerial and a satellite image). These experiments bring out the efficiency of the proposed technique to discriminate correctly the different textures and to interpret each change. Index Terms— Remote sensing, change and texture analysis, wavelet transform 1. INTRODUCTION In the field of earth observation, satellite images are a valuable source of data for detecting natural and anthropogenic changes of land cover and assessing their impacts. These changes are generally characterized by intensities, rhythms and patterns ranging from abrupt changes (on large areas generated by natural disasters) to subtle changes (which affect small regular surfaces like agricultural parcels or urban sprawl). At the moment, most of the change detection methods routinely used by experts have been developed to highlight abrupt changes from remote sensing images at low or medium resolution using almost exclusively the spectral response of the pixels [1]. With the development of Very High Resolution (VHR) sensors such as Quickbird, GeoEye, or Worldview, it is now possible to observe more subtle changes that appear inside an ”entity” (like an agricultural parcel for instance) between two images. However in practice, the two available observations are usually acquired several years apart at different seasons and are likely to be issued from various sensors with different res-

olutions and spectral properties. This yields the direct comparison of the two images impossible. It is then necessary to develop alternative approaches that are not based on a spatial localization of the change but that rather consider the differences in terms of texture inside the objects of interest. In addition, we want to define a methodology that is insensible to the global orientation of the textures inside an entity: for instance, two agricultural parcels corresponding to the same culture but having a different orientation of the digging may not be related to any change. The aim of this study is then to develop an easily implementable rotationally invariant measure able to detect the nature of the changes between two spatial objects, which have different sizes and come from different sensors. We assume that these objects have been segmented using an external approach (in practice, the segmentation is performed by a software dedicated to remote sensing images such as ENVI or eCognition). Among the many existing approaches to characterize the textures (eg [2, 3, 4] for some reviews), we rely on an analysis of the coefficients issued from a wavelet transform on the images. These approaches have indeed turned out successful results [5, 6, 7, 8], in particular for satellite images [9]. To ensure the rotational invariance, the dominant orientation of the objects is estimated using a principal component analysis on the Fourier transform spectrum. All entities are then rotated so that their main orientation is fixed by default. The proposed method defines a change vector based on some combinations of the similarity measurements between the distributions of wavelet coefficients in the different bands. From the proposed vector, it is possible to analyze and characterize the changes between two objects. To illustrate the ability of the proposed change vector to capture the variations between two objects, two applications are presented. The first one classifies the changes between a large set of objects using decision trees based on our measure whereas the second one characterizes changes between two images of the same scene. The article is structured as follows: the reorientation of the images is described in section 2, the analysis of the textural characteristics of the objects is presented in section 3

and the measure capturing the proposed change vector is described in section 4. Finally, applications are shown in section 5. 2. TEXTURE ORIENTATION

reader can refer to [7]. In this paper, we propose to characterize the texture by the distribution of coefficients related to each sub-band of the wavelet decomposition of the image. It is indeed reasonable to consider that a texture is characterized by its wavelets coefficients [14, 15]. Noting f0 the texture image, we have:

−1 X This preliminary step aims to estimate the overall orientation X X wj,n ψj,n (1) fJ,n φJ,n + f0 = of the texture of an object. Most of the available techniques n j=−J n for texture orientation usually associate a direction for all image locations [10, 11]. In our application, as we only aim to for a wavelet decomposition of order J where ψ is the analydetermine the global direction, we prefer to rely on an analysis wavelet and φ the scaling function. The texture can then be sis of the Fourier spectrum. Indeed, a particular orientation of represented by the sequence (fJ,n , wJ,n , −J ≤ j ≤ −1, n ∈ a pattern will generate a singular behavior of the correspondZZ). In a given sub-band, S.G. Mallat verified that the Generalized Gaussian Density (GGD) is a reliable PDF approximaing frequency in the Fourier spectrum. The identification of tion of the density of its wavelet coefficients [13]. The GGD this main orientation axis will then give the privilege direction reads: of the object. As a rotation in the Fourier space implicates a “ ” |x| β β − p(x; α, β) = e α , (2) rotation of the same angle in the image plane, all input objects 2αΓ(1/β) can be reoriented conditionally to their global orientation. R +∞ where Γ(t) = 0 e−z z t−1 dz is the Gamma function, α is In practice, the location of the most energetic direction the scale and β the shape parameter. Hence a sub-band can of the Fourier spectrum is done using a principal component be described by the two parameters (α, β). Several methods analysis (PCA). According to the ratio of the eigenvalues ascan be used to identify these parameters like the moments sociated to each direction, one can quantify a degree of anisotropy technique or the maximum-likelihood (ML) estimator. In our of the grounds in the picture: if the two principal components experiments, we observed that the first method did not gave contain substantially the same amount of information, the texcoherent results. We thus preferred to use the ML estimature will be homogeneous whereas an important ratio between tor. The log-likelihood function is defined as L(x, α, β) = the eigenvalues will identify a privilege direction. The figure log Πp(xi ; α, β) for the set of coefficients x = (x1 , ..., xN ). Parameters (α, β) are given by solving the following system [16]: 160

160

10

20

20

140

20

120

40

140

9

40 40

120 8

60 100

100 60

60

7

80

80

6

100

80

80 80

60

100

40

60 5

120 40

100 4

20

140 20

120

120

160 3

0 20

40

60

80

(a)

100

120

20

40

60

80

(b)

100

120

0 20

40

60

80

100

120

140

160

(c)

Fig. 1. Orientation compensation. (a) : an oriented texture image ; (b) its Fourier spectrum where a privileged orientation clearly appears ; (c) the orientation compensated image 1 illustrates this principle. We validated this approach on a set of 60 oriented textures and 80 homogeneous textures. All the oriented textures were correctly rearranged. In addition, the ratio of the eigenvalues enabled us to discriminate the oriented texture from homogeneous ones. This coarse description can then immediately be used to discriminate changes between homogeneous and oriented textures. Let us now turn to the texture description aspect of our approach. 3. TEXTURE DESCRIPTION Roughly, common descriptors for texture characterization are issued from statistics, geometry, models or from filtering theory (where the image is decomposed through a bank of filters). The Gabor filter [12] or the decomposition using wavelets [13] are the most accurate techniques issued from filter banks and have successfully been applied for texture analysis [5, 6, 7, 8]. For a complete review on texture classification, the

N

∂L(.) N X β|xi |β α−β =− + = 0, ∂α α α i=1 «β „ « N „ ∂L(.) N N Ψ(1/β) X |xi | |xi | = + − = 0, log ∂β β β2 α α i=1

(3)

(4)

where L(.) = L(x, α, β) and Ψ(t) = Γ′ (t)/Γ(t) is the digamma function. When β is fixed, equation (3) has a unique 1/β  P N β β . If one substitutes this resolution α ˆ= N i=1 |xi | ˆ lation in equation (4), β is the root of: Ψ(1/β) − 1+ β

β log i=1 |xi | log |xi | + PN β i=1 |xi |

PN



β N

PN

i=1

β

|xi |β



=0 (5)

which is solved numerically using the Newton-Raphson procedure, as presented in [17]. To obtain a faster convergence, the initial parameter β0 of the iteration process is given by the moment technique. Hence, the texture of an object is characterized by the sequence (αj,Z , βj,Z ) where J ≤ j ≤ −1 is the scale analysis and Z = {H, V, D} corresponds to the horizontal, vertical or diagonal band. 4. CHANGE FEATURES VECTOR To evaluate the differences between two distributions, we are based on the Kullback-Leibler distance. For two GGD defined with (α1 , β1 ) and (α2 , β2 ), this similarity measurement

reads [17]: KL = log



5000 β1 α2 Γ(1/β2 ) β2 α1 Γ(1/β1 )

«

+



α1 α2

« β2

KLS = 0.1226

KLS = 0.7545

20

Γ((β2 + 1)/β1 ) 1 . − Γ(1/β1 ) β1 (6)

This distance is non-symmetric and the similarity measure retained is KLS(p1 , p2 ) = KL(p1 , p2 )+KL(p2 , p1 ). Its value is null for two identical distributions and progressively grows up when the distributions differ. Dealing with the distances obtained for each band of the wavelet decomposition, the purpose of this section is to discriminate changes according to the intensities of the KLS errors related to various scales and orientations of the decomposition. In the field of change analysis for remote sensing data, it is assumed that a change is characterized by its scale (small-scale, large-scale or both), its intensity and can in addition be subtle or abrupt. The intensity indicates the average similarity between the 2 textures, the subtle change occurs when the similarities in terms of distribution and/or scale are close, unlike the change is abrupt. To identify the nature of the observed change, we then propose to define a change vector D composed of 5 kind of indicators:

40 60

5000

80 100 120 20 40 60 80 100120

0

(a): im 1

0 (c) : dia. band, level 1

(e) : dia. band, level 3

KLS = 0.3206

KLS = 1.0916

4000 20 40 60 80

2000

5000

0

0

100 120 20 40 60 80 100 120

(b) : im2

(d) : dia. band, level 2

(f) : dia. band, level 4

Fig. 2.

Abrupt change between two homogeneous textures. The amplitude grows up with the scale analysis of a given detail band (diagonal here)

1. the average for the overall KLS: this discriminates an abrupt change to a subtle one; 2. the ratio between the dissimilarity of one coefficient and the sum of others: this highlights the components that best measure the differences between textures; 3. the standard deviation of the dissimilarities between the different scale levels for a same kind of detail (horizontal, vertical or diagonal): this detects changes occurring only in some scale levels;

cereals

sprayed corn

corn

sowed meadow

meadow

forest

4. the standard deviation of the dissimilarities of a same scale level in the 3 types of detail: this detects changes that occur only in a particular direction; 5. The degree of isotropy issued from section 2: this discriminates changes from homogeneous to oriented changes.

All these components of the change vector enable to interpret different configurations of change. For instance, a largescale change will exhibit dissimilarities in the detail bands that increase when the scale grows up, as illustrated for diagonal details in Figure 2. In this particular example, the 3rd component of D will be exploited to exhibit the change. To validate the change measure proposed by the vector D, the following section presents two applications that use this metric. 5. APPLICATIONS In this section, two applications of the change features vector proposed in the previous section are presented. The first application aims to characterize changes on all pairs of images issued from a benchmark of 2400 agricultural parcels extracted on airborne images with a spatial resolution of 50 cm. We identified six types of textures that correspond to different land use (see Figure 3). In addition, five classes of changes are possible: 1 - large-scale, abrupt and oriented-change (eg transition from cereals to a - sowed or not - meadow) 2 - large-scale, subtle and oriented-change

Fig. 3. Example of the 6 kind of textures issued from the benchmark of the first application.

(eg transition from a meadow to sowed meadow) 3 - change between two homogeneous textures with the same scale (eg transition from corn to meadow) 4 - large-scale and smallscale, abrupt and oriented-change (eg transition from a sprayed corn to meadow), 5 - large-scale change between two homogeneous textures (eg transition from a meadow to a forest). The classification of changes is performed using decision trees. Roughly, the principle is to decompose the classification problem into a series of tests leading to partition the data space into homogeneous sub-regions in terms of classes. The different tests are represented by nodes and major classes are represented by leaves. For more details on the decision trees and their construction, we refer to [18, 19]. In the presented example, a 20 nodes tree was used. The metric that compares textures was the vector D. For each class, a set of 30% of the samples are chosen for the construction of the

tree and 70% are used for the evaluation. The table 1 exhibits the rate of good detection for each of the 5 types of changes previously mentioned. As one can observe, the rate of good detection is very high. In addition, we have depicted the most discriminative component of D corresponding to the change. This experience on a real benchmark proves the ability of the proposed change vector to accurately discriminate the textures and to analyze their changes. Type of Change

Rate good det.

Main comp. of D

No Change 1 2 3 4 5

94% 96% 95% 90% 91% 97%

1–2–3 1 4 5–3 5–2 2–3

Table 1.

Rate of good detections and main component of D exploited for

the change classification on agricultural benchmarks.

The second example is a real situation. The goal is to detect changes from a pair of images. The first one is a scanned aerial photograph dating from 1970. Its noise is high and its spatial resolution quite low (≈ 3m). The second image is issued from the Quickbird satellite and was acquired in 2003 with 80cm resolution. We are interested on three kind of particular changes: 1– ”subtle agricultural changes” (changes in terms of type of culture); 2– ”abrupt changes” (from culture to forest, farms, orchard or inversely); 3– new roads. To identify these changes, the 3rd and 4th components of D (changes only in some scales and oriented changes) were used to isolate the class 1. Class 2 has been detected with the 2nd and 3rd components (scale changes associated or not to subtle changes) and class 3 with the 2nd and 4th components (scale and oriented changes). Results are shown on figure 4. One can observe that the different changes have correctly been identified, despite the fact that the input images were very different in terms on quality and resolution.

vector describing the observed changes is a combination dissimilarities between distributions at each level of the decomposition of the images. We have successfully validated our method by classifications based on decision trees and on a real example. As a perspective, we plan to validate this methodology on other textures (especially on urban areas) and to exploit this measurement to jointly segment the image and analyze the changes. 7. REFERENCES [1] P. Coppin, I. Jonckheere, and E Lambin, “Digital change detection methods in ecosystem monitoring : A review,” International J. of Rem. Sens., vol. 25, pp. 1565–1596, 2004. [2] R.M. Haralick, “Statistical and structural approaches to texture,” Proceedings of the IEEE, vol. 5, pp. 786– 804, 1979. [3] T.R. Reed and J.M.H. du Buf, “A review of recent texture segmentation and feature extraction techniques,” CVGIP: Image Understanding, vol. 57, pp. 359–372, 1993. [4] M. Tuceryan and A.K. Jain, The Handbook of Pattern Recognition and Computer Vision (2nd Edition), chapter Texture Analysis, pp. 207–248, C.H. Chen, L.F. Pau (P.S.P Wang Eds), 1998. [5] J.F. Aujol, G. Aubert, and L. Blanc-F?aud, “Wavelet-based level set evolution for classification of textured images,” IEEE Trans. Im. Proc., vol. 12, no. 12, pp. 1634–1641, 2003. [6] A. Laine and J. Fan, “Texture classification by wavelet packet signatures,” IEEE Trans. Pat. Anal. Mach. Intell., vol. 15, no. 11, pp. 1186–1191, 1993. [7] T. Randen and J.H. Husoy, “Filtering for texture classification: A comparative study,” IEEE Trans. Pat. Anal. Mach. Intell., vol. 21, no. 4, pp. 291–310, Apr. 1999. [8] P. Scheunders, S. Livens, G. van de Wouwer, P. Vautrot, and D. Van Dyck, “Wavelet-based texture analysis,” Journal of Computer Science and Information Management, Special issue on Image Processing, 1998. [9] B. Luo, J.-F. Aujol, Y. Gousseau, and S. Ladjal, “Indexing of satellite images with different resolutions by wavelet features,” IEEE Trans. Im. Proc., vol. 17, no. 8, pp. 1465–1472, 2008. [10] W.T. Freeman and E.H. Adelson, “The design and use of steerable filters,” IEEE Trans. Pat. Anal. Mach. Intell., vol. 13, no. 9, pp. 891– 906, 1991. [11] C. Germain, J.P. Da Costa, O. Lavialle, and P. Baylou, “Multiscale estimation of vector field anisotropy. application to texture characterization,” Sig. Proc., vol. 83, pp. 1487–1503, 2003. [12] D. Gabor, “Theory of communications,” IEE proceedings, vol. 93, no. 26, 1946. [13] S.G. Mallat, “Multiresolution approximations and wavelet orthonormal bases of L2 (R).,” Trans. Amer. Math. Soc., vol. 315, pp. 69–87, 1989. [14] M. Unser, “Texture classification and segmentation using wavelet frames,” IEEE Trans. Im. Proc., vol. 4, no. 11, pp. 1549–1560, 1995.

(a) : 1970

(b) : 2003

(c): change map

Fig. 4.

Change detection (a–b): input images; (c): the change detection. Yellow: change of cultures; Blue: abrupt changes; Red : new roads.

6. CONCLUSION In this article we have proposed a new measure for change detection and analysis in pre-segmented VHR images. This measure is based on the comparison of textures, through the differences between the distributions of coefficients issued from a wavelet decomposition of the images. A preliminary step devoted to the reorientation of objects has allowed us to be less sensitive to the global orientation of textures. The

[15] S.G. Mallat, A wavelet tour of signal processing, Academic Press, 1998. [16] M.K. Varansi and B. Aazhang, “Parametric generalized gaussian density estimation,” J. Acoust. Soc. Amer., vol. 86, pp. 1404–1415, 1989. [17] M. Do and M. Vetterli, “Wavelet-based texture retrieval using generalized gaussian density and kullback-leibler distance,” IEEE Trans. Im. Proc., vol. 11, no. 2, pp. 146–158, 2002. [18] L. Breiman, J. Friedman, R. Olshen, and C. Stone, Classification and Regression Trees, Wadsworth and Brooks, Monterey, CA, 1984. [19] M. A. Friedl and C. E. Brodley, “Decision tree classification of land cover from remotely sensed data,” Rem. Sens. of Env., vol. 61, pp. 399– 409, 1997.