Normalization in change detection? Yes or no?

This topic contains 1 reply, has 2 voices, and was last updated by  periz 1 year ago.

  • Author
  • #2909


    Normalization places the amplitude values of the product in well defined parameters, correct?

    For Long term change analysis, does this in any way result in the loss of information? Does it facilitate comparison between images taken in different periods? What benefit is there leaving it off?

    The tutorials available on the site all seem to have normalization activated.

  • #2910


    If you are talking about the normalization option that you can find here: (sorry, I noticed we did not include yet a module description in the manual -but you can find several tutorials on it…-) the idea is the following:
    change = filter [ (A1 – A2)/(A1 + A2) ]
    The normalization allows detecting any changes, not only those affecting high amplitude values. What is actually leading to the detection of a change is the spatial correlation (implemented via a wiener filter).

    This concept is based on having 2 images. If you have only 2 images, you can find signals only by looking at a spatial correlation.
    If on the contrary you have many images, you do not need to focus on spatial statistics. It means, you can extract 1 single pixel and observe the pattern in time.
    The automatic analysis of amplitude time series is implemented here:
    And here you find an example/exercise:
    If you want to visualize a combination of spatial/temporal statistics by selecting pixels one by one, use this:
    We don’t have yet an exercise showing how to use it, but we can show you one day if you’d like.

You must be logged in to reply to this topic.