where to find the average rate of each year
May 10, 2015 at 1:05 am #856
in the process of sparse geocoding, I selected DBF as the file type. Then I got a .csv file,in which I can see many numbers. But I can’t found the average rate of each year in it. I want to konw where I can find some information about the average rate of each year.Or I have to calculate this number by myself.
May 10, 2015 at 1:44 am #863
first of all, thanks for the question, I discovered an obsolete piece of code.
DBF is in fact an old option that was supposed to be replaced by simple CSV files.
However, I discovered now that the CSV is produced only if you check the “time series” option.
If you geocode the DBF without checking the “time series” option, a DBF file is still produced.
DBF generation works only under windows: I am going to definitely remove it.
In the DBF file that is still produced, you get a table with 4 columns: lat, lon, height and the parameter you have chosen to geocode.
So, in the DBF you will find the velocity only if you have chosen it as parameter to geocode.
If on the contrary you check the “time series” option, you will get many parameters, and the velocity is included.
Be careful when you export time series: when you check it, the sw will open the sparse point processing window and you have to check the processing options you want to apply on your time series. The sw is not going to estimate anything, but it needs to know e.g. which atmosphere you want to remove, which model to apply for the movement, whether to remove the thermal component and so on.
To conclude, you talk about an “average rate for each year”: this at this moment is not included in Sarproz. The velocity is just one. And if you choose a non-parametric model for the time series, no velocity is calculated.
We will anyway consider in the future to include, as you say, an “average rate for each year”.
For the moment, you can generate a csv with the time series and calculate it yourself (in matlab or in excel)…
One last note: we are going to update soon the sparse geocoding module making it more efficient and giving more functions. You’ll get it in the next 1 or 2 months.
May 10, 2015 at 2:21 am #866
the attachment is what I got, it have so many parameters. the value of 5 columns is zero, you can see it in the attachment. you said : when you check it, the sw will open the sparse point processing window and you have to check the processing options you want to apply on your time series. I often experience such a situation, but I don’t konw what should I do with the sparse point processing window. Can you give me an example? thank you very much.
Attachments:You must be logged in to view attached files.
May 10, 2015 at 2:41 am #868
Let’s make an example:
you have processed your dataset, and you got the following outputs (after APS and sparse points processing):
– Integrated Residuals APS
– height and velocity for each PS
So, you go to sparse point geocoding and the sw opens up the “sparse points processing” window asking you to input the processing options. Then you have to:
1. select IR APS
2. check height and velocity in the “read” column
By the way, if you would geocode your results right after “sparse point processing”, the sw would use the same options for generating the time series. If the sw opens up a new “sparse point processing” window, it’s because you closed a session and you opened a new one, so that the sw lost the infos on what you did before.
May 10, 2015 at 2:57 am #869
yeah, I exited the software. and next when I open it to do something with spare geocoding, the sparse point processing window is empty. what shoule I do?I need to do the sparse point processing again, then export time series or something. Thank you.
June 23, 2015 at 8:18 am #1014
I have a doubt regarding the fields YYYYMMDD (created with the time series option) which seem to be showing displacement values. These values refer to a cumulative displacement (i.e. the cumulative displacement at date 20010721) or to the single displacement registered for that acquisition (in that case, respec to to whom or when? the master, the initial date?) Which is the relation between these values and the field CUMULDISP?
June 23, 2015 at 8:32 am #1015
The time series in the site processing are referred to the first acquisition.
In the small area processing you can choose whether to refer them to the first acquisition or to the Master one.
However the Master acquisition may be noisy, so, the sw estimates an error and it removes it (so, the value at the master may be not zero).
The cumulative displacement is the difference between the last and the first acquisitions.
As mentioned before, when we will revise the time series export module, we will include more options.
Tell me if you found any contradictions.
June 23, 2015 at 9:08 am #1016
Then, the cumulative displacement shouldn’t be equal to the displacement for the last acquisition?
I give you a concrete example:
20120803 (First Acquisition): 0.00
20150112 (Last Acquisition): -14.53 —-> This is the displacement of 20151212 respect to 20120803??
June 23, 2015 at 10:37 am #1017
sorry, you are right.
The cumulative displacement is calculated using the model you chose for the time series analysis.
This reduces possible noise.
March 2, 2016 at 8:27 am #1532
I ask you again about the YYYYMMDD fields but this time in the .csv file created from the Small Area processing.
According to what I see it seems that they contain the single displacement for each date instead of the cumulative one (in the field corresponding to the master image eveything is 0, that’s why I think this is the case).
In which units are these magnitudes given?
And, again, which is the relation between them and the CUMDISP field?
I attach the csv file so that you see what I’m reading.
Attachments:You must be logged in to view attached files.
You must be logged in to reply to this topic.