Project

General

Profile

Amatulli updates

2014-28-02

What I did the past weeks

- albedo parameter the Modis Albedo product (http://modis-atmos.gsfc.nasa.gov/ALBEDO/index.html ) is incorporated in to the model.
- coefbh parameter (cloud) Monthly Cloud Frequency (Adam Product) is incorporated in to the model
- coefdh parameter (haze) the Modis Aerosl product (http://modis-atmos.gsfc.nasa.gov/MOD08_M3/index.html) is incorporated in to the model.

Full repository in root/terrain/procedures/dem_variables/gmted2010_rad/scripts_r_sun@ga/terrain

What I'm working on now
  • Perform a validation procedure at monthly level using ground solar radiation stations.
What obstacles are blocking progress
  • I'm going to migrate the full computation from Litoria to Yale-HPC and/or NEX.
  • Up to now i'm using r.sun in grass 6.4.3, but I'm going to implement r.sun in grass7, so full or partially grass7 installation need to be done in the HPC and/or NEX.
What's next
  • Improve the processing chain and calibrate the model based on the validation results.

2013-11-04

What I did the past weeks
  • Incorporate Greenland in the 7.5arc-sec resolution and recalculate all the topographic variables.
    Improve splitting tiling and merging tiling to avoid border effect around the tiles. Scripts stored at root/terrain/procedures/dem_variables/gmted2010_res_x10@e1aa0df
What I'm working on now
  • Reading articles concerning solar radiation.
What obstacles are blocking progress
  • There are no major technical issues right now.
What's next
  • Perform a simple approach to calculate solar radiation.

2013-10-22

What I did the past weeks
  • Create ancillary layers (numbered of observations for 30arc-sec) to improve the quality/accuracy of the derived topographic variables.
    Create a Geo-reference grid lat-long 30arc-sec GeoTiff that can be used as reference for the production of environmental variables (more info at https://projects.nceas.ucsb.edu/nceas/documents/175).
What I'm working on now
  • Antarctica and Greenland are available only at 30arc-sec resolution. I'm incorporating them in to the 7.5arc-sec resolution by a re-sampling approach.
What obstacles are blocking progress
  • The Antarctica and Greenland where not available during government shut down. Now the GTDEM is back and i could download the data.
What's next
  • Start to read how to derive solar radiation using GMTED2010 and ancillary layers (suggestions are welcome).

2013-10-08

What I did the past weeks
  • Complete the processing chain of GMTED2010 to process the topographic variables in the High Performance Computing (HPC) clusters.
    Several topographic variables has been produced ( description here )
What I'm working on now
  • Checking the results and identify nomenclature for file labeling.
What obstacles are blocking progress
  • There are no major technical issues right now.
What's next
  • Start to read how to derive solar radiation using GMTED2010 and ancillary layers (suggestions are welcome).

2013-09-24

What I did the past weeks
  • I was adjusting several scripts to be able to process the topographic variables in the High Performance Computing (HPC) clusters.
    The missing of GIS-RS libraries in the (HPC) slowdown the job process. Now the GDAL and PKTOOLS library are installed and i can restart to process the GMTED2010.
What I'm working on now
  • Using the GMTED2010 (Median Statistic, 7.5 arc-seconds) I'm producing areal percent values over a specific threshold (every 100m).
What obstacles are blocking progress
  • There are no major technical issues right now.
What's next
  • Moving forward on full processing chain of GMTED2010.

2013-08-26

What I did the past weeks
  • Due to some issues in the the data-storage I temporally stop processing topographic variables.
    I was concentrating i create generic scripts that can used in several processes. These scripts intersect.py and addattr-area.py can be used in data flow under BASH/python environment. I will "push" to the git repository as soon we restore all the back up.
    Moreover, i was simulating study cases for testing shorts script for retrieving points/polygons information from raster layers.
    I was using gdallocationinfo pkextract and oft-stat, codes written in C/C++ so very fast.
What I'm working on now
  • I'm working to set up a Linux course for grads students. For this purpose i will create and keep update a Linux Virtual Machine with all the material and software needed.
What obstacles are blocking progress
  • As soon the data-storage is going to be restored i will keep working in creating topographic variables.

2013-08-12

What I did the past week
  • The results of the previous week were merged using the script sc3a_dem_variables_merge.sh. The script sc3a_dem_variables_merge.sh manly use gdal_merge.py to merge the tiles produced by the script sc2a_dem_variables.sh. The script works fine and in a fast way.
    Create colored pdf for fast visualization (soon pdf will be stored in the wiki)
What I'm working on now
  • Try to have all the libraries and sw in the Omega cluster. Unfortunately the previous installation was not working correctly.
What obstacles are blocking progress
  • We still have problem with the /data2 hard disk. This issues has been slow down all the processing chain and the back-up and copy transfer to other server take time.
What's next
  • I consider the topographic variables obtained by the GMTED2010 the best results up to now. The consistency of the data produces continues surface with no artifact.
    An additional layer can be the calculation of the total solar radiation (r.sun in grass).

2013-08-05

What I did the past week
  • Using Median Statistic and Systematic Subsample of the GMTED2010 7.5 arc-seconds , I calculated topographic variables using the script sc2a_dem_variables.sh .
    The script sc2a_dem_variables.sh manly use gdaldem command to calculate the variables and pkfilter to aggregate (median, mean, max, min, stdev) them at 1km resolution (16 pixels of the GMTED2010).
    The script works fine and in a fast way.
What I'm working on now
  • Analyzing and testing the results crossing over with the topographic variables obtained with the EARTHENV-DEM90. Create colored pdf for fast visualization.
What obstacles are blocking progress
  • I'm trying to solve an issues on the Acrobates-server. After a reboot, the 3partitions in the 40T disk are not visible from Acrobates, I'm reading forum to search the solution.
What's next
  • Moving forward on other topographic variables derived by GMTED2010.

2013-07-29

What I did the past week
  • Download DEM GMTED2010 7.5 arc-seconds , and Tiling them in 50 tiles. Script sc1_wget_gmted2010.sh
  • Using Minimum Statistic and Maximum Statistic layers to produce areal percent values over a specific threshold. Obtained areal percent above the minimum and above the maximum every 100 meters (so from -500 to 8700). These two groups of layers are useful to obtained minimum and maximum percent for altitudinal range. Scripts sc2_class_treshold_percent.sh and sc3_class_treshold_density_merge.sh . The script sc2_class_treshold_percent.sh manly use pkfilter to calculate the percent for each pixel. The script works fine and in a fast way. Can be used for other dataset but some parameters need to be changed (e.g window size).
    The data are: stored as Byte; they have a metadata description; gray color table attached; they are stored at /mnt/data2/dem_variables/GMTED2010/altitude/percent_class_{mi,mx}/*.tif ; they are ready to be used.
What I'm working on now
  • Analyzing and testing which one of the following layers Median/Mean Statistic, Systematic Subsample, Breakline Emphasis can be used to calculate topographic variables. I will calculate roughness index for each one and i will compare with the ones obtained by Aster and SRTM, of course in zones with no Aster and SRTM noise.
What obstacles are blocking progress
  • There are no major technical issues right now.
What's next
  • Moving forward on the full processing chain of GMTED2010 and hoping that this dataset will give better results.

2013-07-22

What I did the past week
  • Download DEM from the following web site http://www.viewfinderpanoramas.org/dem3.html, and Tiling them under the same tiling system of the EARTHENV-DEM90. Script sc1_wget_panoramas.sh
  • Producing Topographic variables in tif format. Script sc2_dem_variables.sh
    All the scripts are running in grid-processing using xargs. The code is super fast and can be applied to other datasets.I will use the same kind of structure for other dataset, slightly changing them in accordance to the data set and directory stored.
  • Producing Topographic variables in pdf format for fast visualization.
What I'm working on now What obstacles are blocking progress
  • There are no major technical issues right now but the www.viewfinderpanoramas.org is not subtables to derive Topographic variables (see roughness_median_pan.pdf).
  • Aster seems better processed but SRTM present several stripes (see Africa). Moreover the merging zone Aster-SRTM is clear visible
What's next
  • Moving forward on the full processing chain of GMTED2010 and hoping that this dataset will give better results.
  • The GMTED2010 has been generated at three separate resolutions 30 arc-seconds, 15 arc-seconds, and 7.5 arc-seconds. This new product suite provides mean elevation, median elevation, standard deviation of elevation, systematic subsample, and breakline emphasis (more info at http://pubs.usgs.gov/of/2011/1073/pdf/of2011-1073.pdf)
    For our purpose the presence of statistical derived dataset (mean, median, standard deviation etc) can be used as sort of ancillary layer to calculate potential value of Topographic variables. I will investigate further more this idea.
  • Therefore i will first produce the Topographic variables using the mean and median elevation, than based on visual interpretation (presence of stripes and artifacts) i will consider how to proceed.
  • The GMTED2010 is derived by different data sources and the data merging may produce artifacts.