Friday, June 10, 2011

Niwot SIPNET soil moisture issues

For awhile we've been noticing that SIPNET, optimized on NEE and ET at Niwot, has been running with very high soil moisture levels. They were so high that we weren't seeing some of the water limitations on GPP that we should expect, given field observations. The slide show below shows how we went about trying to address this issue.

The gist is that by optimizing while holding RdConst and WaterRemovalFrac fixed, SIPNET is able to tightly replicate a decade of Niwot NEE data, while keeping leaf and wood carbon pools fairly stable, other parameters well-behaved, and soilWetness in a realistic range.

During optimizations where they are allowed to vary, SIPNET usually estimated RdConst ~1400 and WaterRemovalFrac ~0.001. Sensitivity testing and empirical studies suggest these two values should be fixed closer to RdConst ~25 and WaterRemovalFrac ~0.15.

Hope this is helpful and let me know if you have comments!
(Laura)

Thursday, March 31, 2011

The New SIPNET repository

We are putting together a new release of the SIPNET model - I say we ... but I really mean ... John is doing the hard work of getting the model code merged and up on a new server.

While we are doing this we are looking for extra data files which could be 'SIPNETized' for each of the sites we have been working on. These data could be stored and archived in the new repository. We are imaging a folder called SITES and a subfolder each site where we have SIPNET files and ancilliary data which could be used for parameterization.

Proposed folder stucture (example for Niwot)

TRUNK
>SIPNET
>>SITES
>>>NIWOT (would contain the driver data for the site e.g niwot.dat, niwot.clim etc)
>>>>ANCILLIARY_RAW (could contain files used to estimate initial conditions, parameters etc. e.g. excel file containing sapflow data or text file listing papers used to estimate the parameters)
>>>>OLDFILES

Thursday, August 12, 2010

Spatial SIPNET - SPACENET?

So a few years ago Dr. Bill Sacks (clever man that he is) incorporated a spatial component to SIPNET. The implementation for the user is pretty straight forward.

Assuming that you can run SIPNET for a single site all you need to do is to create a climate file and if needed a parameter file for each location you need to run the model for.

The first field in the climate file is 'loc' usually this is set to zero - however you can tell SIPNET to run at many locations.

I put together a tutorial with Bill Sacks on how to do this for the NCAR ASP conference in 2008.

You can view the tutorial here (Regional Biogeochemica Cycling)

You need to create the spatial inputs either manually based on some input data you already have or you can use some sort of automated procedured based on climate re-analysis data or perhaps MTCLIM.

You also need to edit the "dot.param-spatial" file to tell SIPNET how many locations to expect.

The controls to switch on the spatial version are in sipnet.in

Open the file sipnet.in using your text editot
* Set RUNTYPE = standard
* Set FILENAME = NEW_Spatial_Files (to use the newly-created spatially-varying climate and parameter files)
* Set LOCATION = -1 (to run at all locations)

Below is just the graphic from the regional tutorial - it gives an idea of how the spatial data might look on a very regular grid.

Have fun,
Dave

Parameterization of the SIPNET model for your site

If you have a forested eddy covariance site, it should be possible to parameterize the model using previous studies at the site and by assimilating the data from the site.
You will need to assemble the appropriate climate divers and you will also need to estimate the parameters used to drive model

You can find a description of the model drivers and parameters on this website:
http://www.cgd.ucar.edu/~dmoore1/SIPNET_requirements.htm


Figure shows the layout of a small portion of a parameter file used by SIPNET - click on the image for a more detailed view.

Not all of the parameters in the default 'dot.param' file are used. The parameters which are required depend on the model options that you employ. I recommend that you look carefully at the model options at the start of the code in the file "sipnet.c". We hope to create an improved interface for changing SIPNET options but for now we need to manually change the options in the sipnet.c code (... and remember to recompile!).

The options include options to vary how respiration or little is treated, whether growing degree days are calculated etc. written by Bill Sacks and Rob Braswell (see Braswell et al 2005, Sacks et al 2006, 2007). They also include options written by John Zobitz to include various soil pools (see Zobitz et al 2008).

You might also want to check out the previous blog post (from March).

Please speak to one of the SIPNET team before you use SIPNET for publication so that you have access to the most up to date version of the code. The code is hosted at the University of Wisc and will be moved during the fall of 2010.

SIPNET Team:
Dave Moore (King’s College London), John Zobitz (Augsburg College), Bill Sacks (NCAR/NEON), Ankur Desai (Univ Wisc)

Friday, March 12, 2010

Optional model components log

I was going through some estimate output recently and noted that:

The progress/log file produced during 'estimate' (the one without an extension) does not automatically log the optional model components which are switched on or off.

e.g. information contained within this file

Optional model components
(0 = off, 1 = on):
GROWTH_RESP = 0
LITTER_POOL = 1
LLOYD_TAYLOR = 0
SEASONAL_R_SOIL = 0
WATER_PSN = 1
WATER_HRESP = 1
DAYCENT_WATER_HRESP = 0
MODEL_WATER = 1
COMPLEX_WATER = 1
LITTER_WATER = 0
LITTER_WATER_DRAINAGE = 0
SNOW = 1
GDD = 0
SOIL_PHENOL = 0

some newer components are not listed here ... notably ROOTS.

I've never looked in to how this log file is generated as I have not spent lots of time switching on or off model components (except ROOTS) and in that case I new whether or not roots was switched on.

If you are planning on investigating the effect of including a process or two (e.g. MODIS or ROOTS) then for the moment you need to manually keep track of which of these components are switched on or off.

Saturday, February 20, 2010

reading in sigma for each data stream at each timestep

At ESA and AGU this year I presented some work I've been doing with Andrew, Dave H and Dan R.

In that work, with a lot of advice from Bill via email, I made some changes to my branch of the repository:
sipnet/branches/moore

These changes require that an extra file FILENAME.sigma be specified which mirrors FILENAME.dat and contains the sigma values for each data point.

/*Difference, version 4 - READS IN SIGMAS (dm) ESTIMATES SIGMA, RETURNS AGGREGATE INFO
Run modelF with given parameters at location loc, compare output with measured data
using data types given by dataTypeIndices[0..numDataTypes-1]
Return a measure of the difference between measured and predicted data
(Higher = worse)

/*Added read in ***sigmas
* removed ***sigma from double difference(double *sigma, ***sigmas, OutputInfo *outputInfo,
*
*/


paramchange.c
21: static double ***sigmas; /* (dm) data uncertainty read in once at start of program
53: /* Difference, version 4 - READS IN SIGMAS (dm) ESTIMATES SIGMA, RETURNS AGGREGATE INFO
70: /*Added read in ***sigmas
71: * removed ***sigma from double difference(double *sigma, ***sigmas, OutputInfo *outputInfo,
100: thisSigma = sigmas[loc][i][dataNum];
103: logLike += log(thisSigma);//(dm) now the individual sigmas for each data point is added to the logLike
522: /* Read measured data (from fileName.dat) and valid fractions (from fileName.valid) into arrays (used to also read sigmas)
597: // (dm) added code to read sigmas for each time step and each data type
598: sigmas = (double ***)malloc(numLocs * sizeof(double **));
600: sigmas[loc] = make2DArray(steps[loc], numDataTypes); // make 2-d array just big enough for known # of time steps in this location
619: readDataLine(in3, oneLine, totNumDataTypes); // read sigmas file
620: // assign sigmas elements appropriately, based on which data types we're using:
622: sigmas[loc][index][i] = oneLine[dataTypeIndices[i]];

Wednesday, February 10, 2010

Zobitz modifications

Here is a quick rundown of what I have done:
- Added MODIS satellite fAPAR data as a co-constraint with NEE data at Niwot. I am currently investigating if this improves the parameter estimation and what effect it has on the results. Code mofidications included a change in the variables to estimate, as well as additional columns in .data and .valid files.
- Added a new tracker variable (meanDLight), which computes the running average of the fractional light multiplier (Dlight) over 8 days, which I then compare against measured fAPAR data.
- Implemented a "weighted likelihood" cost function, where each assimilated data stream is multiplied by a fraction from 0 to 1. I did this because I wanted to determine how much the parameter estimation improves by including (or not including) a particular data stream.
- Concurrent with this approach, I did a tweak on the code to show the likelihood for each particular data stream by running sipnet forward over input data.

--> Any of these changes worth merging to the main code?