30
1 Guidelines for Using the Sensitivity Analysis and Auto-calibration Tools for Multi-gage or Multi-step Calibration in SWAT by Michael W. Van Liew Department of Biological Systems Engineering University of Nebraska-Lincoln Lincoln, NE and Tamie L. Veith USDA ARS Pasture Systems and Watershed Management Research Unit University Park, PA

GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

Embed Size (px)

Citation preview

Page 1: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

1

Guidelines for Using the Sensitivity Analysis and Auto-calibration

Tools for Multi-gage or Multi-step Calibration in SWAT

by

Michael W. Van Liew

Department of Biological Systems Engineering

University of Nebraska-Lincoln

Lincoln, NE

and

Tamie L. Veith

USDA ARS

Pasture Systems and Watershed Management Research Unit

University Park, PA

Page 2: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

2

Table of Contents

I. Overview……………… …………………………………………....3

II. Sensitivity Analysis Tool…………………………………………...4

III. Sensitivity Analysis Set-up and Execution………………...…….5

IV. The Shuffled Complex Evolution Algorithm…………………. 10

V. Auto-calibration Set-up and Execution…………………………11

VI. Auto-calibration Output………………………………………...22

VII. Calibration of Water Quality Constituents…………………...24

VIII. References……………………………………………………...26

VIII. Appendix A…………………………………………………….28

Page 3: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

3

I. Overview

The ArcSWAT 2.3 Interface for SWAT 2005 consists of a number of tools that can

be used to assist model users in evaluating parameter sensitivity, aid in model

calibration, and assess parameter uncertainty. These tools were developed by Van

Griensven (2005) and in recent years have been employed increasingly by SWAT

users worldwide. The sensitivity analysis tool is helpful to model users in

identifying parameters that are most influential in governing streamflow or water

quality response. The sensitivity analysis tool in the ArcSWAT Interface allows

model users to conduct two types of analyses. The first analysis may help to

identify parameters that improve a particular process or characteristic of the model,

while the second analysis identifies the parameters that are affected by the

characteristics of the study watershed and those to which the given project is most

sensitive (Veith and Ghebremichael, 2009).

The auto-calibration option provides a powerful, labor-saving tool that can be used

to substantially reduce the frustration and uncertainty that often characterize

manual calibrations (Van Liew et al., 2005). The auto-calibration tool currently

available in the ArcSWAT Interface allows users to determine optimal parameter

values for multiple output variables for a delineated project, based upon observed

data at a single gage during a one-step auto-calibration. In some cases, however,

multi-gage observed data may be available for implementing a distributed

approach to calibration, where observed and simulated outputs are compared at

multiple points on a watershed. Moreover, it may be advantageous to employ a

multi-step approach to auto-calibration, if optimal parameter sets are needed for a

handful of output variables such as streamflow, sediment, and nutrients.

This training module assists model users in implementing the sensitivity analysis

tool outside of the ArcSWAT Interface. The module also provides guidelines for

performing multi-gage and/or multi-step auto-calibrations by running the program

in a project directory instead of the interface. All project files are created in the

interface and then selected files related to auto-calibration are subsequently

modified manually before running the project in MS DOS. An overview of the

necessary steps for building the auto-calibration files for a multi-gage calibration is

presented in Appendix A. Practical guidelines are also included in this module to

help expedite the auto-calibration process. Before proceeding with the instructions

Page 4: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

4

provided in this training module, it is suggested that model users review Section 14

“SWAT Simulation” of the ArcSWAT 2.3 Interface for SWAT 2005 User’s Guide

developed by Winchell et al. (2009). Users may also find it helpful to read “How

to: applying and interpreting the SWAT auto-calibration tools” (Veith and

Ghebremichael, 2009) that defines and explains the main outputs of the tools

available in the auto-calibration interface.

Once a project has been constructed and the warm-up and calibration periods have

been established, an initial simulation can be performed to verify that data have

been correctly input to the project. The input.std and output.std files are

particularly useful in this regard. The latter is helpful in verifying overall output

values for hydrology, crop growth and biomass production. The user should also

copy the project directory that was created in the ArcSWAT interface to a separate

directory. With the inclusion of a SWAT executable in that directory, the project

can then be run in MS DOS mode. Creating a duplicate copy of the project can

also serve to test auto-calibrated and manually adjusted streamflow or water

quality parameter sets while the original auto-calibration run is in progress.

II. Sensitivity Analysis Tool

Model users are often faced with the difficult task of determining which

parameters to calibrate so that the model response mimics the actual field,

subsurface, and channel conditions as closely as possible. When the number of

parameters in a model is substantial as a result of either a large number of sub-

processes being considered or because of the model structure itself, the calibration

process becomes complex and computationally extensive (Rosso, 1994;

Sorooshian and Gupta, 1995). In such cases, sensitivity analysis is helpful to

identify and rank parameters that have significant impact on specific model outputs

of interest (Saltelli et al., 2000).

Sensitivity analysis demonstrates the impact that change to an individual input

parameter has on the model response and can be performed using a number of

different methods (Veith and Ghebremichael, 2009). The method in the ArcSWAT

Interface combines the Latin Hypercube (LH) and One-factor-At-a-Time (OAT)

sampling (Van Griensven, 2005). During sensitivity analysis, SWAT runs

(p+1)*m times, where p is the number of parameters being evaluated and m is the

Page 5: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

5

number of LH loops. For each loop, a set of parameter values is selected such that

a unique area of the parameter space is sampled. That set of parameter values is

used to run a baseline simulation for that unique area. Then, using one-at-a-time

(OAT), a parameter is randomly selected, and its value is changed from the

previous simulation by a user-defined percentage. SWAT is run on the new

parameter set, and then a different parameter is randomly selected and varied.

After all the parameters have been varied, the LH algorithm locates a new

sampling area by changing all the parameters (Veith and Ghebremichael, 2009).

The sensitivity analysis tool in ArcSWAT has the capability of performing two

types of analyses. The first type of analysis uses only modeled data to identify the

impact of adjusting a parameter value on some measure of simulated output, such

as average streamflow. The second type of analysis uses measured data to provide

an overall “goodness of fit” estimation between the modeled and the measured

time series. The first analysis may help to identify parameters that improve a

particular process or characteristic of the model, while the second analysis

identifies the parameters that are affected by the characteristics of the study

watershed and those to which the given project is most sensitive (Veith and

Ghebremichael, 2009).

III. Sensitivity Analysis Set-up and Execution

By entering the ArcSWAT Interface Sensitivity Analysis Window, the user first

specifies the SWAT simulation that will be used for performing the sensitivity

analysis and the location of the subbasin where observed data where be compared

against simulated output. This is illustrated in Figure 1. The user then enters the

desired input settings and observed data file, as shown in Figure 2. NINTVAL, m,

is the number of sub-ranges into which each parameter range is divided. If m = 10,

then one Latin Hypercube loop will sample a parameter of range 0 to 1.0 within the

subrange of 0.0 to 0.1, 0.1 to 0.2 and so on. In the input window, OAT refers to

the percentage of change in the parameter value that will be used in the variations

within the parameter range. If a range is from 0 to 1.0, a 5% parameter change will

vary by 0.05*(1.0 – 0.0) = 0.05 units. Selection of the number of LH loops and the

parameter change percentage should be made jointly, as they are both a function of

the parameter ranges (Veith and Ghebremichael, 2009). The user then enters the

parameters that will be selected for the sensitivity analysis, adjusts lower and upper

Page 6: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

6

parameter bounds if necessary, and then selects the observed data set that will be

compared to the modeled output, if desired (Figure 3).

Sensitivity Input Window

Analysis Location: Select

from the SWAT simulation

list a simulation for

performing the sensitivity

analysis

Subbasin: Select a

subbasin within the

project where observed

data will be compared

against simulated output

Figure 1. Input of the SWAT simulation project and subbasin location

Page 7: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

7

Sensitivity Input Window

Hypercube intervals

(Alpha_Bf): 10 intervals of

0-0.1, 0.1-0.2 … 0.9-10.0

OAT change (Alpha_Bf):

Changes by 5% x (1.0 -

0.0) = 0.05

Initial value of 0.13

becomes 0.08 or 0.18

Figure 2. Input of the sensitivity analysis parameter settings

Sensitivity Input Window

Select Parameters for

conducting sensitivity

analysis

Lower bound = 0.0

Upper bound = 10.0

Adjust if necessary

Observed Data File Name

Figure 3. Input of the observed data, selected parameters and associated lower and

upper bounds

Page 8: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

8

In the Output window, the user designates whether or not the sensitivity analysis

will be run on just modeled output, on observed versus simulated output, or both.

For either case, the user selects which parameters will be selected for the

sensitivity analysis and also indicates whether concentrations or loads will be

evaluated for parameters governing water quality (Figure 4). If modeled output is

selected, the user must specify whether the sensitivity analysis will be performed

using average modeled output, such as streamflow, or the percent of time that the

modeled output is less than a user specified threshold value. If observed versus

simulated output is selected, the user must specify the type of objective function

that will be used for the analysis, either the sum of squares or the sum of squares

ranked (Figure 4). Once completed, the user writes files to the project directory.

Sensitivity Analysis Output WindowOutput Evaluation:

Comparison variable(s)

Objective Function:

Select optimization

method

Write Input Files to

Project Directory

Select Concentrations or

Loads for Water Quality

Select Average Modeled

Output (e.g., streamflow)

Or Percent of Time

output is < a threshold

value

Figure 4. Selections for output parameter sensitivity and/or observed versus

simulated output sensitivity

The sensitivity analysis is run in the directory by double clicking on an SWAT

executable that is added to the project directory. Upon completion of the

sensitivity analysis, a number of files are generated in the output. The sensout.out

Page 9: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

9

file provides a summary of the inputs used for the analysis and a listing of the

ranked parameters (Figures 5 and 6). The parameter producing the highest average

percentage change in the objective function value is ranked as most sensitive

(Veith and Ghebremichael, 2009).

Main Output: Sensout.dat

Input Data:

Objective and

Response

Functions

List of Parameters

Figure 5. A listing of the input data and objective and response functions in the

senout.out file.

Page 10: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

10

Main Output: Sensout.dat

Parameter Ranking

Figure 6. A listing of the parameter ranking in the sensout.out file.

IV. The Shuffled Complex Evolution Algorithm

ArcSWAT includes a multiobjective, automated calibration procedure that was

developed by Van Griensven and Bauwens (2003). The calibration procedure is

based on a Shuffled Complex Evolution Algorithm (SCE-UA; Duan et al., 1992)

and a single objective function. In a first step, the SCE-UA selects an initial

population of parameters by random sampling throughout the feasible parameter

space for “p” parameters to be optimized, based on given parameter ranges. The

population is partitioned into several communities, each consisting of “2p+1”

points. Each community is made to evolve based on a statistical “reproduction

process” that uses the simplex method, an algorithm that evaluates the objective

function in a systematic way with regard to the progress of the search in previous

iterations (Nelder and Mead, 1965). At periodic stages in the evolution, the entire

population is shuffled and points are reassigned to communities to ensure

information sharing. As the search progresses, the entire population tends to

converge toward the neighborhood of global optimization, provided the initial

population size is sufficiently large (Duan et al. 1992). The SCE-UA has been

widely used in watershed model calibration and other areas of hydrology such as

Page 11: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

11

soil erosion, subsurface hydrology, remote sensing, and land surface modeling, and

has generally been found to be robust, effective, and efficient (Duan 2003).

In the optimization scheme developed for SWAT 2005, parameters in the model

that affect hydrology or water quality can be changed in either a lumped (over the

entire watershed) or distributed (for selected subbasins or HRUs) way. In addition,

the parameters can be modified by replacement, by addition of an absolute change

or by a multiplication of a relative change. Besides weight assignments for output

variables that can be made in multi-objective calibrations (e.g., 50% streamflow,

30% sediment, and 20% nutrients), the user can specify a particular objective

function that is minimized. The objective function is an indicator of the deviation

between a measured and a simulated series (Van Griensven and Bauwens 2003).

Available objective function options in the auto-calibration tool include the sum of

squares of residuals and the sum of squares of residuals ranked. The former

represents the classical mean square error method that aims at matching a

simulated time series to a measured series while the latter represents the fitting of

the frequency distributions of the observed and simulated series.

The auto-calibration tool in SWAT can be run in either the Parasol or the Parasol

with Uncertainty Analysis mode. In the Parasol mode, the tool searches for an

optimal calibration set as described above. In the Parasol with Uncertainty

Analysis mode, each simulation that was performed during the Parasol

optimization process is grouped into a “good” or “not good” category. These

categories are based on whether or not the objective function value of the run falls

within a user-defined confidence interval (CI). This confidence interval is defined

by the user input of “90%”, “95%”, or “97.5%” probability and the corresponding

statistic of either a chi-square or Bayesian distribution (Veith and Ghebremichael,

2009).

V. Auto-calibration Set-up and Execution

By entering the ArcSWAT Interface Auto-Calibration and Uncertainty Input

Window, the user first specifies the SWAT simulation that will be used for

performing the auto-calibration and the location of the subbasin where observed

data where be compared against simulated output. This is illustrated in Figure 7.

Page 12: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

12

The user then enters the desired optimization settings, observed data file, and

method of calibration, as shown in Figure 8.

Auto-calibration Input Window

Analysis Location: Select

from the SWAT simulation

list a simulation for

performing the calibration

Subbasin: Select a

subbasin within the

project where observed

data will be compared

against simulated output

Figure 7. Input of the SWAT simulation project and subbasin location

Auto-calibration Input Window

Optimization Settings

MAXN = Maximum number

of trials before optimization is

terminated

IPROB = sets the threshold

for ParaSol:

1 = 90% CI

2 = 95% CI

3 = 97.5% CI

Calibration Method:

ParaSol or ParaSol

with Uncertainty

Analysis

Observed Data File Name

Figure 8. Input of optimization settings, observed record, and calibration method

Page 13: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

13

It is recommended that the user rely upon the default values listed for the

optimization settings in the Input Window, with the exception of two variables,

MAXN and IPROB. The choice made for selecting MAXN, the maximum number

of trials allowed before optimization is terminated, can be particularly useful in

regulating the duration of the auto-calibration. For example, by noting the project

run time (normally several seconds to a few minutes for the warm-up plus

calibration periods), a user can estimate the time required to complete the

automated calibration, based on MAXN. Depending upon the size and complexity

of the project, a cursory auto-calibration run can be made by setting MAXN equal

to 500 to 1000. A more comprehensive auto-calibration can be made by setting

MAXN to 3000 or more. The user may also choose to vary the threshold value of

the confidence interval, IPROB, which in turn will impact the number of “good”

parameter sets that are generated when running the tool in the Parasol with

Uncertainty Analysis mode. Figure 9 shows the listing of optimization settings

that are created in the parasolin.dat file, including MAXN and IPROB.

Figure 9. A listing the optimization settings created in the parasolin.dat file

Because the ArcSWAT Interface does not properly load the observed data file at

this time, the user must add this file directory to the project directory. Figure 10

illustrates values of the input record for observed daily streamflow. A SWAT

project involving a two gage auto-calibration would require two such files. Figure

11 presents values of the input record for observed monthly streamflow and

sediment at a single gage.

Page 14: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

14

Auto-calibration Input Window:

Observed Daily RecordYear of

observed

record

Observed

Daily

Streamflow

in cms

Julian day

of

observed

record

Figure 10. Input file for observed daily streamflow

Auto-calibration Input Window:

Observed Monthly Record

Observed

Monthly

Sediment

Load

(tons/day)

Observed

Monthly

Streamflow

(cms)

Month of

observed

record

Year of

observed

record

Figure 11. Input file for observed monthly streamflow and sediment record

Page 15: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

15

The next step in preparing the auto-calibration run is to select the model

parameters that will be calibrated and their associated lower and upper bounds.

In most cases it is suggested that the user select the default values listed for lower

and upper calibration bounds, unless field data are available to provide more

specific information regarding parameter ranges. The input window for parameter

selection is given in Figure 12. A complete list of streamflow, sediment, and

nutrient parameters that can be calibrated in the tool is presented by Van Griensven

(2005). Suggested streamflow parameters that are governed by responses to

snowmelt and rainfall are presented in Figure 13.

Auto-calibration Input Window

Select Parameters for

calibration

Adjust initial lower and

upper bounds, if necessary

(note: minimum lower bound

for SURLAG = 0.5)

Figure 12. Input window for parameter selection

Page 16: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

16

Figure 13. A suggested listing of parameters for calibrating streamflow

Parameters in SWAT may be grouped by basin, subbasin, or HRU response. For

distributed parameter modeling, basin parameters, such as surlag and sftmp, are not

usually allowed to vary spatially. On the other hand, some or all of the subbasin

and HRU parameters can be designated to vary regionally, depending upon the

location of available measured data. A two gage approach for the auto-calibration

of subbasin and HRU parameters is illustrated in Figure 14.

Figure 14. A two gage approach to auto-calibration of subbasin and HRU

parameters

Page 17: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

17

Since the current version of the ArcSWAT Interface auto-calibration tool only

allows model parameters to be calibrated against measured data at a single gage,

the following procedure is suggested for creating the changepar file so that a

regionally varied model calibration can be achieved. To construct the necessary

information for this file, a multi-step ArcSWAT interface iteration is required in

order to group parameters that vary by HRUs or subbasins with the corresponding

calibration point within the project. This can be accomplished by employing the

“Select HRUs\LU” radio button in the auto-calibration input panel to select all

HRUs or subbasins associated with a given calibration point for each parameter to

be calibrated (Figure 15). Writing the input files from the interface builds that

portion of the changepar.dat file corresponding to the first calibration point. The

newly created changepar.dat file must now be renamed in the project directory to

prevent it from being overwritten. Upon selecting the second calibration point in

the project, “Select HRUs\LU” is again employed to group all HRUs or subbasins

for each parameter that corresponds to that second point. By once again writing

the input files, a second changepar.dat file is created. Manually combining the two

files in the project directory creates the information necessary to perform a

regionally varied auto-calibration, as shown in Figure 16. Model users should

carefully review the newly created changepar.dat file to ensure that all the desired

parameters are included in the file and that there are no duplicate listings.

Page 18: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

18

Auto-calibration input: Multigage Changepar file is created by combining two or more changepar files that are

specific for certain subbasins or HRUs in the project

For parameters that vary by

HRU, select All Land Uses, Soils,

and Slopes for Subbasins that

are relevant to a particular gage

For parameters that vary by

Subbasin, select All Subbasins

that are relevant to a particular

gage

Figure 15. Using the select HRU/LU button to select model parameters by

subbasins or HRUs

Auto-calibration input: Multigage Changepar file is created by combining two or more changepar files that are

specific for certain subbasins or HRUs in the project

Subbasins

for gage 2

Subbasins

for gage 1

HRUs for

gage2

HRUs for

gage 1

Figure 16. An example multigage changepar.dat file

Page 19: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

19

Following the construction of the changepar.dat file, the Auto-calibration Output

pane in the ArcSWAT Interface is used to enter information to perform calibration

output evaluations. This includes the parameter or parameters to be calibrated, the

type of objective function (sum or squares or sum of squares ranked), the weight

assignments for output variables that can be made in multi-objective calibrations,

and the selection of either concentration or load calibration for water quality

parameters (Figure 17). In addition to the parasolin.dat and changepar.dat files, a

third file created when the input files are written in the interface is objmet.dat. As

shown in Figure 18, this file defines the variables and methods that will be used for

optimization.

Auto-calibration Output Window

Output Evaluation: Select

parameter to be calibrated

Objective Function:

Select optimization

method

Write Input Files to

Project Directory

Select Concentrations or

Loads for Water Quality

Calibration

Figure 17. Selection of settings for calibration output evaluations

Page 20: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

20

Auto-calibration input file: Objmet

Code number for

Autocalfile in .fig

Concentration or load

Objective function

method

Given weight for

objective function

Code number for

calibration variable

Figure 18. A description of the optimization control variables and methods in

objmet.dat.

Two additional flies, referred to as file.cio and fig.fig, are modified when the input

files are written to the project directory. File.cio is the overall command file for

the project, while fig.fig is the command file governing the routing of water and

pollutants from the landscape to the stream reaches. Examples of these two files

are presented in Figures 19 and 20, respectively. Variable ICLB in file.cio governs

the type of run that is performed by the model, such as default, parameter

sensitivity, or auto-calibration. Once the fig.fig file is created in the directory, it

must be modified manually to account for multiple calibration gages. Each

calibration location requires two lines: the command code line and the observed

data file name. Figure 20 illustrates a fig.fig file consisting of a two gage

calibration. In the first “autocal” command line located at the bottom of the file,

“16” represents the command code for auto-calibration, “77” is the hydrograph

storage location for the reach being calibrated (corresponding to reach 1 of the

project), “1” represents the number of the auto-calibration file, and “0” represents a

daily time-step registration. Line two of the file, colq1.dat, is the name of the file

containing the observed flow record that corresponds to reach 1. In line three of

Page 21: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

21

the fig file, “68” is the hydrograph storage location for the reach being calibrated

(corresponding to reach 3 of the project) and “2” represents the number of the

autocalibration file. Line four of the file, colq2,dat, is the observed file name

corresponding to reach 3.

Auto-calibration input file: Filecio

ICLB =AutoCalibration

Default = 0

Sensitivity = 1

Optimization = 2

Optimization with

uncertainty = 3

Bestpar = 4

NYSKIP = Warm-up

Number of years

simulated

Figure 19. Designation of auto-calibration options in file.cio

Page 22: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

22

Auto-calibration input file: fig

Autocal Command Code

and Observed Data Files

for 2 Gage Locations

Figure 20. Location of the autocal command code and observed data files in the

fig.fig file

Auto-calibration Output

Parasolout.out, goodpar.out, bestpar.out, and autocal1.out are output files that are

particularly useful for viewing upon completion of the auto-calibration run.

Multiple gages that are calibrated will result in multiple autocal.out files. Output

from the first three of these files is described by Veith and Ghebremichael (2009).

In brief, parasolout.out lists the maximum and minimum parameter values for all

solutions in the Parameter Uncertainty Analysis. Goodpar.out is a record of all

solutions having an objective function value within the Uncertainty Analysis

Confidence Inverval, while bestpar.out represents the parameter set with the

optimal solution among all the goodpar.out solutions. The autocal1.out file

provides a listing of the simulated streamflow and pollutants at the calibration

point for the designated time step used during the run. It is generated during the

last run of the auto-calibration. Sample output files are shown in Figures 21-23.

Page 23: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

23

Auto-calibration output file: Parasolout

Parameter

Uncertainty

Ranges

Calibration

Figure 21. Example output file for parasolout.out

Auto-calibration output file: goodparand bestpar

CalibrationParameter listings

Figure 22. Example output files for goodpar.dat and bestpar.out

Page 24: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

24

Auto-calibration output file: Autocal

Parameter

Uncertainty

RangesCalibrationMonthly Streamflow

Monthly Sediment Load

Figure 23. Example output file for autocal1.out

Once the auto-calibration has been completed, the user can rerun the project with

the bestpar.dat data set with ICBN = 4 in file.cio. Simulated streamflow generated

from the autocal.out file can in turn be compared against measured streamflow at

the monthly or daily time step. Users may find that manual fine tuning following

auto-calibration provides moderate improvement in comparison of measured

versus simulated streamflow. Use of a manual approach following auto-calibration

accentuates the tradeoffs that exist in achieving total mass balance, reasonable

hydrograph responses, and adequate representation of the range in flows (Van

Liew et al., 2005). Fine tuning often tends to provide better matches for the lower

flows without a significant sacrifice in maintaining a good representation of the

peak flows.

Calibration of Water Quality Constituents

Two alternatives exist for employing the auto-calibration feature in the model.

One possibility is to use the multi-objective function developed by van Griensven

(2005) to simultaneously calibrate streamflow and selected water quality variables,

which include sediment, nitrogen, and phosphorus. The other possibility is to

Page 25: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

25

calibrate streamflow and water quality constituents separately in a multi-step

fashion. Cursory testing of the two methods suggests that one method may not

necessarily be better than the other. Cursory testing also suggests that it may be

preferable to calibrate water quality constituents at the monthly instead of daily

time step, due to apparent convergence limitations that seem to exist at the daily

time scale.

If the multi-objective function approach is employed, the following changes in the

input files are needed. First of all, the observed record must include measured

values for both streamflow and the water quality constituent(s), as illustrated in

Figure 11. Sediment or nutrient values are input as loads at the monthly time scale.

Second, the objmet.dat file must be amended to indicate water quality calibration

and a given weight for the objective function must be assigned (Figure 18) (van

Griensven, 2005). Third, the changepar.dat file must also be modified to include

the desired parameters that are to be calibrated.

An alternative to the multi-objective function is to use a multi-step calibration

procedure. This procedure is described briefly as follows. Following auto-

calibration and manual fine tuning of the parameters governing the streamflow

calibration, all parameter values in bestpar.out need to be rewritten to their

respective input files. This can readily be accomplished by using the Manual

Calibration Helper in the interface to update project HRUs or subbasins with

appropriate parameter values related to streamflow. Most values can then be

verified by rerunning the model and reviewing the input.std file.

The auto-calibration tool can once again be employed to search for optimal

parameter values that govern sediment loading in SWAT, based upon a specified

measured record. Manual adjustments following the auto-calibration may be

warranted to achieve a suitable sediment calibration. Updating the input files with

the Manual Calibration Helper could then be undertaken, before proceeding with

the calibration of nutrient constituents in the model.

Page 26: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

26

References

Duan, Q. D. 2003. “Global optimization for watershed model calibration.”

Calibration of watershed models, Water Science Applied Series,

Vol. 6, Q. Duan, et al., eds., AGU, Washington, D.C., 89–104.

Duan, Q., Gupta, V. K., and Sorooshian, S. 1992. “Effective and efficient

global optimization for conceptual rainfall-runoff models.”

Water Resour. Res., 28, 1015–1031.

Nelder, J. A., and Mead, R. A. 1965. “Simplex method for function

minimization.” Comput. J., 7, 308–313.

Rosso R. 1994. An introduction to spatially distributed modelling of basin

response. In Advances in Distributed Hydrology, Rosso R Peano A

Becchi I, Bemporad GA (eds). Water Resources Publications: Fort

Collins; 3–30.

Saltelli A, Scott EM, Chan K, Marian S. 2000. Sensitivity Analysis. John

Wiley & Sons: Chichester.

Sorooshian S, Gupta VK. 1995. Model calibration. In Computer Models

of Watershed Hydrology, Singh VP (ed). Water Resources Publications:

Highlands Ranch, Colorado, USA; 23–63.

Van Griensven, A., and Bauwens, W. 2003. “Multiobjective autocalibration

for semidistributed water quality models.” Water Resour. Res.,

39(12), 1348.

Van Griensven, A. 2005. Sensitivity, auto-calibration, uncertainty and model

evaluation in SWAT2005. Unpublished report.

Page 27: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

27

Veith, T.L. and L.T. Ghebremichael. 2009. How to: applying and interpreting the

SWAT Auto-calibration tools. In: Fifth International SWAT Conference

Proceedings. August 5-7, 2009 (Proceedings).

Van Liew, M. W., J. G. Arnold, and D. D. Bosch. 2005. Problems and potential of

autocalibrating a hydrologic model. Transactions of the ASAE 48(3):1025-1040.

Van Liew, M.W., T.L.Veith, D.B. Bosch, and J.G. Arnold. 2007. Suitability of

SWAT for the Conservation Effects Assessment Project: comparison of USDA

Agricultural Research Service Watersheds. J. of Hydrologic Engineering.

12(2):173-189.

Winchell, M., R. Srinivasan, M. di Luzio, J. Arnold, 2007: ArcSWAT interface for

SWAT 2005. User’s Guide. Blackland Research Center, Texas Agricultural

Experiment Station, Temple.

Page 28: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

28

Appendix A. Overview of steps for building auto-calibration files for

a multi-gage calibration

1. Select simulation and subbasin

2. Construct observed data files for

selected time step. (Data files can be

built in Excel, saved as a space

delimited file that ends with a

“.dat” extension.)

Format for monthly observations:

Year (1x5i), month (3x2i), 3x,

measured values (1X11F.3)

Format for daily observations:

Year (1X5i), day (2X, 3i), 3x,

measured values (1X11F.3)

3. Select optimization settings and

calibration method and input

observed data file.

4. Select parameters for calibration.

Adjust initial boundaries

Auto-calibration Input Window:

Observed Monthly Record

Observed

Monthly

Sediment

Load

(tons/day)

Observed

Monthly

Streamflow

(cms)

Month of

observed

record

Year of

observed

record

Sample monthly streamflow and

sediment record

Auto-calibration Input Window

Select Parameters for

calibration

Adjust initial lower and

upper bounds, if necessary

(note: minimum lower bound

for SURLAG = 0.5)

Auto-calibration Input Window

Analysis Location: Select

from the SWAT simulation

list a simulation for

performing the calibration

Subbasin: Select a

subbasin within the

project where observed

data will be compared

against simulated output

Auto-calibration Input Window

Optimization Settings

MAXN = Maximum number

of trials before optimization is

terminated

IPROB = sets the threshold

for ParaSol:

1 = 90% CI

2 = 95% CI

3 = 97.5% CI

Calibration Method:

ParaSol or ParaSol

with Uncertainty

Analysis

Observed Data File Name

Page 29: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

29

5. Use the HRU/LU button to select

model parameters by subbasin

or HRUs associated with gage 1.

6. In the auto-calibration output

window, write files to project

directory. Enter project directory

and rename changepar.dat file

(e.g., changepar1.dat).

7. Repeat step 5 using the

HRU/LU button to select

model parameters by subbasin

or HRUs associated with gage 2.

8. Select calibration output

evaluations and rewrite input

files to project directory.

9. Combine contents of

changepar1.dat created from

step 6 with the newly created

changepar.dat file in the

project directory. Verify that

there are no duplicate

parameter listings.

Auto-calibration input: Multigage Changepar file is created by combining two or more changepar files that are

specific for certain subbasins or HRUs in the project

Subbasins

for gage 2

Subbasins

for gage 1

HRUs for

gage2

HRUs for

gage 1

Auto-calibration Output Window

Output Evaluation: Select

parameter to be calibrated

Objective Function:

Select optimization

method

Write Input Files to

Project Directory

Select Concentrations or

Loads for Water Quality

Calibration

Auto-calibration input: Multigage Changepar file is created by combining two or more changepar files that are

specific for certain subbasins or HRUs in the project

For parameters that vary by

HRU, select All Land Uses, Soils,

and Slopes for Subbasins that

are relevant to a particular gage

For parameters that vary by

Subbasin, select All Subbasins

that are relevant to a particular

gage

Auto-calibration input: Multigage Changepar file is created by combining two or more changepar files that are

specific for certain subbasins or HRUs in the project

For parameters that vary by

HRU, select All Land Uses, Soils,

and Slopes for Subbasins that

are relevant to a particular gage

For parameters that vary by

Subbasin, select All Subbasins

that are relevant to a particular

gage

Page 30: GuidelinesforSENSITIVIT5YANDAUTOCALIBRATIONINSWAT

30

10. Include observed data

file names in fig.fig file in the

project directory.

Format for observed data file:

(10X, up to13 characters with

last 4 ending in “.dat”)

11. Add second line for

for gage 2 in the objmet.dat

file. Verify correct control

variables for each line.

12. Designate appropriate

variables in file.cio file and

proceed with program

execution.

Auto-calibration input file: fig

Autocal Command Code

and Observed Data Files

for 2 Gage Locations

Auto-calibration input file: Filecio

ICLB =AutoCalibration

Default = 0

Sensitivity = 1

Optimization = 2

Optimization with

uncertainty = 3

Bestpar = 4

NYSKIP = Warm-up

Number of years

simulated

Auto-calibration input file: Objmet

Code number for

Autocalfile in .fig

Concentration or load

Objective function

method

Given weight for

objective function

Code number for

calibration variable