Useful Applications and Other Tips
Hydro Group "Getting Started" Page
Aggregation Scripts
Description and Use: General temporal and spatial aggregation scripts.
Location: /usr/local/hydro/build/tools/vic/output/ (If you add this to PATH in yoru ~/.cshrc file, you should not
need to make a copy.)
Examples:
agg_time.pl:
To aggregate a file of daily records to monthly values, where the 5th
field is to be summed instead of averaged:
agg_time.pl -i
To aggregate a file of hourly records to monthly values, where fields
5 and 6 are to be summed instead of averaged:
agg_time.pl -i
To aggregate a file of 3-hourly records to yearly values, where fields
5 and 6 are to be summed instead of averaged:
agg_time.pl -i
agg_space.pl:
To aggregate the VIC fluxes output files listed in
/my_path/result/filelist.txt, where the cell size is half-degree:
agg_space.pl -i /my_path/result/filelist.txt -r 0.5 -o out_file
For the same basin, but summing the 5th and 6th fields instead of
averaging:
agg_space.pl -i /my_path/result/filelist.txt -r 0.5 -o out_file -sum 5,6
For complete usage, just type:
agg_time.pl -h
agg_space.pl -h
Contact Ted (tbohn at hydro.washington.edu) with questions.
Animated Gifs
Description and Use: A script for converting files to gif and then using
them to create the animation. These can be inserted into a powerpoint
presentation like any other gif. When powerpoint is in slide mode, it
will animate.
To view the animation from the command line: gifview -a
(filename)
Alternately, you can view it from the command line using: animate (filename)
Script:
gifanim.scr
ArcInfo
Description and Use: This is a Geographical Information System (GIS) package. We use this mainly to
process and visualize spatial data.
How to Run:
log
into one of the PCs
Select ArcInfo from the Program Menu
to start the help pages type: help
Examples:
Using ArcInfo to
obtain a DEM (SRTM30) for a basin
Using
Hydro1k to obtain a basin delineation and clipping a DEM with this
Useful Links:
Various
ArcInfo scripts and datasets
FM
490 Class Notes - Very useful for getting to know GRID commands
Warnings:
1. DO NOT use any
"unusual" symbols in the directory names or file names
accessed by ArcInfo, such as spaces or capital letters.
2.
Grids, coverages and other files that have been imported into ArcInfo
can only be manipulated within ArcInfo.
Useful
commands include: kill, rename, copy
AWK
Description and Use: This is a very useful application that
performs various operations to each row of a specified text file. AWK
can do many things but with only one file at a time.
How to Run:
on the command line or in a shell script
Examples:
to
print the 8th column of a text file:
awk
'{print $8}' $file_name
to skip
values:
awk '{if( NR > 100
) print $1,$2}' $file_name
to control printing
format (same as C):
awk
'{printf("%.2f\t%.2f\n",$1,$2}' $file_name
to
print cumulative sum:
awk
'BEGIN{sum=0}{sum += $2; plot $1,sum}' $file_name
to
set a value determined using awk to a variable within a
script:
set value = `awk
'NR==5{print $1}' $file_name`
to exchange a
space for a slash field separator:
awk
'BEGIN { FS="/"}; {print $1 " " $2 " "
$3}' $file_name
to count the number of active
cells in an ascii arcinfo-style grid:
awk
'{if(NR==1)ncols=$2;if(NR==6)nodata=$2;if(NR>6)for(i=1;i<=ncols;i++)if($i!=nodata)n++}END{print
n}' $file_name
A few things to know:
$8
denotes the eighth column
$0 prints the entire
line
NR is the line number
NF
is the number of columns
FS is the field
separator
the general form is: awk
'BEGIN{...}{...}END{...}' $file_name
Apostrophes
protect an external variable within AWK. e.g. '$count'
Useful
Links:
Bruce Barnett's
tutorial
Background Running
Description: You may want to run a process in the background. For
example,
you can do this if you want a process to continue running
and you want to log out of the machine.
There are two ways to do this:
1. place an ampersand (&) after the script/program when
calling it, e.g. "run_getdata.scr &"
2. the following also works:
a. call the program without the
ampersand, e.g. "run_getdata.scr"
b. temporarily suspend the process using
Ctrl Z (Note: Ctrl C permanantly cancels a process).
c. type "bg" - now your process is
running in the background - it starts where it left off.
d. if you want to bring it back to the
foreground, type "fg" in the
same window where the process was called
or type "fg" followed by process id number.
Back-Up Strategies
This information is designed to complement Paul's back-up page
Summary: Some drives are backed-up onto tape, which means if the drive fails,
you will be able to access those files from tape.
Drives that are backed-up: Your home directory (on dynamic) and most of the
usr1 drives
Drives that are NOT backed-up: All of the raid drives, the drives on the
windows machines (including SAMBA), and some of the usr1 drives
For example, /nfs/pluto/usr1 is not backed-up. To test if
a usr1 drive is backed-up, log into the system (i.e. ssh pluto)
and type, "ls /usr1"
If the drive is not backed-up, you will get a message such
as,
"THIS_DIRECTORY_IS_NOT_BACKED_UP_DUE_TO_LACK_OF_LICENSES"
In this case, you must use a different drive to store
things you want to be backed-up to tape.
You may want to double-check with our system administrator
to confirm positively if a particular
drive is backed-up.
What should be backed-up? The majority of your data should remain unbacked-up on the raid
drives
You should place all scripts, programs, readme files, word
documents, powerpoint files, etc.. on a drive that is
backed-up, plus
any other files that took a considerable amount of YOUR
effort to create. A good
example is a corrected flow direction file. Data
downloaded from
the internet or that can be easily re-created from your
backed-up scripts
and programs do not need to exist on a backed-up drive.
Strategies: There are two strategies that people have used to make sure
their programs and scripts remain backed-up
Strategy 1: All original scripts and programs are created
on the usr1 drives and exist there only
and are written so that the file names
to the input and output data include
the paths to the raid drive where the
data exist. The advantage to this strategy is that
you never have to think about performing a
back-up procedure and so your most
recent scripts are always backed-up.
The disadvantage is that if the usr1
drive goes bad, you have to access the tape to get your
files,
which can be a nightmare - for stories,
talk to Andy!
Strategy 2: All processing is done from the raid drive of
your computer and all original programs and scripts are
created there along with the data. The
usr1 drive is used to mirror the file structure
of the raid drive and all programs,
scripts, readmes and other important files are copied into
this file structure. This can be done
easily using a processing script.
For example, here is an automated
script that I use. This script takes a few minutes to run:
it re-creates the entire raid file
structure on usr1 and copies over key files and scripts.
The advantage to this method is that all
of the key files exist on two directories simultaneously,
so if one drive goes, you can get the
files from the other. The disadvantage is that you have
to remember to run your script from time
to time or you will risk losing most recent additions.
Cluster Basics
Basic Information can be found in a powerpoint created by Joanna Gaski
(jgaski at hydro.washington.edu),
entitled "cluster_talk.pdf" which can be obtained
from her home directory (i.e. ~jgaski/cluster_talk.pdf).
Parallel Processing: how to compile and run a parallel process on flood:
Debuggers (gdb and ddd)
Description and Use: Can be useful in debugging programs. DEM (digital elevation model) - SRTM30
Description: We have the global dataset processed into 27 pieces (ArcInfo
format)
Directions to obtain DEM data for a single basin:
DHSVM (Distributed Hydrology
Soil Vegetation Model)
Description and Use: DHSVM is the small-scale hydrology research
model used in this lab. Ferret
NOTE: As far as I can tell, Ferret is not currently available on Hydro system.
Talk to Joanna if you need to have it installed. Description and Use: Visualization package for NetCDF files. Can
be used to plot spatial and time series slices of multi-dimensional
data. Can also do averaging and other mathematical calculations over
all dimensions and produce output postscript files. FTP
Description and Use: file transfer GPROF (code profiler)
Description and Use: can be used to tell where the time sinks in your code
are.
GMT (Generic Mapping Tool)
Description and Use: This is a mapping tool as well as an analysis
tool. Most of the figures created in this lab were done using GMT.
GRASS (Geographic Resources
Analysis Support System)
Description and Use: A freeware GIS product HTML Programming
Description and Use: Web pages can be created using a variety of
software packages (including Word and SOffice), but can also be done
in any text editor (xemacs). IDL (Interactive Data Language)
Description and Use: Analysis and visualization of spatial data
(very powerful for remote sensing analysis) International Students
Description: This is intended to be specific information to help out our
international students.
LaTeX
Description and Use: Document Preparation Matlab
Description and Use: Mathematics, statistics, data processing,
plotting, etc... Octave
Description and Use: Octave is an open-source math/statistics software with a
Perl
Description and Use: A powerful and general scripting language:
highly portable, reads binary and ascii data, can work with multiple
files, very flexible, many uses. Powerpoint to eps for Paper Submission
see Kostas (kostas at hydro.washington.edu) for more information.
1) MPI program:
Set the MPICH (MPI implementation) root directory to
"/usr/local/x86_64/mpich127/pgi-ch-p4"
and add the library and include file paths in your .cshrc
file. For example:
setenv MPICH_ROOT /usr/local/x86_64/mpich127/pgi-ch-p4
setenv INC_MPI ${MPICH_ROOT}/include
setenv LIB_MPI ${MPICH_ROOT}/lib
setenv PATH ${PATH}:${MPICH_ROOT}/bin
Compile your program using either "mpicc" or "pgcc":
mpicc -o your_program your_program.c
pgcc -I$MPICH_ROOT/include -L$MPICH_ROOT/lib -o your_program
your_program.c -lmpich
Run your program with "mpirun":
mpirun -np NUM your_program
where NUM is the number of nodes you are going to use
2) OpenMP program:
Compile your program with "pgcc":
pgcc -mp -o your_program your_program.c
the "-mp" switch enables the OpenMP directives.
Before you run your program you need to set an environment variable for
the number of threads your program will use:
setenv OMP_NUM_THREADS NUM
where NUM is the number of threads (4 processors on each node for FLOOD).
You can use NUM>4, the system will give you a warning message.
Then run your program as usual, no additional arguments necessary.
3) Hybrid MPI/OpenMP program
Compile using "pgcc":
pgcc -mp -I$MPICH_ROOT/include -L$MPICH_ROOT/lib -o your_program
your_program.c -lmpich
and run using "mpirun":
mpirun -np NUM your_program
Don't forget to set the OMP_NUM_THREADS environment variable.
How
to Run:
1. compile the program in debugger
mode:
gcc -g -o (executable)
(program.c)
2. start the debugger:
gdb
(executable)
3. type "run" at the
prompt
Useful Links:
Debugging
with GDB
Debugging
with DDD
The original data can be downloaded from http://www.dgadv.com/srtm30/
step 1. choosing the pieces. Here's how the pieces are
broken up (see map):
parts 1 through 9 are rectangular going from west to east starting at -180
to +180 (i.e. 40 degrees of longitude from west to east) and span from 40
degrees north to 90 north (i.e. 50 degrees of latitude from north to
south). similarly parts 10 through 18 span from 10 south to 40 north
(with the same longitude dimension as parts 1 through 9) and parts 19
through 27 span from 60 south to 10 south (with the same longitude
dimensions as parts 1 through 9).
So, find out what the rough dimensions of your basin are, then you can copy
over just those pieces into your own directory.
For example, if you want part 14, you will do the following command.
First enter into the directory that you want to copy the file, then enter
the following:
cp /nfs/therapy/raid/jenny/global/dem/srtm30/part14_30.arc.gz .
step 2. unzip the files
gzip -d part14_30.arc.gz
step 3. start Arc/Info
you can get some directions for this on at the tutorial web-page:
step 4. import the images (for each
part you copied over), e.g.
asciigrid part14_30.arc part14
step 5. merge them together in the GRID module
here are the following commands:
basin_big = merge ( part14 , part15 , part16 )
where "basin_big" is the name of your
output file, and part14, 15, and 16
are the parts you want to merge.
If there's only one part, then you don't do this step.
step 6. clip this file to a smaller box around the basin.
you will also do this in the GRID module.
gridclip basin_big basin_small BOX xmin ymin xmax ymax
where you will need to specify xmin, ymin, xmax, and ymax for a box that
encloses the your basin.
if you want, you can also clip the file to be exactly that of your basin
delineation. to do this, you will need an arc/info coverage that has the
basin outline, e.g. via this page.
step 7. export from arc/info
quit out of the grid module by typing "quit" or "q"
then type:
gridascii basin_small basin_dem_box.asc
the result is a 30-arc second dem called basin_dem_box.asc. You can use
this for various procedures such as creating a snowbands file, and setting
up your routing network.
Useful Links:
DHSVM
Homepage
How to
Run:
log into gen: ssh gen
Your
gen/default path variable must include:
/usr/local/ferret/bin/
Setup:
First time
use: run Finstall to set ferret_paths:
run by
typing: Finstall
The following will appear on
your screen:
...
Enter
your choice:
(1) Install executables, (2)
Customize 'ferret_paths', (3) Exit and do nothing (1,2, or 3) -->
SELECT 2
Setup ferret_paths...
The
environment variable FER_DIR is currently defined
as
/usr/local/ferret. This is the directory
where the 'fer_environment'
tar file was
installed.
Is that correct and acceptable (y/n)
[y] SELECT YES
The environment variable
FER_DSETS is currently defined as
/usr/local/ferret.
This is the directory where the 'fer_dsets'
tar
file was installed.
Is that correct and
acceptable (y/n) [y] SELECT YES
Enter the
complete path of the directory where you want to place
the
newly created 'ferret_paths' file, for example,
'/usr/local'.
desired 'ferret_paths' location
-->
ENTER A DIRECTORY THAT YOU HAVE WRITE
PERMISSION FOR THAT IS IN YOUR
PERSONAL PATH,
e.g. I USE /nfs/dynamic/home/lxb/utilities/
Enter
your choice:
(1) Install executables, (2)
Customize 'ferret_paths', (3) Exit and
do
nothing (1,2, or 3) --> SELECT 3
/*
Source your new ferret_paths file: */
% source
(your input direvtory from above)/ferret_paths
/*
You are now ready to run ferret and have access to sample datasets,
etc.
run: ferret
to get a list of commands type: show
commands
example commands:
to load a file:
use (filename).nc
list variables in currently
loaded files: show data
set dimensions for
viewing: set region/i=1:NCOLS/j=1:NROWS/k=1
(i.e.
this sets the dimesnions to be for the entire domain, for the first
time step)
plot a spatial plot to the screen:
shade (variable name)
plot a line graph to the
screen: plot (variable name)[i=1:NCOLS@ave]
(i.e.
this creates a line graph of "variable name" vs latitude
averaged over all columns of longitude)
exit:
quit
Useful links: Ferret
Homepage, in particular the version
5.22 user's guide
Anonymous FTP: ftp
ftp://ftpsite/
Password FTP: ftp ftp://username:password@ftpsite/
then:
bin
prompt
ls,
cd (to list or change directories)
get, mget (to
download single or multiple files)
put, mput (to
upload single or multiple files)
exit or
bye
Useful Links:
Paul's
Getting Started Page
Our
FTP Site
located at:
/nfs/dynamo/ftp/pub/.
To use the GPROF code profiler ("gprof" at the command line):
1. Compile the code with the "-pg" option (you can set this in a makefile or on
the command line, e.g., gcc -pg -o code code.c)
2. Run the executable as normal (can be a link to the executable, and
can take arguments),
e.g., vicNl -g globalfile
3. This creates a binary output file with the executable name and
extension .gmon or something similar, e.g., vicNl.gmon
4. Type gprof (executable name only) and redirect the output to a file.
This reads the binary file and translates it to something readable.
For example,
gprof vicNl > log # vicNl is the executable name
the most interesting part is probably down after the words "flat profile"
which, after giving some definitions, lists the time sink routines in order.
(contact Andy for more info: AWW-20050809)
If you sit at a Linux machine, you need to add the following GMT paths to your .cshrc
file: (for csh or tsch users)
open up your .cshrc file: xemacs ~/.cshrc
paste in the following information:
setenv NETCDFHOME /usr/local/i386/netcdf
setenv GMTHOME /usr/local/gmt
set path=(/usr/local/i386/bin $path)
After make the changes, either source this file (source
~/.cshrc) or open up a new window.
Also, for all users, add the following as a browser bookmark:
/usr/local/gmt/gmt/gmt_services.html
How to Run:
Create a script using a text
editor (such as Xemacs): xemacs (file_name)
make
the script executable: chmod +x (file_name)
enter
the name of the script to run it
use
ghostscript or ghostview to view the output postscript file:
gs (file.ps)
ghostview
(file.ps)
Examples: Various
Plotting Examples related to Hydrology
GMT4 script for plotting time-series (Andy's)
Script for plotting time-series with positive anomalies in red and negative anomalies in blue (Nathalie's)
About K's and O's:
Remember that -k (continue) goes in all gmt commands except the last,
-o (overlay) goes in all gmt commands except the first.
Useful
Links:
Other
Useful GMT scripts
GMT
Homepage - this page has an excellent tutorial
Beware: Not sure we have this on the system currently.
How to Run: Type
"grass5" at the command prompt - you may need to type
"rehash" first.
Useful Links:
GRASS
Homepage
Useful Links:
HTML
Colors
UW:
Creating a Student Web Page
HTML
character set
More
HTML characters
more
HTML info
Note: we do not currently seem to have this on the system
Useful Links:
IDL
Homepage
To include more items, I need to hear from you.
Item 1: Regardless of what you hear, IF you have been registered full-time
for the previous Autumn, Winter, and Spring quarters,
you only need to register for 2 credits
during the summer!
How to Run: Type
"latex" at the command line.
Useful Links:
LaTeX
Homepage
LaTeX
Tutorial
How to Run: You can run matlab remotely through the CE network or
atmospheric science network
if you have an account on either of those systems.
Historically, the
following CE computers have had Matlab: titan, prometheus, phocus, and
pandora. Also, the CE PCs in the grad student lab probably have it.
Example: Hydrology
Example
This is a script
Jenny wrote for some research that includes regressions, plotting,
statistics, looping, etc...
Useful Links:
Matlab
Homepage
syntax very similar to Matlab. It can usually run Matlab scripts and
functions with very little to no modifications. Its graphics capabilities
are based on GnuPlot, so it doesn't compare very well to Matlab's graphics.
How to Run: type "octave", then use the usual matlab commands.
For example,
> x=normal_rnd(0,1,1000,1); % generate Gaussian random numbers
> plot(x,'g;Data Series;') % plot the data series with green color and Data Series legend
> gset nokey % get rid of the legend
> c=polyfit(1:length(x),x',2); % fit a 2nd order polynomial
> hold on
> plot(c,'b;Trendline;') % and plot it as a blue line
> clearplot
> closeplot
Links
- Main page
- Documentation
- This page contains many toolboxes
not available in the standard Octave package, e.g. Signal Processing, Symbolic Math etc..
- Octave-Matlab compatibility database
- Octave tutorial
For more info: talk to Kostas (kostas at hydro.washington.edu).
Example Script: Kostas' code for Seasonal Mann-Kendall Trend Test and
Maronna-Yohai Bivariate Test
(see Lettenmaier et al. 1994 for descriptions of both)
Directions:
1. put the archive in a directory
2. untar: tar xzf smk.tar.gz
3. install: ./install.sh
4. start octave
5. usage: if you have a time series vector x, [t,b]=mk(x) and
[t,b]=smk(x) will give you the test statistic and the magnitude of the
trend for the standard and seasonal MK test respectively.
The Maronna-Yohai bivariate test can be
used as [t,b]=maronnayohai(x) (results for this are untested)
How to Run:
The
script must have the following on the first line: #!/usr/bin/perl
Example: Hydrology
Example
Useful Links:
Perl
Homepage
CPAN
- includes tutorials, scripts, source code, etc...
Description: You can prepare a figure for paper submission entirely in a GMT
script.
On the other hand, you may want to add various additions
to the figure using powerpoint (such as text labels, etc.)
Alan has found a simple solution to prepare these "hybrid"
images for publicaton,
i.e. the journals prefer high quality images, such as
*.eps
Here are Alan's comments and directions for this procedure:
I've recently been making many "hybrid" pictures with Powerpoint using
multiple GMT-generated ps images imported as bitmaps, interspersed with
text boxes or other powerpoint generated graphics. These work very well
for presentations, but the downside is that when used for papers it's
difficult to get acceptable quality when converting to other image formats
such as tiff or eps that the journals insist on. The alternative is to
remake all the figures in GMT (or a similar application). This works
well, but is very time consuming if the figure has a large number of
panels and labels.
So anyway I wanted to find a way to get a decent eps image from
powerpoint. I think I have a way that works pretty well:
1) print each powerpoint slide to the tektronix 740 printer redirecting
the output to a file (this produces a postscript file with some extra
bells and whistles in the beginning)
2) open this postscript file in gsview and use PStoEPS (from the file
menu) to convert to an eps file.
The resulting eps file will display in gsview, acrobat distiller will turn
it into a pdf file, and you can print it directly from gsview.
So far I haven't been able to insert the resulting eps file in Word and
have it print (I think the extra printer instructions are causing
problems), but this is not necessary when sending image files to a
journal, of course. For inserting in Word, I usually just copy and paste
(as a picture) directly from Powerpoint to Word.
To make a black and white image, use the xerox 3450 printer driver in step
1 above.
R
Description and Use: a statistics package
How to Run: type
"R"on the command line
Example: more
information
Useful Links:
R-project
R Primer
Scripting/Programming Interface
Description and Use: Shell scripts can be used to simplify complex
processes by calling individual programs.
How to Run:
Create
a script using a text editor (such as Xemacs): xemacs (file_name)
The script must have something like the
following on the first line: #!/bin/tcsh
make
the script executable: chmod +x (file_name)
enter
the name of the script to run it
remember that
before calling a program, the program must be compiled. In C, this
can be done as follows:
gcc
-lm -o (executable_name) (program.c) -Wall
In
this example, -lm includes the mathematics library, -o specifies
executable name, and -Wall causes all warnings to be listed.
Fortran: f77 to compile, also
f95 accesses the NAG f95 compiler
Example: Hydrology
Example
SNOTEL data and processing
See /usr/local/hydro/data/snotel/README.txt.
SNOTEL data is archived in this directory, but proceed
with caution. Ted tells me that the scripts for parsing this data into VIC-style timeseries format
are not really accurate due to the fact that there is no placeholder for missing data
points in the original snotel files.
You would need to update the script get_snotel_timeseries.pl which
allows you to query for all SNOTEL sites that match your
criteria (based on lat/lon range, station name, state,
elevation, start/end dates) and produces timeseries for all
stations that match.
Contact Ted (tbohn at hydro.washington.edu) if you have questions.
Star Office
Description and Use: the Unix version of Microsoft Office
How to Run: type "soffice" on the command line
Note: If you get an error trying to open soffice, try removing the star office
directories, then repeat the initialization.
Futher Note: On newer systems, type "ooffice".
Updating Web-Sites for Presentations, Publications, and Group Seminar
Description: Whenever you give a presentation or submit a paper, you need to
update the appropriate web-site.
(for more information, contact Nathalie: nathalie at hydro.washington.edu )
WARNING!!! Never Never open up this html document with any software other
than
a simple text editor such as xemacs, emacs, pico, or
vi.
Do not use any browsers, word, soffice, etc... to edit
this page. If you
do so,
you stand a very good chance of corrupting the file. You
only need
to know very simple html programming to edit the file, so
there
are absolutely no excuses.
For Presentations (for occasions other than our group seminar)
Step 1: log into dynamic
Step 2: go to the directory: "cd
/nfs/dynamic/usr/local/www/data/Lettenmaier/"
Step 3: cp the document to a back-up copy: "cp presentations.php
presentations.php.bak"
Step 4: edit the file presentations.php using a simple text editor
Step 5: make sure that when you are done with the file, people still have
permission to edit the file (type "ls -l
presentations.php")
the permission should still be for all users - if it's
not, you need to make it so:
"chmod a+w presentations.php"
Step 6: copy the presentation into the appropriate directory:
e.g.
/nfs/dynamic/usr/local/www/data/Lettenmaier/Presentations/2009/.
if the presentation took place in 2009. Follow the naming
standard: lastname_subject_conference_monYY.ppt
for example: wood_wfcst_GEWEX_jun06.ppt
(try to abbreviate where possible to avoid superlong
filenames)
For Publications
Step 1: log into dynamic
Step 2: go to the directory: "cd
/nfs/dynamic/usr/local/www/data/Lettenmaier/"
Step 3: cp the document to a back-up copy: "cp publications.php
publications.php.bak"
Step 4: edit the file using a simple text editor
Step 5: make sure that when you are done with the file, people still have
permission to edit the file (type "ls -l
publications.php")
the permission should still be for all users - if it's
not, you need to make it so:
"chmod a+w publications.php"
Step 6: for submitted manuscripts, copy the pdf-format file into the appropriate directory:
e.g.
/nfs/dynamic/usr/local/www/data/Lettenmaier/Publications/.
Once the paper has been published, you
must move the citation from "Papers in review/press" to "2009", and
remove the pdf from any public (www, public_html) folders.
For Group Seminar
Vacation Email
Description: While you are on vacation, there is an automated response
VIC (Variable Infiltration
Capacity) Model
Description and Use: VIC is the large-scale hydrology research
model used in this lab. Various Scripts (added by request)
Internal Data Collection (added by request)
Step 1: log into dynamic
Step 2: go to the directory: "cd
/nfs/dynamic/usr/local/www/data/Lettenmaier/CurrentResearch"
Step 3: cp the document to a back-up copy: "cp HydroSeminar_2008_2009.php
HydroSeminar_2008_2009.php.bak"
Step 4: edit the file HydroSeminar_2008_2009.php using a simple text editor
Step 5: make sure that when you are done with the file, people still have
permission to edit the file (type "ls -l
HydroSeminar_2008_2009.php")
the permission should still be for all users - if it's
not, you need to make it so:
"chmod a+w HydroSeminar_2008_2009.php"
Step 6: copy the presentation into the appropriate directory:
e.g.
/nfs/dynamic/usr/local/www/data/Lettenmaier/CurrentResearch/HydroSeminar/2009/.
if the presentation took place in 2009. Follow the naming
standard: lastname_subject_hydroseminar_monYY.ppt
for example: adam_arctrend_hydroseminar_may06.ppt
(try to abbreviate where possible to avoid superlong
filenames)
to any incoming messages. It states that you are out of email
contact.
Step 1: create a file called ".forward"
It should look something like this:
\jenny, "|vacation jenny"
Step 2: create a file called ".vacation.msg"
It should look something like this:
From: jenny@hydro.washington.edu (Jennifer Adam)
Subject: away from my mail
I am away from my email until Monday, April. 18. I will respond asap after that date.
Jenny
Step 3: In the command line on dynamic type, "vacation -I"
Step 4: When you return from vacation, disable the automated response.
You can do this with the following command, "mv .forward
.forward.vacation"
To enable it again, you would type, "mv .forward.vacation
.forward"
Useful Links:
VIC
Homepage, in particular VIC
as a Nine-Step Process
/usr/local/hydro/build/tools/. for tools and
/usr/local/hydro/utils/ for smaller utilities
/usr/local/hydro/data/.
Currently there are data specific to VIC,
DHSVM, PRISM and SNOTEL data.
Liz Clark
Last modified: Fri Jul 24 15:37:45 PDT 2009