Proposal for RC --> NSIDC data transfer methodology: Rev. 1.11

Bruce Raup braup at nsidc.org
Fri Oct 5 12:46:52 MDT 2001


Hi Rick,

Good feedback.

You're right that we need example files, and we can create a mock set.  In
fact, we've done part of the job already.  I realize, too, that part 6 of
the proposal, "PUTTING IT ALL TOGETHER", needs considerable augmentation.

One of the next things on my to-do list is to draft lists of valid values
for the various enumerated data types that are in the database (e.g.
segment_left_material, instrument names in the Instrument table, ellipsoid
names, etc.)

> I began to setup an arcview session for manually digitizing glacier
> outlines, etc., to test out how I might incorporate your proposal ideas
> using a common commercial app.   I quickly realized that we will need a
> template or some sort of avenue script that sets up each new session in a
> quick and standard manner so that each RC will not have to construct each
> submission from scratch.  Do you already have such examples or do you
> want Hugh and I and/or several RCs  to submit templates and scripts to
> distribute from our website?

We can provide example files, but it would be great to get help from you
and other RCs to set up tool-specific procedures for generating files in
those formats.  I guess you can cover Arcview; might Frank Paul want to
take on the task for GRASS?

> Also, I am not sure that 8.3 filename format is required (by ESRI) since
> most of my current shapefiles work fine using very long filenames.  It
> seems that the format is more in the ballpark 32.3.  Perhaps 8.3 is only
> for the C library that you list?  I believe that there is a restriction
> to lower case only for shapefile names.

Maybe people ignore this "restriction" in reality.  But on page 2 of the
"ESRI Shapefile Technical Description:  An ESRI White Paper"
(http://www.esri.com/library/whitepapers/pdfs/shapefile.pdf), they say it's
8.3.  But if the reality is that long names are okay, I fully support their
use.

> Finally, we now have a few nice ASTER images with the proper gain
> settings that might be used as standard test images for everyone to take
> their first stab at putting together a database ingest.  Andy Kaeaeb
> suggested that we do this for algorithm development also.  My preference
> would be to simply supply the start date/time of one or two images to
> everyone and let each RC download the data from EDG and process as they
> see fit.  This method would not only give us a chance to work out the
> submission bugs using "standard" data, but would also let us see if
> anyone still has problems ordering or using standard ASTER HDF-EOS data.

I agree that such a multi-pronged end-to-end test would be quite valuable.
We first need to nail down the valid values for attributes and the transfer
format, but it's probably a good time to decide on which ASTER images would
be appropriate for the tests.  It would be good of they covered a wide
range of glacier sizes and types (terminus type, amount of debris cover,
etc.).

> Thanks for keeping the GLIMS gears turning.

Likewise.  They are very big gears, so we all need to have a hand on them.

Bruce

-- 
Bruce Raup
National Snow and Ice Data Center                     Phone:  303-492-8814
University of Colorado, 449 UCB                       Fax:    303-492-2468
Boulder, CO  80309-0449                                    braup at nsidc.org





More information about the GLIMS mailing list