Databasing

Luke Copland luke.copland at ualberta.ca
Fri Jun 29 17:37:57 MDT 2001


Hi all,
        Attached below is an email exchange that Graham Cogley and I
have been
having recently about some of our GLIMS databasing questions - we're
interested to hear any comments you may have. The email exchange arose
from a summary document I put together a few months ago (also attached)
for some people here at the University of Alberta who will be involved
with GLIMS. It gives a brief introduction to the project, and discusses
some of the issues that were brought up in Maryland. I haven't updated
it since March, so some of the information may now be out of date (NB:
QEI in the document stands for Queen Elizabeth Islands = Canadian High
Arctic).

Cheers,
Luke
        
-- 
Luke Copland                         Office: Tory 3-20
Dept. Earth & Atmospheric Sciences      Tel: 780 707 5583 
University of Alberta                   Fax: 780 492 7598
Edmonton, Alberta T6G 2E3, Canada     Email: luke.copland at ualberta.ca

http://arctic.eas.ualberta.ca/luke


-----------------
"J. Graham Cogley" wrote:
>
>      Luke - This will be very useful - it was worth doing.  I have just a
> couple of instant thoughts:
>
>      1.  I don't understand your comment (p5) about errors not being
> explicit
> if L/L coordinates are used.  The positional errors could be "migrated"
> from the N/E system with spherical or (yuk) ellipsoidal trigonometry.  Or
> I may be missing something.
...when I say that the errors are not explicit in the L/L system, I mean
that at present (as I understand it) the L/L system will just give you a
list of coordinates and no information about how they were derived. This
is in comparison to the N/E system, where information about the common
reference point will be provided. Your idea of 'migrating' errors to the
L/L system is a good one, and ties in with my thoughts and your comments
below. If we do decide to only go with the L/L system, then we should
make sure that there is an associated table which describes estimated
errors from transformations, georeferencing, etc. We probably won't be
able to do this point by point, but would rather have an error estimate
for each image.

>
>      2.  On your query 3(c), p6: the simplest interim decision might be to
> register all images and vectors to a chosen "best" image (conceivably a
map,
> or just the most convenient satellite image or air photo), quite
carefully.
> Try to store standard errors of estimate for transformations, etc.  Then
> when better positional information becomes available, such as a better map
> or a GPS survey of GCPs, there is only one further transformation to make.
> It is not clear, though, how this could be accommodated in the database
> structure.
...this is a good idea, but as you say it would ultimately be very hard
to put into practice. There is also the problem that we're working over
large areas that cross many map/scene boundaries. What we've now decided
to do here is to use the published 1:250,000 NTS map sheets as our
'base' images, and then reference the new satellite imagery to these. In
turn, the 1959/60 aerial photos are georeferenced to the corrected
satellite imagery (mainly because it's very hard to compare the detailed
aerial photos to the large scale maps).

>
>      3.  The GLIMS definition of "N/E" in Appendix 2 (p12) is not very
> clear.  "Local ground coordinate system (abbreviated N/E). Units are
meters
> north and meters east of some reference point on the ground that is
> probably,
> but not necessarily, contained in the image." fails to mention the crucial
> point that you have to *define north* before the system is of any use.
> Later it emerges that what it really means by "N/E" is "any well-defined
> cartographic projection". So you can't or shouldn't just make up any old
> N/E system, in which case I'm not sure what the objection to L/L is.  The
> bit about taking longer to read an L/L position than an N/E is baffling -
> if you use the same number of bits the two operations should take the same

> time, and in any medium-sized project you will have to do lots of inter-
> conversions between the two systems anyway.
...my thoughts exactly. I don't see that the pros of the N/E system
outweigh the cons, and your point about defining north is very good.
Since we're referencing our imagery to published maps, it makes more
sense just to store everything directly in Lat/Long (or UTM).

>
>      4.  It will be nice if GLIMS "upgrades" to a better definition of
> "glacier", but there are some complex questions lurking just below the
> surface.  As far as I can see the database makes no provision at all for
> topology, so the little discussion about whether tributaries "count" or
> not is going on in a vacuum. The Dynamic table needs to have some system
> of pointer codes to represent tributary-mainstream relationships, but
> to make good use of such a system it would have to be many-to-one (bcs
> everyone would want to start with the mainstream and find the tributaries,
> rather than the other way around).
...agreed - it seems to be easier to define a glacier by starting at the
terminus and working up to find the accumulation area, rather than the
other way around. Hopefully some of the others will have more
comments/thoughts about this.

>
>      It's a bit like (in fact, exactly like) doing a Horton morphometric
> analysis of a semi-arid drainage basin in which the streams sometimes
> dry up and fail to connect.  I wonder whether anyone still bothers with
> that kind of thing in modern fluvial geomorph?  Even if not, computer
> scientists know everything there is to know about describing networks,
> including trees.
>
>      I suspect that including topology properly could be a big job.
>
> Now that I have written so much I may as well write a bit more:
>
> A. I have made some progress with HDF. The program hdf2bin.exe, on
> Bruce's list in his software survey, does the job of writing binaries
> of the contents of hdf files, so I can now make a start.  The bad news
> is that it generates several hundred files per hdf file, but I can deal
> with that.  I also got Geomatica Freeview, which is great for visualiz-
> ation but will not save anything.  And for ETM images, HDF is a smoke-
> screen and you can work around it with relative ease.
...I usually just use the screen capture feature in Paint Shop Pro to
export images from Geomatica Freeview - it's a low tech solution, but
seems to work well for most stuff.

>
>      All the best,
>
>      Graham.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: GLIMS project summary.pdf
Type: application/pdf
Size: 39852 bytes
Desc: not available
URL: <https://nsidc.org/pipermail/glims/attachments/20010629/5be920d2/attachment.pdf>


More information about the GLIMS mailing list