How do I use WGET for Bulk Downloading this Data set?

WGET Instructions - for command line in Mac and Unix/Linux


1. Configure your username and password for authentication using a .netrc file
>cd ~
>touch .netrc
>echo "machine urs.earthdata.nasa.gov login <uid> password <password>" >> .netrc
>chmod 0600 .netrc

where <uid> is your URS username and <password> is your URS password. Do not include the brackets <>.

2. Create a cookie file. This will be used to persist sessions across individual cURL/WGET calls, making it more efficient.
>cd ~
>touch .urs_cookies

3. Use a WGET command to download your data. Example WGET command:

>wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --keep-session-cookies --no-check-certificate 
--auth-no-challenge=on -r --reject "index.html*" -np -e robots=off 
"https://daacdata.apps.nsidc.org/pub/DATASETS/nsidc0481_MEASURES_greenland_V01/"

 
WGET Instructions - for Windows

1. Create a text file to store the website cookies returned from the HTTPS server, called "mycookies.txt". Store this in the wget installation directory.

2. Use a WGET command to download your data. Example WGET command:

>wget --http-user=[USERNAME] --http-password=[PASSWORD] --load-cookies mycookies.txt --save-cookies mycookies.txt 
--keep-session-cookies --no-check-certificate--auth-no-challenge -r --reject "index.html*" -np -e robots=off 
"https://daacdata.apps.nsidc.org/pub/DATASETS/nsidc0481_MEASURES_greenland_V01/"

As composed above, the WGET request will download files from the specified directory to a directory on your machine. The directory on your machine will have the title of the HTTPS host. For NSIDC, this will either be “daacdata.apps.nsidc.org”.

Using the WGET flags -nd (or --no-directories), -nH (or --no-host-directories), or --cut-dirs=number (where the number is the number of directories to cut, but doesn't include the host directory name - you'd need to include -nH to remove that still), you can modify the directory name and hierarchy, where the files are written.