Knowledge Base

Options Available for Bulk Downloading Data from HTTPS with Earthdata Login

 On Wednesday, October 10, 2018, Earthdata removed support for TLS 1.0 (a security protocol used on SSL secure web pages) on all its services including NSIDC data access. Access to NSIDC data will require a web client (browser, API client, command-line tool) that supports TLS 1.1 or higher. This change affects users who programmatically download data with cURL and WGET command-line tools. 
On Thursday, December 6, 2018, Earthdata will be removing support for TLS 1.1 on all its services, therefore we suggest updating to TLS 1.2 to avoid further interruptions. 

We recommend that users upgrade their cURL, WGET and OpenSSL libraries.  
 
1.  To check your OpenSSL version:
 
openssl ciphers -v ALL | awk '{print $2}' | sort –u
 
If this shows TSLv1.2, you have an appropriate implementation of OpenSSL.  Otherwise, you may need to upgrade your version of OpenSSL.
 
2.  To check your version of TLS using a browser, curl, or wget:
 
curl --tlsv1.2 --silent https://www.howsmyssl.com/a/check
or
wget -q -O - https://www.howsmyssl.com/a/check
or
open https://www.howsmyssl.com/ in your preferred browser

 
If the version is TLS 1.2 or higher in the “Version” section of the web page, or the "tls_version": element of the JSON returned from curl/wget, you don’t need to do anything.  If the version is lower, you may need to upgrade your version of curl/wget/browser or ssl library that supports those tools. 

If you have questions please email them to nsidc@nsidc.org. 

There are a few options depending on the tools that you have available.
•    Using your browser (Firefox only)
•    WGET
•    cURL

Using your browser (Firefox only)
You can use the Download Them All plugin for Firefox if you are using a Mac or Windows computer: https://addons.mozilla.org/en-US/firefox/addon/downthemall
Download Them All is not supported in Firefox Quantum, which is the newest verison of the browser.

1. In your Firefox browser, navigate to the directory containing files you want to download.
2. Right click anywhere in the window and select "DownThemAll!"
3. A window will appear that allows you to select which files to download.
4. The process runs in the background so you can go to another directory.

WGET Instructions - for command line in Mac and Unix/Linux
1. Configure your username and password for authentication using a .netrc file

echo "machine urs.earthdata.nasa.gov login <uid> password <password>" >> ~/.netrc
chmod 0600 ~/.netrc

where <uid> is your Earthdata Login username and <password> is your Earthdata Login password. Do not include the brackets <>.

2. Use a WGET command to download your data. Example WGET command:

wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --keep-session-cookies --no-check-certificate --auth-no-challenge=on -r --reject "index.html*" -np -e robots=off < insert complete data HTTPS URL >

WGET Instructions - for Windows
1. Create a text file to store the website cookies returned from the HTTPS server, called "mycookies.txt". Store this in the wget installation directory.

2. Use a WGET command to download your data. Example WGET command:

wget --http-user=[USERNAME] --http-password=[PASSWORD] --load-cookies mycookies.txt --save-cookies mycookies.txt --keep-session-cookies --no-check-certificate--auth-no-challenge -r --reject "index.html*" -np -e robots=off <insert complete HTTPS data URL>

WGET Tips 

Mac and Unix/Linux users can also skip the step of storing the username and password in a netrc file by adding --http-user=[USERNAME] --http-password=[PASSWORD], or  --http-user=[USERNAME] --ask-password.

The WGET examples provided in this article will download files from the specified directory to a directory on your machine. The directory on your machine will have the title of the HTTPS host. For NSIDC, this will either be "n5eil01u.ecs.nsidc.org" or “daacdata.apps.nsidc.org”.

You can modify how the files are saved using the using the following WGET flags:
-nd (or --no-directories)
-nH (or --no-host-directories)
--cut-dirs=number (where the number is the number of directories to cut, but doesn't include the host directory name - you'd need to include -nH to remove that still)

The GNU WGET 1.18 Manual provides more details on WGET options.

CURL Instructions - for command line in Mac and Unix/Linux
Step 1 of the WGET instructions still applies, just use the cURL command instead of WGET.

curl -b ~/.urs_cookies -c ~/.urs_cookies -L -n -O <insert complete HTTPS data URL>

The -O option on the cURL command downloads the file to the current working directory on your computer. 

Last updated: 30 October 2018

86,5 Bot