Options Available for Bulk Downloading Data from HTTPS with Earthdata Login

There are a few options depending on the tools that you have available.
•    Using your browser (Firefox only)
•    WGET
•    cURL

Using your browser (Firefox only)
You can use the Download Them All plugin for Firefox if you are using a Mac or Windows computer: https://addons.mozilla.org/en-US/firefox/addon/downthemall/
1. In your Firefox browser, navigate to the directory containing files you want to download.
2. Right click anywhere in the window and select "DownThemAll!"
3. A window will appear that allows you to select which files to download.
4. The process runs in the background so you can go to another directory.

WGET Instructions - for command line in Mac and Unix/Linux
1. Configure your username and password for authentication using a .netrc file
>cd ~
>touch .netrc
>echo "machine urs.earthdata.nasa.gov login <uid> password <password>" >> .netrc
>chmod 0600 .netrc

where <uid> is your URS username and <password> is your URS password. Do not include the brackets <>.

2. Create a cookie file. This will be used to persist sessions across individual cURL/WGET calls, making it more efficient.
>cd ~
>touch .urs_cookies

3. Use a WGET command to download your data. Example WGET command:

>wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --keep-session-cookies --no-check-certificate 
--auth-no-challenge=on -r --reject "index.html*" -np -e robots=off <insert complete HTTPS data URL>

WGET Instructions - for Windows
1. Create a text file to store the website cookies returned from the HTTPS server, called "mycookies.txt". Store this in the wget installation directory.

2. Use a WGET command to download your data. Example WGET command:

>wget --http-user=[USERNAME] --http-password=[PASSWORD] --load-cookies mycookies.txt --save-cookies mycookies.txt 
--keep-session-cookies --no-check-certificate--auth-no-challenge -r --reject "index.html*" -np -e robots=off <insert complete HTTPS data URL>

As composed above, the WGET request will download files from the specified directory to a directory on your machine. The directory on your machine will have the title of the HTTPS host. For NSIDC, this will either be "n5eil01u.ecs.nsidc.org" or “daacdata.apps.nsidc.org”.

Using the WGET flags -nd (or --no-directories), -nH (or --no-host-directories), or --cut-dirs=number (where the number is the number of directories to cut, but doesn't include the host directory name - you'd need to include -nH to remove that still), you can modify the directory name and hierarchy, where the files are written.

CURL Instructions - for command line in Mac and Unix/Linux
Steps 1 and 2 of the WGET instructions still apply, just use the cURL command instead of WGET. 
When using cURL, the .netrc files and .urs_cookies files in the WGET instructions are still required.
>curl -b ~/.urs_cookies -c ~/.urs_cookies -L -n <insert complete HTTPS data URL>

Adding -O to the cURL command will download the file to the current working directory on your computer.