Downloading Data from Earthdata Cloud to Your Local Computer Using the Command Line
This article shows how to download NSIDC DAAC data from NASA Earthdata Cloud to your computer using three tools: wget and curl—popular command-line utilities for downloading files—and Data Downloader, a Python-based command-line tool developed by the Physical Oceanography Distributed Active Archive Center (PO.DAAC).
Prerequisites
- An Earthdata Login (EDL) account—register at the Earthdata Login registration page
- A .netrc file in your home directory with your EDL authentication credentials. For setup instructions, see: https://nsidc.org/data/user-resources/help-center/creating-netrc-file-earthdata-login
- Access to a command line:
- Linux and macOS users: Built-in Terminal
- Windows users: PowerShell, Command Prompt, Windows Terminal, Git Bash, or WSL (Windows Subsystem for Linux)
- Your chosen download utility installed:
wget
: Pre-installed on most Linux and macOS systems. Install usingsudo apt install wget
(Debian/Ubuntu) orbrew install wget
(macOS with Homebrew). Windows users can access it through Git for Windows or WSL.curl
: Pre-installed on Linux, macOS, and Windows 10+. Verify installation by typingcurl --version
in your terminal.PO.DAAC Data Downloader
: See installation and dependency information at https://github.com/podaac/data-subscriber/blob/main/README.md
- HTTPS URLs for the files you want to download from the cloud archive (required for wget or curl workflows). You can save multiple download links as a text file by using NASA Earthdata Search or the Common Metadata Repository (CMR) API. See this help article for additional information: https://nsidc.org/data/user-resources/help-center/creating-text-files-https-and-s3-urls-earthdata-cloud-data-access
wget
Download a single file
To download a file using wget, use this basic syntax:
wget [URL]
Here's an example command to download one ATLAS/ICESat-2 L2A Global Geolocated Photon (ATL03, Version 6) file on Linux or macOS:
wget https://data.nsidc.earthdatacloud.nasa.gov/nsidc-cumulus-prod-protected/ATLAS/ATL03/006/2025/03/02/ATL03_20250302235544_11742607_006_01.h5
This command saves the file ATL03_20250302235544_11742607_006_01.h5
to your current working directory.
Download multiple files
Using a Text File with HTTPS Links
1. Put the text file (named URLs.txt in this example) containing the HTTPS links for your desired data files into the directory where you want to download the files.
2. Use the -i
flag (input file) to tell wget to read URLs from a file:
wget -i URLs.txt
Here are some useful wget flags:
-P
<directory>: Save files to a specific folder. The example below creates adownloads
folder in your working directory and saves files there:wget -i URLs.txt -P downloads/
-nc
: Skip downloading files that already exist in your given directory (no clobber):wget -i URLs.txt -P downloads/ -nc
For a complete guide to all available Wget options, see the GNU Wget 1.25 Manual.
Using the CMR Virtual Directory
You can use wget to download files for specific dates from a data collection by using a URL from the CMR Virtual Directory.
For example, this CMR Virtual Directory path lists all ATL03 Version 6 files for August 12, 2024: https://cmr.earthdata.nasa.gov/virtual-directory/collections/C2596864127-NSIDC_CPRD/temporal/2024/08/12. You can download these files using wget
using the following bash script:
#!/bin/bash
# ===== USER CONFIGURATION =====
# Set user configurable variables below - modify these values according to your download needs
COLLECTION=C2596864127-NSIDC_CPRD
yyyy=2024
mm=08
dd=12
# ===== CORE SCRIPT - DO NOT MODIFY =====
WGET_OPTS="-r --no-parent --span-hosts -D cmr.earthdata.nasa.gov,data.nsidc.earthdatacloud.nasa.gov -nd -c"
wget ${WGET_OPTS} https://cmr.earthdata.nasa.gov/virtual-directory/collections/${COLLECTION}/temporal/${yyyy}/${mm}/${dd}
This script can be used to download any NSIDC DAAC data collection available in Earthdata Cloud. To find available collections, visit https://cmr.earthdata.nasa.gov/search/site/collections/directory/NSIDC_CPRD/gov.nasa.eosdis. The collections are listed alphabetically. Navigate to your desired dataset and click the "Directory" link in the last column to access the virtual directory. From there, drill down to your target date and copy the path.
- From this path, copy the Earthdata Collection concept ID (located between
collections/
and/temporal
in the URL, formatted asC####-NSIDC_CPRD
). Assign this unique dataset identifier to the COLLECTION variable in the Bash script. - Update the date variables (yyyy, mm, and dd) in the script to specify your desired year, month, and day.
The WGET_OPTS
line sets options that tell wget to download files recursively (-r
) while saving them to the current directory without creating subdirectories (-nd
). It ensures downloads work across the correct domains (--span-hosts -D
) and can resume if interrupted (-c
). The final line builds the complete URL using your specified variables and applies the WGET_OPTS
settings to control the download process.
To use this script,
1. Open a terminal window and create the script file. For this example, we'll create myscript.sh using the nano text editor.
nano myscript.sh
Copy and paste the script above, modifying the collection concept ID and date variables to match your target day, month, or year.
Save and exit the file (in nano, press Ctrl+O
, then Enter
, then Ctrl+X
).
2. Make the script executable by setting the execution permissions:
chmod +x myscript.sh
3. Run the script.
./myscript.sh
Note: The ./
prefix tells the shell to execute the script from the current working directory.
curl
Download a single file
To use curl
to download a single NSIDC DAAC file, use this basic syntax:
curl -b ~/.urs_cookies -c ~/.urs_cookies -L -n -O [URL]
For example, here's the code for downloading a single file of ATLAS/ICESat-2 L2A Global Geolocated Photon Data, Version 6 (ATL03):
curl -b ~/.urs_cookies -c ~/.urs_cookies -L -n -O https://data.nsidc.earthdatacloud.nasa.gov/nsidc-cumulus-prod-protected/ATLAS/ATL03/005/2022/06/06/ATL03_20220606155953_11541505_005_02.h5
The command uses these important flags:
-O
saves the file with its original name (ATL03_20220606155953_11541505_005_02.h5
)-n
uses your stored EDL credentials from the .netrc file-b
and-c
maintain session cookies-L
follows redirect chains
Downloading multiple files
Using a Text File with HTTPS Links
1. Put the text file (named URLs.txt in this example) containing the HTTPS links for your desired data files into the directory where you want to download the files.
2. Use xargs
and curl
together to download the files in batch:
xargs -n 1 curl -O -b ~/.urs_cookies -c ~/.urs_cookies -L -n < URLs.txt
The command processes URLs.txt line by line, with xargs -n 1
passing each URL to a separate curl command. The curl flags work as described in the single-file download section above.
PO.DAAC Data Downloader
PO.DAAC offers two Python-based command-line tools: a data subscriber and a data downloader. Use the Downloader for occasional or on-demand downloads, and the Subscriber for continuous updates from the archive.
To use either tool with NSIDC DAAC cloud data, specify the provider option (-p
or --provider
) as NSIDC_CPRD
.
Here's an example that downloads ATLAS/ICESat-2 L3A Land and Vegetation Height, Version 6 (ATL08) files within a specific time range (June 7-8, 2022) and region (bounding box: 12.5°W, 51.05°S, 13.85°E, 51.9°N):
podaac-data-downloader -c ATL08 -d ./data_atl08 -p NSIDC_CPRD -b="12.5,51.05,13.85,51.9" -sd=2022-06-07T00:00:00Z -ed=2022-06-08T23:59:00Z
For instructions on using the data subscriber, see https://nsidc.org/data/user-resources/help-center/how-create-data-subscriptions.
Final Thoughts
Here's a simple breakdown of each tool for downloading NSIDC DAAC data from Earthdata Cloud:
- wget: Excellent for downloading single or multiple files. It handles interrupted downloads automatically and includes options for retrieving files from specific dates.
- curl: Already installed on most computers and easy to use for downloading single files. Works well with other commands to download multiple files.
- PO.DAAC Data Downloader: Best when you need to get data from specific times and areas of interest, or want to set up automatic downloads.
Pick the tool that works best for you. If you just need to download a few files now and then, curl or wget are good choices. If you need to download data from specific areas or set up automatic downloads, use the PO.DAAC Data Downloader.
If you have questions about downloading NSIDC DAAC data using these tools, please email NSIDC User Services at nsidc@nsidc.org.