Programmatic Data Access Guide

All data from the NASA National Snow and Ice Data Center Distributed Active Archive Center (NSIDC DAAC) can be accessed directly from our HTTPS file system, using wget or curl. Basic command line instructions are provided in the article below. 

A large selection of NSIDC DAAC data can also be accessed through our Application Programming Interface (API). Our API offers you the ability to order data using specific temporal and spatial filters, as well as subset, reformat, and reproject select data sets. API guidance, including a Jupyter Notebook to construct and download your data request, are also provided in this article.

Regardless of how users access data from the NSIDC DAAC, an Earthdata Login account is required. Please visit the Earthdata Login registration page to register for an account before getting started.

For questions about programmatic access to NSIDC DAAC data, please contact NSIDC User Services: nsidc@nsidc.org

HTTPS access: Wget instructions for Mac and Linux

Step 1

Store your Earthdata Login credentials (username <uid> and password <password>) for authentication in a .netrc file in your home directory. 

Example commands showing how to set up a .netrc file for authentication:

echo 'machine urs.earthdata.nasa.gov login <uid> password <password>' >> ~/.netrc
chmod 0600 ~/.netrc

Replace <uid> and <password> with your Earthdata Login username and password (do not include brackets). Windows bash users should also replace the forward slash ( / ) with a backslash ( \ ) in the above commands. 

Step 1 may be skipped by specifying the following options in the Wget command in Step 2:
--http-user=<uid> --http-password=<password> 
or
--http-user=<uid> --ask-password

Step 2

Use a Wget command to download your data. Please note that depending on the data set, the beginning of the URL is either  https://n5eil01u.ecs.nsidc.org or https://daacdata.apps.nsidc.org. You can determine which one to use by visiting the data set landing page for the data set you are interested in and selecting the 'HTTPS File System' card, and in the window that opens, the URL at the top is the one to use. Below are two examples for each URL type:

Example Wget command to download SMAP L4 Global Daily 9 km EASE-Grid Carbon Net Ecosystem Exchange, Version 6 data for 07 October 2019:

wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --keep-session-cookies --no-check-certificate --auth-no-challenge=on -r --reject "index.html*" -np -e robots=off https://n5eil01u.ecs.nsidc.org/SMAP/SPL4CMDL.006/2019.10.07/SMAP_L4_C_mdl_20191007T000000_Vv6042_001.h5

Example Wget command to download Canadian Meteorological Center (CMC) Daily Snow Depth Analysis Data, Version 1 data for the year 2020:

wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --keep-session-cookies --no-check-certificate --auth-no-challenge=on -r --reject "index.html*" -np -e robots=off https://daacdata.apps.nsidc.org/pub/DATASETS/nsidc0447_CMC_snow_depth_v01/Snow_Depth/Snow_Depth_Daily_Values/GeoTIFF/cmc_sdepth_dly_2020_v01.2.tif

 

Requested files will be downloaded to a directory named after the HTTPS host, "n5eil01u.ecs.nsidc.org" or "daacdata.apps.nsidc.org". You can modify how the files are stored by specifying the following Wget options:
-nd (or --no-directories)
-nH (or --no-host-directories)

The GNU Wget 1.20 Manual provides more details on Wget options.


HTTPS access: Wget instructions for Windows

The following instructions apply to Wget executed through Windows Command Prompt. For Windows bash, please see the above Mac and Linux instructions. 

Step 1

Create a text file ("mycookies.txt") to store the website cookies returned from the HTTPS server. Store this in the wget installation directory.

Step 2

Use a Wget command to download your data. In the following example, replace <uid> and <password> with your Earthdata Login username and password (do not include brackets).

Please note that depending on the data set, the beginning of the URL is either https://n5eil01u.ecs.nsidc.org or https://daacdata.apps.nsidc.org. You can determine which one to use by visiting the data set landing page for the data set you are interested in and selecting the 'HTTPS File System' card, and in the window that opens, the URL at the top is the one to use. Below are two examples for each URL type:

Example Wget command to download SMAP L4 Global Daily 9 km EASE-Grid Carbon Net Ecosystem Exchange, Version 6 data for 07 October 2019:

wget --http-user=<uid> --http-password=<password> --load-cookies mycookies.txt --save-cookies mycookies.txt --keep-session-cookies --no-check-certificate --auth-no-challenge -r --reject "index.html*" -np -e robots=off https://n5eil01u.ecs.nsidc.org/SMAP/SPL4CMDL.006/2019.10.07/SMAP_L4_C_mdl_20191007T000000_Vv6042_001.h5

Example Wget command to download Canadian Meteorological Center (CMC) Daily Snow Depth Analysis Data, Version 1 data for the year 2020:

wget --http-user=<uid> --http-password=<password> --load-cookies mycookies.txt --save-cookies mycookies.txt --keep-session-cookies --no-check-certificate --auth-no-challenge -r --reject "index.html*" -np -e robots=off https://daacdata.apps.nsidc.org/pub/DATASETS/nsidc0447_CMC_snow_depth_v01/Snow_Depth/Snow_Depth_Daily_Values/GeoTIFF/cmc_sdepth_dly_2020_v01.2.tif

Requested files will be downloaded to a directory named after the HTTPS host, "n5eil01u.ecs.nsidc.org" or "daacdata.apps.nsidc.org". You can modify how the files are stored by specifying the following Wget options:
-nd (or --no-directories)
-nH (or --no-host-directories)

The GNU Wget 1.20 Manual provides more details on Wget options.


HTTPS access: curl Instructions

Step 1 

Follow the Step 1 Wget instructions for Mac and Linux above. All Windows users should replace the forward slash ( / ) with a backslash ( \ ) in the same commands.

Step 2

Use a curl command to download your data. 

Please note that depending on the data set, the beginning of the URL is either https://n5eil01u.ecs.nsidc.org or https://daacdata.apps.nsidc.org. You can determine which one to use by visiting the data set landing page for the data set you are interested in and selecting the 'HTTPS File System' card, and in the window that opens, the URL at the top is the one to use. Below are two examples for each URL type:

Example curl command to download a single file of SMAP L4 Global Daily 9 km EASE-Grid Carbon Net Ecosystem Exchange, Version 6 data for 07 October 2019:

curl -b ~/.urs_cookies -c ~/.urs_cookies -L -n -O https://n5eil01u.ecs.nsidc.org/SMAP/SPL4CMDL.006/2019.10.07/SMAP_L4_C_mdl_20191007T000000_Vv6042_001.h5

Example Wget command to download Canadian Meteorological Center (CMC) Daily Snow Depth Analysis Data, Version 1 data for the year 2020:

curl -b ~/.urs_cookies -c ~/.urs_cookies -L -n -O https://daacdata.apps.nsidc.org/pub/DATASETS/nsidc0447_CMC_snow_depth_v01/Snow_Depth/Snow_Depth_Daily_Values/GeoTIFF/cmc_sdepth_dly_2020_v01.2.tif

The -O option in the curl command downloads the file to the current working directory on your computer. 


NSIDC DAAC's data access and service API

Note: Whilst a large selection of NSIDC data can be accessed using the API (e.g., AMSR-E/AMSR2, ICESat-2/ICESat, MODIS, SMAP, VIIRS) not all NSIDC DAAC data can be accessed this way.

The API provided by the NSIDC DAAC allows you to access data programmatically using specific temporal and spatial filters. The same subsetting, reformatting, and reprojection services available on select data sets through NASA Earthdata Search can also be applied using this API. These options can be requested in a single access command without the need to script against our data directory structure. This API is beneficial for those who frequently download NSIDC DAAC data and are interested in incorporating data access commands into their analysis or visualization code. If you are new to the API, we recommend exploring this service using our Jupyter Notebook. See below for further guidance on the API format and examples of this service.


Data Access Jupyter notebook

The Jupyter notebook available on Github allows you to explore the data coverage, size, and customization services available for your data set(s). The notebook will also print the API command that can be used from a command line or browser or within the Notebook itself. No Python experience is necessary; each code cell will prompt you with the information needed to configure your data request. The README includes several options for running the notebook, including via Binder, Conda, and Docker. The Binder button available in the README allows you to easily explore and run the notebook in a shared cloud computing environment without the need to install dependencies on your local machine. If you are interested in using the notebook to bulk download data, we recommend either running this notebook locally using the Conda or Docker options, or using the print API command option if you're running the notebook via Binder.


Data customization services

All data sets can be accessed and filtered based on time and area of interest. We also offer customization services (i.e. subsetting, reformatting, and reprojection) for certain NSIDC DAAC collections. Please visit the following pages for detailed service availabiilty:

How to format the API endpoint

Programmatic API requests are formatted as HTTPS URLs that contain key-value-pairs specifying the service operations:

https://n5eil02u.ecs.nsidc.org/egi/request?<key-value-pairs separated with &> 

There are three categories of API key-value-pairs, as described below. Please reference the lookup Table of Key-Value-Pair (KVP) Operands for Subsetting, Reformatting, and Reprojection Services to see all keyword options available for use.


Search keywords

These keywords include the data set short name (e.g. MOD10A1), version number, time range of interest, and area of interest used to filter* the file results.

Customize keywords

These keywords include subsetting by variable, time, and spatial area*, along with reformatting and reprojection services. These subsetting services modify the data outputs by limiting the data based on the values specified.

*See Figure 1 for more information on the difference between spatial filtering and spatial subsetting.

Configuration keywords

These keywords include data delivery method (either asynchronous or synchronous), email address, and the number of files returned per API endpoint. The asynchronous data delivery option will allow concurrent requests to be queued and processed without the need for a continuous connection; orders will be delivered to the specified email address. Synchronous requests will automatically download the data as soon as processing is complete; this is the default if no method is specified. The file limits differ between the two options:

Maximum files per synchronous request = 100
Maximum files per asynchronous request = 2000

If the number of files is not specified, the API will only deliver 10 files by default. You can adjust the number of files returned using the  page_size and page_num keywords if you are experiencing performance issues. 

In order to request data by rectangular area of interest, the keyword bounding_box is used to specify the search filter bounds. If you are also interested in subsetting the data to that area of interest, include the bbox subsetting keyword with coordinates identical to the bounding_box search filter keyword.

Figure 1. Data filtering requests return whole files based on overlapping coverage, as seen on the lower left-hand map. Data subsetting requests crop the data files to the area of interest, resulting in subsetted file outputs, as seen on the lower right-hand map. All data can be spatially filtered using the API, whereas only select data sets offer spatial subsetting. Spatial subsetting requests require the spatial filtering keyword to avoid errors. 

 


API endpoint examples

Downloading data in its native format i.e. no customization

To request data without subsetting, reformatting, or reprojection services (only spatial and/or temporal filtering), you must use the agent=NO parameter, as demonstrated

Example 1: AMSR-E/AMSR2 Unified L2B Global Swath Ocean Products, Version 1, collected from 2021-12-07 12:00:00 to 2021-12-07 15:00:00 with the maximum file number specified (page_size) and without customization services (agent=NO). A zip file is returned synchronously with 5 science files and associated metadata:

https://n5eil02u.ecs.nsidc.org/egi/request?short_name=AU_Ocean&version=1&time=2021-12-07T12:00:00,2021-12-07T15:00:00&agent=NO&page_size=100

Example 2: MODIS/Terra Snow Cover Daily L3 Global 500m SIN Grid, Version 6, collected from 2018-05-31 17:03:36 to 2018-06-01 06:47:53 over Iceland with the maximum file number specified and without customization services (native data requested). A zip file is returned synchronously with 3 science files and associated metadata:

https://n5eil02u.ecs.nsidc.org/egi/request?short_name=MOD10A1&version=6&time=2018-05-31T17:03:36,2018-06-01T06:47:53&bounding_box=-24.697,63.281,-13.646,66.717&agent=NO&page_size=100

Downloading customized data i.e. reformatting, spatial and parameter subsetting, and reprojection

Example 3: SMAP L3 Radiometer Global Daily 36 km EASE-Grid Soil Moisture, Version 7, GeoTIFF reformatting, spatial subsetting, parameter subsetting, and Geographic reprojection for all data collected 2008-06-06 to 2018-06-07 over Colorado with the maximum file number specified. A zip file is returned synchronously with 2 reformatted, subsetted, and reprojected files:

https://n5eil02u.ecs.nsidc.org/egi/request?short_name=SPL3SMP&version=007&format=GeoTIFF&time=2018-06-06,2018-06-07&bounding_box=-109,37,-102,41&bbox=-109,37,-102,41&Coverage=/Soil_Moisture_Retrieval_Data_AM/soil_moisture&projection=Geographic&page_size=100

Example 4: MODIS/Terra Snow Cover Monthly L3 Global 0.05Deg CMG, Version 6, GeoTIFF reformatting and parameter subsetting, for all data collected from 2015-01-01 to 2015-10-01 with the maximum file number specified​. A zip file is returned synchronously with 10 reformatted and subsetted files:

https://n5eil02u.ecs.nsidc.org/egi/request?short_name=MOD10CM&version=6&format=GeoTIFF&time=2015-01-01,2015-10-01&Coverage=/MOD_CMG_Snow_5km/Snow_Cover_Monthly_CMG&page_size=100

Example 5: MEaSUREs Greenland Ice Velocity: Selected Glacier Site Velocity Maps from InSAR, Version 1, collected from 01 January 2017 to 31 December 2018 over the western Greenland coast, specifying no customization service (subsetting, reformatting, reprojection) agent used and no metadata requested. Because 133 files will be returned, we need to create two endpoints to capture the first 100 results followed by the last 33. 

https://n5eil02u.ecs.nsidc.org/egi/request?short_name=NSIDC-0481&version=1&time=2017-01-01T00:00:00,2018-12-31T00:00:00&bounding_box=-52.5,68.5,-47.5,69.5&agent=NO&INCLUDE_META=N&page_size=100&page_num=1

https://n5eil02u.ecs.nsidc.org/egi/request?short_name=NSIDC-0481&version=1&time=2017-01-01T00:00:00,2018-12-31T00:00:00&bounding_box=-52.5,68.5,-47.5,69.5&agent=NO&INCLUDE_META=N&page_size=100&page_num=2


 


Using curl 

Example 1: MODIS example 4 above:

curl -b ~/.urs_cookies -c ~/.urs_cookies -L -n -O -J --dump-header response-header.txt "https://n5eil02u.ecs.nsidc.org/egi/request?short_name=MOD10CM&version=6&format=GeoTIFF&time=2015-01-01,2015-10-01&Coverage=/MOD_CMG_Snow_5km/Snow_Cover_Monthly_CMG"

Example 2: ATLAS/ICESat-2 L3A Land Ice Height, Version 2, with variable subsetting for all data collected on 01 August 2019 over Pine Island Glacier using an uploaded Shapefile* of the glacier boundary:

curl -b ~/.urs_cookies -c ~/.urs_cookies -L -n -O -J -F "shapefile=@glims_download_PIG.zip" "https://n5eil02u.ecs.nsidc.org/egi/request?short_name=ATL06&version=002&time=2019-08-01,2019-08-02&polygon=-86.69021127827942,-74.83792495569011,-90.06285424412475,-73.99476421422878,-94.2786579514314,-73.71371063374167,-96.52708659532829,-74.13529100447232,-100.04025635141716,-73.99476421422878,-102.28868499531404,-74.556871375203,-101.30499746360915,-74.83792495569011,-101.86710462458338,-75.11897853617722,-103.97500647823671,-75.54055890690788,-101.1644706733656,-76.38371964836921,-101.1644706733656,-76.94582680934343,-96.66761338557184,-77.50793397031765,-96.38655980508473,-77.64846076056124,-97.37024733678962,-78.07004113129187,-95.40287227337984,-78.49162150202255,-95.40287227337984,-79.33478224348386,-92.73286325875229,-80.03741619470165,-88.9386399221763,-79.05372866299675,-91.88970251729097,-78.35109471177898,-90.76548819534253,-77.78898755080476,-92.31128288802165,-77.36740718007411,-91.04654177582964,-77.086353599587,-91.74917572704742,-76.80530001909989,-87.3928452294972,-75.96213927763856,-86.83073806852298,-75.68108569715145,-86.69021127827942,-74.83792495569011&coverage=/gt1l/land_ice_segments/h_li,/gt1l/land_ice_segments/latitude,/gt1l/land_ice_segments/longitude,/gt1r/land_ice_segments/h_li,/gt1r/land_ice_segments/latitude,/gt1r/land_ice_segments/longitude,/gt2l/land_ice_segments/h_li,/gt2l/land_ice_segments/latitude,/gt2l/land_ice_segments/longitude,/gt2r/land_ice_segments/h_li,/gt2r/land_ice_segments/latitude,/gt2r/land_ice_segments/longitude,/gt3l/land_ice_segments/h_li,/gt3l/land_ice_segments/latitude,/gt3l/land_ice_segments/longitude,/gt3r/land_ice_segments/h_li,/gt3r/land_ice_segments/latitude,/gt3r/land_ice_segments/longitude,/quality_assessment"

*This example includes uploading a zipped Shapefile to be used for subsetting, in addition to filtering by simplified polygon coordinates based on the uploaded file. This functionality is currently only available for ICESat-2 data. This Shapefile is available through the NSIDC Global Land Ice Measurements from Space (GLIMS) database. Direct download access available here.

Example 3: Large file requests

The following bash script example demonstrates a request using the asynchronous delivery method. This method is managed by our ordering system and does not require a continuous connection to execute. Therefore we recommend this method for requests that require long processing times, or those that exceed the 100 file synchronous delivery method limit. This script requests the same MEaSUREs data in example 4 above. It retrieves the order number and order status, and delivers the data programmatically in addition to an email notification. 

#!/bin/bash

#make the request
curl -s -b ~/.urs_cookies -c ~/.urs_cookies -L -n -o 'request.xml' 'https://n5eil02u.ecs.nsidc.org/egi/request?request_mode=async&short_name=NSIDC-0481&version=1&time=2017-01-01T00:00:00,2018-12-31T00:00:00&bounding_box=-52.5,68.5,-47.5,69.5&agent=NO&INCLUDE_META=N&page_size=2000&email=yes'

#parse the response for requestid.
REQUESTID=`grep orderId request.xml | awk -F '<|>' '{print $3}'`

#poll for status
for i in {1..360}
do
	curl -s -b ~/.urs_cookies -c ~/.urs_cookies -L -n -o 'response.xml' 'https://n5eil02u.ecs.nsidc.org/egi/request/'"$REQUESTID"
	RESPONSESTATE=`grep status response.xml | awk -F '<|>' '{print $3}'`
	if [ "$RESPONSESTATE" != "processing" ]
	then
		break
	fi
	sleep 10;
done

#get the zips
for i in `grep downloadUrl response.xml | grep -v '\.html<' | awk -F '<|>' '{print $3}'`; do
	curl -s -b ~/.urs_cookies -c ~/.urs_cookies -L -n -O "$i"
done

 

Last updated: December 2021