The Subscriptions API supports delivery to Amazon S3, Microsoft Azure Blob Storage, Google Cloud Storage, or Oracle Cloud Storage. Activation and processing for direct download is not currently supported.
For any cloud storage delivery option, create a cloud storage account with both write and delete access.
When you create a subscription or order with bucket delivery, Planet checks the bucket permissions linked to your token by first attempting to deliver a file named planetverify.txt and then immediately deleting it. If Planet has the adequate permissions, you will not see this file. If you do see this file in your buckets, we recommend that you review your permissions and make sure that Planet has both write and delete access.
Cloud Credential Security
When you create a subscription, input your credentials to create cloud delivery of Planet data. This can introduce a potential security risk. For secure handling of cloud service credentials in the request, make sure that access is limited to the desired delivery path with no read and write access for any other storage locations or cloud services.
Delivery schema¶
The schema for Subscriptions API delivery
below.
Note
This schema varies slightly from the delivery
schema of the Orders API.
"delivery": {
"type": "cloud-storage-provider",
"parameters": {
"parameter1-name": "p1-value",
"parameter2-name": "p2-value"
}
}
Delivery schedule for Planetary Variables¶
Planetary Variables subscriptions as delivered through the Subscriptions API have key differences from catalog subscriptions:
Soil Water Content¶
- One item per day
- Item represents 01:30 local solar time for AMSR2/AMSRE and 06:00 for SMAP
- Two assets: - Raster asset - Quality flags asset
Land Surface Temperature¶
- Two items per day
- Items represent 01:30 & 13:30 local solar time
- Two assets: - Raster asset - Quality flags asset
Biomass Proxy¶
- One item per day
- Item represents 00:00 local solar time
- One asset: - Raster asset
Note
The Planetary Variables for soil water content (SWC), land surface temperature (LST), and biomass proxy (BIOMASS-PROXY) include raster assets. These rasters are clipped to the subscription’s AOI and no additional tools are supported for Planetary Variables through the Subscriptions API.
Supported delivery options¶
Delivery to Amazon S3¶
For Amazon S3 delivery you will need an AWS account with GetObject
, PutObject
, and DeleteObject
permissions.
Parameters
- aws_access_key_id (required): AWS credentials.
- aws_secret_access_key (required): AWS credentials.
- bucket (required): The name of the bucket that will receive the order output.
- aws_region (required): The region where the bucket lives in AWS.
- path_prefix (optional): An optional string that will prepend to the files delivered to the bucket. A forward slash (/) is treated as a folder. All other characters are added as a prefix to the files.
Example Request
"delivery": {
"type": "amazon_s3",
"parameters": {
"bucket": "foo-bucket",
"aws_region": "us-east-2",
"aws_access_key_id": "",
"aws_secret_access_key": "",
"path_prefix": "folder1/prefix"
}
}
Delivery to Microsoft Azure¶
For Microsoft Azure delivery you will need an Azure account with read
, write
, delete
, and list
permissions.
Parameters
- account (required): Azure account name.
- container (required): The name of the container which will receive the order output.
- sas_token (required): Azure Shared Access Signature token. Token should be specified without a leading '?'. (I.e. "sv=2017-04-17u0026si=writersr=cu0026sig=LGqc" rather than "?sv=2017-04-17u0026si=writersr=cu0026sig=LGqc")
- storage_endpoint_suffix (optional): To deliver your order to a sovereign cloud a
storage_endpoint_suffix
should be set appropriately for your cloud. The default is "core.windows.net". - path_prefix (optional): An optional string that will prepend to the files delivered to the bucket. A forward slash (/) is treated as a folder. All other characters are added as a prefix to the files.
Example Request
"delivery": {
"type": "azure_blob_storage",
"parameters": {
"account": "accountname",
"container": "containername",
"sas_token": "sv=2017-04-17u0026si=writersr=cu0026sig=LGqc",
"storage_endpoint_suffix": "core.windows.net",
"path_prefix": "myprefix"
}
}
Delivery to Google Cloud Storage¶
For Google Cloud Storage delivery you will need an GCS account with write
and delete
permissions.
Preparing Your Google Cloud Storage Credentials
The Google Cloud Storage delivery option requires in a single-line base64 version of your service account credentials for use by the credentials
parameter.
To download your service account credentials in JSON format (not P12) and encode them as a base64 string, you can use a command line operation such as:
cat my_creds.json | base64 | tr -d '\n'
Parameters
- credentials (required): GCS credentials.
- bucket (required): The name of the GCS bucket which will receive the order output.
- path_prefix (optional): An optional string that will prepend to the files delivered to the bucket. A forward slash (/) is treated as a folder. All other characters are added as a prefix to the files.
Example Request
"delivery": {
"type": "google_cloud_storage",
"parameters": {
"bucket": "your-gcs-bucket",
"credentials": "c29tZWNyZWRzZm9yeW91cmdjc2J1Y2...",
"path_prefix":"optionalsubfolder1/optionalsubfolder2"
}
}
Delivery to Oracle Cloud Storage¶
For Oracle Cloud Storage delivery you need an Oracle account with read
, write
, and delete
permissions. For authentication you need a Customer Secret Key which consists of an Access Key/Secret Key pair.
Parameters
- customer_access_key_id (required): Customer Secret Key credentials.
- customer_secret_key (required): Customer Secret Key credentials.
- bucket (required): The name of the bucket that will receive the order output.
- region (required): The region where the bucket lives in Oracle.
- namespace (required): Object Storage namespace name.
- path_prefix (optional): An optional string that will prepend to the files delivered to the bucket. A forward slash (/) is treated as a folder. All other characters are added as a prefix to the files.
Example Request
"delivery": {
"type": "oracle_cloud_storage",
"parameters": {
"bucket": "foo-bucket",
"namespace": "ORACLE_NAMESPACE",
"region": "us-sanjose-1",
"customer_access_key_id": "YOUR_ACCESS_ID",
"customer_secret_key": "YOUR_SECRET_KEY",
"path_prefix": "folder1/prefix"
}
}
Delivery layout¶
When data is delivered to your cloud storage solution for your Subscription, we will deliver your files in accordance with the following layout scheme: <subscription_id>/<item_id>/...
For example, if we're delivering the file 20170716_144316_1041_3B_AnalyticMS.tif
for item 20170716_144316_1041
as output for subscription 0ee41665-ab3b-4faa-98d1-25738cdd579c
, the file will be delivered to the path:
0ee41665-ab3b-4faa-98d1-25738cdd579c/20170716_144316_1041/20170716_144316_1041_3B_AnalyticMS.tif
.
Rate this guide: