Delivery

last updated: October 24, 2024

The Subscriptions API supports delivery to Amazon S3, Microsoft Azure Blob Storage, Google Cloud Storage, Oracle Cloud Storage, or Sentinel Hub. Activation and processing for direct download is not currently supported.

For any cloud storage delivery option, create a cloud storage account with both write and delete access.

When you create a subscription or order with bucket delivery, Planet checks the bucket permissions linked to your token by first attempting to deliver a file named planetverify.txt and then immediately deleting it. If Planet has the adequate permissions, you will not see this file. If you do see this file in your buckets, we recommend that you review your permissions and make sure that Planet has both write and delete access.

Cloud Credential Security

When you create a subscription, input your credentials to create cloud delivery of Planet data. This can introduce a potential security risk. For secure handling of cloud service credentials in the request, make sure that access is limited to the desired delivery path with no read and write access for any other storage locations or cloud services.

Delivery schema

The schema for Subscriptions API delivery below.

Note

This schema varies slightly from the delivery schema of the Orders API.

"delivery": {
    "type": "cloud-storage-provider",
    "parameters": {
        "parameter1-name": "p1-value",
        "parameter2-name": "p2-value"
    }
}

Supported Delivery Options

Delivery to Amazon S3

For Amazon S3 delivery you will need an AWS account with GetObject, PutObject, and DeleteObject permissions.

Parameters

  • aws_access_key_id (required): AWS credentials.
  • aws_secret_access_key (required): AWS credentials.
  • bucket (required): The name of the bucket that will receive the order output.
  • aws_region (required): The region where the bucket lives in AWS.
  • path_prefix (optional): An optional string that will prepend to the files delivered to the bucket. A forward slash (/) is treated as a folder. All other characters are added as a prefix to the files.

Example Request

"delivery": {
    "type": "amazon_s3",
    "parameters": {
        "bucket": "foo-bucket",
        "aws_region": "us-east-2",
        "aws_access_key_id": "",
        "aws_secret_access_key": "",
        "path_prefix": "folder1/prefix"
    }
}

Delivery to Microsoft Azure

For Microsoft Azure delivery you will need an Azure account with read, write, delete, and list permissions.

Parameters

  • account (required): Azure account name.
  • container (required): The name of the container which will receive the order output.
  • sas_token (required): Azure Shared Access Signature token. Token should be specified without a leading ?. (I.e. sv=2017-04-17u0026si=writersr=cu0026sig=LGqc rather than ?sv=2017-04-17u0026si=writersr=cu0026sig=LGqc)
  • storage_endpoint_suffix (optional): To deliver your order to a sovereign cloud a storage_endpoint_suffix should be set appropriately for your cloud. The default is core.windows.net.
  • path_prefix (optional): An optional string that will prepend to the files delivered to the bucket. A forward slash (/) is treated as a folder. All other characters are added as a prefix to the files.

Example Request

"delivery": {
    "type": "azure_blob_storage",
    "parameters": {
        "account": "accountname",
        "container": "containername",
        "sas_token": "sv=2017-04-17u0026si=writersr=cu0026sig=LGqc",
        "storage_endpoint_suffix": "core.windows.net",
        "path_prefix": "myprefix"
    }
}

Delivery to Google Cloud Storage

For Google Cloud Storage delivery you will need an GCS account with write and delete permissions.

Preparing Your Google Cloud Storage Credentials

The Google Cloud Storage delivery option requires in a single-line base64 version of your service account credentials for use by the credentials parameter.

To download your service account credentials in JSON format (not P12) and encode them as a base64 string, you can use a command line operation such as:

cat my_creds.json | base64 | tr -d '\n'

Parameters

  • credentials (required): GCS credentials.
  • bucket (required): The name of the GCS bucket which will receive the order output.
  • path_prefix (optional): An optional string that will prepend to the files delivered to the bucket. A forward slash (/) is treated as a folder. All other characters are added as a prefix to the files.

Example Request

"delivery": {
    "type": "google_cloud_storage",
    "parameters": {
        "bucket": "your-gcs-bucket",
        "credentials": "c29tZWNyZWRzZm9yeW91cmdjc2J1Y2...",
        "path_prefix":"optionalsubfolder1/optionalsubfolder2"
    }
}

Delivery to Oracle Cloud Storage

For Oracle Cloud Storage delivery you need an Oracle account with read, write, and delete permissions. For authentication you need a Customer Secret Key which consists of an Access Key/Secret Key pair.

Parameters

  • customer_access_key_id (required): Customer Secret Key credentials.
  • customer_secret_key (required): Customer Secret Key credentials.
  • bucket (required): The name of the bucket that will receive the order output.
  • region (required): The region where the bucket lives in Oracle.
  • namespace (required): Object Storage namespace name.
  • path_prefix (optional): An optional string that will prepend to the files delivered to the bucket. A forward slash (/) is treated as a folder. All other characters are added as a prefix to the files.

Example Request

"delivery": {
    "type": "oracle_cloud_storage",
    "parameters": {
        "bucket": "foo-bucket",
        "namespace": "ORACLE_NAMESPACE",
        "region": "us-sanjose-1",
        "customer_access_key_id": "YOUR_ACCESS_ID",
        "customer_secret_key": "YOUR_SECRET_KEY",
        "path_prefix": "folder1/prefix"
    }
}

Delivery to Google Earth Engine

For Google Earth Engine (GEE) delivery, follow the steps found on our GEE Setup Guide. Once set up, see Subscriptions Delivery to GEE.

Planet's GEE Delivery Integration simplifies the process of incorporating Planet data into GEE projects by creating a direct connection between Planet’s Subscriptions API and GEE. To use the integration, users must sign up for an Earth Engine account, create a Cloud Project, enable the Earth Engine API, and grant a Google service account access to deliver data to their GEE project.

Delivery to Sentinel Hub Collection

For Sentinel Hub collection delivery you must first link your Planet user to your Sentinel Hub user in order to deliver a Planet Subscription to a Sentinel Hub Collection. Please follow the steps here.

Once you have linked your Planet & Sentinel Hub accounts you will be able to create a Subscription via the Subscriptions API to deliver to a Sentinel Hub Collection.

Parameters

  • type (required): "sentinel_hub"
  • collection_id (optional): Enter your specific collection_id if you have one.

Additional guidelines:

It is recommended to leave the collection_id field blank if you do not have a specific collection in mind. In such cases, a new collection will be created for you. The new collection ID will be provided in the response, and the name of the collection will correspond to the name of your subscription.

If you choose to provide a collection_id, please ensure that the collection specified is fully compatible with the subscription you are creating. A validation check will be performed when you create the subscription to confirm this.

To reuse a collection across multiple subscriptions with the same data type, first omit collection_id in your initial request to auto-create a collection. Then, use the the new collection_id for all subsequent requests. This links all subscriptions to the same collection efficiently. Importantly, subscriptions with different data types cannot share a collection. As an example, Soil Water Content and Land Surface Temperature subscriptions cannot share the same collection.

You can browse your collections on the Sentinel Hub Dashboard under My Collections.

To learn more about creating collections check out the Bring Your Own COG API documentation.

Example Request

{
   "name": "PSScene8Band - SentinelHub",
   "source": {
       "type": "catalog",
       "parameters": {
           "item_types": [
               "PSScene"
           ],
           "asset_types": [
               "ortho_analytic_8b",
               "ortho_udm2"
           ],
           "start_time": "2023-08-05T00:00:00Z",
           "end_time": "2023-08-15T00:00:00Z",
           "time_range_type": "acquired",
           "geometry": {
               "type": "Polygon",
               "coordinates": [
                   [[2.31, 47.10],
                    [2.31, 47.13],
                    [2.35, 47.13],
                    [2.35, 47.10],
                    [2.31, 47.10]]
               ]
           }
       }
   },
   "hosting": {
       "type": "sentinel_hub"
   }
}

Example Request to Existing Sentinel Hub Collection

{
   "name": "PSScene8Band - SentinelHub Existing Collection",
   "source": {
       "type": "catalog",
       "parameters": {
           "item_types": [
               "PSScene"
           ],
           "asset_types": [
               "ortho_analytic_8b"
           ],
           "start_time": "2023-08-05T00:00:00Z",
           "end_time": "2023-08-15T00:00:00Z",
           "time_range_type": "acquired",
           "geometry": {
               "type": "Polygon",
               "coordinates": [
                   [[2.31, 47.10],
                    [2.31, 47.13],
                    [2.35, 47.13],
                    [2.35, 47.10],
                    [2.31, 47.10]]
               ]
           }
       }
   },
   "hosting": {
       "type": "sentinel_hub",
       "parameters": {
           "collection_id": "my_collection_id" //optional
       }
   }
}

Please note the following:

  • For imagery subscriptions the following tools are permitted:
    • harmonize
    • toar
    • cloud_filter
  • The clip and file_format (COG) tools are automatically added and cannot be removed.
  • No tools are supported for Planetary Variable subscriptions.
  • The hosting block eliminates the need to use the delivery block. Specifying both is not allowed.

Delivery layout

When data is delivered to your cloud storage solution for your Subscription, we will deliver your files in accordance with the following layout scheme: <subscription_id>/<item_id>/...

For example, if we're delivering the file 20170716_144316_1041_3B_AnalyticMS.tif for item 20170716_144316_1041 as output for subscription 0ee41665-ab3b-4faa-98d1-25738cdd579c, the file will be delivered to the path: 0ee41665-ab3b-4faa-98d1-25738cdd579c/20170716_144316_1041/20170716_144316_1041_3B_AnalyticMS.tif.



We are continually working to improve our technical documentation and support. Please help by sharing your experience with us.

Send Us Your Feedback