datacube.utils.dask.save_blob_to_s3#

datacube.utils.dask.save_blob_to_s3(data, url, profile=None, creds=None, region_name=None, with_deps=None, **kw)[source]#

Dump from memory to S3 as a dask delayed operation.

Parameters
  • data – Data blob to save to file (have to fit into memory all at once)

  • url – Url in a form s3://bucket/path/to/file

  • profile – Profile name to lookup (only used if session is not supplied)

  • creds – Override credentials with supplied data

  • region_name – Region name to use, overrides session setting

  • with_deps – Useful for introducing dependencies into dask graph, for example save yaml file after saving all tiff files.

  • kw – Passed on to s3.put_object(..), useful for things like ContentType/ACL

Returns

  • (url, True) tuple on success

  • (url, False) on any error