It is possible to connect to multiple Data Cube indexes from within the one python process.
When initialising a
Datacube instance, it will load configuration options from one or more
config files. These configuration options define which indexes are available, and any parameters required to connect
Types of Indexes¶
It is possible to implement custom index driver and hook it into the datacube
via plugin mechanism. This is an experimental feature that was used to
S3 AIO format. The index driver interface however is not
well defined and it is unrealistic to implement a completely new backend. One
could however extend existing PostgreSQL backend, and this was the strategy used
S3 AIO driver before it got decommissioned.
The type of index driver to use is defined by the
index_driver option in
each section of the user config file.
The runtime config specifies configuration options for the current user, such as available Data Cube instances and which to use by default.
This is loaded from the following locations in order, if they exist, with properties from latter files overriding those in earlier ones:
[default] db_database: datacube # A blank host will use a local socket. Specify a hostname (such as localhost) to use TCP. db_hostname: # Credentials are optional: you might have other Postgres authentication configured. # The default username is the current user id # db_username: # A blank password will fall back to default postgres driver authentication, such as reading your ~/.pgpass file. # db_password: index_driver: pg ## Development environment ## [dev] # These fields are all the defaults, so they could be omitted, but are here for reference db_database: datacube # A blank host will use a local socket. Specify a hostname (such as localhost) to use TCP. db_hostname: # Credentials are optional: you might have other Postgres authentication configured. # The default username is the current user id # db_username: # A blank password will fall back to default postgres driver authentication, such as reading your ~/.pgpass file. # db_password: ## Staging environment ## [staging] db_hostname: staging.dea.ga.gov.au
Note that the
staging environment only specifies the hostname, all other fields will use default values (database
datacube, current username, password loaded from
When using the datacube, it will use your default environment unless you specify one explicitly
with Datacube(env='staging') as dc: ...
or for cli commmands
datacube -E staging system check
Configuration via Environment Variables¶
It is also possible to configure datacube with a single environment variable:
DATACUBE_DB_URL. This is often convenient when using datacube applications
inside a docker image. Format of the URL is the same as used by SQLAclchemy:
is compulsory. Note that
password is url encoded, so it can contain special
characters. For more information you can consult SQLAlchemy documentations
Connect to local database
datacubevia UNIX socket.
Connect to database
db1on a remote server
db.host.tldon the default port (5432) using
ro_userusername with password
Same as above but using port