Pipe to export Datastore to Cloud Storage.
Add the following snippet to the script section of your bitbucket-pipelines.yml
file:
- pipe: tohero/google-datastore-export:1.0.0
variables:
KEY_FILE: '<string>'
PROJECT: '<string>'
DESTINATION: '<string>'
# DEBUG: '<boolean>' # Optional.
Variable | Usage |
---|---|
KEY_FILE (*) | base64 encoded content of Key file for a Google service account. To encode this content, follow encode private key doc. |
PROJECT (*) | The Project ID of the project that owns the app to deploy. |
DESTINATION (*) | Cloud storage destination (without gs://). |
DEBUG | Turn on extra debug information. Default false . |
(*) = required variable.
With the Google Datastore export pipe you can export your Datastore into Cloud Storage destination. All gcloud components are installed.
- An IAM user is configured with sufficient permissions to perform a gcloud datastore export and write on Cloud Storage destination.
- You have enabled APIs and services needed for your application.
script:
- pipe: tohero/google-datastore-export:1.0.0
variables:
KEY_FILE: $KEY_FILE
PROJECT: 'my-project'
DESTINATION: 'export.my-project.com/my-export'
docker build -t tohero/google-datastore-export .
docker push tohero/google-datastore-export
Apache 2.0 licensed, see LICENSE.txt file.