-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Create databricks catalog ext loc modules #614
feat: Create databricks catalog ext loc modules #614
Conversation
} | ||
databricks = { | ||
source = "databricks/databricks" | ||
version = "1.49.1" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
required for the isolation_mode
attribute for databricks_external_location
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Main concern here is sharing the same storage root between multiple catalogs, and around support for multiple external locations per databricks workspace
Other than that, this looks good - let's try it out!
Summary
This creates a bucket for databricks catalogs to use. It sets up the iam permissions that allows databricks access to read/write to the bucket. In order to associate a databricks catalog to a S3 bucket, it needs a storage credential which is connected to an external location (which is the S3 bucket).
The
databricks-catalog
module creates a new catalog associated to the bucket with the desired permissions.Test Plan
Tested modules in shared-infra. Will be removing from shared-infra after merging and releasing.