-
Notifications
You must be signed in to change notification settings - Fork 108
Added 'databricks bundle plan' command #3530
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
cmd.Flags().StringVarP(&clusterId, "cluster-id", "c", "", "Override cluster in the deployment with the given cluster ID.") | ||
cmd.Flags().MarkDeprecated("compute-id", "use --cluster-id instead") | ||
|
||
cmd.PreRunE = func(cmd *cobra.Command, args []string) error { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
quick question - why it is in PreRunE and not RunE?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just to separate these two and run the check earlier
|
|
||
func newPlanCommand() *cobra.Command { | ||
cmd := &cobra.Command{ | ||
Use: "plan", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the exact expected behavior of bundle plan?
Running it locally:
$ databricks bundle plan
Building python_artifact...
Uploading dist/my_project-0.0.1-py3-none-any.whl...
create jobs.my_project_job
delete jobs.sample_job
create pipelines.my_project_pipeline
delete pipelines.sample_etl
- It seems odd that it builds artifacts as a side effect (but maybe this is necessary?)
It seems odd that it uploads artifacts as a side effect(edit: this appears to be fixed in Fixbundle plan
to not create workspace objects or upload the files #3442)- Should there not be a summary line, like in Terraform (
Plan: 2 to add, 0 to change, 2 to destroy
)? - Should we not use the term "destroy" instead of "delete"? Since that is also the term used for the
databricks bundle destroy
?
We need to be certain of the interface before making it public
cc @denik
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Building an artifact is unfortunately required, because we don't know the final filename of the artifact and we need that filename in the resource configuration.
Uploading the artifact is certainly a bug and we should have acceptance test + fix for it. I think we should fix it before the release.
Summary is nice to have but could be a follow up.
Regarding delete/destroy, this is not new, this is what DABs always used in other places:
The following resources will be deleted:
delete app myapp
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@lennartkats-db are you sure you're using latest main? Not uploading artifacts as part of the plan was fixed here #3442
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool, yes, I'm not using the latest main, good to see uploading is fixed.
It still seems to build the artifact though, but I suppose that might be necessary? It's quite unfortunate it prints a message about but if it is necessary then it should offer that transparency.
And delete
is indeed used in other places. I also kind of like that word. But we should treat this moment as the time where we decide to cast that in
stone.
The summary I'd really want to have and seems easy to add.
One more observation: the output of the command should go to stdout, not stderr.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we not use the term "destroy" instead of "delete"? Since that is also the term used for the databricks bundle destroy?
Deletion can happen as a result of "bundle deploy" as well. It's a more specific and recognizable term IMO. All APIs and SDKs call call this deletion, not destruction.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool. So how about
- We make sure output goes to stdout (this is the most crucial change)
- We add that summary line item
- We maybe do some minor formatting tweaks (to make the summary and llog line stand out + follow the two-space indent of existing warnings)
So output could look like
$ databricks bundle plan
Building python_artifact... ### this would still go to stderr
Plan: 2 to add, 0 to change, 2 to delete
create jobs.my_project_job
delete jobs.sample_job
create pipelines.my_project_pipeline
delete pipelines.sample_etl
(I'd also be up for variations of this exact output, but I'm proposing example output to try close on this.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here's PR to improve the plan output #3546
Co-authored-by: Lennart Kats (databricks) <[email protected]>
## Release v0.270.0 ### Notable Changes * Add 'databricks bundle plan' command. This command shows the deployment plan for the current bundle configuration without making any changes. ([#3530](#3530)) ### Bundles * Add 'databricks bundle plan' command ([#3530](#3530)) * Add new Lakeflow Pipelines support for bundle generate ([#3568](#3568)) * Fix bundle deploy to not update permissions or grants for unbound resources ([#3642](#3642)) * Introduce new bundle variable: `${workspace.current_user.domain_friendly_name}` ([#3623](#3623)) * Improve the output of bundle run when bundle is not deployed ([#3652](#3652))
Changes
Made 'databricks bundle plan' command public (previously hidden)
Why
This command builds the bundle and displays the actions that will be performed on the deployed resources without making any changes.
It is useful for previewing changes before running
bundle deploy
.The output of the command is high level list of actions and resources which are changed, for example
Tests
Covered by existing tests