Skip to content

Conversation

@alexott
Copy link
Contributor

@alexott alexott commented Nov 14, 2025

Changes

Move defaults for databricks_spark_version data source from Go SDK to the Terraform. First part of #5218 work - after new Go SDK is merged, we'll need to change Scala default to 2.1.

Related to databricks/databricks-sdk-go#1331

Tests

  • make test run locally
  • relevant change in docs/ folder
  • covered with integration tests in internal/acceptance
  • using Go SDK
  • using TF Plugin Framework
  • has entry in NEXT_CHANGELOG.md file

Move defaults for `databricks_spark_version` data source from Go SDK to the
Terraform. First part of #5218 work - after new Go SDK is merged, we'll need to change
Scala default to `2.1`.

Related to databricks/databricks-sdk-go#1331
@alexott alexott requested review from a team as code owners November 14, 2025 17:52
@alexott alexott requested review from rauchy and removed request for a team November 14, 2025 17:52
@github-actions
Copy link

If integration tests don't run automatically, an authorized user can run them manually by following the instructions below:

Trigger:
go/deco-tests-run/terraform

Inputs:

  • PR number: 5219
  • Commit SHA: 33d69c17e46f3a158720a1f0e1458371080acabd

Checks will be approved automatically on success.

@alexott alexott deployed to test-trigger-is November 14, 2025 17:53 — with GitHub Actions Active
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants