Open
Description
Describe the issue
We build a wheel with setup.py
, incrementing version as we go. We prefer to lock our jobs to specific versions of the wheel to prevent errors down the road.
If we create a job locked to version 0.1.0
and bumping it to 0.2.0
, the deployement will fail. Running databricks bundle deploy
cleans up the dist
directory, causing the deployment to fail with error
Error: no files match pattern: dist/core-0.1.0*.whl
at resources.jobs.job_locked_version.tasks[1].libraries[0].whl
in resources\jobs\job_locked_version.yml:21:15
If the deployment were to go through, running the job would also fail due to the missing wheel in .internal
.
Configuration
Job locked to a specific version:
resources:
jobs:
job_locked_version:
name: job_locked_version
tasks:
- task_key: some_task
job_cluster_key: job_cluster
notebook_task:
notebook_path: ../../src/integration/nb_integration_example.ipynb
libraries:
- whl: ../../dist/core-0.1.0*.whl
job_clusters:
- job_cluster_key: job_cluster
new_cluster:
spark_version: 15.4.x-scala2.12
azure_attributes:
availability: ON_DEMAND_AZURE
node_type_id: Standard_D3_v2
enable_elastic_disk: true
num_workers: 1
data_security_mode: SINGLE_USER
Steps to reproduce the behavior
- Prep the bundle so it builds a wheel.
- Create a job locked to a specific version of the wheel.
- Deploy to verify success.
- Bump the version in the wheel.
- Redeploy
Expected Behavior
- Ability to disable DABs validation of wheel existence in
dist
. - Previous wheel versions are retained in
.internal
.
Actual Behavior
- Deploy fails due to missing wheel in
dist
. - Job runs fail due to missing wheel in
.internal
.
OS and CLI version
- Databricks CLI v0.252.0
- Windows 11 Enterprise
Is this a regression?
Yes for the problem related to .internal
, changed in #2526.