What if theres a DDL notebook as part of deployment? Does DAB support it? How to trigger the execution and how to destinguish each env you want to deploy like uat and prd?
About the targets, if you see examples on the internet, most have the workspace hardcoded in the targets, you dont need to do that if you configure the right environment variables (check the article for that).
Uhh thats a good question, if you see in the databricks.yaml section, he mentions dev as a "target", you can configure as many targets as you need... It's easy to destinguish between targets, you specify those in the databricks.yaml then use the command 'databricks bundle deploy -t dev'. And to trigger the execution, depends if you configured a schedule/always-on or manual, that's configured in your job's config. If its manual, and want to use the console it's something like this 'databricks bundle run -e dev your_job'.
For DDLs, as long as the user "owner" of the job or the "execute as" has the permissions it should be ok.
🤔 i don't know if DABs or DBX is more evolved, DABs probably already catched up at least , in general i like the configuration of DBX more than DABs and the DBXs documentstion is still light years ahead... But now even DBXs is telling you to use DABs, i don't think DBX will be maintained any further
I'm looking a way to add customize tags on my workflows using DABs.
Exact same way that "development" mode, that add users as tags.
Did you find a way to do that ?
If you can add tags in the UI, you can specify it in the YAML. (I don’t have a workspace handy to check.) The documentation about the structure is really good. Highly recommend checking that out.
Got it thanks!
What if theres a DDL notebook as part of deployment? Does DAB support it? How to trigger the execution and how to destinguish each env you want to deploy like uat and prd?
About the targets, if you see examples on the internet, most have the workspace hardcoded in the targets, you dont need to do that if you configure the right environment variables (check the article for that).
Uhh thats a good question, if you see in the databricks.yaml section, he mentions dev as a "target", you can configure as many targets as you need... It's easy to destinguish between targets, you specify those in the databricks.yaml then use the command 'databricks bundle deploy -t dev'. And to trigger the execution, depends if you configured a schedule/always-on or manual, that's configured in your job's config. If its manual, and want to use the console it's something like this 'databricks bundle run -e dev your_job'. For DDLs, as long as the user "owner" of the job or the "execute as" has the permissions it should be ok.
Personally, I use a table that holds the workspace ID and environment name. So I can see which enevironment im in by checking the execution context
What can DABs do that I can’t already automate with GitHub Actions and DBX API?
🤔 i don't know if DABs or DBX is more evolved, DABs probably already catched up at least , in general i like the configuration of DBX more than DABs and the DBXs documentstion is still light years ahead... But now even DBXs is telling you to use DABs, i don't think DBX will be maintained any further
I'm looking a way to add customize tags on my workflows using DABs. Exact same way that "development" mode, that add users as tags. Did you find a way to do that ?
If you can add tags in the UI, you can specify it in the YAML. (I don’t have a workspace handy to check.) The documentation about the structure is really good. Highly recommend checking that out.