在Databricks

问题描述 投票:0回答:1

我收集了回购文件被复制的那一刻,我不知道该如何解决。有什么想法吗?

这些是任务设置:
resources:
  jobs:
    otd:
      name: otd
      email_notifications:
        on_failure:
          - [email protected]
        no_alert_for_skipped_runs: true
      notification_settings:
        no_alert_for_skipped_runs: true
        no_alert_for_canceled_runs: true
      tasks:
        - task_key: otd_dbt
          dbt_task:
            project_directory: ""
            commands:
              - dbt deps
              - dbt build -s +otd_total
            schema: gold
            warehouse_id: xxxxxxxxxxx
            catalog: logistics_prd
            source: GIT
          job_cluster_key: dbt_CLI
          libraries:
            - pypi:
                package: dbt-databricks>=1.0.0,<2.0.0
      job_clusters:
        - job_cluster_key: dbt_CLI
          new_cluster:
            cluster_name: ""
            spark_version: 15.4.x-scala2.12
            spark_conf:
              spark.master: local[*, 4]
              spark.databricks.cluster.profile: singleNode
            azure_attributes:
              first_on_demand: 1
              availability: ON_DEMAND_AZURE
              spot_bid_max_price: -1
            node_type_id: Standard_D4ds_v5
            custom_tags:
              ResourceClass: SingleNode
            spark_env_vars:
              PYSPARK_PYTHON: /databricks/python3/bin/python3
            enable_elastic_disk: true
            data_security_mode: SINGLE_USER
            runtime_engine: PHOTON
            num_workers: 0
      git_source:
        git_url: https://dev.azure.com/copa-energia/Logistics/_git/dbt_logistica
        git_provider: azureDevOpsServices
        git_branch: main
      queue:
        enabled: true

请随时向我询问更多细节。

事实证明,解决方案比我预期的要简单。
由于这些文件不是必需的,我可以简单地将它们从存储库中删除,然后将它们添加到.gitignore:

.gitignore ... venv/ __pycache__/

azure-devops databricks dbt
1个回答
0
投票

最新问题
© www.soinside.com 2019 - 2025. All rights reserved.