在数据流上运行自定义模板时出现“无法解析文件”错误

问题描述 投票:4回答:1

我正在尝试编写自定义模板来读取CSV并将其输出到另一个CSV。目标是在此CSV中选择所需的数据。当我在Web界面上运行它时出现以下错误

我尽可能地减少了代码以理解我的错误,但我仍然没有看到它。我帮助自己完成了文档:https://cloud.google.com/dataflow/docs/guides/templates/creating-templates#creating-and-staging-templates

class UploadOptions(PipelineOptions):
    @classmethod
    def _add_argparse_args(cls, parser):
        parser.add_value_provider_argument(
            '--input',
            default='gs://[MYBUCKET]/input.csv',
            help='Path of the file to read from')
        parser.add_value_provider_argument(
            '--output',
            required=True,
            help='Output file to write results to.')

pipeline_options = PipelineOptions(['--output', 'gs://[MYBUCKET]/output'])
p = beam.Pipeline(options=pipeline_options)
upload_options = pipeline_options.view_as(UploadOptions)

(p
    | 'read' >> beam.io.Read(upload_options.input)
    | 'Write' >> beam.io.WriteToText(upload_options.output, file_name_suffix='.csv'))

当前错误如下

无法解析文件'gs://MYBUCKET/template.py'。

在终端我有以下错误

错误:(gcloud.dataflow.jobs.run)FAILED_PRECONDITION:无法解析模板文件'gs:// [MYBUCKET] /template.py'。 - '@type':type.googleapis.com/google.rpc.PreconditionFailure违规: - 说明:“意外的流结束:预期'{'”主题:0:0类型:JSON

先感谢您

python google-cloud-platform google-cloud-dataflow
1个回答
1
投票

我设法解决了我的问题。问题来自我在Read of my pipeline中使用的变量。 custom_options变量必须在Read而不是known_args变量中使用

custom_options = pipeline_options.view_as(CustomPipelineOptions)

我制作了一个通用代码,如果有人需要,我会分享我的解决方案。

from __future__ import absolute_import
import argparse

import apache_beam as beam
from apache_beam.io import ReadFromText
from apache_beam.io import WriteToText
from apache_beam.metrics.metric import MetricsFilter
from apache_beam.options.pipeline_options import PipelineOptions, GoogleCloudOptions, SetupOptions

class CustomPipelineOptions(PipelineOptions):
    """
    Runtime Parameters given during template execution
    path and organization parameters are necessary for execution of pipeline
    campaign is optional for committing to bigquery
    """
    @classmethod
    def _add_argparse_args(cls, parser):
        parser.add_value_provider_argument(
            '--path',
            type=str,
            help='Path of the file to read from')
        parser.add_value_provider_argument(
            '--output',
            type=str,
            help='Output file if needed')

def run(argv=None):
    parser = argparse.ArgumentParser()
    known_args, pipeline_args = parser.parse_known_args(argv)

    global cloud_options
    global custom_options

    pipeline_options = PipelineOptions(pipeline_args)
    cloud_options = pipeline_options.view_as(GoogleCloudOptions)
    custom_options = pipeline_options.view_as(CustomPipelineOptions)
    pipeline_options.view_as(SetupOptions).save_main_session = True

    p = beam.Pipeline(options=pipeline_options)

    init_data = (p
                        | 'Hello World' >> beam.Create(['Hello World'])
                        | 'Read Input path' >> beam.Read(custom_options.path)
                 )

    result = p.run()
    # result.wait_until_finish

if __name__ == '__main__':
    run()

然后启动以下命令以在GCP上生成模板

python template.py --runner DataflowRunner --project $PROJECT --staging_location gs://$BUCKET/staging --temp_location gs://$BUCKET/temp --
template_location gs://$BUCKET/templates/$TemplateName
© www.soinside.com 2019 - 2024. All rights reserved.