• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java BlockingDataflowPipelineRunner类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner的典型用法代码示例。如果您正苦于以下问题:Java BlockingDataflowPipelineRunner类的具体用法?Java BlockingDataflowPipelineRunner怎么用?Java BlockingDataflowPipelineRunner使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



BlockingDataflowPipelineRunner类属于com.google.cloud.dataflow.sdk.runners包,在下文中一共展示了BlockingDataflowPipelineRunner类的9个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: setupPipeline

import com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner; //导入依赖的package包/类
private Pipeline setupPipeline(final String inputPath, final String outputPath, boolean enableGcs, boolean enableCloudExec) {
    final GATKGCSOptions options = PipelineOptionsFactory.as(GATKGCSOptions.class);
    if (enableCloudExec) {
        options.setStagingLocation(getGCPTestStaging());
        options.setProject(getGCPTestProject());
        options.setRunner(BlockingDataflowPipelineRunner.class);
    } else if (BucketUtils.isHadoopUrl(inputPath) || BucketUtils.isHadoopUrl(outputPath)) {
        options.setRunner(SparkPipelineRunner.class);
    } else {
        options.setRunner(DirectPipelineRunner.class);
    }
    if (enableGcs) {
        options.setApiKey(getGCPTestApiKey());
    }
    final Pipeline p = Pipeline.create(options);
    DataflowUtils.registerGATKCoders(p);
    return p;
}
 
开发者ID:broadinstitute,项目名称:gatk-dataflow,代码行数:19,代码来源:SmallBamWriterTest.java


示例2: getCloudExecutionOptions

import com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner; //导入依赖的package包/类
private PipelineOptions getCloudExecutionOptions(String stagingLocation) {
  DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class);
  options.setProject(Constants.PROJECT_ID);
  options.setStagingLocation(stagingLocation);
  options.setRunner(BlockingDataflowPipelineRunner.class);
  return options;
}
 
开发者ID:GoogleCloudPlatform,项目名称:policyscanner,代码行数:8,代码来源:LiveStateCheckerApp.java


示例3: getCloudExecutionOptions

import com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner; //导入依赖的package包/类
private PipelineOptions getCloudExecutionOptions(String stagingLocation) {
  DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class);
  options.setProject(SystemProperty.applicationId.get());
  options.setStagingLocation(stagingLocation);
  options.setRunner(BlockingDataflowPipelineRunner.class);
  return options;
}
 
开发者ID:GoogleCloudPlatform,项目名称:policyscanner,代码行数:8,代码来源:UserManagedKeysApp.java


示例4: getCloudExecutionOptions

import com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner; //导入依赖的package包/类
private static PipelineOptions getCloudExecutionOptions(String stagingLocation) {
  DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class);
  options.setProject(SystemProperty.applicationId.get());
  options.setStagingLocation(stagingLocation);
  options.setRunner(BlockingDataflowPipelineRunner.class);
  return options;
}
 
开发者ID:GoogleCloudPlatform,项目名称:policyscanner,代码行数:8,代码来源:LiveStateCheckerRunner.java


示例5: runner

import com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner; //导入依赖的package包/类
private static Class<? extends PipelineRunner<?>> runner(String name) {
  Class<? extends PipelineRunner<?>> c = DirectPipelineRunner.class; // default

  if (DEFAULT_RUNNER.equals(name) || name == null) {
    c = DataflowPipelineRunner.class;
  } else if (BLOCKING_RUNNER.equals(name)) {
    c = BlockingDataflowPipelineRunner.class;
  } else if (DIRECT_RUNNER.equals(name)) {
    c = DirectPipelineRunner.class;
  }
  return c;
}
 
开发者ID:googlegenomics,项目名称:dockerflow,代码行数:13,代码来源:DataflowFactory.java


示例6: run

import com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner; //导入依赖的package包/类
public static void run() {
  DataflowPipelineOptions options = PipelineOptionsFactory.create()
      .as(DataflowPipelineOptions.class);
  options.setRunner(BlockingDataflowPipelineRunner.class);
  options.setProject("chrome-oven-144308");
  options.setFilesToStage(
      detectClassPathResourcesToStage(
          DataflowPipelineRunner.class.getClassLoader()
      )
  );
  options.setStagingLocation("gs://dataflow-chrome-oven-144308/stagingForScheduledPipeline");

  Pipeline p = Pipeline.create(options);

  System.out.println("get here 0");
  p.apply(TextIO.Read.from("gs://dataflow-samples/shakespeare/*"))
      .apply(ParDo.named("ExtractWords").of(new DoFn<String, String>() {
        @Override
        public void processElement(ProcessContext c) {
          System.out.println("get here 1");
          for (String word : c.element().split("[^a-zA-Z']+")) {
            if (!word.isEmpty()) {
              c.output(word);
            }
          }
        }
      }))
      .apply(Count.<String>perElement())
      .apply("FormatResults", MapElements.via(new SimpleFunction<KV<String, Long>, String>() {
        @Override
        public String apply(KV<String, Long> input) {
          System.out.println("get here 3");
          return input.getKey() + ": " + input.getValue();
        }
      }))

      .apply(TextIO.Write.to("gs://dataflow-chrome-oven-144308/scheduled"));

  p.run();
}
 
开发者ID:viktort,项目名称:appengine-cron-example,代码行数:41,代码来源:ScheduledMinimalWordCount.java


示例7: run

import com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner; //导入依赖的package包/类
public static void run(String[] args) {
  System.out.println("Making GCS->Datastore pipeline");

  Options options = PipelineOptionsFactory.fromArgs(args)
      .withValidation()
      .as(Options.class);

  if (options.getIsBlocking()) {
    options.setRunner(BlockingDataflowPipelineRunner.class);
  } else {
    options.setRunner(DataflowPipelineRunner.class);
    options.setStreaming(false);
  }

  // Create Datastore sink
  DatastoreV1.Write write = DatastoreIO.v1().write()
      .withProjectId(options.getProject());


  // Build our data pipeline
  Pipeline pipeline = Pipeline.create(options);
  pipeline.apply("ReadBackup", TextIO.Read.from(options.getBackupGCSPrefix() + "*"))
  .apply("JsonToEntity", ParDo.of(new JsonToEntity()))
  .apply("EntityToDatastore", write);

  System.out.println("Running pipeline");
  pipeline.run();
}
 
开发者ID:cobookman,项目名称:DatastoreToGCS,代码行数:29,代码来源:GCSRestore.java


示例8: run

import com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner; //导入依赖的package包/类
public static void run(String[] args) {
  System.out.println("Making Datastore->GCS pipeline");

  Options options = PipelineOptionsFactory.fromArgs(args)
      .withValidation()
      .as(Options.class);

  if (options.getIsBlocking()) {
    options.setRunner(BlockingDataflowPipelineRunner.class);
  } else {
    options.setRunner(DataflowPipelineRunner.class);
    options.setStreaming(false);
  }


  // Add Timestamp + Entity Kind to backup files
  DateFormat dateFormat = new SimpleDateFormat("yyyy.MM.dd_HH.mm.ss");
  String outputLocation = String.format("%s.%s",
      options.getDatastoreEntityKind() ,
      dateFormat.format(new Date()));

  if (options.getBackupGCSPrefix().endsWith("/")) {
    outputLocation = options.getBackupGCSPrefix() + outputLocation + "/";
  } else {
    outputLocation = options.getBackupGCSPrefix() + "." + outputLocation + "/";
  }

  // Build our Datastore query.
  // Right now we are simply grabbing all Datastore records of a given kind,
  Query.Builder queryBuilder = Query.newBuilder();
  queryBuilder.addKindBuilder().setName(options.getDatastoreEntityKind());
  Query query = queryBuilder.build();

  // Generate the Datastore Read Source
  DatastoreV1.Read read = DatastoreIO.v1().read()
      .withProjectId(options.getProject())
      .withQuery(query);

  // Build our data pipeline
  Pipeline pipeline = Pipeline.create(options);
  pipeline.apply("IngestEntities", read)
          .apply("EntityToJson", ParDo.of(new DatastoreToJson()))
          .apply("WriteJson", TextIO.Write.to(outputLocation)
              .withSuffix(".json"));

  System.out.println("Running pipeline");
  pipeline.run();
}
 
开发者ID:cobookman,项目名称:DatastoreToGCS,代码行数:49,代码来源:GCSBackup.java


示例9: run

import com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner; //导入依赖的package包/类
public static void run(String[] args) throws Exception {
  System.out.println("Making Datastore->GCS pipeline");

  Options options = PipelineOptionsFactory.fromArgs(args)
      .withValidation()
      .as(Options.class);

  if (options.getIsBlocking()) {
    options.setRunner(BlockingDataflowPipelineRunner.class);
  } else {
    options.setRunner(DataflowPipelineRunner.class);
    options.setStreaming(false);
  }

  String date = (new SimpleDateFormat("yyyMMdd")).format(new Date());

  // make string in format: "my-project:dataset.entity_date"
  String tableName = String.format("%s:%s.%s_%s",
      options.getProject(),
      options.getBQDataset(),
      options.getDatastoreEntityKind(),
      date);
  System.out.println("Destination BigQuery Table is: " + tableName);

  // Build our Datastore query.
  // Right now we are simply grabbing all Datastore records of a given kind,
  Query.Builder queryBuilder = Query.newBuilder();
  queryBuilder.addKindBuilder().setName(options.getDatastoreEntityKind());
  Query query = queryBuilder.build();

  // Generate the Datastore Read Source
  DatastoreV1.Read read = DatastoreIO.v1().read()
      .withProjectId(options.getProject())
      .withQuery(query);
  
  // Build our data pipeline
  Pipeline pipeline = Pipeline.create(options);
  pipeline.apply("IngestEntities", read)
          .apply("EntityToTableRows", ParDo.of(new DatastoreToTableRow()))
          .apply("WriteTableRows", BigQueryIO.Write
              .to(tableName)
              .withSchema(new BQSchema().getTableSchema())
              .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE)
              .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED));

  System.out.println("Running pipeline");
  pipeline.run();
}
 
开发者ID:cobookman,项目名称:DatastoreToGCS,代码行数:49,代码来源:BQBackup.java



注:本文中的com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java EmptySet类代码示例发布时间:2022-05-22
下一篇:
Java OnDrawListener类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap