• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java Field类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中com.google.cloud.bigquery.Field的典型用法代码示例。如果您正苦于以下问题:Java Field类的具体用法?Java Field怎么用?Java Field使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



Field类属于com.google.cloud.bigquery包,在下文中一共展示了Field类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: createSchema

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
public Schema createSchema(){
 
 // TODO: Need fruther researching using a comman way to convert from Entity to BigQuery schema directly
 List<Field> fields = new ArrayList<Field>();
 fields.add(Field.of("sales_order_id", Field.Type.integer()));
 fields.add(Field.of("product_id", Field.Type.integer()));
 fields.add(Field.of("product_name", Field.Type.string()));
 fields.add(Field.of("product_price", Field.Type.floatingPoint()));
 fields.add(Field.of("user_id", Field.Type.integer()));
 
 fields.add(Field.of("user_firstname", Field.Type.string()));
 fields.add(Field.of("user_lastname", Field.Type.string()));
 fields.add(Field.of("company_id", Field.Type.integer()));
 fields.add(Field.of("company_name", Field.Type.string()));
 fields.add(Field.of("customer_id", Field.Type.integer()));
 
 fields.add(Field.of("customer_name", Field.Type.string()));
 fields.add(Field.of("status", Field.Type.string()));
 fields.add(Field.of("sales_order_date", Field.Type.timestamp()));
 fields.add(Field.of("sales_order_number", Field.Type.string()));

 Schema schema = Schema.of(fields);
 return schema;
}
 
开发者ID:michael-hll,项目名称:BigQueryStudy,代码行数:25,代码来源:BigQuerySnippets.java


示例2: convertSchema

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
/**
 * Convert the  kafka {@link Schema} to a BigQuery {@link com.google.cloud.bigquery.Schema}, with
 * the addition of an optional field for containing extra kafka data.
 *
 * @param kafkaConnectSchema The schema to convert. Must be of type Struct, in order to translate
 *                           into a row format that requires each field to consist of both a name
 *                           and a value.
 * @return the converted {@link com.google.cloud.bigquery.Schema}, including an extra optional
 *         field for the kafka topic, partition, and offset.
 */
public com.google.cloud.bigquery.Schema convertSchema(Schema kafkaConnectSchema) {
  com.google.cloud.bigquery.Schema.Builder schemaBuilder =
      super.convertSchema(kafkaConnectSchema).toBuilder();

  Field topicField = Field.of(KAFKA_DATA_TOPIC_FIELD_NAME, Field.Type.string());
  Field partitionField = Field.of(KAFKA_DATA_PARTITION_FIELD_NAME, Field.Type.integer());
  Field offsetField = Field.of(KAFKA_DATA_OFFSET_FIELD_NAME, Field.Type.integer());
  Field.Builder insertTimeBuilder = Field.newBuilder(KAFKA_DATA_INSERT_TIME_FIELD_NAME,
                                                  Field.Type.timestamp())
                                         .setMode(Field.Mode.NULLABLE);
  Field insertTimeField = insertTimeBuilder.build();

  Field.Builder kafkaDataFieldBuilder =
      Field.newBuilder(KAFKA_DATA_FIELD_NAME, Field.Type.record(topicField,
                                                             partitionField,
                                                             offsetField,
                                                             insertTimeField))
           .setMode(Field.Mode.NULLABLE);

  schemaBuilder.addField(kafkaDataFieldBuilder.build());

  return schemaBuilder.build();
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:34,代码来源:KafkaDataBQSchemaConverter.java


示例3: testDecimalConversion

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
@Test
public void testDecimalConversion() {
  DecimalConverter converter = new DecimalConverter();

  assertEquals(Field.Type.floatingPoint(), converter.getBQSchemaType());

  try {
    converter.checkEncodingType(Schema.Type.BYTES);
  } catch (Exception ex) {
    fail("Expected encoding type check to succeed.");
  }

  BigDecimal bigDecimal = new BigDecimal("3.14159");

  BigDecimal convertedDecimal = converter.convert(bigDecimal);

  // expecting no-op
  assertEquals(bigDecimal, convertedDecimal);
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:20,代码来源:KafkaLogicalConvertersTest.java


示例4: test

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
@Test
public void test() {
  Schema kafkaConnectTestSchema =
      SchemaBuilder.struct().field("base", Schema.STRING_SCHEMA).build();


  Field kafkaDataField = getKafkaDataField();
  Field baseField = Field.newBuilder("base",
                                  Field.Type.string()).setMode(Field.Mode.REQUIRED).build();
  com.google.cloud.bigquery.Schema bigQueryExpectedSchema =
      com.google.cloud.bigquery.Schema.of(baseField, kafkaDataField);

  com.google.cloud.bigquery.Schema bigQueryActualSchema =
      new KafkaDataBQSchemaConverter().convertSchema(kafkaConnectTestSchema);
  assertEquals(bigQueryExpectedSchema, bigQueryActualSchema);
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:17,代码来源:KafkaDataBQSchemaConverterTest.java


示例5: getKafkaDataField

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
private Field getKafkaDataField() {
  Field topicField = Field.of("topic", Field.Type.string());
  Field partitionField = Field.of("partition", Field.Type.integer());
  Field offsetField = Field.of("offset", Field.Type.integer());
  Field insertTimeField = Field.newBuilder("insertTime",Field.Type.timestamp())
                               .setMode(Field.Mode.NULLABLE)
                               .build();

  return Field.newBuilder("kafkaData",
                          Field.Type.record(topicField,
                                            partitionField,
                                            offsetField,
                                            insertTimeField))
              .setMode(Field.Mode.NULLABLE)
              .build();
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:17,代码来源:KafkaDataBQSchemaConverterTest.java


示例6: fieldsToMap

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
@Test
public void fieldsToMap() throws Exception {
  Schema schema = createTestSchema();
  List<FieldValue> fieldValues = createTestValues();

  BigQueryDelegate delegate = new BigQueryDelegate(mockBigquery, useLegacySql);
  LinkedHashMap<String, com.streamsets.pipeline.api.Field> map = delegate.fieldsToMap(schema.getFields(), fieldValues);
  assertTrue(map.containsKey("a"));
  assertEquals("a string", map.get("a").getValueAsString());
  assertArrayEquals("bytes".getBytes(), map.get("b").getValueAsByteArray());
  List<com.streamsets.pipeline.api.Field> c = map.get("c").getValueAsList();
  assertEquals(1L, c.get(0).getValueAsLong());
  assertEquals(2L, c.get(1).getValueAsLong());
  assertEquals(3L, c.get(2).getValueAsLong());
  assertEquals(2.0d, map.get("d").getValueAsDouble(), 1e-15);
  assertEquals(true, map.get("e").getValueAsBoolean());
  assertEquals(new Date(1351700038292L), map.get("f").getValueAsDatetime());
  assertEquals(new Date(1351700038292L), map.get("g").getValueAsDatetime());
  assertEquals(new Date(1351700038292L), map.get("h").getValueAsDatetime());
  assertEquals(new Date(1351700038292L), map.get("i").getValueAsDate());
  Map<String, com.streamsets.pipeline.api.Field> j = map.get("j").getValueAsListMap();
  assertEquals("nested string", j.get("x").getValueAsString());
  Map<String, com.streamsets.pipeline.api.Field> y = j.get("y").getValueAsListMap();
  assertEquals("z", y.get("z").getValueAsString());
}
 
开发者ID:streamsets,项目名称:datacollector,代码行数:26,代码来源:TestBigQueryDelegate.java


示例7: createTestSchema

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
public static Schema createTestSchema() {
  return Schema.newBuilder()
      .addField(Field.of("a", Field.Type.string()))
      .addField(Field.of("b", Field.Type.bytes()))
      .addField(Field.newBuilder("c", Field.Type.integer()).setMode(Field.Mode.REPEATED).build())
      .addField(Field.of("d", Field.Type.floatingPoint()))
      .addField(Field.of("e", Field.Type.bool()))
      .addField(Field.of("f", Field.Type.timestamp()))
      .addField(Field.of("g", Field.Type.time()))
      .addField(Field.of("h", Field.Type.datetime()))
      .addField(Field.of("i", Field.Type.date()))
      .addField(Field.of("j",
          Field.Type.record(
              Field.of("x", Field.Type.string()),
              Field.of("y", Field.Type.record(Field.of("z", Field.Type.string())))
          )
      )).build();
}
 
开发者ID:streamsets,项目名称:datacollector,代码行数:19,代码来源:TestBigQueryDelegate.java


示例8: guessBigQuerySchema

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
public TableSchema guessBigQuerySchema(org.apache.avro.Schema schema) {
    List<org.apache.avro.Schema.Field> fields = schema.getFields();
    if (fields.size() == 0) {
        return null;
    }
    List<TableFieldSchema> bqFields = new ArrayList<>();
    for (org.apache.avro.Schema.Field field : fields) {
        bqFields.add(tryArrayFieldSchema(field));
    }
    return new TableSchema().setFields(bqFields);
}
 
开发者ID:Talend,项目名称:components,代码行数:12,代码来源:BigQueryAvroRegistry.java


示例9: inferSchemaField

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
private org.apache.avro.Schema inferSchemaField(Field field) {
    String name = field.getName();
    Field.Type sqlType = field.getType();
    Field.Mode mode = field.getMode();

    // Get the "basic" type of the field.
    org.apache.avro.Schema fieldSchema = inferSchemaFieldWithoutMode(field);

    // BigQuery fields are NULLABLE by default.
    if (Field.Mode.NULLABLE == mode || mode == null) {
        fieldSchema = AvroUtils.wrapAsNullable(fieldSchema);
    } else if (Field.Mode.REPEATED == mode) {
        // Determine if the field is an array.
        // https://cloud.google.com/bigquery/docs/reference/standard-sql/data-types#array-type
        fieldSchema = SchemaBuilder.array().items(fieldSchema);
    }
    return fieldSchema;
}
 
开发者ID:Talend,项目名称:components,代码行数:19,代码来源:BigQueryAvroRegistry.java


示例10: createOneSalesOrderRow

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
public static JSONObject createOneSalesOrderRow(
		String sales_order_id,
		String product_id,
		String product_name,
		String product_price,
		String user_id,
		
		String user_firstname,
		String user_lastname,
		String company_id,
		String company_name,
		String customer_id,
		
		String customer_name,
		String status,
		String sales_order_date,
		String sales_order_number
		){
	JSONObject json = new JSONObject();
	Field.Mode.REQUIRED.toString();
   	json.put("sales_order_id", sales_order_id);
   	json.put("product_id", product_id);
   	json.put("product_name", product_name);
   	json.put("product_price", product_price);
   	json.put("user_id", user_id);
   	
   	json.put("user_firstname", user_firstname);
   	json.put("user_lastname", user_lastname);
   	json.put("company_id", company_id);
   	json.put("company_name", company_name);
   	json.put("customer_id", customer_id);
   	
   	json.put("customer_name", customer_name);
   	json.put("status", status);
   	json.put("sales_order_date", sales_order_date);
   	json.put("sales_order_number", sales_order_number);
   	
   	return json;
}
 
开发者ID:michael-hll,项目名称:BigQueryStudy,代码行数:40,代码来源:SalesOrderHelper.java


示例11: LogicalTypeConverter

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
/**
 * Create a new LogicalConverter.
 *
 * @param logicalName The name of the logical type.
 * @param encodingType The encoding type of the logical type.
 * @param bqSchemaType The corresponding BigQuery Schema type of the logical type.
 */
public LogicalTypeConverter(String logicalName,
                            Schema.Type encodingType,
                            Field.Type bqSchemaType) {
  this.logicalName = logicalName;
  this.encodingType = encodingType;
  this.bqSchemaType = bqSchemaType;
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:15,代码来源:LogicalTypeConverter.java


示例12: convertField

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
private Object convertField(Field fieldSchema, FieldValue field) {
  if (field.isNull()) {
    return null;
  }
  switch (field.getAttribute()) {
    case PRIMITIVE:
      switch (fieldSchema.getType().getValue()) {
        case BOOLEAN:
          return field.getBooleanValue();
        case BYTES:
          // Do this in order for assertEquals() to work when this is an element of two compared
          // lists
          return boxByteArray(field.getBytesValue());
        case FLOAT:
          return field.getDoubleValue();
        case INTEGER:
          return field.getLongValue();
        case STRING:
          return field.getStringValue();
        case TIMESTAMP:
          return field.getTimestampValue();
        default:
          throw new RuntimeException("Cannot convert primitive field type "
                                     + fieldSchema.getType());
      }
    case REPEATED:
      List<Object> result = new ArrayList<>();
      for (FieldValue arrayField : field.getRepeatedValue()) {
        result.add(convertField(fieldSchema, arrayField));
      }
      return result;
    case RECORD:
      List<Field> recordSchemas = fieldSchema.getFields();
      List<FieldValue> recordFields = field.getRecordValue();
      return convertRow(recordSchemas, recordFields);
    default:
      throw new RuntimeException("Unknown field attribute: " + field.getAttribute());
  }
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:40,代码来源:BigQueryConnectorIntegrationTest.java


示例13: convertRow

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
private List<Object> convertRow(List<Field> rowSchema, List<FieldValue> row) {
  List<Object> result = new ArrayList<>();
  assert (rowSchema.size() == row.size());

  for (int i = 0; i < rowSchema.size(); i++) {
    result.add(convertField(rowSchema.get(i), row.get(i)));
  }

  return result;
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:11,代码来源:BigQueryConnectorIntegrationTest.java


示例14: testDateConversion

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
@Test
public void testDateConversion() {
  DateConverter converter = new DateConverter();

  assertEquals(Field.Type.date(), converter.getBQSchemaType());

  try {
    converter.checkEncodingType(Schema.Type.INT32);
  } catch (Exception ex) {
    fail("Expected encoding type check to succeed.");
  }
  
  String formattedDate = converter.convert(DAYS_TIMESTAMP);
  assertEquals("2017-03-01", formattedDate);
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:16,代码来源:DebeziumLogicalConvertersTest.java


示例15: testMicroTimeConversion

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
@Test
public void testMicroTimeConversion() {
  MicroTimeConverter converter = new MicroTimeConverter();

  assertEquals(Field.Type.time(), converter.getBQSchemaType());

  try {
    converter.checkEncodingType(Schema.Type.INT64);
  } catch (Exception ex) {
    fail("Expected encoding type check to succeed.");
  }

  String formattedMicroTime = converter.convert(MICRO_TIMESTAMP);
  assertEquals("22:20:38.808123", formattedMicroTime);
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:16,代码来源:DebeziumLogicalConvertersTest.java


示例16: testMicroTimestampConversion

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
@Test
public void testMicroTimestampConversion() {
  MicroTimestampConverter converter = new MicroTimestampConverter();

  assertEquals(Field.Type.timestamp(), converter.getBQSchemaType());

  try {
    converter.checkEncodingType(Schema.Type.INT64);
  } catch (Exception ex) {
    fail("Expected encoding type check to succeed.");
  }

  String formattedMicroTimestamp = converter.convert(MICRO_TIMESTAMP);
  assertEquals("2017-03-01 22:20:38.808123", formattedMicroTimestamp);
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:16,代码来源:DebeziumLogicalConvertersTest.java


示例17: testTimeConversion

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
@Test
public void testTimeConversion() {
  TimeConverter converter = new TimeConverter();

  assertEquals(Field.Type.time(), converter.getBQSchemaType());

  try {
    converter.checkEncodingType(Schema.Type.INT32);
  } catch (Exception ex) {
    fail("Expected encoding type check to succeed.");
  }

  String formattedTime = converter.convert(MILLI_TIMESTAMP);
  assertEquals("22:20:38.808", formattedTime);
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:16,代码来源:DebeziumLogicalConvertersTest.java


示例18: testTimestampConversion

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
@Test
public void testTimestampConversion() {
  TimestampConverter converter = new TimestampConverter();

  assertEquals(Field.Type.timestamp(), converter.getBQSchemaType());

  try {
    converter.checkEncodingType(Schema.Type.INT64);
  } catch (Exception ex) {
    fail("Expected encoding type check to succeed.");
  }

  String formattedTimestamp = converter.convert(MILLI_TIMESTAMP);
  assertEquals("2017-03-01 22:20:38.808", formattedTimestamp);
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:16,代码来源:DebeziumLogicalConvertersTest.java


示例19: testZonedTimestampConversion

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
@Test
public void testZonedTimestampConversion() {
  ZonedTimestampConverter converter = new ZonedTimestampConverter();

  assertEquals(Field.Type.timestamp(), converter.getBQSchemaType());

  try {
    converter.checkEncodingType(Schema.Type.STRING);
  } catch (Exception ex) {
    fail("Expected encoding type check to succeed.");
  }

  String formattedTimestamp = converter.convert("2017-03-01T14:20:38.808-08:00");
  assertEquals("2017-03-01 14:20:38.808-08:00", formattedTimestamp);
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:16,代码来源:DebeziumLogicalConvertersTest.java


示例20: testBQTableDescription

import com.google.cloud.bigquery.Field; //导入依赖的package包/类
@Test
public void testBQTableDescription() {
  final String testTableName = "testTable";
  final String testDatasetName = "testDataset";
  final String testDoc = "test doc";
  final TableId tableId = TableId.of(testDatasetName, testTableName);

  SchemaRetriever mockSchemaRetriever = mock(SchemaRetriever.class);
  @SuppressWarnings("unchecked")
  SchemaConverter<com.google.cloud.bigquery.Schema> mockSchemaConverter =
      (SchemaConverter<com.google.cloud.bigquery.Schema>) mock(SchemaConverter.class);
  BigQuery mockBigQuery = mock(BigQuery.class);

  SchemaManager schemaManager = new SchemaManager(mockSchemaRetriever,
                                                  mockSchemaConverter,
                                                  mockBigQuery);

  Schema mockKafkaSchema = mock(Schema.class);
  // we would prefer to mock this class, but it is final.
  com.google.cloud.bigquery.Schema fakeBigQuerySchema =
      com.google.cloud.bigquery.Schema.of(Field.of("mock field", Field.Type.string()));

  when(mockSchemaConverter.convertSchema(mockKafkaSchema)).thenReturn(fakeBigQuerySchema);
  when(mockKafkaSchema.doc()).thenReturn(testDoc);

  TableInfo tableInfo = schemaManager.constructTableInfo(tableId, mockKafkaSchema);

  Assert.assertEquals("Kafka doc does not match BigQuery table description",
                      testDoc, tableInfo.getDescription());
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:31,代码来源:SchemaManagerTest.java



注:本文中的com.google.cloud.bigquery.Field类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java BatcherState类代码示例发布时间:2022-05-22
下一篇:
Java SyslogServer类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap