• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java GroupType类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中parquet.schema.GroupType的典型用法代码示例。如果您正苦于以下问题:Java GroupType类的具体用法?Java GroupType怎么用?Java GroupType使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



GroupType类属于parquet.schema包,在下文中一共展示了GroupType类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: getType

import parquet.schema.GroupType; //导入依赖的package包/类
private parquet.schema.Type getType(MaterializedField field) {
  MinorType minorType = field.getType().getMinorType();
  DataMode dataMode = field.getType().getMode();
  switch(minorType) {
    case MAP:
      List<parquet.schema.Type> types = Lists.newArrayList();
      for (MaterializedField childField : field.getChildren()) {
        types.add(getType(childField));
      }
      return new GroupType(dataMode == DataMode.REPEATED ? Repetition.REPEATED : Repetition.OPTIONAL, field.getLastName(), types);
    case LIST:
      throw new UnsupportedOperationException("Unsupported type " + minorType);
    default:
      return getPrimitiveType(field);
  }
}
 
开发者ID:skhalifa,项目名称:QDrill,代码行数:17,代码来源:ParquetRecordWriter.java


示例2: writeTuple

import parquet.schema.GroupType; //导入依赖的package包/类
private void writeTuple(Tuple tuple, GroupType type) {
    for (int index = 0; index < type.getFieldCount(); index++) {
        Type fieldType = type.getType(index);
        String fieldName = fieldType.getName();
        // empty fields have to be omitted
        if (tuple.isNull(index))
            continue;
        recordConsumer.startField(fieldName, index);
        if (fieldType.isPrimitive()) {
            tuple.writePrimitiveValue(recordConsumer, index, (PrimitiveType)fieldType);
        }
        else {
            recordConsumer.startGroup();
            writeTuple(tuple.getTuple(index), fieldType.asGroupType());
            recordConsumer.endGroup();
        }
        recordConsumer.endField(fieldName, index);
    }
}
 
开发者ID:EXASOL,项目名称:hadoop-etl-udfs,代码行数:20,代码来源:TupleWriter.java


示例3: groupToCells

import parquet.schema.GroupType; //导入依赖的package包/类
/**
 * transform data in group into cells(List<cell> - > {@link org.apache.hadoop.hbase.client.Result}</>)
 * @param group
 * @return
 */
public static List<Cell> groupToCells(Group group){

    List<Cell> cells = new LinkedList<>();
    if(group != null){
        cells = new LinkedList<>();
        GroupType groupType = group.getType();
        List<Type> types = groupType.getFields();
        byte [] rowKey = group.getBinary(HConstants.ROW_KEY, 0).getBytes();

        long timestamp = group.getLong(HConstants.TIME_STAMP, 0);

        for(Type t : types){
            if(! t.getName().equals(HConstants.ROW_KEY) && ! t.getName().equals(HConstants.TIME_STAMP)){
                String name = t.getName();
                String [] names = name.split(":");
                if(names.length == 2) {
                    byte[] value = group.getBinary(name, 0).getBytes();
                    Cell cell = new KeyValue(rowKey, names[0].getBytes(), names[1].getBytes(), timestamp, value);
                    cells.add(cell);
                }
            }
        }
    }
    return cells;
}
 
开发者ID:grokcoder,项目名称:pbase,代码行数:31,代码来源:PFileReader.java


示例4: ParquetStructConverter

import parquet.schema.GroupType; //导入依赖的package包/类
public ParquetStructConverter(Type prestoType, String columnName, GroupType entryType, int fieldIndex)
{
    checkArgument(ROW.equals(prestoType.getTypeSignature().getBase()));
    List<Type> prestoTypeParameters = prestoType.getTypeParameters();
    List<parquet.schema.Type> fieldTypes = entryType.getFields();
    checkArgument(prestoTypeParameters.size() == fieldTypes.size());

    this.rowType = prestoType;
    this.fieldIndex = fieldIndex;

    ImmutableList.Builder<BlockConverter> converters = ImmutableList.builder();
    for (int i = 0; i < prestoTypeParameters.size(); i++) {
        parquet.schema.Type fieldType = fieldTypes.get(i);
        converters.add(createConverter(prestoTypeParameters.get(i), columnName + "." + fieldType.getName(), fieldType, i));
    }
    this.converters = converters.build();
}
 
开发者ID:y-lan,项目名称:presto,代码行数:18,代码来源:ParquetHiveRecordCursor.java


示例5: writeRecordFields

import parquet.schema.GroupType; //导入依赖的package包/类
private void writeRecordFields(GroupType schema, Schema tajoSchema,
                               Tuple tuple) {
  List<Type> fields = schema.getFields();
  // Parquet ignores Tajo NULL_TYPE columns, so the index may differ.
  int index = 0;
  for (int tajoIndex = 0; tajoIndex < tajoSchema.size(); ++tajoIndex) {
    Column column = tajoSchema.getColumn(tajoIndex);
    if (column.getDataType().getType() == TajoDataTypes.Type.NULL_TYPE) {
      continue;
    }
    Datum datum = tuple.get(tajoIndex);
    Type fieldType = fields.get(index);
    if (!tuple.isNull(tajoIndex)) {
      recordConsumer.startField(fieldType.getName(), index);
      writeValue(fieldType, column, datum);
      recordConsumer.endField(fieldType.getName(), index);
    } else if (fieldType.isRepetition(Type.Repetition.REQUIRED)) {
      throw new RuntimeException("Null-value for required field: " +
          column.getSimpleName());
    }
    ++index;
  }
}
 
开发者ID:gruter,项目名称:tajo-cdh,代码行数:24,代码来源:TajoWriteSupport.java


示例6: convertField

import parquet.schema.GroupType; //导入依赖的package包/类
@Override
Object convertField(JsonElement value) {
  ParquetGroup r1 = new ParquetGroup((GroupType) schema());
  JsonObject inputRecord = value.getAsJsonObject();
  for (Map.Entry<String, JsonElement> entry : inputRecord.entrySet()) {
    String key = entry.getKey();
    JsonElementConverter converter = this.converters.get(key);
    Object convertedValue = converter.convert(entry.getValue());
    boolean valueIsNull = convertedValue == null;
    Type.Repetition repetition = converter.jsonSchema.optionalOrRequired();
    if (valueIsNull && repetition.equals(OPTIONAL)) {
      continue;
    }
    r1.add(key, convertedValue);
  }
  return r1;
}
 
开发者ID:apache,项目名称:incubator-gobblin,代码行数:18,代码来源:JsonElementConversionFactory.java


示例7: buildSchema

import parquet.schema.GroupType; //导入依赖的package包/类
private Type buildSchema() {
  JsonArray inputSchema = this.jsonSchema.getDataTypeValues();
  List<Type> parquetTypes = new ArrayList<>();
  for (JsonElement element : inputSchema) {
    JsonObject map = (JsonObject) element;
    JsonSchema elementSchema = new JsonSchema(map);
    String columnName = elementSchema.getColumnName();
    JsonElementConverter converter = JsonElementConversionFactory.getConverter(elementSchema, false);
    Type schemaType = converter.schema();
    this.converters.put(columnName, converter);
    parquetTypes.add(schemaType);
  }
  String docName = this.jsonSchema.getColumnName();
  switch (recordType) {
    case ROOT:
      return new MessageType(docName, parquetTypes);
    case CHILD:
      return new GroupType(this.jsonSchema.optionalOrRequired(), docName, parquetTypes);
    default:
      throw new RuntimeException("Unsupported Record type");
  }
}
 
开发者ID:apache,项目名称:incubator-gobblin,代码行数:23,代码来源:JsonElementConversionFactory.java


示例8: AvroUnionConverter

import parquet.schema.GroupType; //导入依赖的package包/类
public AvroUnionConverter(ParentValueContainer parent, Type parquetSchema,
                          Schema avroSchema) {
  this.parent = parent;
  GroupType parquetGroup = parquetSchema.asGroupType();
  this.memberConverters = new Converter[ parquetGroup.getFieldCount()];

  int parquetIndex = 0;
  for (int index = 0; index < avroSchema.getTypes().size(); index++) {
    Schema memberSchema = avroSchema.getTypes().get(index);
    if (!memberSchema.getType().equals(Schema.Type.NULL)) {
      Type memberType = parquetGroup.getType(parquetIndex);
      memberConverters[parquetIndex] = newConverter(memberSchema, memberType, new ParentValueContainer() {
        @Override
        void add(Object value) {
          Preconditions.checkArgument(memberValue==null, "Union is resolving to more than one type");
          memberValue = value;
        }
      });
      parquetIndex++; // Note for nulls the parquetIndex id not increased
    }
  }
}
 
开发者ID:Datasio,项目名称:cascalog-avro-parquet,代码行数:23,代码来源:HMAvroConverter.java


示例9: writeRecordFields

import parquet.schema.GroupType; //导入依赖的package包/类
private void writeRecordFields(GroupType schema, Schema avroSchema,
                               Map record) {
  List<Type> fields = schema.getFields();
  List<Schema.Field> avroFields = avroSchema.getFields();
  int index = 0; // parquet ignores Avro nulls, so index may differ
  for (int avroIndex = 0; avroIndex < avroFields.size(); avroIndex++) {
    Schema.Field avroField = avroFields.get(avroIndex);
    if (avroField.schema().getType().equals(Schema.Type.NULL)) {
      continue;
    }
    Type fieldType = fields.get(index);
    Object value = record.get(fieldKeyword.invoke(avroField));
    if (value != null) {
      recordConsumer.startField(fieldType.getName(), index);
      writeValue(fieldType, avroField.schema(), value);
      recordConsumer.endField(fieldType.getName(), index);
    } else if (fieldType.isRepetition(Type.Repetition.REQUIRED)) {
      throw new RuntimeException("Null-value for required field: " + avroField.name());
    }
    index++;
  }
}
 
开发者ID:Datasio,项目名称:cascalog-avro-parquet,代码行数:23,代码来源:HMAvroWriteSupport.java


示例10: getType

import parquet.schema.GroupType; //导入依赖的package包/类
private static Type getType(String[] pathSegments, int depth, MessageType schema) {
  Type type = schema.getType(Arrays.copyOfRange(pathSegments, 0, depth + 1));
  if (depth + 1 == pathSegments.length) {
    return type;
  } else {
    Preconditions.checkState(!type.isPrimitive());
    return new GroupType(type.getRepetition(), type.getName(), getType(pathSegments, depth + 1, schema));
  }
}
 
开发者ID:skhalifa,项目名称:QDrill,代码行数:10,代码来源:DrillParquetReader.java


示例11: getOriginalType

import parquet.schema.GroupType; //导入依赖的package包/类
private OriginalType getOriginalType(Type type, String[] path, int depth) {
  if (type.isPrimitive()) {
    return type.getOriginalType();
  }
  Type t = ((GroupType) type).getType(path[depth]);
  return getOriginalType(t, path, depth + 1);
}
 
开发者ID:skhalifa,项目名称:QDrill,代码行数:8,代码来源:Metadata.java


示例12: contains

import parquet.schema.GroupType; //导入依赖的package包/类
private boolean contains(GroupType group, String[] path, int index) {
    if (index == path.length) {
        return false;
    }
    if (group.containsField(path[index])) {
        Type type = group.getType(path[index]);
        if (type.isPrimitive()) {
            return index + 1 == path.length;
        } else {
            return contains(type.asGroupType(), path, index + 1);
        }
    }
    return false;
}
 
开发者ID:grokcoder,项目名称:pbase,代码行数:15,代码来源:InternalParquetRecordReader.java


示例13: createGroupConverter

import parquet.schema.GroupType; //导入依赖的package包/类
private static GroupedConverter createGroupConverter(Type prestoType, String columnName, parquet.schema.Type parquetType, int fieldIndex)
{
    GroupType groupType = parquetType.asGroupType();
    switch (prestoType.getTypeSignature().getBase()) {
        case ARRAY:
            return new ParquetListConverter(prestoType, columnName, groupType, fieldIndex);
        case MAP:
            return new ParquetMapConverter(prestoType, columnName, groupType, fieldIndex);
        case ROW:
            return new ParquetStructConverter(prestoType, columnName, groupType, fieldIndex);
        default:
            throw new IllegalArgumentException("Column " + columnName + " type " + parquetType.getOriginalType() + " not supported");
    }
}
 
开发者ID:y-lan,项目名称:presto,代码行数:15,代码来源:ParquetHiveRecordCursor.java


示例14: ParquetListConverter

import parquet.schema.GroupType; //导入依赖的package包/类
public ParquetListConverter(Type prestoType, String columnName, GroupType listType, int fieldIndex)
{
    checkArgument(listType.getFieldCount() == 1,
            "Expected LIST column '%s' to only have one field, but has %s fields",
            columnName,
            listType.getFieldCount());
    checkArgument(ARRAY.equals(prestoType.getTypeSignature().getBase()));

    this.arrayType = prestoType;
    this.fieldIndex = fieldIndex;

    // The Parquet specification requires that the element value of a
    // LIST type be wrapped in an inner repeated group, like so:
    //
    // optional group listField (LIST) {
    //   repeated group list {
    //     optional int element
    //   }
    // }
    //
    // However, some parquet libraries don't follow this spec. The
    // compatibility rules used here are specified in the Parquet
    // documentation at http://git.io/vOpNz.
    parquet.schema.Type elementType = listType.getType(0);
    if (isElementType(elementType, listType.getName())) {
        elementConverter = createConverter(prestoType.getTypeParameters().get(0), columnName + ".element", elementType, 0);
    }
    else {
        elementConverter = new ParquetListEntryConverter(prestoType.getTypeParameters().get(0), columnName, elementType.asGroupType());
    }
}
 
开发者ID:y-lan,项目名称:presto,代码行数:32,代码来源:ParquetHiveRecordCursor.java


示例15: ParquetListEntryConverter

import parquet.schema.GroupType; //导入依赖的package包/类
public ParquetListEntryConverter(Type prestoType, String columnName, GroupType elementType)
{
    checkArgument(elementType.getOriginalType() == null,
            "Expected LIST column '%s' field to be type STRUCT, but is %s",
            columnName,
            elementType);

    checkArgument(elementType.getFieldCount() == 1,
            "Expected LIST column '%s' element to have one field, but has %s fields",
            columnName,
            elementType.getFieldCount());

    elementConverter = createConverter(prestoType, columnName + ".element", elementType.getType(0), 0);
}
 
开发者ID:y-lan,项目名称:presto,代码行数:15,代码来源:ParquetHiveRecordCursor.java


示例16: ParquetMapConverter

import parquet.schema.GroupType; //导入依赖的package包/类
public ParquetMapConverter(Type type, String columnName, GroupType mapType, int fieldIndex)
{
    checkArgument(mapType.getFieldCount() == 1,
            "Expected MAP column '%s' to only have one field, but has %s fields",
            mapType.getName(),
            mapType.getFieldCount());

    this.mapType = type;
    this.fieldIndex = fieldIndex;

    parquet.schema.Type entryType = mapType.getFields().get(0);

    entryConverter = new ParquetMapEntryConverter(type, columnName + ".entry", entryType.asGroupType());
}
 
开发者ID:y-lan,项目名称:presto,代码行数:15,代码来源:ParquetHiveRecordCursor.java


示例17: ParquetMapEntryConverter

import parquet.schema.GroupType; //导入依赖的package包/类
public ParquetMapEntryConverter(Type prestoType, String columnName, GroupType entryType)
{
    checkArgument(MAP.equals(prestoType.getTypeSignature().getBase()));
    // original version of parquet used null for entry due to a bug
    if (entryType.getOriginalType() != null) {
        checkArgument(entryType.getOriginalType() == MAP_KEY_VALUE,
                "Expected MAP column '%s' field to be type %s, but is %s",
                columnName,
                MAP_KEY_VALUE,
                entryType);
    }

    GroupType entryGroupType = entryType.asGroupType();
    checkArgument(entryGroupType.getFieldCount() == 2,
            "Expected MAP column '%s' entry to have two fields, but has %s fields",
            columnName,
            entryGroupType.getFieldCount());
    checkArgument(entryGroupType.getFieldName(0).equals("key"),
            "Expected MAP column '%s' entry field 0 to be named 'key', but is named %s",
            columnName,
            entryGroupType.getFieldName(0));
    checkArgument(entryGroupType.getFieldName(1).equals("value"),
            "Expected MAP column '%s' entry field 1 to be named 'value', but is named %s",
            columnName,
            entryGroupType.getFieldName(1));
    checkArgument(entryGroupType.getType(0).isPrimitive(),
            "Expected MAP column '%s' entry field 0 to be primitive, but is %s",
            columnName,
            entryGroupType.getType(0));

    keyConverter = createConverter(prestoType.getTypeParameters().get(0), columnName + ".key", entryGroupType.getFields().get(0), 0);
    valueConverter = createConverter(prestoType.getTypeParameters().get(1), columnName + ".value", entryGroupType.getFields().get(1), 1);
}
 
开发者ID:y-lan,项目名称:presto,代码行数:34,代码来源:ParquetHiveRecordCursor.java


示例18: TajoRecordConverter

import parquet.schema.GroupType; //导入依赖的package包/类
/**
 * Creates a new TajoRecordConverter.
 *
 * @param parquetSchema The Parquet schema of the projection.
 * @param tajoReadSchema The Tajo schema of the table.
 * @param projectionMap An array mapping the projection column to the column
 *                      index in the table.
 */
public TajoRecordConverter(GroupType parquetSchema, Schema tajoReadSchema,
                           int[] projectionMap) {
  this.parquetSchema = parquetSchema;
  this.tajoReadSchema = tajoReadSchema;
  this.projectionMap = projectionMap;
  this.tupleSize = tajoReadSchema.size();

  // The projectionMap.length does not match parquetSchema.getFieldCount()
  // when the projection contains NULL_TYPE columns. We will skip over the
  // NULL_TYPE columns when we construct the converters and populate the
  // NULL_TYPE columns with NullDatums in start().
  int index = 0;
  this.converters = new Converter[parquetSchema.getFieldCount()];
  for (int i = 0; i < projectionMap.length; ++i) {
    final int projectionIndex = projectionMap[i];
    Column column = tajoReadSchema.getColumn(projectionIndex);
    if (column.getDataType().getType() == TajoDataTypes.Type.NULL_TYPE) {
      continue;
    }
    Type type = parquetSchema.getType(index);
    converters[index] = newConverter(column, type, new ParentValueContainer() {
      @Override
      void add(Object value) {
        TajoRecordConverter.this.set(projectionIndex, value);
      }
    });
    ++index;
  }
}
 
开发者ID:gruter,项目名称:tajo-cdh,代码行数:38,代码来源:TajoRecordConverter.java


示例19: ParquetGroup

import parquet.schema.GroupType; //导入依赖的package包/类
public ParquetGroup(GroupType schema) {
  this.schema = schema;
  this.data = new List[schema.getFields().size()];

  for (int i = 0; i < schema.getFieldCount(); ++i) {
    this.data[i] = new ArrayList();
  }
}
 
开发者ID:apache,项目名称:incubator-gobblin,代码行数:9,代码来源:ParquetGroup.java


示例20: writeArray

import parquet.schema.GroupType; //导入依赖的package包/类
private void writeArray(GroupType schema, Schema avroSchema,
                            List array) {
  recordConsumer.startGroup(); // group wrapper (original type LIST)
  if (array.iterator().hasNext()) {
    recordConsumer.startField("array", 0);
    for (Object elt : array) {
      writeValue(schema.getType(0), avroSchema.getElementType(), elt);
    }
    recordConsumer.endField("array", 0);
  }
  recordConsumer.endGroup();
}
 
开发者ID:Datasio,项目名称:cascalog-avro-parquet,代码行数:13,代码来源:HMAvroWriteSupport.java



注:本文中的parquet.schema.GroupType类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java MvpFragment类代码示例发布时间:2022-05-22
下一篇:
Java LeastSquaresProblem类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap