• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java AvroParquetReader类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.parquet.avro.AvroParquetReader的典型用法代码示例。如果您正苦于以下问题:Java AvroParquetReader类的具体用法?Java AvroParquetReader怎么用?Java AvroParquetReader使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



AvroParquetReader类属于org.apache.parquet.avro包,在下文中一共展示了AvroParquetReader类的10个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: getSchema

import org.apache.parquet.avro.AvroParquetReader; //导入依赖的package包/类
@Override
public DatasetJsonRecord getSchema(Path targetFilePath)
  throws IOException {
  System.out.println("parquet file path : " + targetFilePath.toUri().getPath());

  SeekableInput sin = new FsInput(targetFilePath, fs.getConf());
  ParquetReader<GenericRecord> reader = AvroParquetReader.<GenericRecord>builder(targetFilePath).build();

  String schemaString = reader.read().getSchema().toString();
  String storage = STORAGE_TYPE;
  String abstractPath = targetFilePath.toUri().getPath();

  FileStatus fstat = fs.getFileStatus(targetFilePath);
  // TODO set codec
  DatasetJsonRecord datasetJsonRecord =
    new DatasetJsonRecord(schemaString, abstractPath, fstat.getModificationTime(), fstat.getOwner(), fstat.getGroup(),
      fstat.getPermission().toString(), null, storage, "");
  reader.close();
  sin.close();
  return datasetJsonRecord;
}
 
开发者ID:linkedin,项目名称:WhereHows,代码行数:22,代码来源:ParquetFileAnalyzer.java


示例2: getSampleData

import org.apache.parquet.avro.AvroParquetReader; //导入依赖的package包/类
@Override
public SampleDataRecord getSampleData(Path targetFilePath)
  throws IOException {
  ParquetReader<GenericRecord> reader = AvroParquetReader.<GenericRecord>builder(targetFilePath).build();

  Iterator<GenericRecord> iter = Collections.singletonList(reader.read()).iterator();
  int count = 0;
  List<Object> list = new ArrayList<Object>();
  //JSONArray list = new JSONArray();
  while (iter.hasNext() && count < 10) {
    // TODO handle out of memory error
    list.add(iter.next().toString().replaceAll("[\\n\\r\\p{C}]", "").replaceAll("\"", "\\\""));
    count++;
  }
  SampleDataRecord sampleDataRecord = new SampleDataRecord(targetFilePath.toUri().getPath(), list);

  return sampleDataRecord;
}
 
开发者ID:linkedin,项目名称:WhereHows,代码行数:19,代码来源:ParquetFileAnalyzer.java


示例3: validateParquetFile

import org.apache.parquet.avro.AvroParquetReader; //导入依赖的package包/类
public void validateParquetFile(Path parquetFile, long recourdCount) throws IOException {
  ParquetReader reader = AvroParquetReader.builder(parquetFile)
    .build();

  for(long i = 0; i < recourdCount; i++) {
    GenericData.Record actualRow = (GenericData.Record) reader.read();
    Assert.assertNotNull("Can't read row " + i, actualRow);

    Assert.assertEquals("Value different in row " + i + " for key b", actualRow.get("b"), i % 2 == 0);
    Assert.assertEquals("Value different in row " + i + " for key s", actualRow.get("s"), new Utf8(String.valueOf(i)));
    Assert.assertEquals("Value different in row " + i + " for key l", actualRow.get("l"), i);
    Assert.assertEquals("Value different in row " + i + " for key l100", actualRow.get("l100"), i%100);
    Assert.assertEquals("Value different in row " + i + " for key s100", actualRow.get("s100"), new Utf8(String.valueOf(i % 100)));
  }

  Assert.assertNull("Parquet file contains more then expected rows", reader.read());
}
 
开发者ID:streamsets,项目名称:datacollector,代码行数:18,代码来源:LargeInputFileIT.java


示例4: validateParquetFile

import org.apache.parquet.avro.AvroParquetReader; //导入依赖的package包/类
public void validateParquetFile(Path parquetFile, List<Map<String, Object>> data) throws IOException {
  ParquetReader reader = AvroParquetReader.builder(parquetFile)
    .build();

  int position = 0;
  for(Map<String, Object> expectedRow : data) {
    GenericData.Record actualRow = (GenericData.Record) reader.read();
    Assert.assertNotNull("Can't read row " + position, actualRow);

    for(Map.Entry<String, Object> entry : expectedRow.entrySet()) {
      Object value = actualRow.get(entry.getKey());
      Assert.assertEquals("Different value on row " + position + " for key " + entry.getKey(), entry.getValue(), value);
    }
  }

  Assert.assertNull("Parquet file contains more then expected rows", reader.read());
}
 
开发者ID:streamsets,项目名称:datacollector,代码行数:18,代码来源:BaseAvroParquetConvertIT.java


示例5: initReader

import org.apache.parquet.avro.AvroParquetReader; //导入依赖的package包/类
private ParquetReader<GenericRecord> initReader() throws IOException {
    Configuration configuration = getFs().getConf();
    if (this.schema != null) {
        AvroReadSupport.setAvroReadSchema(configuration, this.schema);
    }
    if (this.projection != null) {
        AvroReadSupport.setRequestedProjection(configuration, this.projection);
    }
    ParquetReader reader = AvroParquetReader.<GenericRecord>builder(getFilePath())
            .withConf(configuration).build();
    return reader;
}
 
开发者ID:mmolimar,项目名称:kafka-connect-fs,代码行数:13,代码来源:ParquetFileReader.java


示例6: read

import org.apache.parquet.avro.AvroParquetReader; //导入依赖的package包/类
/**
 * 读取parquet文件内容
 *
 * @param parquetPath
 */
public void read(String parquetPath) {
    AvroParquetReader<GenericRecord> reader = null;
    try {
        reader = new AvroParquetReader<GenericRecord>(new Path(parquetPath));
        GenericRecord result = reader.read();
        System.out.println(result.getSchema());
        while ((result = reader.read()) != null) {
            System.out.println(result);
        }
        reader.close();
    } catch (IOException e) {
        e.printStackTrace();
    }
}
 
开发者ID:mumuhadoop,项目名称:mumu-parquet,代码行数:20,代码来源:AvroParquetOperation.java


示例7: assertReadParquetFile

import org.apache.parquet.avro.AvroParquetReader; //导入依赖的package包/类
/**
 * Tests that a file on the HDFS cluster contains the given parquet.
 *
 * @param path the name of the file on the HDFS cluster
 * @param expected the expected avro record in the file .
 */
public static void assertReadParquetFile(FileSystem fs, String path, Set<IndexedRecord> expected, boolean part) throws IOException {
    Path p = new Path(path);
    if (fs.isFile(p)) {
        try (AvroParquetReader<GenericRecord> reader = new AvroParquetReader<GenericRecord>(fs.getConf(), new Path(path))) {
            IndexedRecord record = null;
            while (null != (record = reader.read())){
                IndexedRecord eqRecord = null;
                for (IndexedRecord indexedRecord : expected) {
                    if(indexedRecord.equals(record)){
                        eqRecord = indexedRecord;
                        break;
                    }
                }
                expected.remove(eqRecord);
            }
        }
        // Check before asserting for the message.
        if (!part && expected.size() != 0)
            assertThat("Not all avro records found: " + expected.iterator().next(), expected, hasSize(0));
    } else if (fs.isDirectory(p)) {
        for (FileStatus fstatus : FileSystemUtil.listSubFiles(fs, p)) {
            assertReadParquetFile(fs, fstatus.getPath().toString(), expected, true);
        }
        // Check before asserting for the message.
        if (expected.size() != 0)
            assertThat("Not all avro records found: " + expected.iterator().next(), expected, hasSize(0));
    } else {
        fail("No such path: " + path);
    }
}
 
开发者ID:Talend,项目名称:components,代码行数:37,代码来源:MiniDfsResource.java


示例8: AvroParquetFileReader

import org.apache.parquet.avro.AvroParquetReader; //导入依赖的package包/类
public AvroParquetFileReader(LogFilePath logFilePath, CompressionCodec codec) throws IOException {
    Path path = new Path(logFilePath.getLogFilePath());
    String topic = logFilePath.getTopic();
    Schema schema = schemaRegistryClient.getSchema(topic);
    reader = AvroParquetReader.<GenericRecord>builder(path).build();
    writer = new SpecificDatumWriter(schema);
    offset = logFilePath.getOffset();
}
 
开发者ID:pinterest,项目名称:secor,代码行数:9,代码来源:AvroParquetFileReaderWriterFactory.java


示例9: serializeToByteBuffer

import org.apache.parquet.avro.AvroParquetReader; //导入依赖的package包/类
/**
 * Serialize Avro data to a in-memory ByteBuffer.
 * @return A ByteBuffer that contains avro data.
 * @throws IOException if the parquet file couldn't be parsed correctly.
 */
public ByteBuffer serializeToByteBuffer() throws IOException {
  final ByteArrayOutputStream stream = new ByteArrayOutputStream();
  final Encoder encoder = EncoderFactory.get().binaryEncoder(stream, null);
  final DatumWriter writer = new GenericDatumWriter<GenericRecord>();
  writer.setSchema(createAvroSchema());
  final AvroParquetReader<GenericRecord> reader = createAvroReader();

  GenericRecord record = reader.read();
  while (record != null) {
    writer.write(record, encoder);
    record = reader.read();
  }

  try {
    reader.close();
  } catch (IOException ex){
    LOG.log(Level.SEVERE, ex.getMessage());
    throw ex;
  }

  encoder.flush();
  final ByteBuffer buf = ByteBuffer.wrap(stream.toByteArray());
  buf.order(ByteOrder.LITTLE_ENDIAN);
  return buf;
}
 
开发者ID:apache,项目名称:reef,代码行数:31,代码来源:ParquetReader.java


示例10: createAvroReader

import org.apache.parquet.avro.AvroParquetReader; //导入依赖的package包/类
/**
 * Construct an avro reader from parquet file.
 * @return avro reader based on the provided parquet file.
 * @throws IOException if the parquet file couldn't be parsed correctly.
 */
private AvroParquetReader<GenericRecord> createAvroReader() throws IOException {
  return new AvroParquetReader<GenericRecord>(parquetFilePath);
}
 
开发者ID:apache,项目名称:reef,代码行数:9,代码来源:ParquetReader.java



注:本文中的org.apache.parquet.avro.AvroParquetReader类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java SetReplicationOp类代码示例发布时间:2022-05-22
下一篇:
Java RpcClientUtil类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap