• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java Option类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.hadoop.io.SequenceFile.Reader.Option的典型用法代码示例。如果您正苦于以下问题:Java Option类的具体用法?Java Option怎么用?Java Option使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



Option类属于org.apache.hadoop.io.SequenceFile.Reader包,在下文中一共展示了Option类的7个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: doProcess

import org.apache.hadoop.io.SequenceFile.Reader.Option; //导入依赖的package包/类
@Override
protected boolean doProcess(Record inputRecord, InputStream in) throws IOException {
  FSDataInputStream fsInputStream = new FSDataInputStream(new ForwardOnlySeekable(in));
  Option opt = SequenceFile.Reader.stream(fsInputStream);
  SequenceFile.Metadata sequenceFileMetaData = null;
  SequenceFile.Reader reader = null;
  try {
    reader = new SequenceFile.Reader(conf, opt);   
    if (includeMetaData) {
      sequenceFileMetaData = reader.getMetadata();
    }
    Class keyClass = reader.getKeyClass();
    Class valueClass = reader.getValueClass();
    Record template = inputRecord.copy();
    removeAttachments(template);
    
    while (true) {
      Writable key = (Writable)ReflectionUtils.newInstance(keyClass, conf);
      Writable val = (Writable)ReflectionUtils.newInstance(valueClass, conf);
      try {
        if (!reader.next(key, val)) {
          break;
        }
      } catch (EOFException ex) {
        // SequenceFile.Reader will throw an EOFException after reading
        // all the data, if it doesn't know the length.  Since we are
        // passing in an InputStream, we hit this case;
        LOG.trace("Received expected EOFException", ex);
        break;
      }
      incrementNumRecords();
      Record outputRecord = template.copy();
      outputRecord.put(keyField, key);
      outputRecord.put(valueField, val);
      outputRecord.put(Fields.ATTACHMENT_MIME_TYPE, OUTPUT_MEDIA_TYPE);
      if (includeMetaData && sequenceFileMetaData != null) {
        outputRecord.put(SEQUENCE_FILE_META_DATA, sequenceFileMetaData);
      }
      
      // pass record to next command in chain:
      if (!getChild().process(outputRecord)) {
        return false;
      }
    }
  } finally {
    Closeables.closeQuietly(reader);
  }
  return true;
}
 
开发者ID:cloudera,项目名称:cdk,代码行数:50,代码来源:ReadSequenceFileBuilder.java


示例2: readCrawldb

import org.apache.hadoop.io.SequenceFile.Reader.Option; //导入依赖的package包/类
private List<String> readCrawldb() throws IOException {
  Path dbfile = new Path(crawldbPath, CrawlDb.CURRENT_NAME
      + "/part-00000/data");
  System.out.println("reading:" + dbfile);
  Option rFile = SequenceFile.Reader.file(dbfile);
  @SuppressWarnings("resource")
  SequenceFile.Reader reader = new SequenceFile.Reader(conf, rFile);
  ArrayList<String> read = new ArrayList<String>();

  READ: do {
    Text key = new Text();
    CrawlDatum value = new CrawlDatum();
    if (!reader.next(key, value))
      break READ;
    read.add(key.toString());
  } while (true);

  return read;
}
 
开发者ID:jorcox,项目名称:GeoCrawler,代码行数:20,代码来源:TestInjector.java


示例3: readCrawldbRecords

import org.apache.hadoop.io.SequenceFile.Reader.Option; //导入依赖的package包/类
private HashMap<String, CrawlDatum> readCrawldbRecords() throws IOException {
  Path dbfile = new Path(crawldbPath, CrawlDb.CURRENT_NAME
      + "/part-00000/data");
  System.out.println("reading:" + dbfile);
  Option rFile = SequenceFile.Reader.file(dbfile);
  @SuppressWarnings("resource")
  SequenceFile.Reader reader = new SequenceFile.Reader(conf, rFile);
  HashMap<String, CrawlDatum> read = new HashMap<String, CrawlDatum>();

  READ: do {
    Text key = new Text();
    CrawlDatum value = new CrawlDatum();
    if (!reader.next(key, value))
      break READ;
    read.put(key.toString(), value);
  } while (true);

  return read;
}
 
开发者ID:jorcox,项目名称:GeoCrawler,代码行数:20,代码来源:TestInjector.java


示例4: readContents

import org.apache.hadoop.io.SequenceFile.Reader.Option; //导入依赖的package包/类
/**
 * Read contents of fetchlist.
 * 
 * @param fetchlist
 *          path to Generated fetchlist
 * @return Generated {@link URLCrawlDatum} objects
 * @throws IOException
 */
private ArrayList<URLCrawlDatum> readContents(Path fetchlist)
    throws IOException {
  // verify results
  Option fFile = SequenceFile.Reader.file(fetchlist);
  SequenceFile.Reader reader = new SequenceFile.Reader(conf, fFile);

  ArrayList<URLCrawlDatum> l = new ArrayList<URLCrawlDatum>();

  READ: do {
    Text key = new Text();
    CrawlDatum value = new CrawlDatum();
    if (!reader.next(key, value)) {
      break READ;
    }
    l.add(new URLCrawlDatum(key, value));
  } while (true);

  reader.close();
  return l;
}
 
开发者ID:jorcox,项目名称:GeoCrawler,代码行数:29,代码来源:TestCrawlDbFilter.java


示例5: readContents

import org.apache.hadoop.io.SequenceFile.Reader.Option; //导入依赖的package包/类
/**
 * Read contents of fetchlist.
 * 
 * @param fetchlist
 *          path to Generated fetchlist
 * @return Generated {@link URLCrawlDatum} objects
 * @throws IOException
 */
private ArrayList<URLCrawlDatum> readContents(Path fetchlist)
    throws IOException {
  // verify results
  Option rFile = SequenceFile.Reader.file(fetchlist);
  SequenceFile.Reader reader = new SequenceFile.Reader(conf, rFile);

  ArrayList<URLCrawlDatum> l = new ArrayList<URLCrawlDatum>();

  READ: do {
    Text key = new Text();
    CrawlDatum value = new CrawlDatum();
    if (!reader.next(key, value)) {
      break READ;
    }
    l.add(new URLCrawlDatum(key, value));
  } while (true);

  reader.close();
  return l;
}
 
开发者ID:jorcox,项目名称:GeoCrawler,代码行数:29,代码来源:TestGenerator.java


示例6: CubeStatsResult

import org.apache.hadoop.io.SequenceFile.Reader.Option; //导入依赖的package包/类
public CubeStatsResult(Path path, int precision) throws IOException {
    Configuration hadoopConf = HadoopUtil.getCurrentConfiguration();
    Option seqInput = SequenceFile.Reader.file(path);
    try (Reader reader = new SequenceFile.Reader(hadoopConf, seqInput)) {
        LongWritable key = (LongWritable) ReflectionUtils.newInstance(reader.getKeyClass(), hadoopConf);
        BytesWritable value = (BytesWritable) ReflectionUtils.newInstance(reader.getValueClass(), hadoopConf);
        while (reader.next(key, value)) {
            if (key.get() == 0L) {
                percentage = Bytes.toInt(value.getBytes());
            } else if (key.get() == -1) {
                mapperOverlapRatio = Bytes.toDouble(value.getBytes());
            } else if (key.get() == -2) {
                mapperNumber = Bytes.toInt(value.getBytes());
            } else if (key.get() > 0) {
                HLLCounter hll = new HLLCounter(precision);
                ByteArray byteArray = new ByteArray(value.getBytes());
                hll.readRegisters(byteArray.asBuffer());
                counterMap.put(key.get(), hll);
            }
        }
    }
}
 
开发者ID:apache,项目名称:kylin,代码行数:23,代码来源:CubeStatsReader.java


示例7: initialize

import org.apache.hadoop.io.SequenceFile.Reader.Option; //导入依赖的package包/类
@Override
public void initialize(InputSplit inputSplit, TaskAttemptContext context) throws IOException, InterruptedException {
	FileSplit split = (FileSplit) inputSplit;
	Configuration conf = context.getConfiguration();
	final Path path = split.getPath();

	Option optPath = SequenceFile.Reader.file(path);
	in = new SequenceFile.Reader(conf, optPath);

	this.end = split.getStart() + inputSplit.getLength();
	if (split.getStart() > in.getPosition()) {
		in.sync(split.getStart());
	}
	start = in.getPosition();
	done = start >= end;
}
 
开发者ID:norvigaward,项目名称:warcutils,代码行数:17,代码来源:WarcSequenceFileRecordReader.java



注:本文中的org.apache.hadoop.io.SequenceFile.Reader.Option类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java JTextScrollPane类代码示例发布时间:2022-05-22
下一篇:
Java Resource类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap