本文整理汇总了Java中org.apache.hadoop.tools.util.ThrottledInputStream类的典型用法代码示例。如果您正苦于以下问题:Java ThrottledInputStream类的具体用法?Java ThrottledInputStream怎么用?Java ThrottledInputStream使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
ThrottledInputStream类属于org.apache.hadoop.tools.util包,在下文中一共展示了ThrottledInputStream类的7个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。
示例1: copyBytes
import org.apache.hadoop.tools.util.ThrottledInputStream; //导入依赖的package包/类
private long copyBytes(FileStatus sourceFileStatus, OutputStream outStream,
int bufferSize, Mapper.Context context)
throws IOException {
Path source = sourceFileStatus.getPath();
byte buf[] = new byte[bufferSize];
ThrottledInputStream inStream = null;
long totalBytesRead = 0;
try {
inStream = getInputStream(source, context.getConfiguration());
int bytesRead = readBytes(inStream, buf);
while (bytesRead >= 0) {
totalBytesRead += bytesRead;
outStream.write(buf, 0, bytesRead);
updateContextStatus(totalBytesRead, context, sourceFileStatus);
bytesRead = inStream.read(buf);
}
} finally {
IOUtils.cleanup(LOG, outStream, inStream);
}
return totalBytesRead;
}
开发者ID:ict-carch,项目名称:hadoop-plus,代码行数:24,代码来源:RetriableFileCopyCommand.java
示例2: copyBytes
import org.apache.hadoop.tools.util.ThrottledInputStream; //导入依赖的package包/类
@VisibleForTesting
long copyBytes(FileStatus sourceFileStatus, OutputStream outStream,
int bufferSize, Mapper.Context context)
throws IOException {
Path source = sourceFileStatus.getPath();
byte buf[] = new byte[bufferSize];
ThrottledInputStream inStream = null;
long totalBytesRead = 0;
try {
inStream = getInputStream(source, context.getConfiguration());
int bytesRead = readBytes(inStream, buf);
while (bytesRead >= 0) {
totalBytesRead += bytesRead;
outStream.write(buf, 0, bytesRead);
updateContextStatus(totalBytesRead, context, sourceFileStatus);
bytesRead = inStream.read(buf);
}
outStream.close();
outStream = null;
} finally {
IOUtils.cleanup(LOG, outStream, inStream);
}
return totalBytesRead;
}
开发者ID:Seagate,项目名称:hadoop-on-lustre2,代码行数:27,代码来源:RetriableFileCopyCommand.java
示例3: copyBytes
import org.apache.hadoop.tools.util.ThrottledInputStream; //导入依赖的package包/类
@VisibleForTesting
long copyBytes(FileStatus sourceFileStatus, long sourceOffset,
OutputStream outStream, int bufferSize, Mapper.Context context)
throws IOException {
Path source = sourceFileStatus.getPath();
byte buf[] = new byte[bufferSize];
ThrottledInputStream inStream = null;
long totalBytesRead = 0;
try {
inStream = getInputStream(source, context.getConfiguration());
int bytesRead = readBytes(inStream, buf, sourceOffset);
while (bytesRead >= 0) {
totalBytesRead += bytesRead;
if (action == FileAction.APPEND) {
sourceOffset += bytesRead;
}
outStream.write(buf, 0, bytesRead);
updateContextStatus(totalBytesRead, context, sourceFileStatus);
bytesRead = readBytes(inStream, buf, sourceOffset);
}
outStream.close();
outStream = null;
} finally {
IOUtils.cleanup(LOG, outStream, inStream);
}
return totalBytesRead;
}
开发者ID:naver,项目名称:hadoop,代码行数:29,代码来源:RetriableFileCopyCommand.java
示例4: readBytes
import org.apache.hadoop.tools.util.ThrottledInputStream; //导入依赖的package包/类
private static int readBytes(ThrottledInputStream inStream, byte buf[],
long position) throws IOException {
try {
if (position == 0) {
return inStream.read(buf);
} else {
return inStream.read(position, buf, 0, buf.length);
}
} catch (IOException e) {
throw new CopyReadException(e);
}
}
开发者ID:naver,项目名称:hadoop,代码行数:13,代码来源:RetriableFileCopyCommand.java
示例5: getInputStream
import org.apache.hadoop.tools.util.ThrottledInputStream; //导入依赖的package包/类
private static ThrottledInputStream getInputStream(Path path,
Configuration conf) throws IOException {
try {
FileSystem fs = path.getFileSystem(conf);
long bandwidthMB = conf.getInt(DistCpConstants.CONF_LABEL_BANDWIDTH_MB,
DistCpConstants.DEFAULT_BANDWIDTH_MB);
FSDataInputStream in = fs.open(path);
return new ThrottledInputStream(in, bandwidthMB * 1024 * 1024);
}
catch (IOException e) {
throw new CopyReadException(e);
}
}
开发者ID:naver,项目名称:hadoop,代码行数:14,代码来源:RetriableFileCopyCommand.java
示例6: getInputStream
import org.apache.hadoop.tools.util.ThrottledInputStream; //导入依赖的package包/类
private static ThrottledInputStream getInputStream(Path path,
Configuration conf) throws IOException {
try {
FileSystem fs = path.getFileSystem(conf);
float bandwidthMB = conf.getFloat(DistCpConstants.CONF_LABEL_BANDWIDTH_MB,
DistCpConstants.DEFAULT_BANDWIDTH_MB);
FSDataInputStream in = fs.open(path);
return new ThrottledInputStream(in, bandwidthMB * 1024 * 1024);
}
catch (IOException e) {
throw new CopyReadException(e);
}
}
开发者ID:aliyun-beta,项目名称:aliyun-oss-hadoop-fs,代码行数:14,代码来源:RetriableFileCopyCommand.java
示例7: getInputStream
import org.apache.hadoop.tools.util.ThrottledInputStream; //导入依赖的package包/类
private static ThrottledInputStream getInputStream(Path path, Configuration conf)
throws IOException {
try {
FileSystem fs = path.getFileSystem(conf);
long bandwidthMB = conf.getInt(DistCpConstants.CONF_LABEL_BANDWIDTH_MB,
DistCpConstants.DEFAULT_BANDWIDTH_MB);
return new ThrottledInputStream(new BufferedInputStream(fs.open(path)),
bandwidthMB * 1024 * 1024);
}
catch (IOException e) {
throw new CopyReadException(e);
}
}
开发者ID:ict-carch,项目名称:hadoop-plus,代码行数:14,代码来源:RetriableFileCopyCommand.java
注:本文中的org.apache.hadoop.tools.util.ThrottledInputStream类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。 |
请发表评论