• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java TimelineClientImpl类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl的典型用法代码示例。如果您正苦于以下问题:Java TimelineClientImpl类的具体用法?Java TimelineClientImpl怎么用?Java TimelineClientImpl使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



TimelineClientImpl类属于org.apache.hadoop.yarn.client.api.impl包,在下文中一共展示了TimelineClientImpl类的5个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: testTimelineClientInDSAppMaster

import org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl; //导入依赖的package包/类
@Test
public void testTimelineClientInDSAppMaster() throws Exception {
  ApplicationMaster appMaster = new ApplicationMaster();
  appMaster.appSubmitterUgi =
      UserGroupInformation.createUserForTesting("foo", new String[]{"bar"});
  Configuration conf = new YarnConfiguration();
  conf.setBoolean(YarnConfiguration.TIMELINE_SERVICE_ENABLED, true);
  appMaster.startTimelineClient(conf);
  Assert.assertEquals(appMaster.appSubmitterUgi,
      ((TimelineClientImpl)appMaster.timelineClient).getUgi());
}
 
开发者ID:naver,项目名称:hadoop,代码行数:12,代码来源:TestDSAppMaster.java


示例2: map

import org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl; //导入依赖的package包/类
public void map(IntWritable key, IntWritable val, Context context) throws IOException {
  TimelineClient tlc = new TimelineClientImpl();
  Configuration conf = context.getConfiguration();

  final int kbs = conf.getInt(KBS_SENT, KBS_SENT_DEFAULT);

  long totalTime = 0;
  final int testtimes = conf.getInt(TEST_TIMES, TEST_TIMES_DEFAULT);
  final Random rand = new Random();
  final TaskAttemptID taskAttemptId = context.getTaskAttemptID();
  final char[] payLoad = new char[kbs * 1024];

  for (int i = 0; i < testtimes; i++) {
    // Generate a fixed length random payload
    for (int xx = 0; xx < kbs * 1024; xx++) {
      int alphaNumIdx =
          rand.nextInt(ALPHA_NUMS.length);
      payLoad[xx] = ALPHA_NUMS[alphaNumIdx];
    }
    String entId = taskAttemptId + "_" + Integer.toString(i);
    final TimelineEntity entity = new TimelineEntity();
    entity.setEntityId(entId);
    entity.setEntityType("FOO_ATTEMPT");
    entity.addOtherInfo("PERF_TEST", payLoad);
    // add an event
    TimelineEvent event = new TimelineEvent();
    event.setTimestamp(System.currentTimeMillis());
    event.setEventType("foo_event");
    entity.addEvent(event);

    // use the current user for this purpose
    UserGroupInformation ugi = UserGroupInformation.getCurrentUser();
    long startWrite = System.nanoTime();
    try {
      tlc.putEntities(entity);
    } catch (Exception e) {
      context.getCounter(PerfCounters.TIMELINE_SERVICE_WRITE_FAILURES).
          increment(1);
      LOG.error("writing to the timeline service failed", e);
    }
    long endWrite = System.nanoTime();
    totalTime += TimeUnit.NANOSECONDS.toMillis(endWrite-startWrite);
  }
  LOG.info("wrote " + testtimes + " entities (" + kbs*testtimes +
      " kB) in " + totalTime + " ms");
  context.getCounter(PerfCounters.TIMELINE_SERVICE_WRITE_TIME).
      increment(totalTime);
  context.getCounter(PerfCounters.TIMELINE_SERVICE_WRITE_COUNTER).
      increment(testtimes);
  context.getCounter(PerfCounters.TIMELINE_SERVICE_WRITE_KBS).
      increment(kbs*testtimes);
}
 
开发者ID:aliyun-beta,项目名称:aliyun-oss-hadoop-fs,代码行数:53,代码来源:SimpleEntityWriterV1.java


示例3: map

import org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl; //导入依赖的package包/类
public void map(IntWritable key, IntWritable val, Context context) throws IOException {
  // collect the apps it needs to process
  TimelineClient tlc = new TimelineClientImpl();
  TimelineEntityConverterV1 converter = new TimelineEntityConverterV1();
  JobHistoryFileReplayHelper helper = new JobHistoryFileReplayHelper(context);
  int replayMode = helper.getReplayMode();
  Collection<JobFiles> jobs =
      helper.getJobFiles();
  JobHistoryFileParser parser = helper.getParser();

  if (jobs.isEmpty()) {
    LOG.info(context.getTaskAttemptID().getTaskID() +
        " will process no jobs");
  } else {
    LOG.info(context.getTaskAttemptID().getTaskID() + " will process " +
        jobs.size() + " jobs");
  }
  for (JobFiles job: jobs) {
    // process each job
    String jobIdStr = job.getJobId();
    LOG.info("processing " + jobIdStr + "...");
    JobId jobId = TypeConverter.toYarn(JobID.forName(jobIdStr));
    ApplicationId appId = jobId.getAppId();

    try {
      // parse the job info and configuration
      Path historyFilePath = job.getJobHistoryFilePath();
      Path confFilePath = job.getJobConfFilePath();
      if ((historyFilePath == null) || (confFilePath == null)) {
        continue;
      }
      JobInfo jobInfo =
          parser.parseHistoryFile(historyFilePath);
      Configuration jobConf =
          parser.parseConfiguration(confFilePath);
      LOG.info("parsed the job history file and the configuration file for job "
          + jobIdStr);

      // create entities from job history and write them
      long totalTime = 0;
      Set<TimelineEntity> entitySet =
          converter.createTimelineEntities(jobInfo, jobConf);
      LOG.info("converted them into timeline entities for job " + jobIdStr);
      // use the current user for this purpose
      UserGroupInformation ugi = UserGroupInformation.getCurrentUser();
      long startWrite = System.nanoTime();
      try {
        switch (replayMode) {
        case JobHistoryFileReplayHelper.WRITE_ALL_AT_ONCE:
          writeAllEntities(tlc, entitySet, ugi);
          break;
        case JobHistoryFileReplayHelper.WRITE_PER_ENTITY:
          writePerEntity(tlc, entitySet, ugi);
          break;
        default:
          break;
        }
      } catch (Exception e) {
        context.getCounter(PerfCounters.TIMELINE_SERVICE_WRITE_FAILURES).
            increment(1);
        LOG.error("writing to the timeline service failed", e);
      }
      long endWrite = System.nanoTime();
      totalTime += TimeUnit.NANOSECONDS.toMillis(endWrite-startWrite);
      int numEntities = entitySet.size();
      LOG.info("wrote " + numEntities + " entities in " + totalTime + " ms");

      context.getCounter(PerfCounters.TIMELINE_SERVICE_WRITE_TIME).
          increment(totalTime);
      context.getCounter(PerfCounters.TIMELINE_SERVICE_WRITE_COUNTER).
          increment(numEntities);
    } finally {
      context.progress(); // move it along
    }
  }
}
 
开发者ID:aliyun-beta,项目名称:aliyun-oss-hadoop-fs,代码行数:77,代码来源:JobHistoryFileReplayMapperV1.java


示例4: createTimelineClient

import org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl; //导入依赖的package包/类
@Public
public static TimelineClient createTimelineClient() {
  TimelineClient client = new TimelineClientImpl();
  return client;
}
 
开发者ID:Nextzero,项目名称:hadoop-2.6.0-cdh5.4.3,代码行数:6,代码来源:TimelineClient.java


示例5: createTimelineClient

import org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl; //导入依赖的package包/类
/**
 * Create a timeline client. The current UGI when the user initialize the
 * client will be used to do the put and the delegation token operations. The
 * current user may use {@link UserGroupInformation#doAs} another user to
 * construct and initialize a timeline client if the following operations are
 * supposed to be conducted by that user.
 *
 * @return a timeline client
 */
@Public
public static TimelineClient createTimelineClient() {
  TimelineClient client = new TimelineClientImpl();
  return client;
}
 
开发者ID:naver,项目名称:hadoop,代码行数:15,代码来源:TimelineClient.java



注:本文中的org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java ByteList类代码示例发布时间:2022-05-22
下一篇:
Java ActionsPublisher类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap