• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java HCatUtil类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.hcatalog.common.HCatUtil的典型用法代码示例。如果您正苦于以下问题:Java HCatUtil类的具体用法?Java HCatUtil怎么用?Java HCatUtil使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



HCatUtil类属于org.apache.hcatalog.common包,在下文中一共展示了HCatUtil类的13个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: deleteTable

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
@Override
public final void deleteTable(final String name) throws CatalogException {
  String dbName = null, tableName = null;
  Pair<String, String> tablePair = null;
  HCatalogStoreClientPool.HCatalogStoreClient client = null;

  // get db name and table name.
  try {
    tablePair = HCatUtil.getDbAndTableName(name);
    dbName = tablePair.first;
    tableName = tablePair.second;
  } catch (Exception ioe) {
    throw new CatalogException("Table name is wrong.", ioe);
  }

  try {
    client = clientPool.getClient();
    client.getHiveClient().dropTable(dbName, tableName, false, false);
  } catch (NoSuchObjectException nsoe) {
  } catch (Exception e) {
    throw new CatalogException(e);
  } finally {
    client.release();
  }
}
 
开发者ID:apache,项目名称:incubator-tajo,代码行数:26,代码来源:HCatalogStore.java


示例2: createRecordReader

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
/**
 * Create an {@link org.apache.hcatalog.mapreduce.HCatRecordReader}.
 *
 * @param split Input split
 * @param schema Table schema
 * @param taskContext Context
 * @return Record reader
 * @throws IOException
 * @throws InterruptedException
 */
private RecordReader<WritableComparable, HCatRecord>
createRecordReader(InputSplit split,
                   HCatSchema schema,
                   TaskAttemptContext taskContext)
  throws IOException, InterruptedException {
  HCatSplit hcatSplit = HCatUtils.castToHCatSplit(split);
  PartInfo partitionInfo = hcatSplit.getPartitionInfo();
  JobContext jobContext = taskContext;
  Configuration conf = jobContext.getConfiguration();

  HCatStorageHandler storageHandler = HCatUtil.getStorageHandler(
      conf, partitionInfo);

  JobConf jobConf = HCatUtil.getJobConfFromContext(jobContext);
  Map<String, String> jobProperties = partitionInfo.getJobProperties();
  HCatUtil.copyJobPropertiesToJobConf(jobProperties, jobConf);

  Map<String, String> valuesNotInDataCols = getColValsNotInDataColumns(
      schema, partitionInfo);

  return HCatUtils.newHCatReader(storageHandler, valuesNotInDataCols);
}
 
开发者ID:renato2099,项目名称:giraph-gora,代码行数:33,代码来源:GiraphHCatInputFormat.java


示例3: extractPartInfo

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
/**
 * Extract partition info.
 *
 * @param schema Table schema
 * @param sd Storage descriptor
 * @param parameters Parameters
 * @param conf Configuration
 * @param inputJobInfo Input job info
 * @return Partition info
 * @throws IOException
 */
private static PartInfo extractPartInfo(
    HCatSchema schema, StorageDescriptor sd, Map<String, String> parameters,
    Configuration conf, InputJobInfo inputJobInfo) throws IOException {
  StorerInfo storerInfo = InternalUtil.extractStorerInfo(sd, parameters);

  Properties hcatProperties = new Properties();
  HCatStorageHandler storageHandler = HCatUtil.getStorageHandler(conf,
      storerInfo);

  // Copy the properties from storageHandler to jobProperties
  Map<String, String> jobProperties =
      HCatUtil.getInputJobProperties(storageHandler, inputJobInfo);

  for (Map.Entry<String, String> param : parameters.entrySet()) {
    hcatProperties.put(param.getKey(), param.getValue());
  }

  return new PartInfo(schema, storageHandler, sd.getLocation(),
      hcatProperties, jobProperties, inputJobInfo.getTableInfo());
}
 
开发者ID:renato2099,项目名称:giraph-gora,代码行数:32,代码来源:HCatUtils.java


示例4: existTable

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
@Override
public boolean existTable(final String name) throws CatalogException {
  boolean exist = false;

  String dbName = null, tableName = null;
  Pair<String, String> tablePair = null;
  org.apache.hadoop.hive.ql.metadata.Table table = null;
  HCatalogStoreClientPool.HCatalogStoreClient client = null;
  // get db name and table name.
  try {
    tablePair = HCatUtil.getDbAndTableName(name);
    dbName = tablePair.first;
    tableName = tablePair.second;
  } catch (Exception ioe) {
    throw new CatalogException("Table name is wrong.", ioe);
  }

  // get table
  try {
    try {
      client = clientPool.getClient();
      table = HCatUtil.getTable(client.getHiveClient(), dbName, tableName);
      if (table != null) {
        exist = true;
      }
    } catch (NoSuchObjectException nsoe) {
      exist = false;
    } catch (Exception e) {
      throw new CatalogException(e);
    }
  } finally {
    client.release();
  }

  return exist;
}
 
开发者ID:apache,项目名称:incubator-tajo,代码行数:37,代码来源:HCatalogStore.java


示例5: existTable

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
@Override
public boolean existTable(final String databaseName, final String tableName) throws CatalogException {
  boolean exist = false;
  org.apache.hadoop.hive.ql.metadata.Table table = null;
  HCatalogStoreClientPool.HCatalogStoreClient client = null;

  // get table
  try {
    try {
      client = clientPool.getClient();
      table = HCatUtil.getTable(client.getHiveClient(), databaseName, tableName);
      if (table != null) {
        exist = true;
      }
    } catch (NoSuchObjectException nsoe) {
      exist = false;
    } catch (Exception e) {
      throw new CatalogException(e);
    }
  } finally {
    if (client != null) {
      client.release();
    }
  }

  return exist;
}
 
开发者ID:gruter,项目名称:tajo-cdh,代码行数:28,代码来源:HCatalogStore.java


示例6: HCatalog

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
public HCatalog(Configuration conf) {
  if (conf.get(Loader.HIVE_METASTORE_URI_PROP) == null) {
    LOG.warn("Using a local Hive MetaStore (for testing only)");
  }
  try {
    hiveConf = new HiveConf(conf, HiveConf.class);
    client = HCatUtil.getHiveClient(hiveConf);
  } catch (Exception e) {
    throw new RuntimeException("Hive metastore exception", e);
  }
}
 
开发者ID:cloudera,项目名称:cdk,代码行数:12,代码来源:HCatalog.java


示例7: getTable

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
@SuppressWarnings("deprecation")
public Table getTable(String dbName, String tableName) {
  Table table;
  try {
    table = HCatUtil.getTable(client, dbName, tableName);
  } catch (Exception e) {
    throw new com.cloudera.cdk.data.NoSuchDatasetException("Hive table lookup exception", e);
  }
  
  if (table == null) {
    throw new com.cloudera.cdk.data.NoSuchDatasetException("Could not find info for table: " + tableName);
  }
  return table;
}
 
开发者ID:cloudera,项目名称:cdk,代码行数:15,代码来源:HCatalog.java


示例8: setVertexInput

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
/**
 * Set vertex {@link InputJobInfo}.
 *
 * @param job The job
 * @param inputJobInfo Vertex input job info
 * @throws IOException
 */
public static void setVertexInput(Job job,
                                  InputJobInfo inputJobInfo)
  throws IOException {
  InputJobInfo vertexInputJobInfo = InputJobInfo.create(
      inputJobInfo.getDatabaseName(),
      inputJobInfo.getTableName(),
      inputJobInfo.getFilter());
  vertexInputJobInfo.getProperties().putAll(inputJobInfo.getProperties());
  Configuration conf = job.getConfiguration();
  conf.set(VERTEX_INPUT_JOB_INFO, HCatUtil.serialize(
      HCatUtils.getInputJobInfo(conf, vertexInputJobInfo)));
}
 
开发者ID:renato2099,项目名称:giraph-gora,代码行数:20,代码来源:GiraphHCatInputFormat.java


示例9: setEdgeInput

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
/**
 * Set edge {@link InputJobInfo}.
 *
 * @param job The job
 * @param inputJobInfo Edge input job info
 * @throws IOException
 */
public static void setEdgeInput(Job job,
                                InputJobInfo inputJobInfo)
  throws IOException {
  InputJobInfo edgeInputJobInfo = InputJobInfo.create(
      inputJobInfo.getDatabaseName(),
      inputJobInfo.getTableName(),
      inputJobInfo.getFilter());
  edgeInputJobInfo.getProperties().putAll(inputJobInfo.getProperties());
  Configuration conf = job.getConfiguration();
  conf.set(EDGE_INPUT_JOB_INFO, HCatUtil.serialize(
      HCatUtils.getInputJobInfo(conf, edgeInputJobInfo)));
}
 
开发者ID:renato2099,项目名称:giraph-gora,代码行数:20,代码来源:GiraphHCatInputFormat.java


示例10: getVertexJobInfo

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
/**
 * Get vertex {@link InputJobInfo}.
 *
 * @param conf Configuration
 * @return Vertex input job info
 * @throws IOException
 */
private static InputJobInfo getVertexJobInfo(Configuration conf)
  throws IOException {
  String jobString = conf.get(VERTEX_INPUT_JOB_INFO);
  if (jobString == null) {
    throw new IOException("Vertex job information not found in JobContext." +
        " GiraphHCatInputFormat.setVertexInput() not called?");
  }
  return (InputJobInfo) HCatUtil.deserialize(jobString);
}
 
开发者ID:renato2099,项目名称:giraph-gora,代码行数:17,代码来源:GiraphHCatInputFormat.java


示例11: getEdgeJobInfo

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
/**
 * Get edge {@link InputJobInfo}.
 *
 * @param conf Configuration
 * @return Edge input job info
 * @throws IOException
 */
private static InputJobInfo getEdgeJobInfo(Configuration conf)
  throws IOException {
  String jobString = conf.get(EDGE_INPUT_JOB_INFO);
  if (jobString == null) {
    throw new IOException("Edge job information not found in JobContext." +
        " GiraphHCatInputFormat.setEdgeInput() not called?");
  }
  return (InputJobInfo) HCatUtil.deserialize(jobString);
}
 
开发者ID:renato2099,项目名称:giraph-gora,代码行数:17,代码来源:GiraphHCatInputFormat.java


示例12: setup

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
@Override
protected void setup(Context context)
  throws IOException, InterruptedException {
  Configuration conf = context.getConfiguration();
  String inputJobInfoStr = conf.get(HCatConstants.HCAT_KEY_JOB_INFO);
  jobInfo =
    (InputJobInfo) HCatUtil.deserialize(inputJobInfoStr);
  dataColsSchema = jobInfo.getTableInfo().getDataColumns();
  partitionSchema =
    jobInfo.getTableInfo().getPartitionColumns();
  StringBuilder storerInfoStr = new StringBuilder(1024);
  StorerInfo storerInfo = jobInfo.getTableInfo().getStorerInfo();
  storerInfoStr.append("HCatalog Storer Info : ")
    .append("\n\tHandler = ").append(storerInfo.getStorageHandlerClass())
    .append("\n\tInput format class = ").append(storerInfo.getIfClass())
    .append("\n\tOutput format class = ").append(storerInfo.getOfClass())
    .append("\n\tSerde class = ").append(storerInfo.getSerdeClass());
  Properties storerProperties = storerInfo.getProperties();
  if (!storerProperties.isEmpty()) {
    storerInfoStr.append("\nStorer properties ");
    for (Map.Entry<Object, Object> entry : storerProperties.entrySet()) {
      String key = (String) entry.getKey();
      Object val = entry.getValue();
      storerInfoStr.append("\n\t").append(key).append('=').append(val);
    }
  }
  storerInfoStr.append("\n");
  LOG.info(storerInfoStr);

  hCatFullTableSchema = new HCatSchema(dataColsSchema.getFields());
  for (HCatFieldSchema hfs : partitionSchema.getFields()) {
    hCatFullTableSchema.append(hfs);
  }
  fieldCount = hCatFullTableSchema.size();
  lobLoader = new LargeObjectLoader(conf,
    new Path(jobInfo.getTableInfo().getTableLocation()));
  bigDecimalFormatString = conf.getBoolean(
    ImportJobBase.PROPERTY_BIGDECIMAL_FORMAT,
    ImportJobBase.PROPERTY_BIGDECIMAL_FORMAT_DEFAULT);
  debugHCatImportMapper = conf.getBoolean(
    SqoopHCatUtilities.DEBUG_HCAT_IMPORT_MAPPER_PROP, false);
  IntWritable[] delimChars = DefaultStringifier.loadArray(conf,
      SqoopHCatUtilities.HIVE_DELIMITERS_TO_REPLACE_PROP, IntWritable.class);
  hiveDelimiters = new DelimiterSet(
    (char) delimChars[0].get(), (char) delimChars[1].get(),
    (char) delimChars[2].get(), (char) delimChars[3].get(),
    delimChars[4].get() == 1 ? true : false);
  hiveDelimsReplacement =
    conf.get(SqoopHCatUtilities.HIVE_DELIMITERS_REPLACEMENT_PROP);
  if (hiveDelimsReplacement == null) {
    hiveDelimsReplacement = "";
  }
  doHiveDelimsReplacement = Boolean.valueOf(conf.get(
    SqoopHCatUtilities.HIVE_DELIMITERS_REPLACEMENT_ENABLED_PROP));

  IntWritable[] fPos = DefaultStringifier.loadArray(conf,
      SqoopHCatUtilities.HCAT_FIELD_POSITIONS_PROP, IntWritable.class);
  hCatFieldPositions = new int[fPos.length];
  for (int i = 0; i < fPos.length; ++i) {
    hCatFieldPositions[i] = fPos[i].get();
  }

  LOG.debug("Hive delims replacement enabled : " + doHiveDelimsReplacement);
  LOG.debug("Hive Delimiters : " + hiveDelimiters.toString());
  LOG.debug("Hive delimiters replacement : " + hiveDelimsReplacement);
  staticPartitionKey =
    conf.get(SqoopHCatUtilities.HCAT_STATIC_PARTITION_KEY_PROP);
  LOG.debug("Static partition key used : " + staticPartitionKey);


}
 
开发者ID:unicredit,项目名称:zSqoop,代码行数:72,代码来源:SqoopHCatImportMapper.java


示例13: setup

import org.apache.hcatalog.common.HCatUtil; //导入依赖的package包/类
@Override
protected void setup(Context context)
  throws IOException, InterruptedException {
  super.setup(context);

  Configuration conf = context.getConfiguration();

  colTypesJava = DefaultStringifier.load(conf,
    SqoopHCatUtilities.HCAT_DB_OUTPUT_COLTYPES_JAVA, MapWritable.class);
  colTypesSql = DefaultStringifier.load(conf,
    SqoopHCatUtilities.HCAT_DB_OUTPUT_COLTYPES_SQL, MapWritable.class);
  // Instantiate a copy of the user's class to hold and parse the record.

  String recordClassName = conf.get(
    ExportJobBase.SQOOP_EXPORT_TABLE_CLASS_KEY);
  if (null == recordClassName) {
    throw new IOException("Export table class name ("
      + ExportJobBase.SQOOP_EXPORT_TABLE_CLASS_KEY
      + ") is not set!");
  }
  debugHCatExportMapper = conf.getBoolean(
    SqoopHCatUtilities.DEBUG_HCAT_EXPORT_MAPPER_PROP, false);
  try {
    Class cls = Class.forName(recordClassName, true,
      Thread.currentThread().getContextClassLoader());
    sqoopRecord = (SqoopRecord) ReflectionUtils.newInstance(cls, conf);
  } catch (ClassNotFoundException cnfe) {
    throw new IOException(cnfe);
  }

  if (null == sqoopRecord) {
    throw new IOException("Could not instantiate object of type "
      + recordClassName);
  }

  String inputJobInfoStr = conf.get(HCatConstants.HCAT_KEY_JOB_INFO);
  jobInfo =
    (InputJobInfo) HCatUtil.deserialize(inputJobInfoStr);
  HCatSchema tableSchema = jobInfo.getTableInfo().getDataColumns();
  HCatSchema partitionSchema =
    jobInfo.getTableInfo().getPartitionColumns();
  hCatFullTableSchema = new HCatSchema(tableSchema.getFields());
  for (HCatFieldSchema hfs : partitionSchema.getFields()) {
    hCatFullTableSchema.append(hfs);
  }
  hCatSchemaFields = hCatFullTableSchema.getFields();

}
 
开发者ID:unicredit,项目名称:zSqoop,代码行数:49,代码来源:SqoopHCatExportMapper.java



注:本文中的org.apache.hcatalog.common.HCatUtil类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java ValueQueue类代码示例发布时间:2022-05-23
下一篇:
Java NegExpr类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap