• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java SerDeUtils类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.hadoop.hive.serde2.SerDeUtils的典型用法代码示例。如果您正苦于以下问题:Java SerDeUtils类的具体用法?Java SerDeUtils怎么用?Java SerDeUtils使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



SerDeUtils类属于org.apache.hadoop.hive.serde2包,在下文中一共展示了SerDeUtils类的6个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: serializeToBytes

import org.apache.hadoop.hive.serde2.SerDeUtils; //导入依赖的package包/类
/**
 * Serialize a object into bytes.
 *
 * @param foi object inspector
 * @param obj object to be serialized
 * @param useJsonSerialize true to use json serialization
 * @return object in serialized bytes
 * @throws IOException when error happens
 */
protected byte[] serializeToBytes(ObjectInspector foi, ObjectInspector doi, Object obj, boolean useJsonSerialize) throws IOException {
    serializeStream.reset();
    boolean isNotNull;
    if (!foi.getCategory().equals(Category.PRIMITIVE)
            && useJsonSerialize) {
        isNotNull = serialize(SerDeUtils.getJSONString(obj, foi),
                PrimitiveObjectInspectorFactory.javaStringObjectInspector, doi, 1);
    } else {
        isNotNull = serialize(obj, foi, doi, 1);
    }
    if (!isNotNull) {
        return null;
    }
    byte[] key = new byte[serializeStream.getCount()];
    System.arraycopy(serializeStream.getData(), 0, key, 0, serializeStream.getCount());

    return key;
}
 
开发者ID:2013Commons,项目名称:hive-cassandra,代码行数:28,代码来源:TableMapping.java


示例2: serializeToBytes

import org.apache.hadoop.hive.serde2.SerDeUtils; //导入依赖的package包/类
/**
 * Serialize a object into bytes.
 * @param foi object inspector
 * @param decalred output object inspector
 * @param obj object to be serialized
 * @param useJsonSerialize true to use json serialization
 * @return object in serialized bytes
 * @throws IOException when error happens
 */
protected byte[] serializeToBytes(ObjectInspector foi, ObjectInspector doi, Object obj, boolean useJsonSerialize) throws IOException {
  serializeStream.reset();
  boolean isNotNull;
  if (!foi.getCategory().equals(Category.PRIMITIVE)
              && useJsonSerialize) {
    isNotNull = serialize(SerDeUtils.getJSONString(obj, foi),
                PrimitiveObjectInspectorFactory.javaStringObjectInspector, doi, 1);
  } else {
    isNotNull = serialize(obj, foi, doi, 1);
  }
  if (!isNotNull) {
    return null;
  }
  byte[] key = new byte[serializeStream.getCount()];
  System.arraycopy(serializeStream.getData(), 0, key, 0, serializeStream.getCount());

  return key;
}
 
开发者ID:dvasilen,项目名称:Hive-Cassandra,代码行数:28,代码来源:TableMapping.java


示例3: serialize

import org.apache.hadoop.hive.serde2.SerDeUtils; //导入依赖的package包/类
/**
 * Given an object and object inspector pair, traverse the object
 * and generate a Text representation of the object.
 */
@Override
public Writable serialize(Object obj, ObjectInspector objInspector)
  throws SerDeException {
  StringBuilder sb = new StringBuilder();
  try {

    StructObjectInspector soi = (StructObjectInspector) objInspector;
    List<? extends StructField> structFields = soi.getAllStructFieldRefs();
    assert (columnNames.size() == structFields.size());
    if (obj == null) {
      sb.append("null");
    } else {
      sb.append(SerDeUtils.LBRACE);
      for (int i = 0; i < structFields.size(); i++) {
        if (i > 0) {
          sb.append(SerDeUtils.COMMA);
        }
        appendWithQuotes(sb, columnNames.get(i));
        sb.append(SerDeUtils.COLON);
        buildJSONString(sb, soi.getStructFieldData(obj, structFields.get(i)),
          structFields.get(i).getFieldObjectInspector());
      }
      sb.append(SerDeUtils.RBRACE);
    }

  } catch (IOException e) {
    LOG.warn("Error generating json text from object.", e);
    throw new SerDeException(e);
  }
  return new Text(sb.toString());
}
 
开发者ID:prestodb,项目名称:presto-hive-apache,代码行数:36,代码来源:JsonSerDe.java


示例4: createDummyFileForEmptyPartition

import org.apache.hadoop.hive.serde2.SerDeUtils; //导入依赖的package包/类
@SuppressWarnings("rawtypes")
private static Path createDummyFileForEmptyPartition(Path path, JobConf job, MapWork work,
    Path hiveScratchDir, String alias, int sequenceNumber)
        throws Exception {

  String strPath = path.toString();

  // The input file does not exist, replace it by a empty file
  PartitionDesc partDesc = work.getPathToPartitionInfo().get(strPath);
  if (partDesc.getTableDesc().isNonNative()) {
    // if this isn't a hive table we can't create an empty file for it.
    return path;
  }

  Properties props = SerDeUtils.createOverlayedProperties(
      partDesc.getTableDesc().getProperties(), partDesc.getProperties());
  HiveOutputFormat outFileFormat = HiveFileFormatUtils.getHiveOutputFormat(job, partDesc);

  boolean oneRow = partDesc.getInputFileFormatClass() == OneNullRowInputFormat.class;

  Path newPath = createEmptyFile(hiveScratchDir, outFileFormat, job,
      sequenceNumber, props, oneRow);

  if (LOG.isInfoEnabled()) {
    LOG.info("Changed input file " + strPath + " to empty file " + newPath);
  }

  // update the work
  String strNewPath = newPath.toString();

  LinkedHashMap<String, ArrayList<String>> pathToAliases = work.getPathToAliases();
  pathToAliases.put(strNewPath, pathToAliases.get(strPath));
  pathToAliases.remove(strPath);

  work.setPathToAliases(pathToAliases);

  LinkedHashMap<String, PartitionDesc> pathToPartitionInfo = work.getPathToPartitionInfo();
  pathToPartitionInfo.put(strNewPath, pathToPartitionInfo.get(strPath));
  pathToPartitionInfo.remove(strPath);
  work.setPathToPartitionInfo(pathToPartitionInfo);

  return newPath;
}
 
开发者ID:mini666,项目名称:hive-phoenix-handler,代码行数:44,代码来源:Utilities.java


示例5: appendWithQuotes

import org.apache.hadoop.hive.serde2.SerDeUtils; //导入依赖的package包/类
private static StringBuilder appendWithQuotes(StringBuilder sb, String value) {
  return sb == null ? null : sb.append(SerDeUtils.QUOTE).append(value).append(SerDeUtils.QUOTE);
}
 
开发者ID:prestodb,项目名称:presto-hive-apache,代码行数:4,代码来源:JsonSerDe.java


示例6: getDeserializer

import org.apache.hadoop.hive.serde2.SerDeUtils; //导入依赖的package包/类
/**
 * getDeserializer
 *
 * Get the Deserializer for a table given its name and properties.
 *
 * @param conf
 *          hadoop config
 * @param schema
 *          the properties to use to instantiate the deserializer
 * @return
 *   Returns instantiated deserializer by looking up class name of deserializer stored in passed
 *   in properties. Also, initializes the deserializer with schema stored in passed in properties.
 * @exception MetaException
 *              if any problems instantiating the Deserializer
 *
 *              todo - this should move somewhere into serde.jar
 *
 */
static public Deserializer getDeserializer(Configuration conf,
    Properties schema) throws MetaException {
  String lib = schema
      .getProperty(org.apache.hadoop.hive.serde.serdeConstants.SERIALIZATION_LIB);
  try {
    Deserializer deserializer = SerDeUtils.lookupDeserializer(lib);
    (deserializer).initialize(conf, schema);
    return deserializer;
  } catch (Exception e) {
    log.error(e, "error in initSerDe: %s %s", e.getClass().getName(), e.getMessage());
    MetaStoreUtils.printStackTrace(e);
    throw new MetaException(e.getClass().getName() + " " + e.getMessage());
  }
}
 
开发者ID:facebookarchive,项目名称:swift-hive-metastore,代码行数:33,代码来源:MetaStoreUtils.java



注:本文中的org.apache.hadoop.hive.serde2.SerDeUtils类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java CommandContext类代码示例发布时间:2022-05-23
下一篇:
Java LayoutParams类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap