• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java HCatSchema类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.hive.hcatalog.data.schema.HCatSchema的典型用法代码示例。如果您正苦于以下问题:Java HCatSchema类的具体用法?Java HCatSchema怎么用?Java HCatSchema使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



HCatSchema类属于org.apache.hive.hcatalog.data.schema包,在下文中一共展示了HCatSchema类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: getGeneralSchemaFromHCatFieldSchema

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
public static IField getGeneralSchemaFromHCatFieldSchema( final HCatFieldSchema hCatFieldSchema ) throws IOException{
  if( hCatFieldSchema.getCategory() == HCatFieldSchema.Category.ARRAY ){
    HCatSchema arrayElementSchema = hCatFieldSchema.getArrayElementSchema();
    return new ArrayContainerField( hCatFieldSchema.getName() , getGeneralSchemaFromHCatFieldSchema( arrayElementSchema.get(0) ) );
  }
  else if( hCatFieldSchema.getCategory() == HCatFieldSchema.Category.MAP ){
    HCatSchema mapValueElementSchema = hCatFieldSchema.getMapValueSchema();
    return new MapContainerField( hCatFieldSchema.getName() , getGeneralSchemaFromHCatFieldSchema( mapValueElementSchema.get(0) ) );
  }
  else if( hCatFieldSchema.getCategory() == HCatFieldSchema.Category.STRUCT ){
    HCatSchema structSchema = hCatFieldSchema.getStructSubSchema();
    StructContainerField field = new StructContainerField( hCatFieldSchema.getName() );
    for( int i = 0 ; i < structSchema.size() ; i++ ){
      field.set( getGeneralSchemaFromHCatFieldSchema( structSchema.get(i) ) );
    }
    return field;
  }
  else if( hCatFieldSchema.getCategory() == HCatFieldSchema.Category.PRIMITIVE ){
    TypeInfo typeInfo = hCatFieldSchema.getTypeInfo();
    return HiveSchemaFactory.getGeneralSchema( hCatFieldSchema.getName() , typeInfo );
  }
  else{
    throw new IOException( "Unknown HCatalog field type : " + hCatFieldSchema.toString() );
  }
}
 
开发者ID:yahoojapan,项目名称:dataplatform-schema-lib,代码行数:26,代码来源:HCatalogSchemaFactory.java


示例2: generateHCatRecords

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
private List<HCatRecord> generateHCatRecords(int numRecords,
  HCatSchema hCatTblSchema, ColumnGenerator... extraCols) throws Exception {
  List<HCatRecord> records = new ArrayList<HCatRecord>();
  List<HCatFieldSchema> hCatTblCols = hCatTblSchema.getFields();
  int size = hCatTblCols.size();
  for (int i = 0; i < numRecords; ++i) {
    DefaultHCatRecord record = new DefaultHCatRecord(size);
    record.set(hCatTblCols.get(0).getName(), hCatTblSchema, i);
    record.set(hCatTblCols.get(1).getName(), hCatTblSchema, "textfield" + i);
    int idx = 0;
    for (int j = 0; j < extraCols.length; ++j) {
      if (extraCols[j].getKeyType() == KeyType.STATIC_KEY) {
        continue;
      }
      record.set(hCatTblCols.get(idx + 2).getName(), hCatTblSchema,
        extraCols[j].getHCatValue(i));
      ++idx;
    }

    records.add(record);
  }
  return records;
}
 
开发者ID:aliyun,项目名称:aliyun-maxcompute-data-collectors,代码行数:24,代码来源:HCatalogTestUtils.java


示例3: readObject

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
@SuppressWarnings("unchecked")
private void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException {
	this.fieldNames = new String[in.readInt()];
	for (int i = 0; i < this.fieldNames.length; i++) {
		this.fieldNames[i] = in.readUTF();
	}

	Configuration configuration = new Configuration();
	configuration.readFields(in);

	if (this.configuration == null) {
		this.configuration = configuration;
	}

	this.hCatInputFormat = new org.apache.hive.hcatalog.mapreduce.HCatInputFormat();
	this.outputSchema = (HCatSchema) HCatUtil.deserialize(this.configuration.get("mapreduce.lib.hcat.output.schema"));
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:18,代码来源:HCatInputFormatBase.java


示例4: readObject

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
@SuppressWarnings("unchecked")
private void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException {
	this.fieldNames = new String[in.readInt()];
	for(int i=0; i<this.fieldNames.length; i++) {
		this.fieldNames[i] = in.readUTF();
	}

	Configuration configuration = new Configuration();
	configuration.readFields(in);

	if(this.configuration == null) {
		this.configuration = configuration;
	}

	this.hCatInputFormat = new org.apache.hive.hcatalog.mapreduce.HCatInputFormat();
	this.outputSchema = (HCatSchema)HCatUtil.deserialize(this.configuration.get("mapreduce.lib.hcat.output.schema"));
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:18,代码来源:HCatInputFormatBase.java


示例5: extractPartInfo

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
private static PartInfo extractPartInfo(HCatSchema schema, StorageDescriptor sd,
                    Map<String, String> parameters, Configuration conf,
                    InputJobInfo inputJobInfo) throws IOException {

  StorerInfo storerInfo = InternalUtil.extractStorerInfo(sd, parameters);

  Properties hcatProperties = new Properties();
  HiveStorageHandler storageHandler = HCatUtil.getStorageHandler(conf, storerInfo);

  // copy the properties from storageHandler to jobProperties
  Map<String, String> jobProperties =
      HCatRSUtil.getInputJobProperties(storageHandler, inputJobInfo);

  for (String key : parameters.keySet()) {
    hcatProperties.put(key, parameters.get(key));
  }
  // FIXME
  // Bloating partinfo with inputJobInfo is not good
  return new PartInfo(schema, storageHandler, sd.getLocation(),
    hcatProperties, jobProperties, inputJobInfo.getTableInfo());
}
 
开发者ID:cloudera,项目名称:RecordServiceClient,代码行数:22,代码来源:InitializeInput.java


示例6: HCatTableInfo

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
/**
 * Initializes a new HCatTableInfo instance to be used with
 * {@link org.apache.hive.hcatalog.mapreduce.HCatInputFormat} for reading data from
 * a table. Work with hadoop security, the kerberos principal name of the server
 * - else null. The principal name should be of the form:
 * <servicename>/[email protected]<realm> like "hcat/[email protected]"
 * The special string _HOST will be replaced automatically with the correct host name
 * @param databaseName the db name
 * @param tableName the table name
 * @param dataColumns schema of columns which contain data
 * @param partitionColumns schema of partition columns
 * @param storerInfo information about storage descriptor
 * @param table hive metastore table class
 */
HCatTableInfo(
  String databaseName,
  String tableName,
  HCatSchema dataColumns,
  HCatSchema partitionColumns,
  StorerInfo storerInfo,
  Table table) {
  this.databaseName = (databaseName == null) ?
      MetaStoreUtils.DEFAULT_DATABASE_NAME : databaseName;
  this.tableName = tableName;
  this.dataColumns = dataColumns;
  this.table = table;
  this.storerInfo = storerInfo;
  this.partitionColumns = partitionColumns;
}
 
开发者ID:cloudera,项目名称:RecordServiceClient,代码行数:30,代码来源:HCatTableInfo.java


示例7: getHCatSchema

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
HCatSchema getHCatSchema(List<RequiredField> fields, String signature,
                         Class<?> classForUDFCLookup) throws IOException {
  if (fields == null) {
    return null;
  }

  Properties props = UDFContext.getUDFContext().getUDFProperties(
    classForUDFCLookup, new String[]{signature});
  HCatSchema hcatTableSchema = (HCatSchema) props.get(HCatConstants.HCAT_TABLE_SCHEMA);

  ArrayList<HCatFieldSchema> fcols = new ArrayList<HCatFieldSchema>();
  for (RequiredField rf : fields) {
    fcols.add(hcatTableSchema.getFields().get(rf.getIndex()));
  }
  return new HCatSchema(fcols);
}
 
开发者ID:cloudera,项目名称:RecordServiceClient,代码行数:17,代码来源:PigHCatUtil.java


示例8: getSchema

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
@Override
public ResourceSchema getSchema(String location, Job job) throws IOException {
  HCatContext.INSTANCE.setConf(job.getConfiguration()).getConf().get()
    .setBoolean(HCatConstants.HCAT_DATA_TINY_SMALL_INT_PROMOTION, true);
  Table table = phutil.getTable(location,
    hcatServerUri != null ? hcatServerUri : PigHCatUtil.getHCatServerUri(job),
    PigHCatUtil.getHCatServerPrincipal(job),

    // Pass job to initialize metastore conf overrides for embedded metastore case
    // (hive.metastore.uris = "").
    job);
  HCatSchema hcatTableSchema = HCatUtil.getTableSchemaWithPtnCols(table);
  try {
    PigHCatUtil.validateHCatTableSchemaFollowsPigRules(hcatTableSchema);
  } catch (IOException e) {
    throw new PigException(
      "Table schema incompatible for reading through HCatLoader :" + e.getMessage()
        + ";[Table schema was " + hcatTableSchema.toString() + "]"
      , PigHCatUtil.PIG_EXCEPTION_CODE, e);
  }
  storeInUDFContext(signature, HCatConstants.HCAT_TABLE_SCHEMA, hcatTableSchema);
  outputSchema = hcatTableSchema;
  return PigHCatUtil.getResourceSchema(hcatTableSchema);
}
 
开发者ID:cloudera,项目名称:RecordServiceClient,代码行数:25,代码来源:HCatRSLoader.java


示例9: mapper_emits_values_by_reading_schema_from_hcatalog

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
@Test
public void mapper_emits_values_by_reading_schema_from_hcatalog() throws Exception {
    HCatSchema hCatSchema = buildTestSchema();
    provider.setSchema(hCatSchema);

    // setup hcatrecords. We care most about 15 and 48
    List<DefaultHCatRecord> hCatRecords = new ArrayList<DefaultHCatRecord>();
    for (String[] rec : readFile("src/test/resources/sample.txt")) {
        DefaultHCatRecord hcr = new DefaultHCatRecord(rec.length);
        for (int i = 0; i < rec.length; i++) {
            hcr.set(i, rec[i]);
        }
        hCatRecords.add(hcr);
        mapDriver.withInput(new LongWritable(), hcr);
    }

    mapDriver.withOutput(new Text("CA"), new Text("69"));
    mapDriver.withOutput(new Text("TX"), new Text("14.38"));
    mapDriver.withOutput(new Text("MI"), new Text("13.53"));
    mapDriver.withOutput(new Text("CA"), new Text("18.3"));
    mapDriver.withOutput(new Text("IL"), new Text("16.06"));
    mapDriver.runTest();
}
 
开发者ID:mmiklavc,项目名称:hadoop-testing,代码行数:24,代码来源:CMSTopStateTest.java


示例10: parse

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
@Override
public List<Object[]> parse(File file, HCatSchema schema, List<String> names) {
  try {
    List<String> lines = Files.readAllLines(file.toPath(), charset);

    if (this.hasHeader) {
      lines = lines.subList(1, lines.size());
    }

    List<Object[]> records = new ArrayList<>(lines.size());
    for (String line : lines) {
      records.add(parseRow(line, names.size()));
    }
    return records;
  } catch (IOException e) {
    throw new RuntimeException("Error while reading file", e);
  }
}
 
开发者ID:klarna,项目名称:HiveRunner,代码行数:19,代码来源:TsvFileParser.java


示例11: HCatalogMapParser

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
public HCatalogMapParser( final Map<Object,Object> record , final HCatSchema schema ) throws IOException{
  this.schema = schema;
  this.record = record;

  childSchema = schema.get(0);
  childConverter = HCatalogPrimitiveConverterFactory.get( childSchema );
}
 
开发者ID:yahoojapan,项目名称:dataplatform-schema-lib,代码行数:8,代码来源:HCatalogMapParser.java


示例12: HCatalogStructParser

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
public HCatalogStructParser( final List<Object> record , final HCatSchema schema ) throws IOException{
  this.record = record;

  fieldIndexMap = new HashMap<String,Integer>();
  converterList = new ArrayList<IHCatalogPrimitiveConverter>();
  schemaList = new ArrayList<HCatFieldSchema>();

  for( int i = 0 ; i < schema.size() ; i++ ){
    HCatFieldSchema fieldSchema = schema.get(i);
    fieldIndexMap.put( fieldSchema.getName() , Integer.valueOf(i) );
    converterList.add( HCatalogPrimitiveConverterFactory.get( fieldSchema ) );
    schemaList.add( schema.get(i) );
  }
}
 
开发者ID:yahoojapan,项目名称:dataplatform-schema-lib,代码行数:15,代码来源:HCatalogStructParser.java


示例13: HCatalogRootParser

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
public HCatalogRootParser( final HCatRecord record , final HCatSchema schema ) throws IOException{
  this.record = record;

  fieldIndexMap = new HashMap<String,Integer>();
  converterList = new ArrayList<IHCatalogPrimitiveConverter>();
  schemaList = new ArrayList<HCatFieldSchema>();

  for( int i = 0 ; i < schema.size() ; i++ ){
    HCatFieldSchema fieldSchema = schema.get(i);
    fieldIndexMap.put( fieldSchema.getName() , Integer.valueOf(i) );
    converterList.add( HCatalogPrimitiveConverterFactory.get( fieldSchema ) );
    schemaList.add( schema.get(i) );
  }
}
 
开发者ID:yahoojapan,项目名称:dataplatform-schema-lib,代码行数:15,代码来源:HCatalogRootParser.java


示例14: HCatalogArrayParser

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
public HCatalogArrayParser( final List<Object> record , final HCatSchema schema ) throws IOException{
  this.schema = schema;
  this.record = record;

  childSchema = schema.get(0);
  childConverter = HCatalogPrimitiveConverterFactory.get( childSchema );
}
 
开发者ID:yahoojapan,项目名称:dataplatform-schema-lib,代码行数:8,代码来源:HCatalogArrayParser.java


示例15: getGeneralSchema

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
public static IField getGeneralSchema( final HCatSchema hCatSchema ) throws IOException{
  StructContainerField schema = new StructContainerField( "hcat_schema" );
  for( int i = 0 ; i < hCatSchema.size() ; i++ ){
    schema.set( getGeneralSchemaFromHCatFieldSchema( hCatSchema.get( i ) ) );
  }
  return schema;
}
 
开发者ID:yahoojapan,项目名称:dataplatform-schema-lib,代码行数:8,代码来源:HCatalogSchemaFactory.java


示例16: getHCatSchema

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
public static HCatSchema getHCatSchema( final IField schema ) throws IOException{
  if( !( schema instanceof StructContainerField ) ){
    throw new IOException( "Root schema is struct only." );
  }
  StructContainerField structSchema = (StructContainerField)schema;
  
  List<HCatFieldSchema> hCatSchemaList = new ArrayList<HCatFieldSchema>();
  for( String key : structSchema.getKeys() ){
    hCatSchemaList.add( getHCatFieldSchema( structSchema.get( key ) ) );
  }

  return new HCatSchema( hCatSchemaList );
}
 
开发者ID:yahoojapan,项目名称:dataplatform-schema-lib,代码行数:14,代码来源:HCatalogSchemaFactory.java


示例17: runHCatImport

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
protected void runHCatImport(List<String> addlArgsArray,
  int totalRecords, String table, ColumnGenerator[] cols,
  String[] cNames, boolean dontCreate, boolean isQuery) throws Exception {
  CreateMode mode = CreateMode.CREATE;
  if (dontCreate) {
    mode = CreateMode.NO_CREATION;
  }
  HCatSchema tblSchema =
    utils.createHCatTable(mode, totalRecords, table, cols);
  utils.createSqlTable(getConnection(), false, totalRecords, table, cols);
  addlArgsArray.add("-m");
  addlArgsArray.add("1");
  addlArgsArray.add("--hcatalog-table");
  addlArgsArray.add(table);
  String[] colNames = null;
  if (cNames != null) {
    colNames = cNames;
  } else {
    colNames = new String[2 + cols.length];
    colNames[0] = "ID";
    colNames[1] = "MSG";
    for (int i = 0; i < cols.length; ++i) {
      colNames[2 + i] =  cols[i].getName().toUpperCase();
    }
  }
  String[] importArgs;
  if (isQuery) {
    importArgs = getQueryArgv(true, colNames, new Configuration());
  } else {
    importArgs = getArgv(true, colNames, new Configuration());
  }
  LOG.debug("Import args = " + Arrays.toString(importArgs));
  SqoopHCatUtilities.instance().setConfigured(false);
  runImport(new ImportTool(), importArgs);
  List<HCatRecord> recs = utils.readHCatRecords(null, table, null);
  LOG.debug("HCat records ");
  LOG.debug(utils.hCatRecordDump(recs, tblSchema));
  validateHCatRecords(recs, tblSchema, 10, cols);
}
 
开发者ID:aliyun,项目名称:aliyun-maxcompute-data-collectors,代码行数:40,代码来源:HCatalogImportTest.java


示例18: loadHCatTable

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
private void loadHCatTable(HCatSchema hCatSchema, String table,
  int count, ColumnGenerator... extraCols)
  throws Exception {
  Map<String, String> staticKeyMap = new HashMap<String, String>();
  for (ColumnGenerator col : extraCols) {
    if (col.getKeyType() == KeyType.STATIC_KEY) {
      staticKeyMap.put(col.getName(), (String) col.getHCatValue(0));
    }
  }
  loadHCatTable(null, table, staticKeyMap,
    hCatSchema, generateHCatRecords(count, hCatSchema, extraCols));
}
 
开发者ID:aliyun,项目名称:aliyun-maxcompute-data-collectors,代码行数:13,代码来源:HCatalogTestUtils.java


示例19: hCatRecordDump

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
public String hCatRecordDump(List<HCatRecord> recs,
  HCatSchema schema) throws Exception {
  List<String> fields = schema.getFieldNames();
  int count = 0;
  StringBuilder sb = new StringBuilder(1024);
  for (HCatRecord rec : recs) {
    sb.append("HCat Record : " + ++count).append('\n');
    for (String field : fields) {
      sb.append('\t').append(field).append('=');
      sb.append(rec.get(field, schema)).append('\n');
      sb.append("\n\n");
    }
  }
  return sb.toString();
}
 
开发者ID:aliyun,项目名称:aliyun-maxcompute-data-collectors,代码行数:16,代码来源:HCatalogTestUtils.java


示例20: getFields

import org.apache.hive.hcatalog.data.schema.HCatSchema; //导入依赖的package包/类
/**
 * Specifies the fields which are returned by the InputFormat and their order.
 *
 * @param fields The fields and their order which are returned by the InputFormat.
 * @return This InputFormat with specified return fields.
 * @throws java.io.IOException
 */
public HCatInputFormatBase<T> getFields(String... fields) throws IOException {

	// build output schema
	ArrayList<HCatFieldSchema> fieldSchemas = new ArrayList<HCatFieldSchema>(fields.length);
	for (String field : fields) {
		fieldSchemas.add(this.outputSchema.get(field));
	}
	this.outputSchema = new HCatSchema(fieldSchemas);

	// update output schema configuration
	configuration.set("mapreduce.lib.hcat.output.schema", HCatUtil.serialize(outputSchema));

	return this;
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:22,代码来源:HCatInputFormatBase.java



注:本文中的org.apache.hive.hcatalog.data.schema.HCatSchema类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java Session类代码示例发布时间:2022-05-22
下一篇:
Java TickUnits类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap