• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java LazyString类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.hadoop.hive.serde2.lazy.LazyString的典型用法代码示例。如果您正苦于以下问题:Java LazyString类的具体用法?Java LazyString怎么用?Java LazyString使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



LazyString类属于org.apache.hadoop.hive.serde2.lazy包,在下文中一共展示了LazyString类的9个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: lazyString

import org.apache.hadoop.hive.serde2.lazy.LazyString; //导入依赖的package包/类
@Nonnull
public static LazyString lazyString(@Nonnull final String str,
        @Nonnull final LazyStringObjectInspector oi) {
    LazyString lazy = new LazyString(oi);
    ByteArrayRef ref = new ByteArrayRef();
    byte[] data = str.getBytes(StandardCharsets.UTF_8);
    ref.setData(data);
    lazy.init(ref, 0, data.length);
    return lazy;
}
 
开发者ID:apache,项目名称:incubator-hivemall,代码行数:11,代码来源:HiveUtils.java


示例2: testLazyStringFeature

import org.apache.hadoop.hive.serde2.lazy.LazyString; //导入依赖的package包/类
@Test
public void testLazyStringFeature() throws Exception {
    LazyStringObjectInspector oi = LazyPrimitiveObjectInspectorFactory.getLazyStringObjectInspector(
        false, (byte) 0);
    List<LazyString> x = Arrays.asList(lazyString("テスト:-2", oi), lazyString("漢字:-333.0", oi),
        lazyString("test:-1"));
    testFeature(x, oi, LazyString.class, String.class);
}
 
开发者ID:apache,项目名称:incubator-hivemall,代码行数:9,代码来源:GeneralClassifierUDTFTest.java


示例3: extractPigTypeFromHiveType

import org.apache.hadoop.hive.serde2.lazy.LazyString; //导入依赖的package包/类
/**
    * Converts from a hive type to a pig type
    * 
    * @param value
    *            Object hive type
    * @return Object pig type
    */
   public static Object extractPigTypeFromHiveType(Object value) {

if (value instanceof org.apache.hadoop.hive.serde2.lazy.LazyArray) {
    value = parseLazyArrayToPigArray((org.apache.hadoop.hive.serde2.lazy.LazyArray) value);
} else if (value instanceof org.apache.hadoop.hive.serde2.lazy.LazyMap) {
    value = parseLazyMapToPigMap((org.apache.hadoop.hive.serde2.lazy.LazyMap) value);
} else {

    if (value instanceof LazyString) {
	value = ((LazyString) value).getWritableObject().toString();
    } else if (value instanceof LazyInteger) {
	value = ((LazyInteger) value).getWritableObject().get();
    } else if (value instanceof LazyLong) {
	value = ((LazyLong) value).getWritableObject().get();
    } else if (value instanceof LazyFloat) {
	value = ((LazyFloat) value).getWritableObject().get();
    } else if (value instanceof LazyDouble) {
	value = ((LazyDouble) value).getWritableObject().get();
    } else if (value instanceof LazyBoolean) {
	boolean boolvalue = ((LazyBoolean) value).getWritableObject()
		.get();
	value = (boolvalue) ? 1 : 0;
    } else if (value instanceof LazyByte) {
	value = (int) ((LazyByte) value).getWritableObject().get();
    } else if (value instanceof LazyShort) {
	value = ((LazyShort) value).getWritableObject().get();
    }

}

return value;
   }
 
开发者ID:sigmoidanalytics,项目名称:spork-streaming,代码行数:40,代码来源:HiveRCSchemaUtil.java


示例4: testShouldStoreRowInHiveFormat

import org.apache.hadoop.hive.serde2.lazy.LazyString; //导入依赖的package包/类
@Test
public void testShouldStoreRowInHiveFormat() throws IOException, InterruptedException, SerDeException {
    String loadString = "org.apache.pig.piggybank.storage.HiveColumnarLoader('f1 string,f2 string,f3 string')";
    String storeString = "org.apache.pig.piggybank.storage.HiveColumnarStorage()";

    String singlePartitionedFile = simpleDataFile.getAbsolutePath();
    File outputFile = new File("testhiveColumnarStore");

    PigServer server = new PigServer(ExecType.LOCAL);
    server.setBatchOn();
    server.registerQuery("a = LOAD '" + Util.encodeEscape(singlePartitionedFile) + "' using " + loadString
            + ";");

    //when
    server.store("a", outputFile.getAbsolutePath(), storeString);

    //then
    Path outputPath = new Path(outputFile.getAbsolutePath()+"/part-m-00000.rc");

    ColumnarStruct struct = readRow(outputFile, outputPath, "f1 string,f2 string,f3 string");

    assertEquals(3, struct.getFieldsAsList().size());
    Object o =  struct.getField(0);
    assertEquals(LazyString.class, o.getClass());
    o =  struct.getField(1);
    assertEquals(LazyString.class, o.getClass());
    o =  struct.getField(2);
    assertEquals(LazyString.class, o.getClass());

}
 
开发者ID:sigmoidanalytics,项目名称:spork-streaming,代码行数:31,代码来源:TestHiveColumnarStorage.java


示例5: testShouldStoreTupleAsHiveArray

import org.apache.hadoop.hive.serde2.lazy.LazyString; //导入依赖的package包/类
@Test
public void testShouldStoreTupleAsHiveArray() throws IOException, InterruptedException, SerDeException {
    String loadString = "org.apache.pig.piggybank.storage.HiveColumnarLoader('f1 string,f2 string,f3 string')";
    String storeString = "org.apache.pig.piggybank.storage.HiveColumnarStorage()";

    String singlePartitionedFile = simpleDataFile.getAbsolutePath();
    File outputFile = new File("testhiveColumnarStore");

    PigServer server = new PigServer(ExecType.LOCAL);
    server.setBatchOn();
    server.registerQuery("a = LOAD '" + Util.encodeEscape(singlePartitionedFile) + "' using " + loadString
            + ";");
    server.registerQuery("b = FOREACH a GENERATE f1, TOTUPLE(f2,f3);");

    //when
    server.store("b", outputFile.getAbsolutePath(), storeString);

    //then
    Path outputPath = new Path(outputFile.getAbsolutePath()+"/part-m-00000.rc");

    ColumnarStruct struct = readRow(outputFile, outputPath, "f1 string,f2 array<string>");

    assertEquals(2, struct.getFieldsAsList().size());
    Object o =  struct.getField(0);
    assertEquals(LazyString.class, o.getClass());
    o =  struct.getField(1);
    assertEquals(LazyArray.class, o.getClass());

    LazyArray arr = (LazyArray)o;
    List<Object> values = arr.getList();
    for(Object value : values) {
        assertEquals(LazyString.class, value.getClass());
        String valueStr =((LazyString) value).getWritableObject().toString();
        assertEquals("Sample value", valueStr);
    }

}
 
开发者ID:sigmoidanalytics,项目名称:spork-streaming,代码行数:38,代码来源:TestHiveColumnarStorage.java


示例6: testShouldStoreBagAsHiveArray

import org.apache.hadoop.hive.serde2.lazy.LazyString; //导入依赖的package包/类
@Test
public void testShouldStoreBagAsHiveArray() throws IOException, InterruptedException, SerDeException {
    String loadString = "org.apache.pig.piggybank.storage.HiveColumnarLoader('f1 string,f2 string,f3 string')";
    String storeString = "org.apache.pig.piggybank.storage.HiveColumnarStorage()";

    String singlePartitionedFile = simpleDataFile.getAbsolutePath();
    File outputFile = new File("testhiveColumnarStore");

    PigServer server = new PigServer(ExecType.LOCAL);
    server.setBatchOn();
    server.registerQuery("a = LOAD '" + Util.encodeEscape(singlePartitionedFile) + "' using " + loadString
            + ";");
    server.registerQuery("b = FOREACH a GENERATE f1, TOBAG(f2,f3);");

    //when
    server.store("b", outputFile.getAbsolutePath(), storeString);

    //then
    Path outputPath = new Path(outputFile.getAbsolutePath()+"/part-m-00000.rc");

    ColumnarStruct struct = readRow(outputFile, outputPath, "f1 string,f2 array<string>");

    assertEquals(2, struct.getFieldsAsList().size());
    Object o =  struct.getField(0);
    assertEquals(LazyString.class, o.getClass());
    o =  struct.getField(1);
    assertEquals(LazyArray.class, o.getClass());

    LazyArray arr = (LazyArray)o;
    List<Object> values = arr.getList();
    for(Object value : values) {
        assertEquals(LazyString.class, value.getClass());
        String valueStr =((LazyString) value).getWritableObject().toString();
        assertEquals("Sample value", valueStr);
    }

}
 
开发者ID:sigmoidanalytics,项目名称:spork-streaming,代码行数:38,代码来源:TestHiveColumnarStorage.java


示例7: createLazyPrimitiveClass

import org.apache.hadoop.hive.serde2.lazy.LazyString; //导入依赖的package包/类
/**
 * Create a lazy primitive class given the type name. For Long and INT we use CassandraLazyLong and CassandraLazyInt
 * instead of the LazyObject from Hive.
 */
public static LazyObject createLazyPrimitiveClass(
    PrimitiveObjectInspector oi) {
  PrimitiveCategory p = oi.getPrimitiveCategory();

  switch (p) {
    case BOOLEAN:
      return new CassandraLazyBoolean((LazyBooleanObjectInspector) oi);
    case BYTE:
      return new LazyByte((LazyByteObjectInspector) oi);
    case SHORT:
      return new LazyShort((LazyShortObjectInspector) oi);
    case INT:
      return new CassandraLazyInteger((LazyIntObjectInspector) oi);
    case LONG:
      return new CassandraLazyLong((LazyLongObjectInspector) oi);
    case FLOAT:
      return new CassandraLazyFloat((LazyFloatObjectInspector) oi);
    case DOUBLE:
      return new CassandraLazyDouble((LazyDoubleObjectInspector) oi);
    case STRING:
      return new LazyString((LazyStringObjectInspector) oi);
    case BINARY:
      return new CassandraLazyBinary((LazyBinaryObjectInspector) oi);
    case TIMESTAMP:
      return new CassandraLazyTimestamp((LazyTimestampObjectInspector) oi);
    default:
      throw new RuntimeException("Internal error: no LazyObject for " + p);
  }
}
 
开发者ID:dvasilen,项目名称:Hive-Cassandra,代码行数:34,代码来源:CassandraLazyFactory.java


示例8: testShouldStoreMapAsHiveMap

import org.apache.hadoop.hive.serde2.lazy.LazyString; //导入依赖的package包/类
@Test
public void testShouldStoreMapAsHiveMap() throws IOException, InterruptedException, SerDeException {
    String loadString = "org.apache.pig.piggybank.storage.HiveColumnarLoader('f1 string,f2 string,f3 string')";
    String storeString = "org.apache.pig.piggybank.storage.HiveColumnarStorage()";

    String singlePartitionedFile = simpleDataFile.getAbsolutePath();
    File outputFile = new File("testhiveColumnarStore");

    PigServer server = new PigServer(ExecType.LOCAL);
    server.setBatchOn();
    server.registerQuery("a = LOAD '" + Util.encodeEscape(singlePartitionedFile) + "' using " + loadString
            + ";");
    server.registerQuery("b = FOREACH a GENERATE f1, TOMAP(f2,f3);");

    //when
    server.store("b", outputFile.getAbsolutePath(), storeString);

    //then
    Path outputPath = new Path(outputFile.getAbsolutePath()+"/part-m-00000.rc");

    ColumnarStruct struct = readRow(outputFile, outputPath, "f1 string,f2 map<string,string>");

    assertEquals(2, struct.getFieldsAsList().size());
    Object o =  struct.getField(0);
    assertEquals(LazyString.class, o.getClass());
    o =  struct.getField(1);
    assertEquals(LazyMap.class, o.getClass());

    LazyMap arr = (LazyMap)o;
    Map<Object,Object> values = arr.getMap();
    for(Entry<Object,Object> entry : values.entrySet()) {
        assertEquals(LazyString.class, entry.getKey().getClass());
        assertEquals(LazyString.class, entry.getValue().getClass());

        String keyStr =((LazyString) entry.getKey()).getWritableObject().toString();
        assertEquals("Sample value", keyStr);
        String valueStr =((LazyString) entry.getValue()).getWritableObject().toString();
        assertEquals("Sample value", valueStr);
    }

}
 
开发者ID:sigmoidanalytics,项目名称:spork-streaming,代码行数:42,代码来源:TestHiveColumnarStorage.java


示例9: TestColumnTypes

import org.apache.hadoop.hive.serde2.lazy.LazyString; //导入依赖的package包/类
@Test
public void TestColumnTypes() throws Exception {
       ArrayList<Object> stuff = new ArrayList<Object>();
	Properties proptab = new Properties();
	proptab.setProperty(HiveShims.serdeConstants.LIST_COLUMNS, "flag,num1,num2,text");
	proptab.setProperty(HiveShims.serdeConstants.LIST_COLUMN_TYPES, "boolean,tinyint,smallint,string");
	AbstractSerDe jserde = mkSerDe(proptab);
       StructObjectInspector rowOI = (StructObjectInspector)jserde.getObjectInspector();

	// {"attributes":{"flag":false,"num":"5","text":"Point(15.0 5.0)"}}
       addWritable(stuff, false);
       addWritable(stuff, (byte)2);
       addWritable(stuff, (short)5);
       addWritable(stuff, "Point(15.0 5.0)");
	Object row = runSerDe(stuff, jserde, rowOI);
	Object fieldData = getField("flag", row, rowOI);
	Assert.assertEquals(false, ((BooleanWritable)fieldData).get());
	fieldData = getField("num1", row, rowOI);
	Assert.assertEquals((byte)2, ((ByteWritable)fieldData).get());
	fieldData = getField("num2", row, rowOI);
	Assert.assertEquals((short)5, ((ShortWritable)fieldData).get());
	fieldData = getField("text", row, rowOI);
	Assert.assertEquals("Point(15.0 5.0)", ((Text)fieldData).toString());

	stuff.set(0, new BooleanWritable(true));
	stuff.set(1, new ByteWritable((byte)4));
	stuff.set(2, new ShortWritable((short)4));
	//stuff.set(3, new Text("other"));
	LazyStringObjectInspector loi = LazyPrimitiveObjectInspectorFactory.
		getLazyStringObjectInspector(false, (byte)'\0');
	LazyString lstr = new LazyString(loi);
	ByteArrayRef bar = new ByteArrayRef();
	bar.setData("other".getBytes());
	lstr.init(bar, 0, 5);
	stuff.set(3, lstr);
	row = runSerDe(stuff, jserde, rowOI);
	fieldData = getField("flag", row, rowOI);
	Assert.assertEquals(true, ((BooleanWritable)fieldData).get());
	fieldData = getField("num1", row, rowOI);
	Assert.assertEquals((byte)4, ((ByteWritable)fieldData).get());
	fieldData = getField("num2", row, rowOI);
	Assert.assertEquals((short)4, ((ShortWritable)fieldData).get());
	fieldData = getField("text", row, rowOI);
	Assert.assertEquals("other", ((Text)fieldData).toString());
}
 
开发者ID:Esri,项目名称:spatial-framework-for-hadoop,代码行数:46,代码来源:TestEsriJsonSerDe.java



注:本文中的org.apache.hadoop.hive.serde2.lazy.LazyString类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java SchemaTablePrefix类代码示例发布时间:2022-05-22
下一篇:
Java DBusException类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap