• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java JsonConverter类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.kafka.connect.json.JsonConverter的典型用法代码示例。如果您正苦于以下问题:Java JsonConverter类的具体用法?Java JsonConverter怎么用?Java JsonConverter使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



JsonConverter类属于org.apache.kafka.connect.json包,在下文中一共展示了JsonConverter类的12个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: JsonMessageBuilder

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
public JsonMessageBuilder() {
    log.info("Building messages using com.ibm.mq.kafkaconnect.builders.JsonMessageBuilder");
    converter = new JsonConverter();
    
    // We just want the payload, not the schema in the output message
    HashMap<String, String> m = new HashMap<>();
    m.put("schemas.enable", "false");

    // Convert the value, not the key (isKey == false)
    converter.configure(m, false);
}
 
开发者ID:ibm-messaging,项目名称:kafka-connect-mq-sink,代码行数:12,代码来源:JsonMessageBuilder.java


示例2: JsonRecordBuilder

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
public JsonRecordBuilder() {
    log.info("Building records using com.ibm.mq.kafkaconnect.builders.DefaultRecordBuilder");
    converter = new JsonConverter();
    
    // We just want the payload, not the schema in the output message
    HashMap<String, String> m = new HashMap<>();
    m.put("schemas.enable", "false");

    // Convert the value, not the key (isKey == false)
    converter.configure(m, false);
}
 
开发者ID:ibm-messaging,项目名称:kafka-connect-mq-source,代码行数:12,代码来源:JsonRecordBuilder.java


示例3: expectConverters

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
private void expectConverters(Class<? extends Converter> converterClass) {
    // connector default
    Converter keyConverter = PowerMock.createMock(converterClass);
    Converter valueConverter = PowerMock.createMock(converterClass);
    //internal
    Converter internalKeyConverter = PowerMock.createMock(converterClass);
    Converter internalValueConverter = PowerMock.createMock(converterClass);

    // Instantiate and configure default
    EasyMock.expect(plugins.newConverter(JsonConverter.class.getName(), config))
            .andReturn(keyConverter);
    keyConverter.configure(
            EasyMock.<Map<String, ?>>anyObject(),
            EasyMock.anyBoolean()
    );
    EasyMock.expectLastCall();
    EasyMock.expect(plugins.newConverter(JsonConverter.class.getName(), config))
            .andReturn(valueConverter);
    valueConverter.configure(
            EasyMock.<Map<String, ?>>anyObject(),
            EasyMock.anyBoolean()
    );
    EasyMock.expectLastCall();

    // Instantiate and configure internal
    EasyMock.expect(plugins.newConverter(JsonConverter.class.getName(), config))
            .andReturn(internalKeyConverter);
    internalKeyConverter.configure(
            EasyMock.<Map<String, ?>>anyObject(),
            EasyMock.anyBoolean()
    );
    EasyMock.expectLastCall();
    EasyMock.expect(plugins.newConverter(JsonConverter.class.getName(), config))
            .andReturn(internalValueConverter);
    internalValueConverter.configure(
            EasyMock.<Map<String, ?>>anyObject(),
            EasyMock.anyBoolean()
    );
    EasyMock.expectLastCall();
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:41,代码来源:WorkerTest.java


示例4: JsonEventParser

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
/**
 * default c.tor
 */
public JsonEventParser() {
    this.keyConverter = new JsonConverter();
    this.valueConverter = new JsonConverter();

    Map<String, String> props = new HashMap<>(1);
    props.put("schemas.enable", Boolean.FALSE.toString());

    this.keyConverter.configure(props, true);
    this.valueConverter.configure(props, false);

}
 
开发者ID:mravi,项目名称:kafka-connect-hbase,代码行数:15,代码来源:JsonEventParser.java


示例5: JsonFormat

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
public JsonFormat(S3Storage storage) {
  this.storage = storage;
  this.converter = new JsonConverter();
  Map<String, Object> converterConfig = new HashMap<>();
  converterConfig.put("schemas.enable", "false");
  converterConfig.put(
      "schemas.cache.size",
      String.valueOf(storage.conf().get(S3SinkConnectorConfig.SCHEMA_CACHE_SIZE_CONFIG))
  );
  this.converter.configure(converterConfig, false);
}
 
开发者ID:confluentinc,项目名称:kafka-connect-storage-cloud,代码行数:12,代码来源:JsonFormat.java


示例6: setUp

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
public void setUp() throws Exception {
  super.setUp();
  converter = new JsonConverter();
  converter.configure(Collections.singletonMap("schemas.enable", "false"), false);

  s3 = newS3Client(connectorConfig);
  storage = new S3Storage(connectorConfig, url, S3_TEST_BUCKET_NAME, s3);

  partitioner = new DefaultPartitioner<>();
  partitioner.configure(parsedConfig);
  format = new JsonFormat(storage);
  s3.createBucket(S3_TEST_BUCKET_NAME);
  assertTrue(s3.doesBucketExist(S3_TEST_BUCKET_NAME));
}
 
开发者ID:confluentinc,项目名称:kafka-connect-storage-cloud,代码行数:15,代码来源:DataWriterJsonTest.java


示例7: testAddRemoveTask

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
@Test
public void testAddRemoveTask() throws Exception {
    expectConverters();
    expectStartStorage();

    // Create
    TestSourceTask task = PowerMock.createMock(TestSourceTask.class);
    WorkerSourceTask workerTask = PowerMock.createMock(WorkerSourceTask.class);
    EasyMock.expect(workerTask.id()).andStubReturn(TASK_ID);

    EasyMock.expect(plugins.currentThreadLoader()).andReturn(delegatingLoader).times(2);
    PowerMock.expectNew(
            WorkerSourceTask.class, EasyMock.eq(TASK_ID),
            EasyMock.eq(task),
            EasyMock.anyObject(TaskStatus.Listener.class),
            EasyMock.eq(TargetState.STARTED),
            EasyMock.anyObject(JsonConverter.class),
            EasyMock.anyObject(JsonConverter.class),
            EasyMock.eq(TransformationChain.<SourceRecord>noOp()),
            EasyMock.anyObject(KafkaProducer.class),
            EasyMock.anyObject(OffsetStorageReader.class),
            EasyMock.anyObject(OffsetStorageWriter.class),
            EasyMock.eq(config),
            EasyMock.anyObject(ClassLoader.class),
            EasyMock.anyObject(Time.class))
            .andReturn(workerTask);
    Map<String, String> origProps = new HashMap<>();
    origProps.put(TaskConfig.TASK_CLASS_CONFIG, TestSourceTask.class.getName());

    TaskConfig taskConfig = new TaskConfig(origProps);
    // We should expect this call, but the pluginLoader being swapped in is only mocked.
    // EasyMock.expect(pluginLoader.loadClass(TestSourceTask.class.getName()))
    //        .andReturn((Class) TestSourceTask.class);
    EasyMock.expect(plugins.newTask(TestSourceTask.class)).andReturn(task);
    EasyMock.expect(task.version()).andReturn("1.0");

    workerTask.initialize(taskConfig);
    EasyMock.expectLastCall();
    // We should expect this call, but the pluginLoader being swapped in is only mocked.
    // Serializers for the Producer that the task generates. These are loaded while the PluginClassLoader is active
    // and then delegated to the system classloader. This is only called once due to caching
    // EasyMock.expect(pluginLoader.loadClass(ByteArraySerializer.class.getName()))
    //        .andReturn((Class) ByteArraySerializer.class);

    workerTask.run();
    EasyMock.expectLastCall();

    EasyMock.expect(plugins.delegatingLoader()).andReturn(delegatingLoader);
    EasyMock.expect(delegatingLoader.connectorLoader(WorkerTestConnector.class.getName()))
            .andReturn(pluginLoader);

    EasyMock.expect(Plugins.compareAndSwapLoaders(pluginLoader)).andReturn(delegatingLoader)
            .times(2);

    EasyMock.expect(workerTask.loader()).andReturn(pluginLoader);

    EasyMock.expect(Plugins.compareAndSwapLoaders(delegatingLoader)).andReturn(pluginLoader)
            .times(2);

    // Remove
    workerTask.stop();
    EasyMock.expectLastCall();
    EasyMock.expect(workerTask.awaitStop(EasyMock.anyLong())).andStubReturn(true);
    EasyMock.expectLastCall();

    expectStopStorage();

    PowerMock.replayAll();

    worker = new Worker(WORKER_ID, new MockTime(), plugins, config, offsetBackingStore);
    worker.start();
    assertEquals(Collections.emptySet(), worker.taskIds());
    worker.startTask(TASK_ID, anyConnectorConfigMap(), origProps, taskStatusListener, TargetState.STARTED);
    assertEquals(new HashSet<>(Arrays.asList(TASK_ID)), worker.taskIds());
    worker.stopAndAwaitTask(TASK_ID);
    assertEquals(Collections.emptySet(), worker.taskIds());
    // Nothing should be left, so this should effectively be a nop
    worker.stop();

    PowerMock.verifyAll();
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:82,代码来源:WorkerTest.java


示例8: testCleanupTasksOnStop

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
@Test
public void testCleanupTasksOnStop() throws Exception {
    expectConverters();
    expectStartStorage();

    // Create
    TestSourceTask task = PowerMock.createMock(TestSourceTask.class);
    WorkerSourceTask workerTask = PowerMock.createMock(WorkerSourceTask.class);
    EasyMock.expect(workerTask.id()).andStubReturn(TASK_ID);

    EasyMock.expect(plugins.currentThreadLoader()).andReturn(delegatingLoader).times(2);
    PowerMock.expectNew(
            WorkerSourceTask.class, EasyMock.eq(TASK_ID),
            EasyMock.eq(task),
            EasyMock.anyObject(TaskStatus.Listener.class),
            EasyMock.eq(TargetState.STARTED),
            EasyMock.anyObject(JsonConverter.class),
            EasyMock.anyObject(JsonConverter.class),
            EasyMock.eq(TransformationChain.<SourceRecord>noOp()),
            EasyMock.anyObject(KafkaProducer.class),
            EasyMock.anyObject(OffsetStorageReader.class),
            EasyMock.anyObject(OffsetStorageWriter.class),
            EasyMock.anyObject(WorkerConfig.class),
            EasyMock.eq(pluginLoader),
            EasyMock.anyObject(Time.class))
            .andReturn(workerTask);
    Map<String, String> origProps = new HashMap<>();
    origProps.put(TaskConfig.TASK_CLASS_CONFIG, TestSourceTask.class.getName());

    TaskConfig taskConfig = new TaskConfig(origProps);
    // We should expect this call, but the pluginLoader being swapped in is only mocked.
    // EasyMock.expect(pluginLoader.loadClass(TestSourceTask.class.getName()))
    //        .andReturn((Class) TestSourceTask.class);
    EasyMock.expect(plugins.newTask(TestSourceTask.class)).andReturn(task);
    EasyMock.expect(task.version()).andReturn("1.0");

    workerTask.initialize(taskConfig);
    EasyMock.expectLastCall();
    // We should expect this call, but the pluginLoader being swapped in is only mocked.
    // Serializers for the Producer that the task generates. These are loaded while the PluginClassLoader is active
    // and then delegated to the system classloader. This is only called once due to caching
    // EasyMock.expect(pluginLoader.loadClass(ByteArraySerializer.class.getName()))
    //        .andReturn((Class) ByteArraySerializer.class);

    workerTask.run();
    EasyMock.expectLastCall();

    EasyMock.expect(plugins.delegatingLoader()).andReturn(delegatingLoader);
    EasyMock.expect(delegatingLoader.connectorLoader(WorkerTestConnector.class.getName()))
            .andReturn(pluginLoader);

    EasyMock.expect(Plugins.compareAndSwapLoaders(pluginLoader)).andReturn(delegatingLoader)
            .times(2);

    EasyMock.expect(workerTask.loader()).andReturn(pluginLoader);

    EasyMock.expect(Plugins.compareAndSwapLoaders(delegatingLoader)).andReturn(pluginLoader)
            .times(2);

    // Remove on Worker.stop()
    workerTask.stop();
    EasyMock.expectLastCall();

    EasyMock.expect(workerTask.awaitStop(EasyMock.anyLong())).andReturn(true);
    // Note that in this case we *do not* commit offsets since it's an unclean shutdown
    EasyMock.expectLastCall();

    expectStopStorage();

    PowerMock.replayAll();

    worker = new Worker(WORKER_ID, new MockTime(), plugins, config, offsetBackingStore);
    worker.start();
    worker.startTask(TASK_ID, anyConnectorConfigMap(), origProps, taskStatusListener, TargetState.STARTED);
    worker.stop();

    PowerMock.verifyAll();
}
 
开发者ID:YMCoding,项目名称:kafka-0.11.0.0-src-with-comment,代码行数:79,代码来源:WorkerTest.java


示例9: JsonRecordWriterProvider

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
JsonRecordWriterProvider(S3Storage storage, JsonConverter converter) {
  this.storage = storage;
  this.mapper = new ObjectMapper();
  this.converter = converter;
}
 
开发者ID:confluentinc,项目名称:kafka-connect-storage-cloud,代码行数:6,代码来源:JsonRecordWriterProvider.java


示例10: createNewConverter

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
private JsonConverter createNewConverter() {
  JsonConverter result = new JsonConverter();
  result.configure(configs, false);
  return result;
}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:6,代码来源:SchemaMapper.java


示例11: SchemaJsonSerializer

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
public SchemaJsonSerializer(JsonConverter jsonConverter) {
  this.jsonConverter = jsonConverter;
}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:4,代码来源:SchemaMapper.java


示例12: SchemaJsonDeserializer

import org.apache.kafka.connect.json.JsonConverter; //导入依赖的package包/类
public SchemaJsonDeserializer(JsonConverter jsonConverter) {
  this.jsonConverter = jsonConverter;
}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:4,代码来源:SchemaMapper.java



注:本文中的org.apache.kafka.connect.json.JsonConverter类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java ByteArrayResource类代码示例发布时间:2022-05-23
下一篇:
Java BodyElementImpl类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap