本文整理汇总了Java中org.apache.flink.streaming.api.datastream.IterativeStream类的典型用法代码示例。如果您正苦于以下问题:Java IterativeStream类的具体用法?Java IterativeStream怎么用?Java IterativeStream使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
IterativeStream类属于org.apache.flink.streaming.api.datastream包,在下文中一共展示了IterativeStream类的11个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。
示例1: testImmutabilityWithCoiteration
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
@Test
public void testImmutabilityWithCoiteration() {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<Integer> source = env.fromElements(1, 10).map(noOpIntMap); // for rebalance
IterativeStream<Integer> iter1 = source.iterate();
// Calling withFeedbackType should create a new iteration
ConnectedIterativeStreams<Integer, String> iter2 = iter1.withFeedbackType(String.class);
iter1.closeWith(iter1.map(noOpIntMap)).print();
iter2.closeWith(iter2.map(noOpCoMap)).print();
StreamGraph graph = env.getStreamGraph();
assertEquals(2, graph.getIterationSourceSinkPairs().size());
for (Tuple2<StreamNode, StreamNode> sourceSinkPair: graph.getIterationSourceSinkPairs()) {
assertEquals(sourceSinkPair.f0.getOutEdges().get(0).getTargetVertex(), sourceSinkPair.f1.getInEdges().get(0).getSourceVertex());
}
}
开发者ID:axbaretto,项目名称:flink,代码行数:22,代码来源:IterateITCase.java
示例2: main
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
public static void main(String[] args) throws Exception {
// Set up the environment
if(!parseParameters(args)) {
return;
}
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<Tuple2<Long, Long>> edges = getEdgesDataSet(env);
IterativeStream<Tuple2<Long, Long>> iteration = edges.iterate();
DataStream<Tuple2<Long, Long>> result = iteration.closeWith(
iteration.keyBy(0).flatMap(new AssignComponents()));
// Emit the results
result.print();
env.execute("Streaming Connected Components");
}
开发者ID:vasia,项目名称:gelly-streaming,代码行数:21,代码来源:IterativeConnectedComponents.java
示例3: initializeCycle
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
private void initializeCycle(int cycleID) {
//get the head and tail of cycle
FlinkProcessingItem tail = cycles.get(cycleID).get(0);
FlinkProcessingItem head = cycles.get(cycleID).get(cycles.get(cycleID).size() - 1);
//initialise source stream of the iteration, so as to use it for the iteration starting point
if (!head.isInitialised()) {
head.setOnIteration(true);
head.initialise();
head.initialiseStreams();
}
//initialise all nodes after head
for (int node = cycles.get(cycleID).size() - 2; node >= 0; node--) {
FlinkProcessingItem processingItem = cycles.get(cycleID).get(node);
processingItem.initialise();
processingItem.initialiseStreams();
}
SingleOutputStreamOperator backedge = (SingleOutputStreamOperator) head.getInputStreamBySourceID(tail.getComponentId()).getOutStream();
backedge.setParallelism(head.getParallelism());
((IterativeStream) head.getDataStream()).closeWith(backedge);
}
开发者ID:apache,项目名称:incubator-samoa,代码行数:24,代码来源:FlinkTopology.java
示例4: testIncorrectParallelism
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
@Test(expected = UnsupportedOperationException.class)
public void testIncorrectParallelism() throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<Integer> source = env.fromElements(1, 10);
IterativeStream<Integer> iter1 = source.iterate();
SingleOutputStreamOperator<Integer> map1 = iter1.map(noOpIntMap);
iter1.closeWith(map1).print();
}
开发者ID:axbaretto,项目名称:flink,代码行数:12,代码来源:IterateITCase.java
示例5: testSimpleIteration
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
@SuppressWarnings("rawtypes")
@Test
public void testSimpleIteration() throws Exception {
int numRetries = 5;
int timeoutScale = 1;
for (int numRetry = 0; numRetry < numRetries; numRetry++) {
try {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
iterated = new boolean[parallelism];
DataStream<Boolean> source = env.fromCollection(Collections.nCopies(parallelism * 2, false))
.map(noOpBoolMap).name("ParallelizeMap");
IterativeStream<Boolean> iteration = source.iterate(3000 * timeoutScale);
DataStream<Boolean> increment = iteration.flatMap(new IterationHead()).map(noOpBoolMap);
iteration.map(noOpBoolMap).addSink(new ReceiveCheckNoOpSink());
iteration.closeWith(increment).addSink(new ReceiveCheckNoOpSink());
env.execute();
for (boolean iter : iterated) {
assertTrue(iter);
}
break; // success
} catch (Throwable t) {
LOG.info("Run " + (numRetry + 1) + "/" + numRetries + " failed", t);
if (numRetry >= numRetries - 1) {
throw t;
} else {
timeoutScale *= 2;
}
}
}
}
开发者ID:axbaretto,项目名称:flink,代码行数:41,代码来源:IterateITCase.java
示例6: testClosingFromOutOfLoop
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
@Test(expected = UnsupportedOperationException.class)
public void testClosingFromOutOfLoop() throws Exception {
// this test verifies that we cannot close an iteration with a DataStream that does not
// have the iteration in its predecessors
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
// introduce dummy mapper to get to correct parallelism
DataStream<Integer> source = env.fromElements(1, 10).map(noOpIntMap);
IterativeStream<Integer> iter1 = source.iterate();
IterativeStream<Integer> iter2 = source.iterate();
iter2.closeWith(iter1.map(noOpIntMap));
}
开发者ID:axbaretto,项目名称:flink,代码行数:18,代码来源:IterateITCase.java
示例7: testCoIterClosingFromOutOfLoop
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
@Test(expected = UnsupportedOperationException.class)
public void testCoIterClosingFromOutOfLoop() throws Exception {
// this test verifies that we cannot close an iteration with a DataStream that does not
// have the iteration in its predecessors
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
// introduce dummy mapper to get to correct parallelism
DataStream<Integer> source = env.fromElements(1, 10).map(noOpIntMap);
IterativeStream<Integer> iter1 = source.iterate();
ConnectedIterativeStreams<Integer, Integer> coIter = source.iterate().withFeedbackType(
Integer.class);
coIter.closeWith(iter1.map(noOpIntMap));
}
开发者ID:axbaretto,项目名称:flink,代码行数:19,代码来源:IterateITCase.java
示例8: main
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
public static void main(String[] args) throws Exception {
// Checking input parameters
final ParameterTool params = ParameterTool.fromArgs(args);
// set up input for the stream of integer pairs
// obtain execution environment and set setBufferTimeout to 1 to enable
// continuous flushing of the output buffers (lowest latency)
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment()
.setBufferTimeout(1);
// make parameters available in the web interface
env.getConfig().setGlobalJobParameters(params);
// create input stream of integer pairs
DataStream<Tuple2<Integer, Integer>> inputStream;
if (params.has("input")) {
inputStream = env.readTextFile(params.get("input")).map(new FibonacciInputMap());
} else {
System.out.println("Executing Iterate example with default input data set.");
System.out.println("Use --input to specify file input.");
inputStream = env.addSource(new RandomFibonacciSource());
}
// create an iterative data stream from the input with 5 second timeout
IterativeStream<Tuple5<Integer, Integer, Integer, Integer, Integer>> it = inputStream.map(new InputMap())
.iterate(5000);
// apply the step function to get the next Fibonacci number
// increment the counter and split the output with the output selector
SplitStream<Tuple5<Integer, Integer, Integer, Integer, Integer>> step = it.map(new Step())
.split(new MySelector());
// close the iteration by selecting the tuples that were directed to the
// 'iterate' channel in the output selector
it.closeWith(step.select("iterate"));
// to produce the final output select the tuples directed to the
// 'output' channel then get the input pairs that have the greatest iteration counter
// on a 1 second sliding window
DataStream<Tuple2<Tuple2<Integer, Integer>, Integer>> numbers = step.select("output")
.map(new OutputMap());
// emit results
if (params.has("output")) {
numbers.writeAsText(params.get("output"));
} else {
System.out.println("Printing result to stdout. Use --output to specify output path.");
numbers.print();
}
// execute the program
env.execute("Streaming Iteration Example");
}
开发者ID:axbaretto,项目名称:flink,代码行数:56,代码来源:IterateExample.java
示例9: testDoubleClosing
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
@Test
public void testDoubleClosing() throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
// introduce dummy mapper to get to correct parallelism
DataStream<Integer> source = env.fromElements(1, 10).map(noOpIntMap);
IterativeStream<Integer> iter1 = source.iterate();
iter1.closeWith(iter1.map(noOpIntMap));
iter1.closeWith(iter1.map(noOpIntMap));
}
开发者ID:axbaretto,项目名称:flink,代码行数:14,代码来源:IterateITCase.java
示例10: testDifferingParallelism
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
@Test(expected = UnsupportedOperationException.class)
public void testDifferingParallelism() throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
// introduce dummy mapper to get to correct parallelism
DataStream<Integer> source = env.fromElements(1, 10)
.map(noOpIntMap);
IterativeStream<Integer> iter1 = source.iterate();
iter1.closeWith(iter1.map(noOpIntMap).setParallelism(parallelism / 2));
}
开发者ID:axbaretto,项目名称:flink,代码行数:15,代码来源:IterateITCase.java
示例11: testExecutionWithEmptyIteration
import org.apache.flink.streaming.api.datastream.IterativeStream; //导入依赖的package包/类
@Test(expected = IllegalStateException.class)
public void testExecutionWithEmptyIteration() throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<Integer> source = env.fromElements(1, 10).map(noOpIntMap);
IterativeStream<Integer> iter1 = source.iterate();
iter1.map(noOpIntMap).print();
env.execute();
}
开发者ID:axbaretto,项目名称:flink,代码行数:14,代码来源:IterateITCase.java
注:本文中的org.apache.flink.streaming.api.datastream.IterativeStream类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。 |
请发表评论