本文整理汇总了Java中scala.util.Either类的典型用法代码示例。如果您正苦于以下问题:Java Either类的具体用法?Java Either怎么用?Java Either使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
Either类属于scala.util包,在下文中一共展示了Either类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。
示例1: executeSql
import scala.util.Either; //导入依赖的package包/类
private InterpreterResult executeSql(String sql) {
try {
Either<String, Either<Joiner4All, Mapper4All>> result = dDriver.query(sql, null);
if (result.isLeft()) {
return new InterpreterResult(Code.ERROR, result.left().get().toString());
}
Either<Joiner4All, Mapper4All> goodResult =
(Either<Joiner4All, Mapper4All>) result.right().get();
if (goodResult.isLeft()) {
return new InterpreterResult(Code.SUCCESS, goodResult.left().get().toString());
} else {
return new InterpreterResult(Code.SUCCESS,
mapper4All2Zeppelin((Mapper4All) goodResult.right().get()));
}
} catch (Exception e) {
return new InterpreterResult(Code.ERROR, e.getMessage());
}
}
开发者ID:lorthos,项目名称:incubator-zeppelin-druid,代码行数:21,代码来源:DruidSqlInterpreter.java
示例2: getEnumCellEditor
import scala.util.Either; //导入依赖的package包/类
private CellEditor getEnumCellEditor(TreeNode node) {
final ValueData vd = getVD(node);
final EnumLike enumLike = (EnumLike)vd.element().thype();
final Object[] values = Interop.toArray(enumLike.values(node));
for(int i = 0; i < values.length; ++i) {
Either<String, Object> e = enumLike.interpOut(node, values[i]);
if(e.isRight()) {
values[i] = Interop.literal(e.right().get());
}
else {
values[i] = e.left().get();
}
}
return getSelectCellEditor(true, values);
}
开发者ID:insweat,项目名称:hssd,代码行数:18,代码来源:EntryEditorEditingSupport.java
示例3: toGremlinQuery
import scala.util.Either; //导入依赖的package包/类
private GremlinQuery toGremlinQuery(String query, int limit, int offset) throws AtlasBaseException {
QueryParams params = validateSearchParams(limit, offset);
Either<NoSuccess, Expression> either = QueryParser.apply(query, params);
if (either.isLeft()) {
throw new AtlasBaseException(DISCOVERY_QUERY_FAILED, query);
}
Expression expression = either.right().get();
Expression validExpression = QueryProcessor.validate(expression);
GremlinQuery gremlinQuery = new GremlinTranslator(validExpression, graphPersistenceStrategy).translate();
if (LOG.isDebugEnabled()) {
LOG.debug("Translated Gremlin Query: {}", gremlinQuery.queryStr());
}
return gremlinQuery;
}
开发者ID:apache,项目名称:incubator-atlas,代码行数:19,代码来源:EntityDiscoveryService.java
示例4: findTypeLookupsDuringQueryParsing
import scala.util.Either; //导入依赖的package包/类
private ValidatingTypeCache findTypeLookupsDuringQueryParsing(String query) throws AtlasException {
TypeSystem typeSystem = TypeSystem.getInstance();
ValidatingTypeCache result = new ValidatingTypeCache();
typeSystem.setTypeCache(result);
typeSystem.reset();
HierarchicalTypeDefinition<ClassType> hiveTypeDef = createClassTypeDef("hive_db", "", ImmutableSet.<String>of(),
createRequiredAttrDef("name", DataTypes.STRING_TYPE),
createRequiredAttrDef("tableCount", DataTypes.INT_TYPE)
);
typeSystem.defineClassType(hiveTypeDef);
Either<Parsers.NoSuccess, Expressions.Expression> either = QueryParser.apply(query, null);
Expressions.Expression expression = either.right().get();
QueryProcessor.validate(expression);
return result;
}
开发者ID:apache,项目名称:incubator-atlas,代码行数:19,代码来源:QueryProcessorTest.java
示例5: handle
import scala.util.Either; //导入依赖的package包/类
public Object handle(Request request, Response response) throws Exception {
KoauthRequest koauthRequest = requestMapper.map(request);
Either<KoauthResponse, String> authentication = provider.oauthenticate(koauthRequest);
if (authentication.isLeft()) {
KoauthResponse left = authentication.left().get();
if (left.getClass().equals(ResponseUnauthorized.class)) {
response.status(401);
return "You are treated as a guest.\n" + ((ResponseUnauthorized) left).body();
} else {
response.status(400);
return "You are treated as a guest.\n" + ((ResponseBadRequest) left).body();
}
} else {
String username = authentication.right().get();
return "You are " + username + ".";
}
}
开发者ID:kovacshuni,项目名称:koauth-samples,代码行数:18,代码来源:OauthServlet.java
示例6: getVersion
import scala.util.Either; //导入依赖的package包/类
@NotNull
private static Either<ExecError, VersionTriple> getVersion(HaskellToolsConsole toolConsole, String workingDirectory, String hlintPath) {
return EitherUtil.rightFlatMap(
runHlint(toolConsole, workingDirectory, hlintPath, "--version"),
new AbstractFunction1<String, Either<ExecError, VersionTriple>>() {
@Override
public Either<ExecError, VersionTriple> apply(String version) {
Matcher m = HLINT_VERSION_REGEX.matcher(version);
if (!m.find()) {
return new ExecError(
"Could not parse version from hlint: '" + version + "'",
null
).toLeft();
}
return EitherUtil.right(new VersionTriple(
Integer.parseInt(m.group(1)),
Integer.parseInt(m.group(2)),
Integer.parseInt(m.group(3))
));
}
}
);
}
开发者ID:carymrobbins,项目名称:intellij-haskforce,代码行数:24,代码来源:HLint.java
示例7: runHlint
import scala.util.Either; //导入依赖的package包/类
/**
* Runs hlintProg with parameters if hlintProg can be executed.
*/
@NotNull
private static Either<ExecError, String> runHlint(HaskellToolsConsole toolConsole,
@NotNull String workingDirectory,
@NotNull String hlintProg,
@NotNull String hlintFlags,
@NotNull String... params) {
GeneralCommandLine commandLine = new GeneralCommandLine();
commandLine.setWorkDirectory(workingDirectory);
commandLine.setExePath(hlintProg);
ParametersList parametersList = commandLine.getParametersList();
// Required so that hlint won't report a non-zero exit status for lint issues.
// Otherwise, ExecUtil.readCommandLine will return an error.
parametersList.add("--no-exit-code");
parametersList.addParametersString(hlintFlags);
parametersList.addAll(params);
toolConsole.writeInput(ToolKey.HLINT_KEY, "Using working directory: " + workingDirectory);
toolConsole.writeInput(ToolKey.HLINT_KEY, commandLine.getCommandLineString());
return ExecUtil.readCommandLine(commandLine);
}
开发者ID:carymrobbins,项目名称:intellij-haskforce,代码行数:23,代码来源:HLint.java
示例8: requireCabalVersionMinimum
import scala.util.Either; //导入依赖的package包/类
protected void requireCabalVersionMinimum(double minimumVersion, @NotNull String errorMessage) throws RuntimeConfigurationException {
final HaskellBuildSettings buildSettings = HaskellBuildSettings.getInstance(getProject());
final String cabalPath = buildSettings.getCabalPath();
if (cabalPath.isEmpty()) {
throw new RuntimeConfigurationError("Path to cabal is not set.");
}
GeneralCommandLine cabalCmdLine = new GeneralCommandLine(cabalPath, "--numeric-version");
Either<ExecUtil.ExecError, String> result = ExecUtil.readCommandLine(cabalCmdLine);
if (result.isLeft()) {
//noinspection ThrowableResultOfMethodCallIgnored
ExecUtil.ExecError e = EitherUtil.unsafeGetLeft(result);
NotificationUtil.displaySimpleNotification(
NotificationType.ERROR, getProject(), "cabal", e.getMessage()
);
throw new RuntimeConfigurationError("Failed executing cabal to check its version: " + e.getMessage());
}
final String out = EitherUtil.unsafeGetRight(result);
final Matcher m = EXTRACT_CABAL_VERSION_REGEX.matcher(out);
if (!m.find()) {
throw new RuntimeConfigurationError("Could not parse cabal version: '" + out + "'");
}
final Double actualVersion = Double.parseDouble(m.group(1));
if (actualVersion < minimumVersion) {
throw new RuntimeConfigurationError(errorMessage);
}
}
开发者ID:carymrobbins,项目名称:intellij-haskforce,代码行数:27,代码来源:HaskellRunConfigurationBase.java
示例9: fireCommand
import scala.util.Either; //导入依赖的package包/类
/**
* All commands. If data is null then GET else POST.
*
* @return
*/
private Either<String, Either<JSONArray, JSONObject>> fireCommand(String endPoint, String optData, Map<String, String> reqHeaders) {
CloseableHttpResponse resp = null;
String respStr;
String url = format(coordinatorUrl + endPoint, coordinatorHost, coordinatorPort);
try {
if (optData != null) {// POST
resp = postJson(url, optData, reqHeaders);
} else {// GET
resp = get(url, reqHeaders);
}
respStr = IOUtils.toString(resp.getEntity().getContent());
} catch (IOException ex) {
return new Left<>(format("Http %s, faced exception %s\n", resp, ex));
} finally {
returnClient(resp);
}
try {
return new Right<>(Util.asJsonType(respStr));
} catch (JSONException je) {
return new Left<>(format("Recieved data %s not in json format, faced exception %s\n", respStr, je));
}
}
开发者ID:srikalyc,项目名称:Sql4D,代码行数:28,代码来源:CoordinatorAccessor.java
示例10: aboutDataSource
import scala.util.Either; //导入依赖的package包/类
/**
* Left is error Right is Tuple <dimensions, metrics>
*
* @param name
* @param reqHeaders
* @return
*/
public Either<String, Tuple2<List<String>, List<String>>> aboutDataSource(String name, Map<String, String> reqHeaders) {
Either<String, Either<JSONArray, JSONObject>> resp = fireCommand("druid/coordinator/v1/metadata/datasources/" + name, null, reqHeaders);
if (resp.isLeft()) {
return new Left<>(resp.left().get());
}
Either<JSONArray, JSONObject> goodResp = resp.right().get();
if (goodResp.isRight()) {
JSONObject data = goodResp.right().get();
if (data.has("segments")) {
JSONArray segmentsArray = data.getJSONArray("segments");
if (segmentsArray.length() == 0) {
return new Left<>("No segments received..");
}
JSONObject firstItem = segmentsArray.getJSONObject(0);
String dims = firstItem.getString("dimensions");
String metrics = firstItem.getString("metrics");
return new Right<>(new Tuple2<>(Arrays.asList(dims.split(",")), Arrays.asList(metrics.split(","))));
} else {
return new Left<>("No segments key in the response..");
}
}
return new Left<>("Unexpected response " + goodResp.left().get().toString());
}
开发者ID:srikalyc,项目名称:Sql4D,代码行数:31,代码来源:CoordinatorAccessor.java
示例11: fireQuery
import scala.util.Either; //导入依赖的package包/类
/**
* For firing simple queries(i.e non join queries).
*
* @param jsonQuery
* @param reqHeaders
* @param requiresMapping
* @return
*/
public Either<String, Either<Mapper4All, JSONArray>> fireQuery(String jsonQuery, Map<String, String> reqHeaders, boolean requiresMapping) {
CloseableHttpResponse resp = null;
String respStr;
String url = format(brokerUrl, brokerHost, brokerPort);
try {
resp = postJson(url, jsonQuery, reqHeaders);
respStr = IOUtils.toString(resp.getEntity().getContent());
} catch (IOException ex) {
return new Left<>(format("Http %s, faced exception %s\n", resp, ex));
} finally {
returnClient(resp);
}
JSONArray possibleResArray = null;
try {
possibleResArray = new JSONArray(respStr);
} catch (JSONException je) {
return new Left<>(format("Recieved data %s not in json format. \n", respStr));
}
if (requiresMapping) {
return new Right<String, Either<Mapper4All, JSONArray>>(new Left<Mapper4All, JSONArray>(new Mapper4All(possibleResArray)));
}
return new Right<String, Either<Mapper4All, JSONArray>>(new Right<Mapper4All, JSONArray>(possibleResArray));
}
开发者ID:srikalyc,项目名称:Sql4D,代码行数:32,代码来源:BrokerAccessor.java
示例12: getTimeBoundary
import scala.util.Either; //导入依赖的package包/类
/**
* Get timeboundary.
*
* @param dataSource
* @param reqHeaders
* @return
* @throws java.lang.IllegalAccessException
*/
public Interval getTimeBoundary(String dataSource, Map<String, String> reqHeaders) throws IllegalAccessException {
Program<BaseStatementMeta> pgm = DCompiler.compileSql(format("SELECT FROM %s", dataSource));
Either<String,Either<Mapper4All,JSONArray>> res = fireQuery(pgm.nthStmnt(0).toString(), reqHeaders, true);
if (res.isLeft()) {
throw new IllegalAccessException(format("DataSource %s does not exist(or) check if druid is accessible, faced exception %s", dataSource, res.left().get()));
}
Mapper4All finalRes = res.right().get().left().get();// Thats because we know Time boundary cannot be a Join result!!
int min = finalRes.baseFieldNames.indexOf("minTime");
int max = finalRes.baseFieldNames.indexOf("maxTime");
if (finalRes.baseAllRows.isEmpty()) {// Possible when table does not exist.
throw new IllegalAccessException("Either table does not exist(or) druid is not accessible");
}
List<Object> row = finalRes.baseAllRows.get(0);// Only 1 element is returned in Timeboundary.
return new Interval(row.get(min).toString(), row.get(max).toString());
}
开发者ID:srikalyc,项目名称:Sql4D,代码行数:24,代码来源:BrokerAccessor.java
示例13: getCompiledAST
import scala.util.Either; //导入依赖的package包/类
/**
* Get an in memory representation of broken SQL query. This may require
* contacting druid for resolving dimensions Vs metrics for SELECT queries
* hence it also optionally accepts HTTP request headers to be sent out.
*
* @param sqlQuery
* @param namedParams
* @param reqHeaders
* @return
* @throws java.lang.Exception
*/
public Program<BaseStatementMeta> getCompiledAST(String sqlQuery, NamedParameters namedParams, Map<String, String> reqHeaders) throws Exception {
Program<BaseStatementMeta> pgm = DCompiler.compileSql(preprocessSqlQuery(sqlQuery, namedParams));
for (BaseStatementMeta stmnt : pgm.getAllStmnts()) {
if (stmnt instanceof QueryMeta) {
QueryMeta query = (QueryMeta) stmnt;
if (query.queryType == RequestType.SELECT) {//classifyColumnsToDimAndMetrics
Either<String, Tuple2<List<String>, List<String>>> dataSourceDescRes = coordinator.aboutDataSource(stmnt.dataSource, reqHeaders);
if (dataSourceDescRes.isLeft()) {
throw new Exception("Datasource info either not available (or)could not be loaded ." + dataSourceDescRes.left().get());
} else {
((SelectQueryMeta) query).postProcess(dataSourceDescRes.right().get());
}
}
} else if (stmnt instanceof InsertMeta) {//TODO: Handle this.
} else if (stmnt instanceof DeleteMeta) {//TODO: Handle this.
} else if (stmnt instanceof DropMeta) {//TODO: Handle this.
}
}
//TODO: Do something if pgm is invalid !!!
pgm.isValid();
return pgm;
}
开发者ID:srikalyc,项目名称:Sql4D,代码行数:37,代码来源:DDataSource.java
示例14: write
import scala.util.Either; //导入依赖的package包/类
@Override
public void write(RowLocation rowLocation, Either<Exception, ExecRow> value) throws IOException, InterruptedException {
if (value.isLeft()) {
Exception e = value.left().get();
// failure
failure = true;
SpliceLogUtils.error(LOG,"Error Reading",e);
throw new IOException(e);
}
assert value.isRight();
ExecRow execRow = value.right().get();
try {
if (!initialized) {
initialized = true;
tableWriter.open();
}
tableWriter.write(execRow);
} catch (Exception se) {
SpliceLogUtils.error(LOG,"Error Writing",se);
failure = true;
throw new IOException(se);
}
}
开发者ID:splicemachine,项目名称:spliceengine,代码行数:24,代码来源:SMRecordWriter.java
示例15: getRecordWriter
import scala.util.Either; //导入依赖的package包/类
@Override
public RecordWriter<byte[],Either<Exception, KVPair>> getRecordWriter(TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException {
try {
assert taskAttemptContext != null && taskAttemptContext.getConfiguration() != null:"configuration passed in is null";
if (outputCommitter == null)
getOutputCommitter(taskAttemptContext);
DataSetWriterBuilder tableWriter =TableWriterUtils.deserializeTableWriter(taskAttemptContext.getConfiguration());
TxnView childTxn = outputCommitter.getChildTransaction(taskAttemptContext.getTaskAttemptID());
if (childTxn == null)
throw new IOException("child transaction lookup failed");
tableWriter.txn(childTxn);
return new HTableRecordWriter(tableWriter.buildTableWriter(), outputCommitter);
} catch (Exception e) {
throw new IOException(e);
}
}
开发者ID:splicemachine,项目名称:spliceengine,代码行数:17,代码来源:HTableOutputFormat.java
示例16: write
import scala.util.Either; //导入依赖的package包/类
@Override
public void write(byte[] rowKey, Either<Exception, KVPair> value) throws IOException, InterruptedException {
if (value.isLeft()) {
Exception e = value.left().get();
// failure
failure = true;
SpliceLogUtils.error(LOG,"Error Reading",e);
throw new IOException(e);
}
assert value.isRight();
KVPair kvPair = value.right().get();
try {
if (!initialized) {
initialized = true;
tableWriter.open();
}
tableWriter.write(kvPair);
} catch (Exception se) {
SpliceLogUtils.error(LOG,"Error Writing",se);
failure = true;
throw new IOException(se);
}
}
开发者ID:splicemachine,项目名称:spliceengine,代码行数:24,代码来源:HTableRecordWriter.java
示例17: SparkUpdateDataSetWriter
import scala.util.Either; //导入依赖的package包/类
public SparkUpdateDataSetWriter(JavaPairRDD<K, Either<Exception, V>> rdd,
OperationContext operationContext,
Configuration conf,
long heapConglom,
int[] formatIds,
int[] columnOrdering,
int[] pkCols,
FormatableBitSet pkColumns,
String tableVersion,
ExecRow execRowDefinition,
FormatableBitSet heapList){
this.rdd=rdd;
this.operationContext=operationContext;
this.conf=conf;
this.heapConglom=heapConglom;
this.formatIds=formatIds;
this.columnOrdering=columnOrdering;
this.pkCols=pkCols;
this.pkColumns=pkColumns;
this.tableVersion=tableVersion;
this.execRowDefinition=execRowDefinition;
this.heapList=heapList;
}
开发者ID:splicemachine,项目名称:spliceengine,代码行数:24,代码来源:SparkUpdateDataSetWriter.java
示例18: readExceptionsCauseAbort
import scala.util.Either; //导入依赖的package包/类
@Test
public void readExceptionsCauseAbort() throws StandardException, IOException {
SparkPairDataSet<ExecRow, ExecRow> dataset = new SparkPairDataSet<>(SpliceSpark.getContextUnsafe().parallelizePairs(tenRows).mapToPair(new FailFunction()));
JavaPairRDD<ExecRow, Either<Exception, ExecRow>> rdd = dataset.wrapExceptions();
final Configuration conf=new Configuration(HConfiguration.unwrapDelegate());
TableWriterUtils.serializeInsertTableWriterBuilder(conf, new FakeTableWriterBuilder(false));
conf.setClass(JobContext.OUTPUT_FORMAT_CLASS_ATTR,FakeOutputFormat.class,FakeOutputFormat.class);
// workaround for SPARK-21549 on spark-2.2.0
conf.set("mapreduce.output.fileoutputformat.outputdir","/tmp");
File file = File.createTempFile(SMOutputFormatTest.class.getName(), "exception");
file.delete();
file.mkdir();
conf.set("abort.directory", file.getAbsolutePath());
try {
rdd.saveAsNewAPIHadoopDataset(conf);
Assert.fail("Expected exception");
} catch (Exception se) {
Assert.assertTrue("Unexpected exception", se instanceof SparkException);
}
File[] files = file.listFiles();
Assert.assertTrue("Abort() not called", files.length > 0);
}
开发者ID:splicemachine,项目名称:spliceengine,代码行数:24,代码来源:SMOutputFormatTest.java
示例19: writeExceptionsCauseAbort
import scala.util.Either; //导入依赖的package包/类
@Test
public void writeExceptionsCauseAbort() throws StandardException, IOException {
SparkPairDataSet<RowLocation, ExecRow> dataset = new SparkPairDataSet<>(SpliceSpark.getContextUnsafe().parallelizePairs(tenRows).mapToPair(new ToRowLocationFunction()));
JavaPairRDD<RowLocation, Either<Exception, ExecRow>> rdd = dataset.wrapExceptions();
final Configuration conf=new Configuration(HConfiguration.unwrapDelegate());
TableWriterUtils.serializeInsertTableWriterBuilder(conf, new FakeTableWriterBuilder(true));
conf.setClass(JobContext.OUTPUT_FORMAT_CLASS_ATTR, FakeOutputFormat.class, FakeOutputFormat.class);
// workaround for SPARK-21549 on spark-2.2.0
conf.set("mapreduce.output.fileoutputformat.outputdir","/tmp");
File file = File.createTempFile(SMOutputFormatTest.class.getName(), "exception");
file.delete();
file.mkdir();
conf.set("abort.directory", file.getAbsolutePath());
try {
rdd.saveAsNewAPIHadoopDataset(conf);
Assert.fail("Expected exception");
} catch (Exception se) {
Assert.assertTrue("Unexpected exception", se instanceof SparkException);
}
File[] files = file.listFiles();
Assert.assertTrue("Abort() not called", files.length > 0);
}
开发者ID:splicemachine,项目名称:spliceengine,代码行数:24,代码来源:SMOutputFormatTest.java
示例20: abortNotCalled
import scala.util.Either; //导入依赖的package包/类
@Test
public void abortNotCalled() throws StandardException, IOException {
SparkPairDataSet<RowLocation, ExecRow> dataset = new SparkPairDataSet<>(SpliceSpark.getContextUnsafe().parallelizePairs(tenRows).mapToPair(new ToRowLocationFunction()));
JavaPairRDD<RowLocation, Either<Exception, ExecRow>> rdd = dataset.wrapExceptions();
final Configuration conf=new Configuration(HConfiguration.unwrapDelegate());
TableWriterUtils.serializeInsertTableWriterBuilder(conf, new FakeTableWriterBuilder(false));
conf.setClass(JobContext.OUTPUT_FORMAT_CLASS_ATTR,FakeOutputFormat.class,FakeOutputFormat.class);
// workaround for SPARK-21549 on spark-2.2.0
conf.set("mapreduce.output.fileoutputformat.outputdir","/tmp");
File file = File.createTempFile(SMOutputFormatTest.class.getName(), "noException");
file.delete();
file.mkdir();
conf.set("abort.directory", file.getAbsolutePath());
rdd.saveAsNewAPIHadoopDataset(conf);
File[] files = file.listFiles();
Assert.assertEquals("Abort() was called", 0, files.length);
}
开发者ID:splicemachine,项目名称:spliceengine,代码行数:19,代码来源:SMOutputFormatTest.java
注:本文中的scala.util.Either类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。 |
请发表评论