• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java SimpleMGraph类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.clerezza.rdf.core.impl.SimpleMGraph的典型用法代码示例。如果您正苦于以下问题:Java SimpleMGraph类的具体用法?Java SimpleMGraph怎么用?Java SimpleMGraph使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



SimpleMGraph类属于org.apache.clerezza.rdf.core.impl包,在下文中一共展示了SimpleMGraph类的18个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: init

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
@BeforeClass
public static void init(){
    LiteralFactory lf = LiteralFactory.getInstance();
    UriRef pers1 = new UriRef("http://www.example.org/test#pers1");
    UriRef pers2 = new UriRef("http://www.example.org/test#pers2");
    MGraph data = new SimpleMGraph();
    //NOTE: This test a language literal with and without language as
    //      well as a xsd:string typed literal. To test correct handling of
    //      RDF1.1
    data.add(new TripleImpl(pers1, RDF.type, FOAF.Person));
    data.add(new TripleImpl(pers1, FOAF.name, new PlainLiteralImpl("Rupert Westenthaler",
            new Language("de"))));
    data.add(new TripleImpl(pers1, FOAF.nick, new PlainLiteralImpl("westei")));
    data.add(new TripleImpl(pers1, FOAF.mbox, lf.createTypedLiteral("[email protected]")));
    data.add(new TripleImpl(pers1, FOAF.age, lf.createTypedLiteral(38)));
    data.add(new TripleImpl(pers1, FOAF.knows, pers2));
    data.add(new TripleImpl(pers2, FOAF.name, new PlainLiteralImpl("Reto Bachmann-Gmür")));
    rdfData = data.getGraph();
}
 
开发者ID:jsonld-java,项目名称:jsonld-java-clerezza,代码行数:20,代码来源:ClerezzaJsonLdParserSerializerTest.java


示例2: serializerTest

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
@Test
public void serializerTest(){
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    serializer.serialize(out, rdfData, "application/ld+json");
    byte[] data = out.toByteArray();
    log.info("Serialized Graph: \n {}",new String(data,UTF8));
   
    //Now we reparse the graph to validate it was serialized correctly
    SimpleMGraph reparsed = new SimpleMGraph();
    parser.parse(reparsed, new ByteArrayInputStream(data), "application/ld+json");
    Assert.assertEquals(7, reparsed.size());
    for(Triple t : rdfData){
        Assert.assertTrue(reparsed.contains(t));
    }
    
}
 
开发者ID:jsonld-java,项目名称:jsonld-java-clerezza,代码行数:17,代码来源:ClerezzaJsonLdParserSerializerTest.java


示例3: transform

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
 * Performs the actual transformation mapping the data extracted from OSM XML data to a Clerezza graph.
 * @return
 */
public TripleCollection transform(){
    TripleCollection resultGraph = new SimpleMGraph();
    processXmlBinary();
    for(String wayId:  osmWayNodeMap.keySet()) {
        OsmWay wayObj = osmWayNodeMap.get(wayId);
        UriRef wayUri = new UriRef("http://fusepoolp3.eu/osm/way/" + wayId);
        resultGraph.add(new TripleImpl(wayUri, RDF.type, new UriRef("http://schema.org/PostalAddress")));
        resultGraph.add(new TripleImpl(wayUri, new UriRef("http://schema.org/streetAddress"), new PlainLiteralImpl(wayObj.getTagName())));
        UriRef geometryUri = new UriRef("http://fusepoolp3.eu/osm/geometry/" + wayId);
        resultGraph.add(new TripleImpl(wayUri, new UriRef("http://www.opengis.net/ont/geosparql#geometry"), geometryUri));
        String linestring = getWktLineString(wayObj.getNodeReferenceList());
        resultGraph.add(new TripleImpl(geometryUri, new UriRef("http://www.opengis.net/ont/geosparql#asWKT"), new PlainLiteralImpl(linestring)));            
    }
    
    return resultGraph;
}
 
开发者ID:fusepoolP3,项目名称:p3-osm-transformer,代码行数:21,代码来源:OsmXmlParser.java


示例4: generateRdf

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
 * Get SIOC content from the RDF as text and return it.
 *
 * @param entity
 * @return
 * @throws IOException
 */
@Override
protected TripleCollection generateRdf(HttpRequestEntity entity) throws IOException {
    String text = "";
    Graph graph = Parser.getInstance().parse(entity.getData(), "text/turtle");
    Iterator<Triple> triples = graph.filter(null, SIOC.content, null);
    if (triples.hasNext()) {
        Literal literal = (Literal) triples.next().getObject();
        text = literal.getLexicalForm();
    }

    final TripleCollection result = new SimpleMGraph();
    final Resource resource = entity.getContentLocation() == null
            ? new BNode()
            : new UriRef(entity.getContentLocation().toString());
    final GraphNode node = new GraphNode(resource, result);
    node.addProperty(RDF.type, TEXUAL_CONTENT);
    node.addPropertyValue(SIOC.content, text);
    node.addPropertyValue(new UriRef("http://example.org/ontology#textLength"), text.length());
    return result;
}
 
开发者ID:fusepoolP3,项目名称:p3-pipeline-transformer,代码行数:28,代码来源:SimpleRdfConsumingTransformer.java


示例5: uploadRdf

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
 * Load RDF data sent by HTTP POST. Use the Dataset custom header
 * to address the dataset in which to store the rdf data.
 * Use this service with the following curl command:
 *  curl -X POST -u admin: -H "Content-Type: application/rdf+xml" 
 *  	-H "Dataset: mydataset" -T <rdf_file> http://localhost:8080/dlcupload/rdf 
 */
@POST
@Path("rdf")
@Produces("text/plain")
public String uploadRdf(@Context final UriInfo uriInfo,  
		@HeaderParam("Content-Type") String mediaType,
		@HeaderParam("Dataset") String dataset,
        final InputStream stream) throws Exception {
	
    AccessController.checkPermission(new AllPermission());
    final MGraph graph = new SimpleMGraph();
    
    String message = "";
   
    if(mediaType.equals(SupportedFormat.RDF_XML)) {
    	parser.parse(graph, stream, SupportedFormat.RDF_XML);
    }
    else {
    	message = "Add header Content-Type: application/rdf+xml ";
    }
    
    return message + "Added " + graph.size() + " triples  to dataset " + dataset + "\n";
}
 
开发者ID:fusepool,项目名称:datalifecycle,代码行数:30,代码来源:DlcUploader.java


示例6: addTriplesCommand

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
 *
 * Add triples to graph
 */
private MGraph addTriplesCommand(LockableMGraph targetGraph, URL dataUrl) throws IOException {
    AccessController.checkPermission(new AllPermission());

    URLConnection connection = dataUrl.openConnection();
    connection.addRequestProperty("Accept", "application/rdf+xml; q=.9, text/turte;q=1");

    // create a temporary graph to store the data        
    SimpleMGraph tempGraph = new SimpleMGraph();
    String mediaType = connection.getHeaderField("Content-type");
    if ((mediaType == null) || mediaType.equals("application/octet-stream")) {
        mediaType = guessContentTypeFromUri(dataUrl);
    }
    InputStream data = connection.getInputStream();
    if (data != null) {
        parser.parse(tempGraph, data, mediaType);
        targetGraph.addAll(tempGraph);
    }
    return tempGraph;
}
 
开发者ID:fusepool,项目名称:datalifecycle,代码行数:24,代码来源:SourcingAdmin.java


示例7: parserTest

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
@Test
public void parserTest() {
    final InputStream in = getClass().getClassLoader().getResourceAsStream(
            "testfiles/product.jsonld");
    SimpleMGraph graph = new SimpleMGraph();
    parser.parse(graph, in, "application/ld+json");
    Assert.assertEquals(13, graph.size());
}
 
开发者ID:jsonld-java,项目名称:jsonld-java-clerezza,代码行数:9,代码来源:ClerezzaJsonLdParserSerializerTest.java


示例8: TransformationContext

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
TransformationContext(ContentItem ci, MGraph target){
	if(ci == null){
		throw new IllegalArgumentException("The parsed ContentItem MUST NOT be NULL!");
	}
	this.ci = ci;
	this.src = ci.getMetadata();
	this.tar = target == null ? new SimpleMGraph() : target;
}
 
开发者ID:fusepoolP3,项目名称:p3-stanbol-engine-fam,代码行数:9,代码来源:TransformationContext.java


示例9: read

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
public Map<String,Collection<Object>> read(Exchange exchange) throws IOException, LDPathParseException, InvalidPayloadException {
    final String contentType = exchange.getIn().getHeader(Exchange.CONTENT_TYPE, String.class);
    final String subject = exchange.getIn().getHeader(LDPATH_CONTEXT, String.class);
    final ParsingProvider parser = new JenaParserProvider();
    final MGraph graph = new SimpleMGraph();

    if (isBlank(contentType)) {
        throw new IOException("Empty or missing Content-Type header identifying the RDF format");
    }

    if (isBlank(subject)) {
        throw new IOException("Empty or missing LDPATH_CONTEXT header, identifying the context for the LDPath program");
    }

    parser.parse(graph, getBodyAsInputStream(exchange),
            "application/n-triples".equals(contentType) ? "text/rdf+nt" : contentType, null);

    final UriRef context = new UriRef(subject);
    final ClerezzaBackend backend = new ClerezzaBackend(graph);
    final LDPath<Resource> ldpath = new LDPath<Resource>(backend);
    final Map<String, Collection<?>> results = ldpath.programQuery(context, query);
    final Map<String, Collection<Object>> transformed = transformLdpathOutput(results);

    Map<String, Collection<Object>> ldTransform = new HashMap<>();
    for (Map.Entry<String, Collection<Object>> entry : transformed.entrySet()) {
        if (entry.getValue().size() > 0) {
            ldTransform.put(entry.getKey(), entry.getValue());
        }
    }

    if (ldTransform.size() > 0) {
        return ldTransform;
    } else {
        return null;
    }
}
 
开发者ID:acoburn,项目名称:camel-ldpath,代码行数:37,代码来源:LDPathEngine.java


示例10: generateRdf

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
@Override
protected TripleCollection generateRdf(HttpRequestEntity entity) throws IOException {
    final String text = IOUtils.toString(entity.getData(), "UTF-8");
    final TripleCollection result = new SimpleMGraph();
    final Resource resource = entity.getContentLocation() == null
            ? new BNode()
            : new UriRef(entity.getContentLocation().toString());
    final GraphNode node = new GraphNode(resource, result);
    node.addProperty(RDF.type, TEXUAL_CONTENT);
    node.addPropertyValue(SIOC.content, text);
    node.addPropertyValue(new UriRef("http://example.org/ontology#textLength"), text.length());
    return result;
}
 
开发者ID:fusepoolP3,项目名称:p3-pipeline-transformer,代码行数:14,代码来源:SimpleRdfProducingTransformer.java


示例11: testEnhanceEmptyClientGraph

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
@Test
public void testEnhanceEmptyClientGraph() {
	String errorMessage = "";
	try {
	   TripleCollection resultGraph = jenas.enhance(TEST_DATASET_URI, new SimpleMGraph());
   
	}
	catch(IllegalArgumentException iae){
		errorMessage = iae.getMessage();
		
	}
	
	Assert.assertEquals("An empty graph cannot be enhanced", errorMessage);
	
}
 
开发者ID:fusepoolP3,项目名称:p3-geo-enriching-transformer,代码行数:16,代码来源:JenaSpatialTest.java


示例12: generateRdf

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
  * Takes from the client an address in RDF of which it wants the 
  * geographical coordinates in a format like
  *  
  * <> <http://schema.org/streetAddress> "Via Roma 1" ;
  *                           <http://schema.org/addressLocality> "Trento" ;
  *                           <http://schema.org/addressCountry> "IT" .
  * 
  * The format of the address should follow that used by the national post service of the country.
  * The country must be provided by its 2 digit ISO code (i.e. "IT" for Italy)
  * The url of the OSM/XML data set to look into must be provided as a parameter 'xml' (optional)     
  * The application caches the RDF data. If no URL are provided for the data the application
  * looks in the cache. 
  * Returns the original RDF data with geographical coordinates. 
  *    
  * <http://example.org/res1> <http://schema.org/streetAddress> "Via Roma 1" ;
  *                           <http://schema.org/addressLocality> "Trento" ;
  *                           <http://schema.org/addressCountry> "IT" ;
  *                           <http://www.w3.org/2003/01/geo/wgs84_pos#lat> "46.3634673" ;
  *                           <http://www.w3.org/2003/01/geo/wgs84_pos#long> "11.0357087" .
  */
 @Override
 public TripleCollection generateRdf(HttpRequestEntity entity) throws IOException {    	
     TripleCollection resultGraph = new SimpleMGraph(); // graph to be sent back to the client
     Model dataGraph = ModelFactory.createDefaultModel(); // graph to store the data after the transformation
     String mediaType = entity.getType().toString();   
     String contentLocation = null;
     if ( entity.getContentLocation() != null ) {
         contentLocation = entity.getContentLocation().toString();
     }
             
     TripleCollection inputGraph = Parser.getInstance().parse( entity.getData(), mediaType);        
     
     Address address = getAddress( inputGraph );
     
     String mimeType = entity.getType().toString();        
     
     // Fetch the OSM data from the url and transforms it into RDF via XSL.
     Dataset dataset = null;
     log.info("Data Url : " + xmlUri);
     if( xmlUri != null){
     	try {
     	  InputStream xslt = getClass().getResourceAsStream( XSLT_PATH );
		  InputStream osmRdfIn = processor.processXml(xslt, getOsmData(xmlUri), contentLocation);
		  RDFDataMgr.read(dataGraph, osmRdfIn, null, Lang.TURTLE);
		  dataset = store(dataGraph);
		}
		catch(TransformerConfigurationException tce){
			throw new RuntimeException(tce.getMessage());
		} 
		catch (TransformerException te) {				
		throw new RuntimeException(te.getMessage());
}
         
     }
     else {
         dataset = osmDataset;
     }
     
     // Geocoding: search for the street with the name sent by the client 
     // and return the geographic coordinates
     if(address != null && ! "".equals(address.getStreetAddress()))
         resultGraph = geocodeAddress(dataset, address);
     
     return resultGraph;
     
 }
 
开发者ID:fusepoolP3,项目名称:p3-osm-transformer,代码行数:68,代码来源:OsmRdfTransformer.java


示例13: geocodeAddress

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
 * Search for an address (a node in OSM).
 * @param graph The input graph contains a schema:streetAddress with the name of the street, the locality and the country code .
 * @return Returns the geocoordinates of the street that has been found. 
 */
private TripleCollection geocodeAddress(Dataset ds, Address address){
    TripleCollection geoCodeRdf = new SimpleMGraph();
    
    String pre = StrUtils.strjoinNL( 
        "PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>" ,
        "PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>" ,
        "PREFIX schema: <http://schema.org/>" ,
        "PREFIX text: <http://jena.apache.org/text#>" ,
        "PREFIX geo: <http://www.w3.org/2003/01/geo/wgs84_pos#>" ,
        "PREFIX ogc: <http://www.opengis.net/ont/geosparql#>") ;
    
    String qs = StrUtils.strjoinNL( "SELECT ?s ?street ?lat ?lon" ,
                                " { ?s text:query (schema:streetAddress '" + address.getStreetAddress() + "') ;" ,
                                "      schema:streetAddress ?street ;" ,
                                "      schema:addressLocality \"" + address.getLocality() + "\" ;" ,
                                "      schema:addressCountry \"" + address.getCountryCode() + "\" ;" ,
                                "      geo:lat ?lat ;" ,
                                "      geo:long ?lon ." ,                                                                       
                                " }") ;
    
    log.info(pre + "\n" + qs);
    
    ds.begin(ReadWrite.READ) ;
    try {
        Query q = QueryFactory.create(pre + "\n" + qs) ;
        QueryExecution qexec = QueryExecutionFactory.create(q , ds) ;
        //QueryExecUtils.executeQuery(q, qexec) ;
        ResultSet results = qexec.execSelect();   
        int numberOfAddresses = 0;
        for( ; results.hasNext(); ){
            QuerySolution sol = results.nextSolution();
            String streetUriName = sol.getResource("s").getURI();
            String streetName = sol.getLiteral("?street").getString();  
            String latitude = sol.getLiteral("?lat").getLexicalForm();
            String longitude = sol.getLiteral("?lon").getLexicalForm();
            UriRef addressRef = new UriRef(streetUriName);                
            geoCodeRdf.add(new TripleImpl(addressRef, schema_streetAddress, new PlainLiteralImpl(streetName)));
            geoCodeRdf.add(new TripleImpl(addressRef, schema_addressLocality, new PlainLiteralImpl( address.getLocality())) );
            geoCodeRdf.add(new TripleImpl(addressRef, schema_addressCountry, new PlainLiteralImpl( address.getCountryCode())) );
            geoCodeRdf.add(new TripleImpl(addressRef, geo_lat, new PlainLiteralImpl( latitude )) );
            geoCodeRdf.add(new TripleImpl(addressRef, geo_lon, new PlainLiteralImpl( longitude )) );
            numberOfAddresses++;
        }
        log.info("Number of addresses like " + address.getStreetAddress() + " found: " + numberOfAddresses);
    } 
    finally { 
        ds.end() ; 
    }
    
    return geoCodeRdf;
}
 
开发者ID:fusepoolP3,项目名称:p3-osm-transformer,代码行数:57,代码来源:OsmRdfTransformer.java


示例14: geocodeStreet

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
 * Search for a street (way in OSM) 
 * @param graph The input graph contain a schema:streetAddress with the name of the street.
 * @return Returns the geometry of the street that has been found with the coordinates serialized as WKT. 
 */
private TripleCollection geocodeStreet(Dataset ds, Address address){
    TripleCollection geoCodeRdf = new SimpleMGraph();
    
    String pre = StrUtils.strjoinNL( 
        "PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>" ,
        "PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>" ,
        "PREFIX schema: <http://schema.org/>" ,
        "PREFIX text: <http://jena.apache.org/text#>" ,
        "PREFIX geo: <http://www.w3.org/2003/01/geo/wgs84_pos#>" ,
        "PREFIX ogc: <http://www.opengis.net/ont/geosparql#>") ;
    
    String qs = StrUtils.strjoinNL( "SELECT ?s ?street ?geometry ?wkt " ,
                                " { ?s text:query (schema:streetAddress '" + address.getStreetAddress() + "') ;" ,
                                "      schema:streetAddress ?street ;" ,
                                "      schema:addressLocality " + address.getLocality() + " ;" ,
                                "      schema:addressCountry " + address.getCountryCode() + " ;" ,
                                "      ogc:geometry ?geometry ." ,
                                "   ?geo ogc:asWKT ?wkt ." ,
                                " }") ;
    
    System.out.println(pre + "\n" + qs);
    
    ds.begin(ReadWrite.READ) ;
    try {
        Query q = QueryFactory.create(pre + "\n" + qs) ;
        QueryExecution qexec = QueryExecutionFactory.create(q , ds) ;
        //QueryExecUtils.executeQuery(q, qexec) ;
        ResultSet results = qexec.execSelect();   
        int numberOfToponyms = 0;
        for( ; results.hasNext(); ){
            QuerySolution sol = results.nextSolution();
            String streetUriName = sol.getResource("s").getURI();
            String streetName = sol.getLiteral("?street").getString();
            Resource geo = sol.getResource("?geo");
            String geoUri = geo.getURI();
            String wkt = sol.getLiteral("?wkt").getString();
            UriRef streetRef = new UriRef(streetUriName);
            UriRef geometryRef = new UriRef(geoUri);
            geoCodeRdf.add(new TripleImpl(streetRef, schema_streetAddress, new PlainLiteralImpl(streetName) ));
            geoCodeRdf.add(new TripleImpl(streetRef, schema_addressLocality, new PlainLiteralImpl( address.getLocality())) );
            geoCodeRdf.add(new TripleImpl(streetRef, schema_addressCountry, new PlainLiteralImpl( address.getCountryCode())) );
            geoCodeRdf.add(new TripleImpl(streetRef, new UriRef("http://www.opengis.net/ont/geosparql#geometry"), geometryRef));
            geoCodeRdf.add(new TripleImpl(geometryRef, new UriRef("http://www.opengis.net/ont/geosparql#asWKT"), new PlainLiteralImpl(wkt)));
            numberOfToponyms++;
        }
        log.info("Number of toponymis like " + address.getStreetAddress() + " found: " + numberOfToponyms);
    } 
    finally { 
        ds.end() ; 
    }
    
    return geoCodeRdf;
}
 
开发者ID:fusepoolP3,项目名称:p3-osm-transformer,代码行数:59,代码来源:OsmRdfTransformer.java


示例15: findSameEntities

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
 * The client RDF data is always used as the source data source, of type file, for the comparisons with a target data source. 
 * The target data source can be of type file or a SPARQL endpoint. If the target data source in the Silk config file
 * is set to be of type file then the same client data will be used and the task is a deduplication task (Silk works only with local files). 
 * The updated configuration file and the input RDF data and the output files are stored in the /tmp/ folder.
 * @param inputRdf
 * @return
 * @throws IOException
 */
protected TripleCollection findSameEntities(InputStream inputRdf, String rdfFormat, InputStream configIn) throws IOException {    	
	// Default silk config file
	File configFile = null;
	if(configIn != null){
		configFile = FileUtil.inputStreamToFile(configIn, "silk-config-", ".xml");
	}
	else {
		configFile = FileUtil.inputStreamToFile(getClass().getResourceAsStream("silk-config-file.xml"), "silk-config-", ".xml");
	}
	
    // file with original data serialized in N-TRIPLE format
    File ntFile = File.createTempFile("input-rdf", ".nt");
    // file containing the equivalences
    File outFile = File.createTempFile("output-", ".nt");
    
    // update the config file with the paths of the source datasource and output files and the format
    // if the type of target datasource is "file" update the path (deduplication) 
    SilkConfigFileParser silkParser = new SilkConfigFileParser(configFile.getAbsolutePath());
    silkParser.updateOutputFile(outFile.getAbsolutePath());
    silkParser.updateSourceDataSourceFile(ntFile.getAbsolutePath(), "N-TRIPLE");
    if (silkParser.getTargetDataSourcetype().equals("file")) {
        silkParser.updateTargetDataSourceFile(ntFile.getAbsolutePath(), "N-TRIPLE"); //deduplication
    }
    silkParser.saveChanges();
    
    // change the format into N-TRIPLE
    Parser parser = Parser.getInstance();
    TripleCollection origGraph =  parser.parse(inputRdf, rdfFormat);
    Serializer serializer = Serializer.getInstance();
    serializer.serialize(new FileOutputStream(ntFile), origGraph, SupportedFormat.N_TRIPLE);

    // interlink entities
    Silk.executeFile(configFile, null, 1, true);
    log.info("Interlinking task completed."); 
    TripleCollection equivalences = parseResult(outFile); 
    
    // add the equivalence set to the input rdf data to be sent back to the client
    TripleCollection resultGraph = new SimpleMGraph();
    resultGraph.addAll(origGraph);
    resultGraph.addAll(equivalences);
    
    // remove all temporary files
    configFile.delete();        
    ntFile.delete();
    outFile.delete();

    // returns the result to the client
    return resultGraph;
}
 
开发者ID:fusepoolP3,项目名称:p3-silkdedup,代码行数:59,代码来源:DuplicatesTransformer.java


示例16: queryNearby

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
 * Searches for points of interest within a circle of a given radius. 
 * The data used is stored in a named graph.
 * @param point
 * @param uri
 * @param radius
 * @return
 */
public TripleCollection queryNearby(WGS84Point point, String graphName, double radius){
    TripleCollection resultGraph = new SimpleMGraph();
    log.info("queryNearby()");
    long startTime = System.nanoTime();
    String pre = StrUtils.strjoinNL("PREFIX spatial: <http://jena.apache.org/spatial#>",
            "PREFIX geo: <http://www.w3.org/2003/01/geo/wgs84_pos#>",
            "PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>",
            "PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>");
    
    String qs = StrUtils.strjoinNL("SELECT * ",
            "FROM NAMED <" + graphName + ">",
            "WHERE { ",
            "GRAPH <" + graphName + "> ",
            " { ?s spatial:nearby (" + point.getLat() + " " + point.getLong() + " " + radius + " 'm') ;",
            "      rdf:type ?type ; ",
            "      geo:lat ?lat ;" ,
            "      geo:long ?lon ; ",
            "      rdfs:label ?label .", " }",
            "}");

    log.info(pre + "\n" + qs);
    spatialDataset.begin(ReadWrite.READ);
    int poiCounter = 0;
    try {
        Query q = QueryFactory.create(pre + "\n" + qs);
        QueryExecution qexec = QueryExecutionFactory.create(q, spatialDataset);
        ResultSet results = qexec.execSelect() ;
        for ( ; results.hasNext() ; ) {
            QuerySolution solution = results.nextSolution() ;
            String poiUri = solution.getResource("s").getURI();
            String poiName = checkUriName(poiUri);
            String poiType = checkUriName(solution.getResource("type").getURI());
            String poiLabel = solution.getLiteral("label").getString();
            String poiLatitude = solution.getLiteral("lat").getString();
            String poiLongitude = solution.getLiteral("lon").getString();
            log.info("poi name: " + poiName + " label = " + poiLabel);
            UriRef poiRef = new UriRef(poiName);
            String positionUri = checkUriName(point.getUriName());
            resultGraph.add( new TripleImpl(poiRef, schema_containedIn, new UriRef(positionUri)) );               
            resultGraph.add( new TripleImpl(poiRef, RDFS.label, new PlainLiteralImpl(poiLabel)) );
            resultGraph.add( new TripleImpl(poiRef, RDF.type, new UriRef(poiType)));
            resultGraph.add( new TripleImpl(poiRef, geo_lat, new TypedLiteralImpl(poiLatitude, XSD.float_)) );
            resultGraph.add( new TripleImpl(poiRef, geo_long, new TypedLiteralImpl(poiLongitude, XSD.float_)) );  
            poiCounter++;
            
        }
      
    } 
    finally {
        spatialDataset.end();
    }
    long finishTime = System.nanoTime();
    double time = (finishTime - startTime) / 1.0e6;
    log.info(String.format("FINISH - %.2fms", time));
    log.info(String.format("Found " + poiCounter + " points of interest."));
    return resultGraph;

}
 
开发者ID:fusepoolP3,项目名称:p3-geo-enriching-transformer,代码行数:67,代码来源:SpatialDataEnhancer.java


示例17: perform

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
 * Smush the union of the source, digest and enhancements graphs using the
 * interlinking graph. More precisely collates URIs coming from different
 * equivalent resources in a single one chosen among them. All the triples
 * in the union graph are copied in the smush graph that is then smushed
 * using the interlinking graph. URIs are canonicalized to http://
 *
 * @param graphToSmushRef
 * @return
 */
void perform() {
    messageWriter.println("Smushing task.");


    final SameAsSmusher smusher = new SameAsSmusher() {

        @Override
        protected UriRef getPreferedIri(Set<UriRef> uriRefs
        ) {
            Set<UriRef> httpUri = new HashSet<UriRef>();
            for (UriRef uriRef : uriRefs) {
                if (uriRef.getUnicodeString().startsWith("http")) {
                    httpUri.add(uriRef);
                }
            }
            if (httpUri.size() == 1) {
                return httpUri.iterator().next();
            }
            // There is no http URI in the set of equivalent resource. The entity was unknown. 
            // A new representation of the entity with http URI will be created. 
            if (httpUri.size() == 0) {
                return generateNewHttpUri(dataSet, uriRefs);
            }
            if (httpUri.size() > 1) {
                return chooseBest(httpUri);
            }
            throw new Error("Negative size set.");
        }

    };

    if (dataSet.getSmushGraph().size() > 0) {
        dataSet.getSmushGraph().clear();
    }

    dataSet.getSmushGraph().addAll(dataSet.getDigestGraph());
    dataSet.getSmushGraph().addAll(dataSet.getEnhancementsGraph());
    log.info("All triples from the union of digest and enhancements graph are now in the smush graph.");
    log.info("Starting smushing.");
    smusher.smush(dataSet.getSmushGraph(), dataSet.getInterlinksGraph(), true);
    log.info("Smush task completed.");

    // Remove from smush graph equivalences between temporary uri (urn:x-temp) and http uri that are added by the clerezza smusher.
    // These equivalences must be removed as only equivalences between known entities (http uri) must be maintained and then published
    MGraph equivToRemove = new SimpleMGraph();
    Lock srl = dataSet.getSmushGraph().getLock().readLock();
    srl.lock();
    try {
        Iterator<Triple> isameas = dataSet.getSmushGraph().filter(null, OWL.sameAs, null);
        while (isameas.hasNext()) {
            Triple sameas = isameas.next();
            NonLiteral subject = sameas.getSubject();
            Resource object = sameas.getObject();
            if (subject.toString().startsWith("<" + URN_SCHEME) || object.toString().startsWith("<" + URN_SCHEME)) {
                equivToRemove.add(sameas);
            }
        }
    } finally {
        srl.unlock();
    }

    dataSet.getSmushGraph().removeAll(equivToRemove);

    messageWriter.println("Smushing of " + dataSet.getUri()
            + "Smushed graph size = " + dataSet.getSmushGraph().size());
    canonicalizeResources();

}
 
开发者ID:fusepool,项目名称:datalifecycle,代码行数:79,代码来源:SmushingJob.java


示例18: canonicalizeResources

import org.apache.clerezza.rdf.core.impl.SimpleMGraph; //导入依赖的package包/类
/**
 * All the resources in the smush graph must be http dereferencable when
 * published. All the triples in the smush graph are copied into a temporary
 * graph. For each triple the subject and the object that have a non-http
 * URI are changed in http uri and an equivalence link is added in the
 * interlinking graph for each resource (subject and object) that has been
 * changed.
 */
private void canonicalizeResources() {
    LockableMGraph graph = dataSet.getSmushGraph();
    MGraph graphCopy = new SimpleMGraph();
    // graph containing the same triple with the http URI for each subject and object
    MGraph canonicGraph = new SimpleMGraph();
    Lock rl = graph.getLock().readLock();
    rl.lock();
    try {
        graphCopy.addAll(graph);
    } finally {
        rl.unlock();
    }

    Iterator<Triple> ismushTriples = graphCopy.iterator();
    while (ismushTriples.hasNext()) {
        Triple triple = ismushTriples.next();
        UriRef subject = (UriRef) triple.getSubject();
        Resource object = triple.getObject();
        // generate an http URI for both subject and object and add an equivalence link into the interlinking graph
        if (subject.getUnicodeString().startsWith(URN_SCHEME)) {
            subject = generateNewHttpUri(dataSet, Collections.singleton(subject));
        }
        if (object.toString().startsWith("<" + URN_SCHEME)) {
            object = generateNewHttpUri(dataSet, Collections.singleton((UriRef) object));
        }

        // add the triple with the http uris to the canonic graph
        canonicGraph.add(new TripleImpl(subject, triple.getPredicate(), object));
    }

    Lock wl = graph.getLock().writeLock();
    wl.lock();
    try {
        graph.clear();
        graph.addAll(canonicGraph);
    } finally {
        wl.unlock();
    }

}
 
开发者ID:fusepool,项目名称:datalifecycle,代码行数:49,代码来源:SmushingJob.java



注:本文中的org.apache.clerezza.rdf.core.impl.SimpleMGraph类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java FromStringTerm类代码示例发布时间:2022-05-22
下一篇:
Java AlertDialog类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap