Sunday, December 21, 2014

A Solr RDF Store and SPARQL endpoint in just 2 minutes

How to store and query RDF data in Solr? Here is a quick guide, just 2 minute / 5 steps and you will get that ;)

1. All what you need  

  • A shell  (in case you are on the dark side of the moon, all steps can be easily done in Eclipse or whatever IDE) 
  • Java (7)
  • Apache Maven (3.x)
  • git 

2. Checkout SolRDF code

Open a shell and type the following:

# cd /tmp
# git clone solrdf-download

3. Build and Run SolRDF

# cd solrdf-download/solrdf
# mvn clean install
# cd solrdf-integration-tests
# mvn clean package cargo:run

The very first time you run this command a lor of things will be downloaded, Solr included. At the end you should see something like this:

[INFO] Jetty 7.6.15.v20140411 Embedded started on port [8080]
[INFO] Press Ctrl-C to stop the container...

SolRDF is up and running! 

4. Add some data

Open another shell and type the following:

# curl -v http://localhost:8080/solr/store/update/bulk?commit=true \
  -H "Content-Type: application/n-triples" \
  --data-binary @/tmp/solrdf-download/solrdf/src/test/resources/sample_data/bsbm-generated-dataset.nt 

Wait a moment...ok! You just added (about) 5000 triples!

5. Execute some query

Open another shell and type the following:

# curl "" \
  --data-urlencode "q=SELECT * WHERE { ?s ?p ?o } LIMIT 10" \
  -H "Accept: application/sparql-results+json"

# curl "" \
  --data-urlencode "q=SELECT * WHERE { ?s ?p ?o } LIMIT 10" \
  -H "Accept: application/sparql-results+xml"

Et voilà! Enjoy! I'm still working on that...any suggestion about this idea is warmly welcome...and if you meet some annoying bug feel free to give me a shout ;)

No comments: