(continued from part III)
Over the last year I had hardly time to advance my Perl client to the AllegroGraph tuple server. Which is a shame, as it is fun to take the existing REST interface and to offer it in a perlish mindset.
So when the Perl hackathon (report) was in Vienna these days, and coincidentally also the RDF Perl hackathon Geilo next week, I thought I should use the opportunity to a rub shoulders with the Perl illuminaries, at least for half a day. But when I arrived, everyone was already in deep hacking mode, so I had no excuse but to do programming myself.
Start of December I attended the ESTC 2009 (European Semantic Technologies Conference) here in Vienna.
It was my first attendence, and it was interesting to see how the non-academic faction of the semantic web crowd looked like. I had seen the academic portion at the ESWC 2009 where I presented a paper on semantic time series in one of the workshops.
And while I wore my Topic Maps T-shirt publicly in Heraklion, at the ESTC it was kept well hidden under my pullover all the time.
(continued from Part II)
One thing Formula 3 (F3) was designed to help with is the quantitative transformation of time series. If you had to compute the mean values over an 1-hour interval of, say, the ticket sales then the following TSP operator takes care of it:
every 10 minutes
(continued from Part I)
Last time I implicitly proposed to think some parts of a (geosemantic) application in terms of time series. This is not so farfetched, consider for instance a semantically enabled tourism application for, say, Vienna.
Sure, there are a number of very static things you would store into the semantic network:
- the sites, churches, cathedral, churches, and even more churches,
- the museums, galleries, museums and even more museums,
- the tourism ontology, containing buildings, museums, and yes, the churches.
But even if this is Vienna, not everything is static: There are (insane) traffic conditions, (predominantly italian and spanish) tourists roaming through the city, concert tickets sold at the weirdest places. All these are perfect candidates to be packed into time series.
When people design semantic systems, then a typical architecture looks something like this:
- An RDF tuple (or Topic Maps) store for the odd and irregular data, and
- some relational DB, either imported into the semantic store, or wrapped, or linked via a message bus (MQ, events, ...).
- Some more or less sophisticated integration, and
- the user interface on top of it.
Now this is all well and good for your middle-of-the-road semantic portal, but the class of applications I have in mind have one thing in common:
Data dynamics, with temporal and geospatial aspects. And that with physical units.
CatBert was in a very aggressive mood. He had interviewed me all day on subject identification and how it worked for the Topic Maps and the RDF stack.
But the longer he investigated, the more poignant his remarks became. This evening he even meant:
Robert, I am getting tired of the lack of imagination in the semantic web community.
This weekend I have commenced work on RDF::SKOS.
As most of you know, many have observed this before and larsbot has blogged about it in the past, SKOS is strikingly similar to Topic Maps, although in detail its semantics is much more limited. Interesting from a standards-political point of view.
There is yet no Perl coverage on CPAN. That has to change.
The Python client shipped with the distribution is using it, so I wondered how this would pan out in Perl. Pan out on CPAN, so to say. Last weekend I did some tinkering and concocted a first, naive Perl client.
A while back I experimented with the Java client API of AllegroGraph to talk to a triple store.
The latest release (V3.1.1) also sports a Python client which immediately aroused my interest, and that for several reasons:
- It is using a new HTTP protocol with the AllegroGraph server, one using JSON.
- And its API is following that of Sesame.
The following simply goes through the basic motions, as also described by the Python tutorial.
(by Lara Spendier, adaptions by rho)
While Redland supports SPARQL to a large degree, there is no support for OWL, or any other kind of reasoning. This is where RDF::Redland::DIG kicks in: It is an extension to exchange information with a DIG reasoner, i.e.
RDF::Redland is the Perl wrapper around the RDF Redland libraries. At the Austrian Research Centers Seibersdorf we (actually Lara Spendier) created a first version of RDF::Redland::DIG which provides a DIG reasoner backend for Redland.
DIG is an XML-based protocol applications can use to talk to a reasoner process. In the first steps you will have
(This is a reworked and updated version of an earlier version)
Update: You also may be interested in how to use OWL and reasoning with RDF::Redland.