Mapwarper Tutorial & Spatial Humanities Workshop by Lincoln Mullen

Lincoln Mullen has written a great series of workshops on the Spatial / Digital Humanities over five days. Day 3 is focused around Georectification with Mapwarper.net and is a very good tutorial.
mapwarper-preview.png
The full workshop contents are here, and I recommend you to check them out:
Day 1: Introduction and Setup | Map Literacy | Narrative Maps
Day 2: Data Maps | QGIS
Day 3: Georectification | Working with Spatial Data
Day 4: Deep Maps | From Manuscripts to Maps
Day 5: Programmatic Maps

Lincoln is an assistant professor in the Department of History and Art History at George Mason University, working on the history of American religions as a digital historian.

The link to the Georectificaiton Workshop is here: http://lincolnmullen.com/projects/spatial-workshop/georectification.html

The screenshot of the tutorial is below the fold. Continue reading

Advertisement

Mapwarper featured in A Digital Humanities Primer for English Students

Jenna Herdman has written an excellent free e-book about Digital Humanities for English Students which has an entire chapter titled: Digital Mapping Tool Tutorial which features the Mapwarper. It’s been published using gitbook and is available in pdf, html, epub formats.

The tutorial covers adding a map to mapwarper.net to chart the movements of David in Charles Dickens’s David Copperfield.

unspecified-2

The map is then loaded into Palladio which is a new tool for me. it “is a web-based platform for the visualization of complex, multi-dimensional data”.

unspecified-1

Do check out this great resource. The book has seven chapters in total and all of them are interesting and worthwhile to read! https://www.gitbook.com/book/jennaherdman/a-digital-humanities-primer-for-english-students/details

Screenshot after this read more tag:


Continue reading

Devise Omniauth OAuth Strategy for MediaWiki (Wikipedia, WikiMedia Commons)

Authentication of MediaWiki users with a Rails Application using Devise and Omniauth

Wikimaps is a Wikimedia Commons project to georeference/georectify historical maps. Read the wikimaps blog here. It is using a customised version of the Mapwarper open source map georectification software as seen on http://mapwarper.net to speak with the Commons infrastructure and running on Wikimedia Foundations Labs servers. We needed a way to allow Commons users to log in easily.  And so I developed the omniauth-mediakwiki strategy gem so your Ruby applications can authenticate on WikiMedia wikis, like Wikipedia.org and Wikimedia Commons.

e0974880-2ef0-11e4-9b51-e96f339fe90c

The Wikimaps Warper application uses Devise – it works very nicely with Omniauth. The above image shows traditional login with username and password and, using OmniAuth, to Wikimedia Commons, GitHub and OpenStreetMap.

After clicking the Wikimedia Commons button the user is presented with this:oauth

It may not be that pretty, but the user allowing this will redirect back to our app and the user will be logged in.

This library used the omniauth-osm library as an initial framework for building upon.

The code is on github here:   https://github.com/timwaters/omniauth-mediawiki

The gem on RubyGems is here: https://rubygems.org/gems/omniauth-mediawiki

And you can install it by including it in your Gemfile or by doing:

gem install omniauth-mediawiki

Create new registration

The mediawiki.org registration page is where you would create an OAuth consumer registration for your application. You can specify all wikimedia wikis or a specific one to work with. Registrations will create a key and secret which will work with your user so you can start developing straight away although currently a wiki admin has to approve each registration before other wiki users can use it.  Hopefully they will change this as more applications move away from HTTP Basic to more secure authentication and authorization strategies in the future!

Screenshot from 2014-09-03 21:08:33

Usage

Usage is as per any other OmniAuth 1.0 strategy. So let’s say you’re using Rails, you need to add the strategy to your `Gemfile` alongside omniauth:

gem 'omniauth'
gem 'omniauth-mediawiki'

Once these are in, you need to add the following to your `config/initializers/omniauth.rb`:

Rails.application.config.middleware.use OmniAuth::Builder do
 provider :mediawiki, "consumer_key", "consumer_secret"
end

If you are using devise, this is how it looks like in your `config/initializers/devise.rb`:

config.omniauth :mediawiki, "consumer_key", "consumer_secret", 
    {:client_options => {:site => 'http://commons.wikimedia.org' }}

If you would like to use this plugin against a wiki you should pass this you can use the environment variable WIKI_AUTH_SITE to set the server to connect to. Alternatively you can pass the site as a client_option to the omniauth config as seen above. If no site is specified the http://www.mediawiki.org wiki will be used.

Notes

In general see the pages around https://www.mediawiki.org/wiki/OAuth/For_Developers for more information

When registering for a new OAuth consumer registration you need to specify the callback url properly. e.g. for development:

http://localhost:3000/u/auth/mediawiki/callback
http://localhost:3000/users/auth/mediawiki/callback

This is different from many other OAuth authentication providers which allow the consumer applications to specify what the callback should be. Here we have to define the URL when we register the application. It’s not possible to alter the URL after the registration has been made.

Internally the strategy library has to use `/w/index.php?title=` paths in a few places, like so:

:authorize_path => '/wiki/Special:Oauth/authorize',
:access_token_path => '/w/index.php?title=Special:OAuth/token',
:request_token_path => '/w/index.php?title=Special:OAuth/initiate',

This could be due to a bug in the OAuth extension, or due to how the wiki redirects from /wiki/Special pages to /w/index.php pages….. I suspect this may change in the future.

Another thing to note is that the mediawiki OAuth implementation uses a cool but non standard way of identifying the user.  Omiauth and Devise needs a way to get the identity of the user. Calling '/w/index.php?title=Special:OAuth/identify' it returns a JSON Web Token (JWT). The JWT is signed using the OAuth secret and so the library decodes that and gets the user information.

Calling the MediaWIki API

Omniauth is mainly about authentication – it’s not really about using OAuth to do things on their behalf – but it’s relatively easy to do so if you want to do that. They recommend using it in conjunction with other libraries, for example, if you are using omniauth-twitter, you should use the Twitter gem to use the OAuth authentication variables to post tweets. There is no such gem for MediaWiki which uses OAuth. Existing  Ruby libraries such as MediaWiki Gateway and MediaWIki Ruby API currently only use usernames and passwords – but they should be looked at for help in crafting the necessary requests though.

So we will have to use the OAuth library and call the MediaWiki API directly:

In this example we’ll call the Wikimedia Commons API

Within a Devise / Omniauth setup, in the callback method, you can directly get an OAuth::AccessToken via request.env["omniauth.auth"]["extra"]["access_token"] or you can get the token and secret from request.env["omniauth.auth"]["credentials"]["token"] and request.env["omniauth.auth"]["credentials"]["secret"]

Assuming the authentication token and secret are stored in the user model, the following could be used to query the mediawiki API at a later date.

@consumer = OAuth::Consumer.new "consumer_key", "consumer_secret",
            {:site=>"https://commons.wikimedia.org"}
@access_token = OAuth::AccessToken.new(@consumer, user.auth_token, user.auth_secret)
uri = 'https://commons.wikimedia.org/w/api.php?action=query&meta=userinfo&uiprop=rights|editcount&format=json'
resp = @access_token.get(URI.encode(uri))
logger.debug resp.body.inspect
# {"query":{"userinfo":{"id":12345,"name":"WikiUser",
# "rights":["read","writeapi","purge","autoconfirmed","editsemiprotected","skipcaptcha"],
# "editcount":2323}}}

Here we called the Query action for userinfo asking for rights and editcount infomation.

Rain Prediction for the immediate future using Met Office DataPoint – rain graph

Part of a series of posts to cover some small projects that I did whilst not being able to work. They cover things from the role of familiar strangers on the internet and anti-social networks, through to meteorological hacks, funny memes to twitter bots. This post is about a meteorological hack.

The Met Office in the UK have in this last year published an API for their range of services – all part of the open data movement I think.

DataPoint is a way of accessing freely available Met Office data feeds in a format that is suitable for application developers. It is aimed at professionals, the scientific community and student or amateur developers, in fact anyone looking to re-use Met Office data within their own innovative applications.

The year before this, in Denver, USA, I was shown by a couple of awesome mapping and weather geeks a mobile app that showed when it was going to rain, and more importantly when it wasn’t going to rain in very high temporal resolution. You can use the app and know whether to get a swift half and then leave to get the bus, or whether to stay in for an hour until the showers end. It was very detailed and highly useful. This app Dark Sky App

was freaking awesome. And I want here in the UK, so when the Met Office announced their API I was interested.

You cannot do what DarkSkyApp does with the Met Office DataPoint API though – what you can do is do some interpolations though. The API for precipitation forecasts only gives access to a 3 hourly map tile.

http://www.metoffice.gov.uk/datapoint/product/precipitation-forecast-map-layer

Although further poking around shows that they do have an undocumented 1 hourly image.

Screenshot - 061213 - 16:05:15

These map tiles then could be used. http://rain-graph.herokuapp.com  is the resulting application with the code here: https://github.com/timwaters/rain_graph

It’s a Ruby Sinatra application which for a location, grabs the precipitation tile for a defined location for each hour from now. It looks at the pixel value for the given location and determines the amount predicted. It shows when the heaviest rain is predicted and when it would stop. Interpolation is given by the graph engine itself – no fancy meteorological modelling is done (at this stage). It uses Chunky_png to get the pixel values.

687474703a2f2f692e696d6775722e636f6d2f4343306f5868772e706e67

All requests are cached to avoid hitting the MetOffice API and because an image won’t change for an hour. Additionally it uses another API method to get a human readable upcoming forecast text for that location, and displays it under the graph. Contrary to popular global belief it’s not always raining in the UK, and so most of the time you will never see a a graph showing something!

Considerations:

Pixels to Lat Lon:
Since a lat/lon location is quite specific, it could map to one pixel in a tile, and that pixel could have a lower or higher value than the ones surrounding it. I could use a kernel average – do a 6×6 pass over the pixel and get the average value. But since there are tiles are lower zoom levels, by zooming out, the spatial extent of the pixel would equal that larger area – it would do the work for us.

Interpolation between forecasts:
It wasn’t clear if the forecast images showed the predicted situation over the whole hour, or whether it showed the situation at that moment. Should a user look at an animation to see how rain cloud moves across from A->B and guess that in between that there would be rain, or should they think that that there would be no rain if there is no rain shown?

User Interface:
It looks a bit bland – we should show the image tiles underneath  – perhaps shown when hovering over a point.

Accuracy:
I haven’t tested the accuracy of this.

Location hard coding:
The text forecasts are hardcoded to a set number of regions, but we could do a closest point and get the correct forecast for the given lat and lon.

Use Yr.No API

Yr.no has detailed hour by hour forecasts API for a place giving the amount of precipitation.
http://www.yr.no/place/United_Kingdom/England/Leeds/hour_by_hour_detailed.html

<time from="2013-12-06T19:00:00" to="2013-12-06T20:00:00">
<!-- Valid from 2013-12-06T19:00:00 to 2013-12-06T20:00:00 -->
<symbol number="3" name="Partly cloudy" var="mf/03n.11" />
<precipitation value="0" /><!-- Valid at 2013-12-06T19:00:00 -->
<windDirection deg="294.2" code="WNW" name="West-northwest" />
<windSpeed mps="4.3" name="Gentle breeze" />
<temperature unit="celsius" value="1" />
<pressure unit="hPa" value="1004.9" />
</time>

Leeds Data Thing, Maps and Hackdays

Leeds Data Thing is a new group started in Leeds  (not to be confused with Leeds Ruby Thing!).

I spoke at the first event (read the write up from Rebecca) about Geospatial visualisations and  OpenStreetMap: Here are the slides:

Since then there has been a few other events as part of Big Data Week – including a load of great short talks.

This weekend there was a data hackday at the UK’s NHS Information Centre for Health and Social Care in the centre of Leeds.

hipster photo

There’s a wealth of data on their website , but it was given to us as a mysql database, and we were able to enter remotely. On the first day I poked around the data and had a thought.

Hackdays

I often spend the first part of any hackday wondering what to do, and twiddling thumbs. I find that hackdays become for me a type of busman’s holiday – and this hackday was particularly geographical in nature. Most of the entries had some kind of data on map component. I think that these types of analyses, whilst being very smart and interesting – and may be exactly what the judges are looking for, may not exactly stretch the unexpected or “the hack” in the data.

Fortunately there was plenty of latitude for exploring things laterally. The most interesting dataset was listing the chemicals and drugs each practice spent money on – but I couldn’t find much to do with it.   What caught my eye was the dataset listing the names of the doctors surgeries, practices, medical centres. If I think about my neighbourhood I can pass about half a dozen doctors in a very small area. Leeds is well covered (or perhaps just my area is!) . I was reminded of James Joyce’s quote about being unable to cross Dublin without passing a pub. Perhaps the same can be said for Leeds and doctors!  The names of the surgeries were also interesting. Names such as:

Chapeloak Surgery
The Avenue Surgery
Dr Ca Hicks’ Practice
The Dekeyser Group Practice
The Highfield Medical Centre
Chapeltown Family Surgery

Wonder if the more “leafy” the name, the more “leafy” the neighbourhood it was in? Perhaps the more grandiose sounding practices had more patients? Perhaps the smaller sounding ones had better patient satisfaction reviews?

At the venue, it appeared that I was the only one to be using Linux on the desktop and so the wifi did not work – so I had a bit over one hour to put something together. Decided to go with the concept of “Leeds is covered” and wanted something showing the labels of the practices over the areas where they were. Filling out the map, so to speak.  The hack was called “Tim’s One Hour Data Challenge” and here is the end result:

Leeds is covered

GeoCommons GeoJSON in OpenLayers

So, with GeoCommons you can export all the features from a dataset in GeoJSON format. This is very useful. Then can it be displayed in Openlayers? Why yes it can!

I use the following strategy, wrapping the response with a little bit extra so that the GeoJSON format can read it properly.

var url = "http://geocommons.com/overlays/128725/features.json?limit=100";

var p = new OpenLayers.Format.GeoJSON();

OpenLayers.loadURL(url, {}, null, function (response) {
 var gformat = new OpenLayers.Format.GeoJSON();
 gg = '{"type":"FeatureCollection", "features":' +
      response.responseText + '}';
 var feats = gformat.read(gg);

 vector_layer.addFeatures(feats);
});

Have a look a the live demo of this example of loading points from a Geocommons dataset straight into an OpenLayers map
(Note that I am using an OpenLayers.ProxyHost proxy to make it work in FF)

It is of course very basic in terms of styling, but it’s a start!

Incidentally the points are from “CARMA, India Power Plant Emissions, India, 2000/ 2007/Future” (carbon monitoring)

WhereCampEU 2011 recap & berlin psychogeography

Last week in Berlin I was lucky enough to go to WhereCampEU – thanks to Gary and Chris for organising this wonderful unconference. The conference was held in a trendy hipster ish part of the city, but which had also, I heard, the highest number of young families and births. It was also in the former Eastern part of the city. It gave the area a nice appeal, overall.

photo by Chris Fleming

I did a couple of sessions, one on a preview of GeoCommons2.0  talked about in a previous post and the other a psychogeography session. For the psychogeography session I sent four teams out to explore the environs around the campus.

One team followed people around. They said “I’m amazed by how slowly some people moved” and “Well, often we followed someone and then they would wander into a book shop”  – revealing the nature of the people and the type of area, bohem style cafes and shops, lazily people.

Another group were sent to ask people to point to were the centre of Berlin was. I asked some people where they thought was the centre, and most of them scratched their chins, and pointed to the Mitte area of the city, usually on the map, or waved southwards. Part of a consequence of being a split city, really. The western bit, someone said, “looks and feels more like a CBD” – that is, big shops, tall towers etc. I did venture to the former western CBD centre, and came across a mile long car show. This area was where the money was.

The other group was sent to walk around the area according to the Game of Life algorithm, Left left Right where you walk and take the first left, then the second left, then the next right, and so on. It’s impossible to predict where you will end up. I joined this group. We had a good explore over a small area, really, but encountering a lot of different environments. Shared (private) gardens / courtyards in the middle of apartment blocks, churches, cafes, and shops.

The fourth team were given a secret mission, and so I cannot reveal to you what they did. However, they are all in good health, and saw the city in a new light.

Photo by Chris Fleming

Back to the unconference, and some of the highlights were:

* Playing the Skobbler game, treasure hunting for addresses in the neighbourhood.

* Seeing offmaps evolve over the year. I’ve not got an iPhone, but that app looked very nice.

* Spatial databases, and in particular CouchDb – and their spatial bits

* CASA did a few talks – I’m getting more and more fond of their work – if anything they really seem to love the stuff they are doing – they share the same vision as me as giving GI tools and benefits to as many people as possible.

* Peter Batty wore an ipad t-shirt – and gave a great presentation about essentially putting utilities information onto a Gmaps like interface and mobile map.

* Gary Gale gave a compelling reason for standardizing place. And it makes sense.

* Meeting the NomadLabs guys for the first time, and being able to say “Thank You” for their work on Ruby on Rails GIS Hacks that I found very useful 4 years ago!

* Corridor talk, beer and food

Some stuff I’ve been working on with new GeoCommons 2.0

Last weekend, if you were at WhereCampEU in Berlin (blog post to follow) , you may have caught my sneak peak into the new GeoCommons 2.0, which has been revealed just the other day. Here are some of the highlights of the new GeoCommons

  • The flash map has been overhauled and re-written, mainly by Andrei – and it can handle hundreds of thousands numbers of points quite happily
  • Analytics library is completed, but not currently accessible to normal users of GeoCommons – hopefully it will be soon, if people want it.
  • Behind the scenes, the system uses a number of distributed workers and tasks to offload processing intensive or long processing tasks
  • Datasets and Maps get given nice overview images, and the attributes of datasets have histograms generated for them
  • Data can be edited in the system, and filtered, and saved either to replace itself or as a new dataset
  • Animation of temporal data is much nicer now
  • Polymaps for HTML5 non-flash map support
  • Filters can be applied to the map, so that attributes can be filtered out.
  • Thematic maps can be made with categories now
  • Acetate is used as standard
  • Custom markers can be added to a map, and even animated ones work too!
The GeoIQ developer blog has a developer orientated review of wha’ts new and there is a good overview of GeoCommons on the main GeoIQ blog too.
Keep your eyes peeled on the GeoIQ Developer Blog over the next few days as the team adds some more posts about some of the technology behind it.

WherecampUK 2010 Recap

Last week I journeyed down on the train to Nottingham to go to WhereCampUK – an unconference for all things “geo” – but it was only a few months since the similarly named WhereCampEU of (which I never actually wrote something about) down in London. Before I share some of the best bits, here’s some of the similarities and differences.

* Less international folks

* Less big geo personalities and keynotes

* More OSM

* No T-Shirts

* More beer – we drank a large pub dry, literally. The next day, the landlord swore at me for pissing off their regulars.

* More cake

* Cheaper and quicker to run, setup and organise.

For more pointers in how to run an unconference, check out Steve Coast‘s latest post, where he writes about what he did for Wherecamp in Denver.  How I ran a successful unconference in 6 hours and you can too.

Overall, the event was great.

I ran two sessions. The main one was “What is Psychogeography“. The best part of this was sending all participants out with directions in twos and threes, for 10 mins before lunch. They had directions such as “left left right”, “follow someone”, “ask where the centre is, follow that direction, ask again”, “find hidden portals”, “find fairies”, “hear something, take a photo”.

I also quickly slotted in the NYPL Warper presentation, and included this slide. You get 20 points if you know what this refers to!

I also mentioned the words “neogeography” for the first time in the conference, and that was at 3:30pm, which says quite a bit about the use of the term.

Talks I liked were:

* Vernacular Geography & Informal Placenames

* Geo Games

* Education and mobile maps

* Augmented Reality roundup

* How streets get names

* Peoples Collection Wales

* Haptic Navigation

* OSM Talks including – Potlatch 2

* Gregory Mahler’s – I’m a Psycho Mapper!

* OSGEO



AGI GeoCommunity – NYPL & Mapwarper

Yesterday I presented at the AGI Annual conference on the NYPL Map Rectifier. I was also able to launch MapWarper.net – to showcase the user submitted version of the code base, which you can get on github, should you like! I’ll put the slides up to slideshare asasp

I also showed the video of the newest Stamen Design work on the redesign of the interface. This interface will be applied to the application, and will be featuring at the British Library Growing Knowledge exhibition in a couple of weeks

The whole conference was okay, well organised, and well attended by fellow AGI Northern Group members. The soapbox – an evening even over beers – short 20:20’s but with the emphasis on ranting and comedy proved very popular, and was very amusing – keep an eye out for the videos when they come out!

The free W3Geo unconference which was on the day before the main conference, and as Ed writes, had the more interesting talks and energy here. w3g had a number of fixed keynotes and speakers, which I think was a good idea – it allowed people to attend knowing that certain people would be speaking. Alas, I didn’t really believe it was promoted that well beforehand.

The main conference had a number of interesting talks. The guys from CASA really do seem to be enjoying themselves. Survey Mapper was highlighted – essentially it’s an onlne survey application, but with, yes geography! It would be great to be able to allow the export of the results from a survey, and use this as a data layer in GeoCommons – to compare against any number of other datasets. They also showcased their work with Twitter – exploratory – one thing with pretty much all twitter map experiements is that people are so overwhelmed with the dataset, that the analysis, the task of generating information from this data tends to be overlooked. Most of the mapping done tends to show that, for the very first time, that people tend to live and do things in urban areas.

ESRI were keynoting, but with a somewhat downbeat “why doesn’t anyone understand and use GIS” cry, with a call to promote the usefullness of GIS to decision makers. Nothing new here, but in these economic times, seemed to imply that GIS budgets were being hit hard.

Nigel Shadbolt’s closing talk convinced me of the importance of linked data. However, the talk was more about the benefits and importance of Open Data as a whole, and he was speaking at such a higher level than most of the other talks were operating at. They were talking about “how to share” data, whilst with open data at a national level operates at a much more fundamental level

The Ordnance Survey have now escaped much of the fights and shouting what with the OS Open Data, and today have announced better, clearer, more generous terms for Derivative data (see here and here for more info on that