Attended “Digital Geography in a Web 2.0 World” in Manchester on Monday, hosted by the National Centre for e-Social Science (NCeSS) which support from ERSC. Primarily it was showcasing relevant projects of the Centre’s “Nodes”, from Leeds, Nottingham, Bristol, Leicester, although most of the talks were from University College London (UCL). NCeSS expressed an interest with working with OpenStreetMap and other data sources, and seems to be a good initiative – somewhat designed to make palaeographers adopt some new “emerging” practices .
Talks included “The Names project: profiling for the public and by the public”, “GMapsCreator and MapTube”, “sim city for real”, “3D visualisations”, “Intelligent Agents”, “Web 2.0, Neogeography and the Virtual World”. Interesting stuff overall, even if it showed the time lag between academic research and the current state of the art.
Now for some observations:
Andrew Hudson Smith, in a talk entitled “Web 2.0, Neogeography and the Virtual World” missed the point about OpenStreetMap somewhat – talking about his iphone, and how it was cool, as it had a GPS on it – he talked about OSM, as it “tracks you around automatically” and “GPS traces are automatically uploaded, where it makes maps”. He did showcase some of the work of UCL in Second Life, particularly in visualisation of some GIS concepts, such as Schelling’s segregation model, the game of life, which was interesting.
Richard Milton talked about GMapsCreator, a nice bit of kit that “takes a shapefile containing geographic areas linked with attributes and automatically generates a working Google Maps website from the data.” – written with GeoTools, too. He also introduced http://www.maptube.org/. MapTube is “MapTube is a free resource for viewing, sharing, mixing and mashing maps online”. It has a nice interface, you can layer and reorder KML and google tile layers on top of each other – although in practice it seems to exist as a showcase for outputting UCL maps. For example the map they did for Radio 4 Mapping the Credit Crunch. You can only link to a map, no embedding, no GeoRSS, no sharing. Would be nice to see some kind of discovery of datasources, publishing the catalogue to make it findable by search engines etc .The issue of copyright of some layers seems to have been skirted somewhat.
Martin Clarke. SimCity for Real. Urban and regional modelling. There was a bit of discussion at the end about the intention to increase the user base of these models. The response was that “oh, models are so complex, we cant have non-experts using this”. This is for non-expert academic and policy makers, let alone members of the public! “If people made dodgy analysis about where to place a new shop, then it would disastrous!”. Same arguments about opening up spatial analysis, until Google changed things and about public data (people would misinterpret it!). Shame. If Web 2.0 is about democratising data, web 3.0 would be about democratising the analysis of data.
Asked whether these models are used for public policy – the answer was that no, councils etc. whilst getting models and bespoke modelling software – they do not use it for making decisions. Then explained that their private customers (they have a consultancy in addition to academic use) used it, as it could be proven that money could be saved. So there is a monetary incentive not to release this analysis as service. Interesting that this public research body is indirectly helping private consultancies. Perhaps in the future grants should be awarded on condition that research and tools be open to citizens?
Most interesting presentation was from Nottingham University.
The Locata game is a game to test spatial awareness – you see a map, or relief model, and given an image of a view shed, you have to work out where that “photo” could have been taken. It requires Shockwave to play, so I didn’t test it. Another application they developed which I didn’t try, but is worth a look is “geoCode” – “Simulating QR Code Location-based Services”
Second Life got a few mentions in a few of the presentations – we are now seeing the first results of the wave of research projects using the software started 1-2 years ago, when the hype was big. Furries were mentioned about being a hindrance in publishing and talking about research! But things were interesting especially regarding the potential to use it as a virtual learning environment – and for demonstrating 3D agent based models.
Hi,
Thanks for the blog Tim.
I work with Martin and have been developing Modelling and Simulation for e-Social Science (MoSeS) resources for NCeSS at The University of Leeds. I have a slightly different take to Martin. Like you I want to make the tools and data publicly available. I would like people and organisations to be able to explore possible alternatives and get involved in the modelling and analysis of data. Access to data and models is a big issue and I’m working to make all our results and tools as openly available as possible.
My major concern currently is that the level of sophistication in the models and the errors in the data only allow us to explore alternative potential scenarios. We are some way off being able to use tools to evaluate different policy scenarios. Evidence based policy is perhaps some way off. I am concerned about over hype and raising expectations too high.
I’m also not arguing for a job although there clearly is one. I’ll be working with Andy and others at CASA and NCeSS as we take things a stage further in our second phase research node Generataive e-Social Science (GENESIS) which starts tomorrow.
Best wishes,
Andy
Pingback: New Map, New Danger « thinkwhere
This is really attention-grabbing, You are a very professional blogger.
I’ve joined your feed and look forward to searching for more of your great post. Also, I’ve shared your web site in my social networks
Excellent post! We are linking to this great post on our site.
Keep up the great writing.