Jul 232013

To generate tiles for the map stack used by FieldTrip GB we are using 4 Mapserver instances deployed to an OpenStack private cloud. This means we can get all our tiles generated relatively quickly using inexpensive commodity hardware. A problem we have is that the resulting PNG tile images look beautiful but are way too big for users to download to their mobile device in any quantity. So we looked to using Mapserver’s built in JPEG format but our cartographers were not happy with the results. One of my colleagues came up with the bright idea of using ImageMagick to compress the PNG to JPEG instead, and the result (using 75% compression) was much better. We can use the ImageMagick command line  with the following syntax:


for var in "$@"
echo "converting $var to jpg";
convert $var -quality 75 `echo $var | tr '.png' '.jpg'`;
# rm $var

and pipe this script using xargs to traverse an existing cache with the PNG generated tiles.

find . -name '*.png' -print0 |  xargs -0 -P4 ../convert_png_to_jpeg_delete_png.sh

So the cartographers finally relented and we now have much smaller files to download to devices. The only problem is that the script to run the ImageMagick convert takes for ever to run ( well all right – 2 days). It’s not because ImageMagick is slow at compression – it’s super fast. It’s just that the IO overhead involved is huge as we are iterating over  16 million inodes. So our plan of scaling up commodity hardware (4 CPU virtual machine) is failing. A solution is to do the jpeg conversion at the same time as the tile caching – this way you are only dealing with one tile at the point you are writing to the cache – so there is much less overhead.

So it’s time to hack some of the Mapcache code and get ImageMagic to add the above compression just after it writes the PNG to the cache.

This just involves editing a single source file found in the lib directory of the Mapcache source distribution  ( mapcache-master/lib/cache_disk.c ). I’m assuming below you have already downloaded and compiled Mapcache and also have downloaded ImageMagick packages including the devel package.

First of all include the ImageMagick header file

#include  <wand/magick_wand.h>

Then locate the method  _mapcache_cache_disk_set. This is the method where Mapcache actually writes the image tile to disk.

First we add some variables and an Exception macro at the top of the method.

MagickWand *m_wand = NULL ;
MagickBooleanType status;

#define ThrowWandException(wand)


(void) fprintf(stderr,”%s %s %lu %sn”,GetMagickModule(),description);
description=(char *) MagickRelinquishMemory(description);

Add then right at the end of the method we add the MagickWand equivalent of the convert command line shown above. The compression code is highlighted

if(ret != APR_SUCCESS) {
ctx->set_error(ctx, 500, "failed to close file %s:%s",filename, apr_strerror(ret,errmsg,120));
return; /* we could not create the file */

// *******ImageMagick code here ********

ctx->log(ctx, MAPCACHE_INFO, “filename for tile: %s”, filename);
MagickWandGenesis() ;
m_wand=NewMagickWand() ;
if (status == MagickFalse)
// MagickSetImageFormat(m_wand, ‘JPG’) ;
char newfilename[200];
strcpy(newfilename, filename) ;
int blen = strlen(newfilename) ;
if(blen > 3)

newfilename[blen-3]=’j’ ;
newfilename[blen-2]=’p’ ;
newfilename[blen-1]=’g’ ;
MagickSetImageCompression(m_wand, JPEGCompression) ;
MagickSetCompressionQuality(m_wand, 75 ) ;
ctx->log(ctx, MAPCACHE_INFO, “filename for new image: %s”, newfilename);
MagickWriteImage(m_wand, newfilename ) ;
/* Clean up */
if(m_wand)m_wand = DestroyMagickWand(m_wand);

And that’s it. Now just the simple matter of working how to compile it, link it etc.

After a lot of hmm’ing and ah-ha’ing (and reinstalling ImageMagick to more recent version using excellent advice from here ) it meant making the following changes to the Makefile.inc in mapcache src root dir.


Then run make as usual to compile Mapcache and you’re done! The listing below shows the output and difference in compression:

ls -l MyCache/00/000/000/000/000/000/
total 176
-rw-r–r–. 1 root root 4794 Jul 23 13:56 000.jpg
-rw-r–r–. 1 root root 21740 Jul 23 13:56 000.png
-rw-r–r–. 1 root root 2396 Jul 23 13:56 001.jpg
-rw-r–r–. 1 root root 9134 Jul 23 13:56 001.png
-rw-r–r–. 1 root root 8822 Jul 23 13:56 002.jpg
-rw-r–r–. 1 root root 46637 Jul 23 13:56 002.png
-rw-r–r–. 1 root root 8284 Jul 23 13:56 003.jpg
-rw-r–r–. 1 root root 45852 Jul 23 13:56 003.png
-rw-r–r–. 1 root root 755 Jul 23 13:55 004.jpg
-rw-r–r–. 1 root root 2652 Jul 23 13:55 004.png

original PNG tile

converted to JPEG at 75% compression

Annotating a picture in Fieldtrip GB

 annotate, FtGB, Guide, help, map, Skitch  Comments Off on Annotating a picture in Fieldtrip GB
Jul 052013

The Fieldtrip GB team have been out and about talking to users of the app and we have had some useful feedback on what users would like to see in the app. One of the comments that appeared more than once was that it would be good to be able to annotate photos.  We agree, but there are several other things that we have scheduled as higher priorities.

This got me thinking. There are apps out there that allow users to annotate photos and screen captures, so which one is best and how could FtGB users integrate it into their workflow? I was lucky enough to see a presentation by Derek France from the University of Chester and one of the apps that he demo’d was Skitch.  Skitch is an app made by the same people that do Evernote. Skitch allows users to sketch something new, mark-up maps, screen captures, or even a photo. So you would be able to:

  • make a new sketch
  • screen capture a map then add notes/annotations to it
  • annotate or add notes to a photograph.


So how would you integrate this into the FtGB workflow?  Well, because Skitch is not “part” of FtGB you would have to launch it and run FtGB as a background programme. Lets walk through it in steps as if we were in the field.

  1. We are at a site we want to survey.
  2. start Skitch
  3. take a photo
  4. add sketch/notes
  5. save
  6. back to FtGB
  7. create point
  8. attach photo from Gallery
  9. navigate to the skitch folder (in android you should ensure that Skitch is in your My Gallery)
  10. Save

Photo Annotated in Skitch

Pretty simple, as long as you have the Skitch folder checked so that it appears in your My Gallery.  

What if you wanted to annotate a map from Fieldtrip GB.  This is possible, it just requires you to screen capture  the map.  There is usually a way to do this by pressing a couple of buttons at the same time, much like a special move on a Nintendo (editors note – my phone wont do screen capture in FtGB).
  1. Take a screengrab with FtGB running (process varies between handsets)
  2. Open saved screengrab in Skitch
  3. Annotate the screen grab
  4. Save annotation
  5. Back to FtGB
  6. Attach annotated screen capture to a point

Screen Map Capture annotated in Skitch

Hopefully the steps described above will allow you to add annotated pictures, maps and sketches to Fieldtrip GB. There are undoubtedly other annotation apps out there that would integrate in a similar way, we just chose Skitch as it was free, easy to use and seemed to to exactly what we wanted.

 Posted by at 9:46 am

Mbtiles and Openlayers

 html5, mapbox, mbtiles, mobile, openlayers, phonegap  Comments Off on Mbtiles and Openlayers
Jun 072013

Mbtiles and Openlayers

I was testing the feasibility of adding an overlay to openlayers map that is displayed on a mobile/tablet device .

The overlay is going to be in mbtiles format the made popular by MapBox.

The mbtiles db will be accessed locally on the device this useful when bandwidth is poor or non 3g tablets .

The mbtiles format is http://www.mapbox.com/developers/mbtiles/ described here.

Its is basically a sqlite database that holds a collection of  x,y,z indexed tiles.

Webkit based browsers including mobile versions support this although its not actually part of the Html5 spec.

The main issue of using mbtiles locally is actually getting the database into the right location.

Another is the speed at which the device can render the images. The overhead in extracting blob images to the resulting  base64 encoded images.

There are a couple of ways this can be done however.

Getting Mbtiles on Device/Browser

With Phonegap

You can use the  FileTransfer object in phonegap to copy the database locally from a server. It will be downloaded to the Documents folder on the iphone by default.


example code to download an mbtiles db.

var fail = function (error) {

var doOnce = window.localStorage.getItem("doOnce");

   window.requestFileSystem(LocalFileSystem.PERSISTENT, 0, function(fileSystem) {
       fileSystem.root.getFile('testDB2.db', {create: true, exclusive: false}, function(fileEntry) {
           var localPath = fileEntry.fullPath;
           if (device.platform === "Android" && localPath.indexOf("file://") === 0) {
               localPath = localPath.substring(7);
           console.log("LOCAL PATH  "+ localPath);
           var ft = new FileTransfer();
           localPath, function(entry) {
               console.log("successful download");
           }, fail);
       }, fail);
     }, fail);

Use the phonegap web sql plugin  https://github.com/pgsqlite/PG-SQLitePlugin-iOS.git  and open the database like.


The benefit of using a phonegap sqllite plugin – allows flexibility where you download the mbtile db to and removes the device dependant limits on database size.

Also if a browser drops native web sql support then it doesn’t matter.


Rather than download a remote database you could copy over a local database at startup.

The simple way to add a prepopulated SQLite DB in PhoneGap from this blog


If you want to keep it an entirely non-native web app based solution or desktop browser (webkit based – Chrome Safari you might be able to use a tool like.


There are more suggestion on stackoverflow here but I not tried them.


By using the syncing by creating an empty local mbtiles database and then populating it by inserts via data from the server is going to adversely affect performance. I have not tried this so I dont know how well it would work.

OpenLayers integration

First thing is to subclass an Openlayers TMS class.

* Map with local storage caching.
* @params options:
*     serviceVersion - TMS service version
*     layerName      - TMS layer name
*     type           - layer type
*     isBaseLayer    - is this the base layer?
*     name         - map name
*     url            - TMS URL
*     opacity        - overlay transparency
var MapWithLocalStorage = OpenLayers.Class(OpenLayers.Layer.TMS, {
   initialize: function(options) {

       this.serviceVersion = options.serviceVersion;
       this.layername = options.layerName;
       this.type = options.type;

       this.async = true;

       this.isBaseLayer = options.isBaseLayer;

           this.opacity = options.opacity;

       OpenLayers.Layer.TMS.prototype.initialize.apply(this, [options.name,
   getURLasync: function(bounds, callback, scope) {
       var urlData = this.getUrlWithXYZ(bounds);
       webdb.getCachedTilePath( callback, scope, urlData.x, urlData.y , urlData.z, urlData.url);
   getUrlWithXYZ: function(bounds){
          bounds = this.adjustBounds(bounds);
       var res = this.map.getResolution();
       var x = Math.round((bounds.left - this.tileOrigin.lon) / (res * this.tileSize.w));
       var y = Math.round((bounds.bottom - this.tileOrigin.lat) / (res * this.tileSize.h));
       var z = this.serverResolutions != null ?
           OpenLayers.Util.indexOf(this.serverResolutions, res) :
           this.map.getZoom() + this.zoomOffset;

       //inverty for openstreetmap rather than google style TMS
       var ymax = 1 << z;
       var y = ymax - y -1;
       var path = this.serviceVersion + "/" + this.layername + "/" + z + "/" + x + "/" + y + "." + this.type;

       var url = this.url;
       if (OpenLayers.Util.isArray(url)) {
           url = this.selectUrl(path, url);
       return { url: url + path, x:x, y:y, z:z};

   getURL: function(bounds) {
       return OpenLayers.Layer.XYZ.prototype.getURL.apply(this, [bounds]);


this.async = true;

as it will have to receive images from the local sqlite database asynchronously  as  web sql has an asynchronous callback style API.

       var ymax = 1 << z;

       var y = ymax – y -1;

All this does is invert the y axis tile to handle openstreetmap not required for google style TMS.

The is a good site that describes the various types of TMS around.


The Database Setup

"use strict";
var webdb = {};

function getWebDatabase(){
   if(typeof(openDatabase) !== 'undefined'){
       webdb = undefined;
   return webdb;

webdb.open = function() {
 var dbSize = 50 * 1024 * 1024; // 50MB
 webdb.db = openDatabase("'testDB2", "1.0", "Cached Tiles", dbSize);

webdb.onError = function(tx, e) {
 console.warn("There has been an error: " + e.message);

webdb.onSuccess = function(tx, r) {
 console.log("Successful Database tx " );

webdb.createTablesIfRequired = function() {
   console.log("Creating DataBase Tables");
 var db = webdb.db;
 db.transaction(function(tx) {
   tx.executeSql("CREATE TABLE IF NOT EXISTS " +
                 "tiles(zoom_level INTEGER, tile_column INTEGER, tile_row INTEGER, tile_data TEXT, mapName TEXT)", [], webdb.onSuccess,

                 " tile_index on tiles(zoom_level, tile_column, tile_row, mapName)", [], webdb.onSuccess,

function hexToBase64(str) {
   var hexString = str.replace(/([da-fA-F]{2}) ?/g, "0x$1 ");
   var hexArray = hexString.split(" ");
   var len = hexArray.length;
   var binary ='';
   for (var i = 0; i < len; i++) {
       binary += String.fromCharCode( hexArray[ i ] )
   //getting a stack error on large images
   //var binary = String.fromCharCode.apply(null, hexArray);
   return window.btoa(binary);

webdb.getCachedTilePath = function(callback, scope, x, y, z, url ){
   var db = webdb.db;
   var resultsCallback = function(tx, rs) {
       console.log('resultsCallback *********************' );
       console.log('rs.rows.length ' + rs.rows.length);

       if(callback) {
           if( rs.rows.length > 0 ) {
               var rowOutput  = rs.rows.item(0);
               var tile_data = rowOutput['tile_data'];
               //strip off the hex prefix
               tile_data = tile_data.substring(2);

           } else {
               callback.call(scope, url);
   db.transaction(function(tx) {
       tx.executeSql("SELECT quote(tile_data) as tile_data FROM tiles where zoom_level=? AND tile_column=? AND tile_row=?", [z,x,y], resultsCallback,


When you have larger blobs in the database you can’t use the overloaded array version of String.fromCharCode as I was getting stack memory issue on the device. (iphone).

So you have to loop through and build it manually.

You have to use the quote function on the tile_data blob to turn it into a hex  string.

“SELECT quote(tile_data) as tile_data

Then trim the hex prefix X’ of the hex string before base64ing.

Testing if you just want to test the javascript /html5 with mbtiles you can copy your mbtiles database to the correct folder .

/Users/murrayking/Library/Application Support/iPhone Simulator/6.1/Applications/667F70EF-D002-425D-86C9-5027C965C518/Library/WebKit/LocalStorage/file__0/0000000000000001.db on a mac

or Chrome on mac as well.

Users/murrayking/Library/Application Support/Google/Chrome/Default/databases/http_localhost_8080/13


This approach is a bit convoluted.

Esp  the conversion of the blob to base64 and performance is a bit poor on older devices. But on newer devices its acceptable.  And as devices become more powerful it will become less issue as with all html5 javascript type things.

Not tried it yet on Android but should work. Worked in the Chrome browser on the linux box.

It does allow you to use rich openlayers framework cross platform without having to invest in native versions.

Also you can debug and test using a desktop browser which is fast before doing proper testing on the actual device.

Example Screenshot working on iphone3g using Phonegap and Mbtiles.

Development version based on our Fieldtrip GB app http://fieldtripgb.blogs.edina.ac.uk/ available on android and iphone.

Overlay is historic map in mbtiles format from the National Library of Scotland.


Debugging on Chrome non-native

working on chrome

Collecting data as a group using Fieldtrip GB

 Uncategorized  Comments Off on Collecting data as a group using Fieldtrip GB
Apr 222013

Mobilise the Crowd

One use of Fieldtrip GB is to get groups of students collecting data over a wide area and then collate this into a master dataset.  Kind of like crowd-sourcing but with an “informed crowd” collection specific information. But, how would you best go about doing this using the current app?

At the moment the best thing to do is to create a master form and get the group to collect data against that. The process of what you would need to do is set out below:

  1. Set up a “group” Dropbox account (just a regular account, but one that you are happy to share the username/password of with the group)
  2. Log into the Authoring Tool with the group Dropbox account and create your data collection form and save it.  It will be saved to the group Dropbox account..
  3. Get the group to log in to the group Dropbox account from their phones and sync to retrieve the form
  4. Get the group to collect data against the form
  5. When they have finished collecting their data, get the group to log in to the group Dropbox account from their phones and perform another sync. This will upload the data they have just collected to the group Dropbox account
  6. The leader/tutor can then log in to group Dropbox account through the Authoring Tool.
  7. Use the filter menu to display all the data collected against the form that the group used.
  8. Export the data – this will create 1 file that contains all the data collected by the group (Note – we are working to improve the KML export and hope to add CSV as an output option)
So, that process is not too onerous.  Fieldtrip GB is a great way to get multiple people collecting consistent information and helps you to mobilise a crowd.  Why not give it a go with your students or organisation and let us know how you get on.


 Posted by at 2:26 pm

Fieldtrip GB hits the the iTunes AppStore

 iPhone, New Release  Comments Off on Fieldtrip GB hits the the iTunes AppStore
Apr 092013

Good news fruit based phone fans, Fieldtrip GB is now available or iPhone and iPad.  You can download the app for free from the iTunes App Store. It contains the same functionality as the Android version and we hope you like it.  We are currently working on a few tweaks and would love to hear what you think about the app.  To provide feedback just drop us an email to edina@ed.ac.uk just put “Fieldtrip GB” in the subject field and it will get to the development team.

 Posted by at 7:49 pm
Mar 252013

First of all – apologies for this blog going quiet for so long. Due to resource issues its been hard to keep up with documenting our activities. All the same we have been quietly busy continuing work on geo mobile activity and I’m please to announce that we have now releases our Fieldtrip GB app in the Google Play Store  


We expect the iOS version to go through the Apple App Store  in a few weeks.

Over the next few weeks I’ll be posting to blog with details of how we implemented this app and why we choose certain technologies and solutions.

Hopefully this will prove a useful resource to the community out there trying to do similar things.

A brief summary. The app uses PhoneGap and OpenLayers so is largely using HTML5 web technologies but wrapped up in a native framework. The unique mapping uses OS Open data including Strategi , Vector Map District  and Land-Form PANORAMA mashed together with path and cycleway data from OpenStreetMap and Natural England.


Fieldtrip GB is live in Play Store

 Android, Fieldtrip GB, New Release, Release  Comments Off on Fieldtrip GB is live in Play Store
Mar 152013

Fieldtrip GB is now available to download from the Google Play Store.  This is hugely exciting.  Fear not iPhone users, we haven’t forgotten about you.  We are just about ready to submit the app to the iTunes App Store and will be waiting for it to be approved. This could take up to a couple of weeks. We will let you know when it is approved and ready to download.

Fieldtrip GB is now available at Google Play

 Posted by at 2:38 pm

Fourth International Augmented Reality Standards Meeting

 AR standards, arml, augmented reality, augmented reality browsers, html5, karml, kml, Layar, mobile tech  Comments Off on Fourth International Augmented Reality Standards Meeting
Oct 282011

I’m just back from the Fourth International AR Standards Meeting that took place in Basel, Switzerland and trying hard to collect my thoughts after two days of intense and stimulating discussion. Apart from anything else, it was a great opportunity to finally meet some people I’ve known from email and discussion boards  on “the left hand side of the reality-virtuality continuum“.

Christine  Perry, the driving spirit, inspiration and editor at large of  AR Standards Group has done a fantastic job bringing so many stakeholders together representing Standards Organisations such as the OGC, Khronos, Web3d Consortium, W3C, OMA and WHATWG  Browser and SDK vendors such as Wikitude, Layar, Opera, ARGON and Qualcomm AR and hardware manufacturers ( Canon, SonyEricsson, NVIDIA) as well as several solution providers such as MOB Labs and mCrumbs – oh and a light sprinkling of academics ( Georgia Tech, Fraunhofer iDG ).

I knew I’d be impressed and slightly awe struck by these highly accomplished people, but what did  surprise me was the lack of  any serious turf fighting. Instead, there was a real sense of pioneering spirit in the room.  Of course everyone had their own story to tell (which just happened to be a story that fitted nicely into their organizational interests), but it really was more about people trying to make some sense of a confusing landscape of technologies and thinking in good faith about what we can do to make it easier.  In particular, it seemed clear that the Standards Organizations felt they could separate the problem space fairly cleanly between their specialist area of interest (geospatial, 3d, hardware/firmware, AR content, web etc). The only area where these groups had significant overlap was on sensor APIs, and some actions were taken to link in with the various Working Groups working on sensors to reduce redundancies.

In seemed to me that there was some agreement about how things will look for AR Content Providers and developers (eventually). Most people appeared to favour the idea of  declarative content mark-up language working in combination with a  scripting language (Javascript) similar to the geolocation API model. Some were keen on the idea of this all being embedded into a standard web browsers Document Object Model. Indeed, Rob Manson, from MobLabs has already achieved a prototype AR experience using various existing (pseduo) standards for web sensor and processing APIs. The two existing markup content proposals ARML and KARML are both based on the OGC’s KML, but even here the idea would be to eventually integrate a KML content and styling model into a generic html model, perhaps following the html/css paradigm.

This shared ambition to  converge AR standards with generic web browser standards is  a recognition that the convergence of hardware, sensors, 3d, computer vision and geo location is a bigger phenomenon than AR browsers or augmented reality. AR is just the first manifestation of this convergence and “anywhere, anytime” access to the virtual world as discussed by Rob Manson on his blog.

To a certain extent, the work we have been discussing here on geo mobile blog, using HTML5 to create web based mapping applications, is a precursor to a much broader sensor enabled web that uses devices such as camera, GPS, compass etc. not just to enable 2d mapping content but all kinds of application that can exploit the sudden happen-chance of  millions of people carrying around dozens of sensors, cameras and powerful compute/graphic processors in their pockets.

Coming back from this meeting, I’m feeling pretty upbeat about the prospects for AR and emerging sensor augmented web. Let’s hope we are able to keep the momentum going for the next meeting in Austin.

Jul 072011

Earlier this week I attended the Open Source Junction Context Aware Mobile Technologies event organized by OSS Watch. Due to a prior engagement I missed the second day and had to leave early to catch a train. It was a pity as the programme was excellent and there was some terrific networking opportunities, although it sounds like I was fortunate to miss the geocaching activity which the twitter feed suggested was very wet and involved an encounter with some bovine aggression.

During the first two sessions I did attend there were quite a few people, including myself, talking about the mobile web approach to app development. I made the comment that the whole mobile web vs. native debate was fascinating and current and that mobile web was losing. But everyone seemed to agree that apps are a pretty bad deal for developers and that making any money from this is about as likely as winning the lottery. This got me thinking on the train to Edinburgh about the “App ecosystem” and what that actually means. A very brief Google search did not enlighten me much so I sketched my own App food chain, shown below.

It no surprise that the user is right at the bottom as all the energy that flows through this ecosystem comes from the guy with the electronic wallet.

But I think it’s going to be a bit of a surprise for app developers ( content providers ) to see themselves at the top of this food chain (along with Apple and Google) as it doesn’t feel like you are king of the jungle when the App retail cut is so high and prices paid by users is so low.

It will be interesting to see if Google, who are not happy with the number of paid apps in the Google Marketplace cut the developer a better deal. Or if the Microsoft Apps built on top of Nokia try to gain market penetration by attracting more high quality content. My guess is not yet. The problem for developers is that the App retailers can grow at the moment just by the sheer number of new people buying smartphones. This is keeping prices artificially low and means app retailers are not competing all that much for content. But smartphone ownership is in fact growing so fast that pretty soon ( approx 2 years?) everyone who wants or can afford a smartphone is going to have one. How do app retailers grow then? They are going to have to get users to part with more money for apps and content either by charging more or attracting advertsing revenue. Even though there are a lot of app developers out there, apps users will pay for are scarce and retailers are going to have to either pay more to attract the best developers and content to their platform, or make life easier for content providers by adopting open standards. So maybe the mobile web might emerge triumphant after all.