Mar 152016
 

The mobile app team at EDINA recently developed an app for the University of Edinburgh Main Library “Open Doors” event. This was our first attempt to use Apple’s iBeacon technology in anger, in a real environment. We had done some evaluation of iBeacons previously so had some idea of what to expect, and what not to expect from the technology. Nevertheless, the environment we deployed beacons to, a large open lobby area, was very challenging. We had to create a bespoke detection heuristic to create a reasonable user experience. In this post, I’ll demonstrate the problem and then explain how our algorithm works and discuss is performance, and potential for improvement or alternatives.

The user experience we were after should in theory have been a fairly simple one (you might think).

  • We divide the floorplan into non-contiguous zones, ensuring a fair amount of distance (> 5m) between zones.
  • As a users enters a zone, we pan to the area on the floorpan viewer and some content (in this case a video) is highlighted.

 

IMG_0116

Screenshot from the Library Tour app showing zones in the open lobby space

Therefore all we needed to know was which beacon was closest. The exact distance was not that important, so we could ignore inaccuracies in actual distance, so long as we could determine which of the beacons in range was the closest.

  1. Where more than one beacon is deployed in the same open floor space, it is very difficult to find positions and range settings where the beacon readings do not collide with one another.
  2. Setting a beacon range to a small value (<1 meter) requires the user to position themselves very close to the beacon and it is likely the user will walk through the zone without anything happening.
  3. If we set the range to > 1 meter, responsiveness is better, but the reliability of the signals strength readings becomes increasingly inaccurate.
  4. For Android devices, the measured distance varies greatly across difference devices, making it hard to set a range value that will create a good user experience for all users, as documented in this excellent ThoughtWorks blog.

 

A brief look at some tracking data should help to visualise the problem.

 

tracking data for Nexus 5 device showing measured distance of 3 beacons against time, highlighting where ENTER events were activated.

tracking data for Nexus 5 device showing measured distance of 3 beacons against time, highlighting where ENTER events were activated.

The figure above shows the output from a test we ran in our open plan office space ( not the library – we didn’t have time to capture the data when we were deploying the app in the Library). This data collected on a Nexus 5 is close to what we were hoping for. It shows a user following a route from zone 1 (blue line representing beacon 64404), then entering zone 2 (orange line segment), then entering zone 3 (green), before turning around and returning to zone 2 and finally back to zone 1. We were using Estimote beacons, but rather than using the proprietary Estimate SDK, we used the alt beacon library instead. Listening to the beacons in ranging mode, we receive a batch of beacon readings every second or so, where each beacon in range reports its signal strength from which an estimation of the distance is derived. The data above is pretty good scenario for our use case, as for the most part only one beacon is detected in range at anyone time.  There is a period of 7 seconds between   14:13:04 and 14:13:11, where the ranging data batch includes readings for both beacon 30295 and 64404.

 

zoomed in detail of tracking data for Nexus 5 device showing measured distance of 3 beacons against time, highlighting where ENTER events were activated.

zoomed in detail of tracking data for Nexus 5 device, showing measured distance of 3 beacons against time, highlighting where ENTER events were activated.

As we might expect the readings for the orange beacon gradually increase as we walk away from zone 2 and approach zone 1, while the values for the blue beacon decrease as we approach zone 1. Even though both beacons are in range during this period, we don’t want both beacons to trigger events at the same time. We want the algorithm to decide which zone the user is currently located in even if more than one beacon is in range. Two simple solutions present themselves:

  1. Choose the nearest beacon in the batch. In the case above, with the Nexus 5 readings, this would work perfectly. The zone 1 (blue) ENTER event would actually have occurred a second before the one recorded above, and so this simple heuristic would be more responsive than our implementation in this case. You’ll see shortly why we can’t rely on this all the time though.
  2. Require a minimum distance before a beacon can trigger an application event. This would not work over the full period above Nexus 5 track (figure 1).  If we choose a threshold of <1 meter, the first time the user enters zone 2 ( the first orange line segment) would not trigger a zone ENTER event as it should. If we raise the threshold to 1.5 meters, then the entry into this is zone 2 is detected, but during the 7 second period shown if figure 2, the zone 1 readings shown in blue would also activate zone ENTER events, colliding with simultaneous ENTER events for zone 2.

 

So at first sight, the simple solution of choosing the closest reading in the batch looks good. But let’s take a look at tracking data for a Moto4G device instead.

tracking data for Moto4G device showing measured distance of 3 beacons against time, highlighting where ENTER events were activated.

tracking data for Moto4G device showing measured distance of 3 beacons against time, highlighting where ENTER events were activated.

The first thing to notice is that the distance range is larger than in the previous example, ranging from 1.3 m to 3.5m for the Motorola device, compared to 0.37m to 1.95m for the Nexus 5. This difference between platforms is another reason why setting a minimum threshold for activation is tricky to get right. As you would expect, we found consistency across iOS devices is much better. The next thing to notice is how patchy the the data can be in places. For some reason, this device recorded hardly any beacon readings at all between a 30 second period between 9:34:37 and 9:35:05. A period that includes the sole reading for zone 3 (green). We are not quite sure why this happens (some feature of the underlying bluetooth implementation for these devices perhaps, or maybe a quirk in the altbeacon library?). What is clear is that patchy scanning data that can cause the “choose the nearest beacon” solution to come undone. Take a look at the highlighted datapoint below.

tracking data from Moto4G device highlighted data point

For the highlighted batch the orange beacon was the only beacon detected in range, so the “closest in batch” heuristic would trigger an ENTER zone event at this point. But the subsequent 3 batches of readings (spanning 3 seconds) have only blue beacons recorded. So the “closest in batch” would immediately trigger a blue ENTER zone event. This is typical behaviour on zone boundaries where readings are patchy and flip between 2 or more beacons in range. It takes 7 seconds before we see both beacons in the same batch of readings, and can pick the closest (orange) without subsequent flip back to blue . This data point is highlighted in the chart below. Note our algorithm, which I’ll explain below, did not trigger the ENTER zone event at this point, but instead has to wait 4 seconds for the next batch of readings. So in this case, the algorithm pays a high price to avoid flipping between zones.

Screen Shot 2016-03-09 at 17.54.38

It might be possible to devise a way to mitigate the effect of patchy data and zonal boundaries, by examining the values of beacons over two or three previous batches of ranging scans, instead of relying on just one batch of readings for comparison. There is a danger though that averaging (even when weighting the most recent batch) could slow down the responsiveness of the application. In the case where a single data point is critical, such as the (green) zone 3 ENTER event in the chart above, it’s not clear if we should average the green data point against zero values for previous batches, or just take the current value as the weighted average for the beacon.  It looks like the latter technique would have worked quite well in the case above, but I have not had time to explore this alternative solution properly.

Hopefully this section of the post has helped to explain some of the nuances of deploying beacons as a way of representing zones in an open space. Generally, the “choose the closest in batch” heuristic seems to work quite well, but is not immune to flipping behaviour in places where ranging data is patchy. In the next post, I’ll present the solution we used.

Our solution:

Our solution for dealing with the kind of issues described above, is based on the State pattern. Each beacon is associated with a geofence zone around the beacon, with the beacon registering either an INSIDE (zone)  or OUTSIDE (zone) state. The class representing each zone is called BeaconGeoFence and performs two functions. The first is maintain a BeaconGeofenceState, which subclasses into GeoFenceInsideState and GeoFenceOutsideState. Each BeaconGeoFence can only reference one of these states at a time. The second function that BeaconGeoFence class performs is to implement a BroadcastReceiver that listens for events broadcast by the other BeaconGeoFence zones. The BeaconGeofenceState class implements a single method ( evaluateGeofence ), which determines whether a instance of  BeaconGeoFence should change its state, and then broadcast the result of this evaluation to all other BeaconGeoFence instances.  So the general idea is to create a model where beacons (geofence zones) can broadcast messages to one another and potentially change each others state, based on an evaluation of their own state.

To work through how this works in a bit more detail. Initially, all BeaconGeoFence instances are initialised to the “outside” state with a default radius (6m) defining the geofence zone. When the main application class FloorPlanApplication is initialised, the geofence ranging process is started with a call to
beaconManager.startRangingBeaconsInRegion , which kicks off the scanning process where the didRangeBeaconsInRegion(Collection<Beacon> beacons...) method is called every second or so. The collection of beacons represents the current batch of beacons in range. As explained above, to avoid flipping between beacons within range, we sort the batch by estimated distance and only consider the closest in the batch for evaluation. The corresponding GeofenceBeacon is the only one that has a chance to evaluate and change its state.  What happens next depends on the distance of the selected GeofenceBeacon and on its current state.

If the current state is OUTSIDE, one of two things can happen:

  1. if the distance is less than the current radius value for the beacon, the BeaconGeofence changes its state to the GeofenceInsideState and broadcasts an ENTER event to all the other beacons.
  2. if the distance is greater than the current radius value for the beacon, the BeaconGeofence does not change its state and broadcasts a STAY_OUTSIDE event to the other BeaconGeofence instances.

In either case, the other BeaconGeofence instances must work out what to do in response to the broadcast event.

  1. If an ENTER event is received, the receiving BeaconGeofence must immediately change its state to the OutsideGeoFence state. This action is meant to prevent more than one beacon being in an INSIDE state at the same time. The receiving BeaconGeofence also changes it’s radius threshold value to the value passed by the ENTER event. This ensures that only a beacon that has a distance closer than the one that triggered the ENTER event can produce a subsequent ENTER event.
  2. If a STAY_OUTSIDE event is received, the receiving BeaconGeofence instances do not need to change their state, as the closet beacon in the previous batch of readings was not near enough to trigger an ENTER event. But all the beacons increase their radius threshold, to make it easier next time for this or another “outside” BeaconGeoFence to push out the current “inside” beacon.

If on the hand, the closest in batch has an INSIDE state, one of two things below will happen:

  1. if the distance is great that the current radius setting for the beacon, the BeaconGeofence changes its state to OUTSIDE and broadcasts an EXIT event
  2. if the distance is less than the current radius setting the BeaconGeoFence does not change its state and broadcasts a STAY_INSIDE event to the other BeaconGeoFence instances.

Again, the other BeaconGeofences have to decide how to respond to each broadcast event.

  1. For an EXIT event, other beacons do not need to change anything, The purpose of this event is to capture a situation where the user walks out of a zone. As the state has now changed to OUTSIDE,  the device will be able to trigger a new ENTER event if they turn back and enter the zone again.
  2. For a STAY_INSIDE event all other beacons change their radius to the latest reading from the INSIDE beacon. The INSIDE beacon still keeps its original ENTER distance radius.

The above algorithm solves two problems we encountered using iBeacons on Android devices. First, the difference in range distances calculated for different devices. In this algorithm, the optimal range (radius) for each Beacon can change from the initial default value, allowing the device to self calibrate. The first ENTER event sets and benchmark range for all the beacons to beat, and subsequent STAY_INSIDE broadcast events reinforces the distance used to evaluate whether a beacon should change its state. The algorithm also handles a situation where a single “outside” beacon is the only beacon in range, so by definition is the closest in the batch. When this BeaconGeofence is evaluated, to produce an ENTER event it has to beat (lower value than) the last distance produced by a previous STAY_INSIDE broadcast – that is it must beat the last known distance of the current “inside” beacon. However, unlike a floating average, this algorithm does not preclude a single reading from generating an ENTER event.

Overall we found this solution worked reasonably well, mostly preventing annoying flipping between zones, even when data was patchy. There was an impact on responsiveness, but not too severe. Looking at a range of log traces, we found the algorithm required an extra scan (typically requiring  1 second per batch) delay, beyond the optimal point for triggering a state transition. So generally, we found the cost was a second or so. If your zones are reasonable large, this is probably an acceptable level of delay, as the user will take a few seconds to walk through the zone and therefore will will provide enough time to trigger an ENTER event. For smaller zones, it is still possible for the user to walk straight through the zone without generating an event. There is clearly still a lot of work to do in tweaking the algorithm, or trying out some alternative techniques, but we did feel we made some progress deploying iBeacons as a way of detecting a device as its moves through non contiguous zones. It will also be useful to check whether the new Android Eddystone protocol produces more consistent behaviour across Android devices.

 

Sep 252015
 

In conjunction with our Main Library colleagues, the EDINA Mobile Internet project has developed a self-guided tour app using beacons for the main library.

The app has been specifically designed for the Doors Open Days on the 26th and 27th of September 2015. It will allow users to explore the University of Edinburgh’s Main Library through an interactive tour enabling you to learn about the history of the building, discover the exhibitions space and find out more about the library’s varied services and world-class collections.

The app uses beacons to pop up a series of videos as you explore the building(see screenshots below). It is available for download in iOS and Android versions from the Google Play store and the Apple app store :

Android : https://play.google.com/store/apps/details?id=fplan.edina.ac.uk.fplan&hl=en_GB

iOS: https://itunes.apple.com/gb/app/main-library-tour/id1040515101?mt=8

Or visit the stores and search for “Main Library Tour”

iOS Simulator Screen Shot 17 Sep 2015 10.26.13          iOS Simulator Screen Shot 16 Sep 2015 16.28.08         IMG_0298

Sep 022015
 

People are spending more time on their mobiles…social media accounts for more than 20% of time spent, compared to under 10% on desktop. Currently consumers spent 45% of their internet time on computers, 40% on mobiles and 15% tablets.

Mobile internet time is more heavily skewed towards social networking and games, whilst desktop is more loaded towards email and entertainment such as film and multimedia.

Read the Guardian article online.

All that’s based on an IAB report with more related data and survey findings showing an increase in advertising spending for mobiles too.

Aug 062015
 

The UK is now a “smartphone society”

Smartphones have overtaken laptops as the most popular device for getting online, Ofcom research has revealed, with record ownership and use transforming the way we communicate.

Two thirds of people now own a smartphone, using it for nearly two hours every day to browse the internet, access social media, bank and shop online.

Ofcom’s 2015 Communications Market Report finds that a third (33%) of internet users see their smartphone as the most important device for going online, compared to 30% who are still sticking with their laptop.

The full, downloadable report is on their website.

Jun 082015
 

An interesting article in ‘Computing’ asserts that investment in mobile application development will stand organisations in good stead when it comes to dealing with the Internet of things. It forecasts that there will still be a focus on development for smartphones and tablets, however, more organisations will start to experiment with sensor enabled and wearable connected and embedded devices.

It doesn’t paint an entirely rosy picture, it states that organisations are still dealing with mobile app backlog despite the adoption of agile approaches by developers and DevOps. The article ends by asking the question – ‘how will organisations keep abreast with the demands of Mobility and the Internet of Things.

The full article can be found here.

Mar 132015
 

Prompted by seeing two events listed, I’ve just had a quick scour of the web for other mobile-related conferences this year.

Perhaps surprisingly, when you think of it, there’s still a lucrative circuit of charged, seated events round the country and globe – rather than, or perhaps as well as, a network of virtual spaces where ideas are exchanged. We’ll continue to monitor them and attend the odd one or two. For reference, and to save a few precious minutes for anyone else with a similar curiosity, here’s a short summary of today’s quick search:

The Third Annual Future of Education and Technology Conference 2015: Transforming Education through Digital Technology – Friday 13th March 2015, University of Salford, Manchester  (sic).

A couple of other more general ones:  Digital Media Strategies 2015, just passed this week in London and The Guardian Changing Media Summit there as well, next week. Another one to read the write-up is the Mobile World Congress 2015, this year in Barcelona a few days ago (conveniently reviewed in the Guardian already). Looking ahead, this weekend sees the 11th International Conference on Mobile Learning 2015 in Madeira, Portugal, then the UK Mobile Government  Summit in London on Tuesday, St Patrick’s Day;  next month, there’s a university-based one-day’er – the grandly titled Future of Mobile and Technology Enhanced Learning in Higher and Further Education Conference 2015 in Salford again.

Skipping ahead to the early Summer, although not Mobile per se, there’s an interesting looking gathering planned under the heading: Enabling Transformational Change and Innovation in Higher Education via Technology, in London in June. Followed by at least two dedicated, seriously Mobile conferences in August – International Conference on Mobile Computing & Networking in Birmingham and (the .

Then, one for the Fall, a November-scheduled ForumOxford: Mobile Apps and Technologies Conference 2015 already advertised.

And lastly the one that set me off on this, a free webinar next Wednesday lunchtime which I’ve just registered for “Mobile learning in practice: special educational needs and essential skills: Phase two of the [Jisc] mobile learning guide“. I’m looking forward to that, after a slightly gruelling trek to Birmingham for Jisc’s Digifest at the beginning of this week (which had at least one relevant session on this theme Mobile Learning in Practice.

With this data deluge, as only the tip of iceberg, there’s no shortage of themes and insights being offered for us to digest then inform our work. Watch this space, along with plenty of others, to see what we make of it.

Jan 232015
 

The technology strategy document (linked below) is intended to complement the work of EDINA’s Mobile Internet project. That project is, in part, designed to help generate a ‘mobile strategy’ for EDINA through a process of research, development, reflection and synthesis. This document then will undergo substantial revision as the Mobile Internet project progresses, and will constitute an expression of the deepening understanding within EDINA of ‘mobile’ as a strategic technology issue. To ensure that EDINA:
• can and does respond proactively to the challenges presented by the growing significance of ‘mobile’
• is equipped and well-positioned to exploit opportunities indicated by this new paradigm
• is positively recognised by significant stakeholders (Jisc, University of Edinburgh) as a centre of expertise in the development and delivery of mobile service

– EDINA mobile development strategy_revision_3

Jan 062015
 

Hands free, voice activated and designed to be an unobtrusive as possible, Google Glass is one example of how wearable technology is changing how we think about mobile technology.

The University, in partnership with Google, has embarked on a project investigating how this technology can be used to support learning, teaching and the general day to day experience at the University.

To this end, the University is encouraging small groups (Max 4) to submit ideas of how this technology can be used; the key words are innovation and creativity.

There is a website (http://glass.ed.ac.uk/) that has further details regarding the project.

Nov 132014
 

Sensewhere are company specialising in indoor positioning solutions. Their main product is purely software based with no requirement for additional hardware.

On Nov 4th they launched their Software Development Kit which is licence free to mobile device manufacturers and platform providers.

The brief for the Sensewhere developers kit can be found here.

 

Nov 042014
 

“Isolated apps are out; “micro-moments” are in”

In a recent report, technology analysts Forrester Research are predicting that big changes are coming in in mobile development.

They are predicting 8 significant changes to the mobile development ‘market’ in 2015:

  1. Standalone apps will lose their luster
  2. Hardware-driven innovation will enable new opportunities
  3. Mobile competition will shift to accessories and ecosystems
  4. Composition will dominate front-end mobile experiences
  5. The merger of physical and digital worlds accelerates
  6. Mobile context becomes high-def
  7. Service virtualization and API design tools will appear in every development toolbox
  8. Low-code platforms will move into the aggregation tier, but struggle to go mainstream

They see the driver for these is users moving

“… away from apps, and toward more contextually relevant micro-moments, delivered across families of devices, that are personalized to anticipate unique customer needs.”

There is a summary article on ‘readwrite’ here  and the full report is available here.