Measure The Effectiveness of Your Mobile Campaign in Driving Footfall

Need to better understand how well your mobile campaign drove people to visit the store?

On Device Research’s Mobile Store Effect will tell you...

  • Visitation Impact: How many people visited the store as a result of the campaign?
  • Trip Impact: How often they visited the store?
  • Dwell Time: How long they stayed for?

On Device Research uses mobile surveys to measure marketing effectiveness in both developed and emerging markets.

Using proprietary technology, access to consumers and best practice On Device Research helps some of the world’s leading brands, ad agencies and media owners to understand the effectiveness of their online and offline advertising.

So far On Device Research has delivered over 35 million surveys across 92 countries.

How We Do It

The key building blocks for our solution are our tracking technology, proprietary cleansing algorithm, survey platform technology and our fully opted in device ID panel.

1. We embed our tracking URL into the ad creative. We verify that we are collecting Device IDs correctly.
2. Campaign goes live and we capture handsets that have been exposed to the creative.
3. We match exposed IDs to Geo Wave’s panel of continuously "always on" consumers that allow us to track users' store visits.
4. We create exposed & control groups of handset owners that are matched on location behaviours, app usage & visitation of the advertiser retail brand & determine the total impact of the campaign in driving store visitation.

What We Measure

We measure GPS & WIFI positions on a continuous 'always on' basis

We measure GPS & WIFI positions on a continuous 'always on' basis.

Depending on the frequency of visiting the store we can measure

  • Total Store Effect: Pre vs Campaign + 2 weeks post
  • Short Term Store Effect: Pre vs Campaign
  • Sustained Store Effect: Pre vs 2 weeks post

Our Robust Matching Criteria

Test (exposed) & control groups are matched on:

  • City/region live/work
  • Weekday/weekend geo behaviour
  • Ave Store visitation patterns
  • App usage levels

This enables the true impact of mobile advertising to be isolated.


Grocery Store Case Study

Campaign Objectives:

To promote value for money for fresh fruit & vegetables

Total Impact:

% who visited the Grocery Store during the 4 week campaign +2 weeks post

Grocery Store Case Study

Location Accuracy FAQs

How is our location data collected?

  • 1st party SDK location data, that comes directly from the user’s operating system (GPS).

How accurate is the GPS data vs. lat/long?

  • GPS accurate up to a metre. The GPS supplies the lat/longs coordinates so this is as accurate as the GPS data itself.

What makes our data collection unique vs. other providers?

  • 1st party data. The collection of background location data as opposed to bid stream which only collects app open data.

How accurate is our location data? What level does it go down to?

  • Accurate up to a metre.

What is the minimum number of locations we would recommend for a study to be accurate/robust?

  • We recommend a study should be with a minimum 3,000 devices – to create an accurate and robust study.

Is it only footfall in a brands own locations we are able report back on, or can we look at competitor footfall in comparison? And see if there is a shift across to theirs?

  • Yes we can, we can look at all competitors. As long as the Lat/longs are provided.

How far pre/post can we look at footfall for a campaign?

  • Data since 2014, going back to 18 months+

What verticals do we find the most successful with and why?

  • Every location that has a big enough area, that is distinct and not around other POIs e.g. Shopping centres etc.

Our key USP’s

  • 1st party background data, the only company to have dwell time for these reports and for a considered purchase only our dwell time reports will show campaign success where visits should actually decrease at the end of a considered purchase funnel
  • Representativity: SDK is placed into 15 of the top 50 news and entertainment apps, local business apps and Navigation apps.
  • No bidstream data

What if the 5 minute refresh means we miss someone when they enter a store? How does this get attributed?

  • Ten minute refresh, if you enter a store and you stay there for less than five minutes, we may not get the visit – although, we are using background data, meaning our sample size is bigger and more accurate.

At what point is someone picked up as ‘in the store’? Is it at the entrance or at a central point in the middle?

  • Depending on the radius set, if the user enters the store, we will be able to pick them up.

For a campaign that is targeting, for example, 500,000 people. Say On Device only have 5% of those people in their 2.5m pool, wont this provide an inaccurate reflection of true footfall uplift?

  • As long as the sample is representative of the population.

Is it possible that the report would show a negative footfall uplift? How can this be explained to our client (agency client/ brand)?

  • It is possible for a campaign to show a negative footfall uplift. It is all dependant on the campaign etc. For larger purchases i.e. car, sofa it should show a negative result due to the longer purchase cycles and thus missing the purchase window

How long does a report take to produce once tracking post campaign has finished?

  • 5 working days.

Apps our data is collected from, how the consumer gives permission and what that permission is for etc?

  • 15 of the top 50 news and entertainment apps, local business apps and navigation apps.
  • The consumer explicitly gives their permission the publisher & relevant 3rd parties (i.e. our location technology partner) to share their location data. They have complete control to turn this off at any point.

Platforms we work with