Predictive Analytics; Witchcraft or just a data problem?


English: Museum of Witchcraft

I acquired (along the way) an observation that went; “Some people think they have a ‘Big Data Problem’ when actually they have a big ‘Data Problem‘”. This supports a previous blog about getting your house in order first, however, this statement got me thinking about the barriers for adoption of insight analytics and what the likely candidates for stopping organisations embracing analytical tools in support of their daily operations.

I have had experience of a number of challenges to date, summarised as;

  • Cultural; There is skepticism (witchcraft) that any value could be gleaned, or, we’ve been doing analytics for years (reporting analytics does not count, these are after all retrospective views of only 15% of your data!)
  • Costs; Unknown or unquantifiable costs of setting up a service without a solid business case or ROI
  • Privacy; Data integrity and the moral hurdle of using the data in a new way (please read Alistair Croll’s (Twitter; @acroll) blog on it being our ‘Civil Rights Issue‘ – I like this by the way)
  • Technical – I’ve tried to split out the challenges more;
    • Technical; Data consistency/collation
    • Technical; Infrastructure/resources
    • Technical; Access to skills

Clearly a willingness to even consider a project is perhaps the first barrier, but not necessarily the largest – least not in my view. You can navigate these with simple examples; A US based credit card company conducted some risk analysis to address the groups of customers whom represented higher risks of late payment. This was a mechanism to mitigate risk and allow for accurate income forecasts from the customer base. Once the data was analysed they found that customer who used their cards in bars were likely to be consistent late payers, also those that bought non-brand motor oils, conversely those that bought felt pads for the bottom of chair legs and customers who used their cards at the dentists were good payers.

This rather simplistic behavioural profiling on their data offers them a more robust risk model and offers a mechanism to intervene; granted it also elicits the behaviour that they reduce your credit limit because of it – I refer to Alistair’s blog again).

However, the cultural challenges (I believe) will be overcome with increasing numbers of success stories and a consciousness in the responsibilities with what we do with the data as whilst I may be immersed in it and it seems like a real no-brainer to me I understand that education takes time and corporate conscious needs to be paramount.

With my patience preserved then I’d like to focus on a specific challenge of data, the data component is after the fuel for the engine of any prediction tool and I have heard this many times as a primary reason for not adopting, whether it be preparing the data ready for analysis; cleansing,  collation & quality;

  • “I have to create a copy of operational data – separation is required”
  • “The data then needs to be structured in to a data mart for initial review”Layout 1
  • “I don’t even know what data set to start with”

Or, simply the infrastructure & tools required to start the process.

This to me is such an ‘Old Skool’ approach to today’s technology problem. Sure if your led by traditional BI thinking and as such approach this with a ‘sand box strategy’. The reality is that there are tools & services available today that will allow you to keep data in place without operational impact, with innovative delivery processes to allow you to adopt services with a low cost of entry – SaaS, and of course where all the necessary skills are available.

This approach allows you to explore your data and its viability for insight. Whilst I am not saying you have gold mine at your feet, you need to approach this from the perspective that you want to find better ways of making decisions so you can improve your outcomes, or the outcome of the interest groups. After all, if the credit card company can manage their risk better from a buying habit which is pretty 1-dimensional approach, surely some value exists in even the most mundane of data sets.

A case in point from my own stable was building access data, a CFO was able to determine that a number of buildings across his estate could be disposed simply because his new view of traffic from swipe cards showed building usage. Similarly it would allow you to determine whether heating was required, or when it was required by the the traffic pattern – ergo, operational costs saving. A simple long tail data set that was previously useless has now delivered new, actionable insight – with a £Million outcome.

I have seen many Ph.D’s promoting the adoption approach as complex and whilst I’m not advocating a fool’s rush in approach, I am saying that I have customers who, from a standing start have actionable insight in weeks, without impacting any current business activities or with any real expense which supported a business case development for a return of £Millions by eliminating churn.

This isn’t witchcraft after all – its just a perspective and we all gain from different ones.

Perhaps, you have a perspective  that you’d like to share by casting a vote?

Advertisements

One thought on “Predictive Analytics; Witchcraft or just a data problem?

  1. Pingback: Dry Roasted Klutz | Child welfare would benefit from big data.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s