This site uses cookies to provide you with a more responsive and personalised service. By using this site you agree to our use of cookies. Please read our privacy statement for more information on the cookies we use and how to delete or block them.
Blog:

For data to be the “new oil” it needs to pass through the refinery

20 August 2020

By Will Tingle.

In 2006, mathematician Clive Humby coined the phrase “Data is the new oil”. Ironically, the pandemic saw oil prices plummet into negative territory for the first time in history, whilst data continues to be fundamental in our response to the crisis.

Big Data is not new anymore but organisations are still challenged by collecting data which is truly fit for purpose. As with oil, the first stage of the data journey is mining in order to extract enough raw material to make refining viable and useful.

This proved to be a difficult feat initially for Public Health England who in mid-April were performing around 15,000 COVID-19 tests a day. This made it extremely complicated to comprehend the true scale of the pandemic at its peak. Whilst the official number of reported cases stands at about 310,000 in the UK, the Office for National Statistics estimate that the actual figure may be closer to 4.2 million cases based on a sample of antibody tests.

The figure for daily tests now stands at more than 100,000, meaning that the raw data to start drawing insights and conclusions is much improved. This should stand us in good stead to avoid a second nationwide lockdown, given that the data should allow any lockdown to be confined to a highly targeted area.

Data abundance is one thing, ensuring the data you have is accurate and serviceable is also fundamental to support effective decision making. Guidance from Oxford University Hospitals states that approximately 70% of COVID -19 tests are likely to be accurate. This poses a great risk in the sense that someone who is actually COVID-19 positive could receive a test to the contrary and thereafter go about their day-to-day life as usual, infecting others.  

Oil fields of data not fit for purpose

Data accuracy is also a key issue for organisations and marketers. Basic steps to ensure the legitimacy of underlying data can save significant time and money for marketers simply by preventing the pursuit of customers who lack the intent or ability to make a purchase.  

Let me showcase this with a guilty confession I have to make. Sometimes I have entered my data in capture forms using the email address [email protected] rather than [email protected] And it wasn’t even by accident – the cheek of it! Why did I do this? It’s probably because I’m really not interested in being bombarded with emails about a particular product. I most likely submitted the details in order to view an article, for example. I therefore lack the intent to make a purchase in this instance.

The result of my reckless actions is that companies are sitting on oil fields of data which are not fit for purpose. One study has shown that as much as 30-40% of all data which is captured in online forms is unusable.

Big data and finding value

This explains why as with oil, data must also pass through the refinery – a part which is all too often neglected. This is the missing link between harvesting data and making it useful.

Many companies are incorporating data into their strategy but prioritising the collection of data over its application and are not leveraging data in the correct way to achieve successful outcomes. True value lies in software which can examine this input data and discard or clean the 30-40% of unusable data to enable marketers to concentrate their efforts on leads that are genuine.

Companies capture hordes of data at our every click of a button. However, marketers can be blinded by an overabundance of data and need tools to examine the veracity of the underlying information in order to concentrate their resources on a more select data set.

Solutions that revolve around transforming vast quantities of data into accurate and useable information are badly needed. This can achieve marked results in sales conversion and deliver immediate cost and efficiency savings.

So just remember that like oil, in its crude form, data is often too messy to be useful. A few steps to refine the information can go a long way. Businesses that can combine abundance and accuracy of data have a competitive edge and can extract and deliver huge value.

If you are a private equity firm or strategic investor and would like to find out more about opportunities in the field of Big Data, please contact Will Tingle, M&A Manager or Damian Ryan, M&A Partner.

Please also contact Will or Damian if you are a business owner working in the field of Big Data seeking to grow or exit your business.

Damian Ryan, Media and Entertainment Partner, M&A, BDO:

“The BDO Martech report highlighted that the manipulation and interrogation of data is a key challenge for marketers. Clever software solutions can help to automate these processes, simultaneously reducing media spend and increasing sales conversion.”

Robin Caller, Founder and CEO, Overmore Group:

“Most firms don’t measure the total cost of incorrect or incomplete data. To do this, the “total cost of attempting to sell” needs to be known on a per lead basis. As most leads fail to convert, and most firms spend more trying to convert a lead than generating a lead, there is always more waste in the “lead to revenue” funnel than anywhere else. Helping real people get their enquiry data correct is one key measure. Keeping the bad data out of the cycle is the other.”