Building Data Castles in the Air?

In Data Governance, Data Management, Data Quality, IT Strategy & Management by IRM UK1 Comment

Print Friendly, PDF & Email

In the halcyon days of my youth one of my favourite songs was Don McLean’s ‘Castles in the Air’. It highlighted McClean’s emotional dislocation between his desire to lead a tranquil, contented life and having to endure the pressures of the city in which he lived.  The phrase ‘Castles in the Air’ refers to plans which have little or no chance of succeeding.  I guess he realised his ambition was little more than a forlorn hope at the time he wrote the song. A perusal of the media in recent weeks demonstrated to me how so much of data exploitation & management also appears focused on building castles in the air. Undoubtedly, data’s potential to revolutionise society and all our lives is an exciting and enticing prospect.  I recently watched an excellent BBC documentary on how data can do this.  Entitled ‘The Joy of Data’, presented by Dr Hannah Fry, a mathematician at University College, London, it highlighted how data acts as a connecting bridge between the domain of concepts, numbers & abstractions and the real world.  Data provides us with the facts and evidence to reduce uncertainty in our lives and society, and can improve our personal and organisational decision making. 


Nigel Turner, Principal Information Management Consultant EMEA,  Global Data Strategy,

Nigel will be presenting the following full day workshop Making Enterprise Data Quality a Reality   at IRM UK’s Enterprise Data & BI Conference Europe 2016, 7-10 November, London


In the programme she went on to provide examples of how Big Data in particular was going to do this, exemplified in a use case of the city of Bristol in the UK. Bristol has bold and ambitious plans to become Britain’s first truly digital city.  It is starting to collect, store and analyse large quantities of data gathered from sensors and other Internet of Things (IoT) devices to transform how the city cares for its elderly, ill and vulnerable, controls traffic congestion and generally make the city a more pleasant place to live in.  Maybe Don McClean should move there.

But this optimistic, utopian, view of data and its potential power have contrasted sharply with several data horror stories that have recently appeared in the media. A study by the ECRI Institute ([1]) found that many US health care organisations treat patients incorrectly because they confuse them with other patients with the same or similar names.  This happened no less than 7,613 times in the two-year period of the research. Needless to say, the impact of this poor management of data quality can be catastrophic.  Closer to home, in the UK the National Health Service (NHS) in England reported that up to 5% of patient data held in General Practitioners lists is inaccurate, obsolete and duplicated.([2])  This also increases the chances of treatment errors and moreover has financial implications as the NHS pays GPS £136 for each registered patient, money it can ill afford to waste as it is under severe financial pressures.

And it’s not only in health that problems with basic data quality have been exposed. The UK energy regulator OFGEM found that some energy companies routinely overcharge customers because of billing errors caused by a muddle over imperial and metric gas meter readings, which measure gas consumption in different units.  Some people have been overcharged by as much as 130% for their gas over many years.  This is bad news for consumers but also for the suppliers concerned as OFGEM has ordered them to sort out the mess and refund affected clients, a complex and costly undertaking.([3])

But no article about the perils of poor data quality would be complete without another retail blooper. On its UK website, Hewlett Packard (HP) inadvertently reduced the cost of a high-spec workstation from £2,278.30 to the bargain price of £1.58. ([4]) Needless to say lots of sharp eyed bargain hunters ordered them before HP spotted the error and corrected it.  Ultimately HP refused to sell them at that price, citing a ‘processing error’, but was left red faced when many would be buyers contacted the UK press to complain.  Lots of bad publicity ensued.

So what does all this positive and negative media exposure tell us about the data challenge?   To reiterate, there is no doubt that data has the potential power to make a real difference in our lives and the wider world in which we live.   But the exciting possibilities of Big Data, IoT, Self-Service Business Intelligence, Data Science, Analytics and so on can only be truly realised if they are anchored on proven, root and branch data management principles and practices.  This may not be the most thrilling aspect of data work, any more than the task of laying the foundations of a castle, rather than building its soaring walls and turrets, but without it castles fall down and crumble.   It’s hard to deliver the vision of the digital world outlined by Hannah Fry unless you get the basics right, and it’s equally clear that many organisations are still failing to do that.

Focusing on the foundations is key to data success. There are a number of universal but essential practices.  First, understand what data is really important to the successful operation of your company, whether it relates to your customers, products, suppliers, sales transactions, invoices et al.  You cannot rigorously control all the data you use, so put your focus on the things that really matter.  Then make people formally responsible for this data through Data Governance, set data standards, analyse its quality against these standards, and put business and IT processes in place to address the quality problems.  Also develop and implement policies to ensure all people in an organisation adhere to good data practices.  Always make all these foundational activities an integral part of any Big Data and Analytics project.

Sound & trustworthy data provides the basis on which the digital world and its many artefacts can and must be built. If you don’t want to build castles in the air, but data edifices that endure and last, get the basics right.  Don McClean wanted to get back to basics; organisations should look to do the same with their data.

([1])   Wall Street Journal Europe edition 25 September 2016
([2])   UK Times newspaper editorial comment 21 July 2016
([3])  BBC News website 15 August 2016
([4])  BBC Technology website 1 August 2016

Nigel Turner is Principal Information Management Consultant EMEA at Global Data Strategy.  He specialises in information strategy, data governance, data quality & master data management. During his consultancy career he has worked with over 150 clients, including British Gas, AIMIA/Nectar, HSBC, EDF Energy, Telefonica O2, the Chartered Institute for Personnel and Development (CIPD) and Intel US.  With more than 20 years’ experience in the Information Management industry, Nigel started his career working to improve data quality, data governance & CRM within British Telecommunications (BT), and has since used this experience to help many other organisations do the same.  Whilst at BT he also ran a successful Information Management and CRM practice of 200+ people providing consultancy and solutions to many of BT’s corporate customers.   He is also an elected member of the UK’s Data Management Association (DAMA) management committee.  In 2015 he was given DAMA International’s Community Award for setting up a mentoring scheme for data management professionals in the UK.  In 2007 fellow data professionals voted him runner up in Data Strategy magazine’s UK Data Professional of the Year awards.  Nigel is a well-known thought leader in data management and has published several white papers & articles and is a regular invited speaker at Information Management & CRM events. 

Copyright Nigel Turner, Global Data Strategy



  1. Good insights. There are certainly lots of exciting opportunities with IoT, Big Data, etc. , but it’s important to have a solid foundation to build from. It’s often the intersection of new insights from, for example, social media sentiment analysis with the core customer data from a traditional data warehouse that provides the most value. I’ve seen several large organizations do this successfully. It is, as you point out, not an ‘either / or’, but a matter of using technologies successfully in concert.

Leave a Comment