Data compiled by the Met Office for 2014 concludes that January has in fact been the wettest on record for Southern England since data on such events has been compiled. At the same time, data from the UK Statistics Authority confirmed that funding for dealing with flooding and other natural disasters had been cut, in real terms, by almost £247m.

The great debate over what the data says, and how it is interpreted continues, with proponents claiming funding is adequate, while opponents point to data showing the opposite is true. As recently as February 25th however, Sir Andrew Dilnot, the head of the United Kingdom’s statistical data watchdog, rebutted claims of no funding cuts, calling instead for the government to publish official spending data “in the public interest”.

The point of our blog here is not to support or disprove either side in this raging debate. The devastation caused by the flood waters is real, and real lives have been impacted as a result. While this event has disrupted the lives and livelihood for many, it has also provided a flood (no pun intended!) of data points that, if collected, collated and analyzed properly, can help prevent (or perhaps mitigate) similar occurrences in future.

The key word however is “properly”!  There’s no doubt that data plays an important part in almost every aspect of our society, especially when it comes to disaster readiness planning. Proper data collection, cleansing and dissemination is therefore vital.

The Environment Agency captures relevant data using automatic field devices and shares that data via telemetry to a broad array of stakeholders and systems, both internal and external. A lot of this data will be “OpenData” until May 15th to support activities related to further prevention of winter flooding. Parties at interest, such as The Royal Borough of Windsor and Maidenhead, The Met Office and other Councils and activist groups, amongst scores of others, rely on this data. The key question is: As the data comes from disparate sources and systems, and is freely shared with other upstream and downstream agencies and systems, can its reliability and integrity be assured for critical applications?

To ensure that level of data integrity, the data will need to be cleaned and “sanitised” before being integrated into subsequent data processing systems. And that’s where WinPure excels! It delivers a comprehensive suite of data cleansing tools which are used by clients across a broad spectrum of industries and agencies – both in the government and the private sector. By taking raw data and putting it through its paces, these tools ensure that the “letter” and the “spirit” of the data remain uncompromised.

The use of “fuzzy logic” and other advanced intelligent data cleansing logic ensures that the data is reliable for critical decisions – whether it be used for public safety or commercial interests.  And the fact that these tools are widely supported on popular platforms, such as Windows XP-SP2 / 2003 / Vista / Windows 7 and Windows 8, mean that they are easily available as cost effective solutions to deal with a flood of data cleansing challenges.

By Mike Hughes | March 4th, 2014 | Posted in Data Cleansing

About Mike Hughes

Mike has over 4 years full time experience blog wand content writing. He enjoys and can write on a wide range of topics all data management related.

Any Questions?

We’re here to help you get the most from your data.

Download and try out our Award-Winning WinPure™ Clean & Match Data Cleansing and Matching Software Suite.

WinPure, a trusted innovator in Data Quality and Master Data Management Tools.
Join the thousands of customers who rely on WinPure to grow faster with better data.