The ICO (Information Commissioners Office) has recently updated its guidance on big data. The document includes a look at the implications that big data, artificial intelligence and machine learning have for data protection.
The paper basically supports the concept that big data and analytics have an important role to play for both organisations and individuals. The ICO also stresses that the privacy considerations should not be sacrificed in the name of big data and analytics. The concept of privacy provides an element of security against the online criminals aiming to steal the identity and subsequent monies from the average individual. It also provides a shield against the rise of hate crimes where vulnerable individuals can be victimised, bullied and vilified.
The ICO recognises that data is an asset and for a while companies have regarded data very much like a new currency. Like currency it needs to be protected guarded and curated in a sensible manner. Most of the general public remains sceptical about the use of big data by companies. This was highlighted recently by the case of the NHS handing over patient records to Google. The idea was for Google to use its DeepMind artificial intelligence tool to help predict patient need. The good that comes from the project is being overshadowed by the fear that big organisations are holding and using our data.
The ICO paper address this point: “So a key question for organisations using personal data for big data analytics is whether the processing is fair. Fairness involves several elements. Transparency – what information people have about the processing – is essential. But assessing fairness also involves looking at the effects of the processing on individuals and their expectations as to how their data will be used.”
In my view there has to be step change in the way individuals see their personal data and how companies use it. The Boston Consulting Group found that 75% of consumers in most countries rate privacy of personal data as a top issue. With that in mind companies are going to have to gain the trust of consumers, and law makers are going to have to construct sensible legislation which is enforced. Ultimately it comes down to companies being open and honest about data. This means holding data securely, processing the data sensibly for an identifiable benefit and not selling it wholesale to the highest bidder.
Blog post by Richard Skeggs (Senior Data Manager at the Business and Local Government Data Research Centre), please if you have any questions about the contents of this post.
Published 30 June 2017