“The oil of the 21st century”, “the fuel of the digital economy”, the “data gold rush”. There’s no doubt that data is playing an ever-more important role in both the global society and the economy.
The former Article 29 Working Party, renamed European Data Protection Board on 25 May 2018, when the GDPR became directly applicable in all member states of the European Union, has defined big data as
“the exponential growth both in the availability and in the automated use of information: it refers to gigantic digital datasets held by corporations, governments and other large organisations, which are then extensively analysed using computer algorithms”.
Following this definition, big data is conceptualised as being both the process of collecting information and the subsequent step of analysing it. By now we all know that simply storing and owning a massive amounts of data is meaningless. Value instead is generated in the identification, location and extraction of knowledge from large datasets that are able to provide insights which may subsequently inform decision making and process automation, what we often refer to as Smart Data.
Smart Data, in fact, provides us with information that determines decision-making processes for on-demand business, science and engineering applications.
In contemplating this, I have begun to look for a sound legal methodology to approach smart data and allow for the lawful generation of Smart Data from big data, permitting us to harness value in massive datasets by way of an optimal application of data protection by design.
Over the next two weeks I will therefore publish a series of blogs exploring this concept.