Organisations across the public and private sector are increasingly using big data to make important decisions and predict demand for products and services.
As an in-house lawyer, you may find yourself involved in how your organisation uses big data and transforms its decision-making processes. This is especially likely if you're accountable for IP and/or data protection advice and/or compliance in your organisation or review the contracts that will commit your employer to receipt or provision of services of this kind.
How big data informs big decisions
More and more areas of our lives are overseen and governed by algorithms based on very large data sets.
Big data can now support everything from approving loan applications to identifying children at risk of abuse. And there’s so much of this data that experts estimate 90% of the electronic information in the world today didn’t exist two years ago.
Thanks to the exponential – and continual - increase in computing and network power, cloud computing and the Internet of Things (IoT), we’re generating massive amounts of data every day.
Furthermore, advances in artificial intelligence and machine learning have revolutionised the field of analytical statistics, enabling millions of potential correlations to be analysed in just a few hours.
This is the big data and analytics revolution. Our challenge is to transform this raw data into useful and actionable information.
Defining big data
Big data is a term applied to data sets too large and complex to process with traditional tools, like spreadsheets. It also implies the use of advanced predictive analytics tools, algorithms and methods from nonlinear systems to reveal relationships and dependencies or predict outcomes and behaviours.
The main components of a big data ecosystem are:
- Analysis techniques, such as A/B testing, machine learning and natural language processing;
- Management of business intelligence via cloud computing and databases; and
- Visualisation tools such as charts and graphs to display the results of analysis.
A real-world example
In 2016, the multinational conglomerate, GE spent over $1bn analysing data from sensors on gas turbines, jet engines, oil pipelines and other machines. To make sense of its findings, the company is building a cloud-based platform called Predix. This combines its own information flows with customer data. Analytics software then helps the business reduce costs and increase uptime for its clients through vastly improved predictive maintenance.
As part of the project, GE is hiring several thousand software engineers and data scientists and retraining tens of thousands of salespeople and support staff. This represents a major shift in GE’s business model from product sales and service licenses to outcomes-based subscription pricing.
GE’s long-term aim behind this investment in big data and analytics is to triple sales of software products to around $15bn by 2020.
Challenges facing big data
While rich in opportunity, big data also presents big challenges. As well as dealing with issues arising from cloud computing and IoT, businesses looking to harness the power of this movement will have to grapple with:
- Data quality. As with all automated systems, the quality of any analysis is only as good as the quality of the underlying data. For example, when analysing the performance of a production line, it’s critical to consider invisible factors such as machine degradation, component wear and quality of raw materials as well as volume and quality of output;
- War for talent. Big data is a relatively new area and as yet, qualified specialists in the field are in short supply. Nevertheless, organisations will need large numbers of technical specialists who can conduct data analysis and make insights actionable;
- Organisational capabilities. Senior leaders in organisations will need to understand the outputs from big data analytics. More importantly, they’ll need to be willing and able to transform organisation-wide decision-making processes;
- Decision-making discretion. Big data evangelists believe it can provide the answers to tough social questions, such as which criminals to give the longest sentences or which teachers to fire. The underlying belief is that the decisions are better because, being based solely on data, they’re devoid of human bias or misjudgement. However, the data is only as useful as the statistical quality of the model we apply it to. In addition, supposedly objective models can exacerbate inequalities by mathematically codifying pre-existing biases and opinions; and
- Overfitting to the past. Most big data models use historic data to make predictions about future outcomes. For this reason, it’s critical to adapt the model when conditions change. One of the reasons many polling models failed in the Brexit referendum and the 2016 US Presidential Election was that pollsters had failed to capture, and therefore factor in, changes in registered voter demographics. Being too finely tuned to past data makes predictive models vulnerable to even small changes in prevailing conditions.
Big data and analytics are here to stay. As well as the public sector, businesses are generating data at an exponential rate. They can use this data to improve their products and services, predict customer demand and protect their organisations by way of better decisions. However, like all automated processes, computer analytics are only as good as the underlying data they use.