We’re still only half way through 2018 and it’s already clear that how businesses approach big data may need to change before the year is out. News story after controversy after scandal highlighted just how far some businesses (and their data collection and data science practices) have strayed from consumer expectations. From Cambridge Analytica abusing Facebook user data to the Commonwealth Bank losing the financial statements of almost 20 million accounts, it’s not been a good year for the protection and transparent, sensitive usage of personal data. While big data can refer to more than just personal data, the latter is where businesses usually extract the greatest value. And so any new regulations or technical requirements targeting a business’ ability to capture, store, process and use this data can have quite severe consequences for its business strategies – even its entire business model. But let’s not get carried away: When handled correctly, businesses can still benefit from data science and big data in Australia. It’s just that the constantly changing technical, legal and ethical landscape means some strategies and practices may need revisiting. And this places new pressure on businesses to be constantly aware, understand and implement the necessary changes.
Protecting big data in Australia
One of the biggest challenges for big data in Australia is keeping so much personal or sensitive data safe and secure. Cyber security remains one of the biggest threats to businesses of all sizes in Australia today. According to the Telstra Security Report 2018, 25 per cent of Australian businesses experience an email security breach at least once a month, with another 25 per cent experiencing a phishing attack. More concerning is that a staggering 76 per cent of Australian businesses were targeted with ransomware in 2017, the highest rate of ransomware attacks in the world. Of those who fell victim to such an attack, 47 per cent paid the ransom so as to retrieve their data (86 per cent successfully). With the scale and number of potential cyber threats increasing every year, the regulations around data security continue to evolve. On 22 February 2018, the Notifiable Data Breach Scheme came into force in Australia, requiring organisations to notify individuals (as well as the Office of the Australian Information Commissioner or OAIC) as soon as possible following a data breach of personal information. Within six weeks of the scheme being launched, the OAIC received 63 reportable data breaches. However, 51 per cent of these breaches were the result of human error, while only 44 per cent were due to malicious or criminal attack. So, while most businesses are aware of the external threats that could lead to a data breach, just as significant is the internal risk from inadequate data practices or a lack of the necessary skills. The hackers and ransomware attacks may take the headlines, but good data security begins at home.
Along with new Australian legislation such as the Notifiable Data Breach Scheme, there are international legal changes that can also have an impact on local data practices. On 25 May 2018, the European Union implemented its General Data Protection Regulation (GDPR), which is why the preceding weeks saw inboxes around the world filling up with notifications of updated privacy policies and invitations to reconfirm email subscriptions. The 260-page regulation is far more than a few tweaks to existing laws and includes a plethora of new requirements governing everything from how the individual gives consent to the capture and usage of their personal data, as well as how it is managed, shared and more. Much, much more. If you’re thinking “Sucks to be a business in Europe” as you spread your Vegemite, consider that it isn’t the location of the business that determines whether the GDPR applies, but the location of the individuals whose personal data is captured. In short, if your business has the personal data of anyone living in the EU, or could potentially capture it in the future, it may need to comply with the GDPR. And that makes this European regulation effectively global. Certainly, Australian businesses should at least consult with an expert to understand the possible exposure and make any necessary technical or procedural changes. In the future, it’s likely we’ll see other localised regulations with a potentially global impact. That’s why the future of big data in Australia won’t only depend on access to experienced data scientists but also access to legal and technical expertise that specialises in data privacy and data jurisdictional issues.
Don’t abuse customer trust
Even if your business is able to overcome the technical and legal challenges, a practice or strategy for the use of big data in Australia may still be the wrong thing to do. According to MEF’s 4th annual Consumer Trust Study, 53 per cent of consumers believe they are not in control of how their data is used, while 39 per cent agreed with the statement: “I know that by agreeing to the terms and conditions I am giving permission, but I don’t feel I have a choice”. So it might be seen as disingenuous for a business to claim a user gave clear permission for the capture and use of their data because they once ticked a lengthy and obscure set of terms and conditions when first signing up for a product or service. Take the storm that consumed digital startup Unroll.me in 2017. Launched in 2011, Unroll.me promised to free users from cluttered email inboxes. Once given access to your Gmail, Google Apps or Yahoo account, the free service made it simpler than ever to unsubscribe from the growing flood of unwanted email newsletters, while combining those you did want into a single, convenient digest email. No one questioned that it needed access to everything in your inbox because how else could it do its job. The very nature of the service necessitated certain levels of access. Yet what wasn’t so clear was that Unroll.me was owned by analytics company Slice Intelligence and they had other reasons for wanting to access your inbox – selling data to third parties. Unroll.me got into strife when its users discovered – via a New York Times article – that the startup was collecting information from any emailed Lyft receipts and selling the aggregated data to Lyft’s biggest rival, Uber. Unroll.me CEO Jojo Hedaya further angered users by publishing a blog post that tried to hide behind the less-than-transparent terms and conditions: “it was heartbreaking to see that some of our users were upset to learn about how we monetise our free service.” Even though the startup’s practice was technically and legally possible, it still caused a great deal of damage to the brand’s reputation by failing to align with user expectations of what was reasonable. (Today, the Unroll.me website is far more transparent about the use of customer data.) So, when planning future big data strategies, after answering the question “Can we do this?” businesses should go on to ask “Should we do this?”.
- Technical: Data security requires specialised internal skills and processes as well as protection against external cyber threats.
- Legal: New and changing regulations – both locally and abroad – are placing fresh demands on businesses to invest in greater compliance expertise.
- Ethical: Future big data strategies need to be more transparent and aligned with community expectations, regardless of what may be technically or legally possible.