3 Big Problems with Big Data and How to Solve Them

According to Yael Eisenstat, ex-head of  Facebook elections integrity operations, cognitive bias training is the key along with time, better Data Science and bigger, cleaner input data.

Implementing algorithms and analytics without accounting for enough variables is what causes this issue.

So ongoing critical evaluation, quality control and course correction (plus not over-relying on technology) go a long way in keeping Big Data on the right track here.

Aside from the “garbage in, garbage out” issues leading to biased and incomplete conclusions, another weakness (also, a priority) is Big Data security.

A slew of big online leaks and wide-scale breaches in recent years (US Citizen Records leak and Yahoo! emails, to name a few) demonstrate that when dealing with Big Data, the usual methods of security protection aren’t enough.

Big Data and the algorithms that operate on it are very complex, and the more complex a system, the more potential weak spots that can be exploited.

Here are a few examples: It’s still a long way to go before there are clear regulations on data collection and breach aftermath proceedings.

Meanwhile, the experts from a1qa recommend organizations to relentlessly test all the software based on and working with Big Data.

Such testing should be aimed at finding weak spots, verifying if open-source software like Hadoop is breach-proof, employing attribute-based encryption and other measures of digital protection.

Altogether, this should be par for the course on the way to ensure the cybersecurity of an enterprise’s Big Data.

One step away from potential attacks and breaches of security, the question of Big Data privacy—personal data privacy in particular—has never been more relevant.

Since nobody reads terms of service, billions of users voluntarily provide app makers, device manufacturers, social networks, and various businesses around the globe with unending and uncontrolled streams of personal information.

While extremely beneficial for marketing and research purposes, this also creates opportunity for intrusive, unethical use of Big Data—by those who collect and process it.

This is a serious concern for governments and general public alike.

So for the past couple of years, new regulations and data privacy laws have been emerging around the globe.

The trouble, however, is that these laws and initiatives are:This leads to an avalanche of questions of the legal and ethical nature.

Where should we draw the line between user data privacy and corporate use?.Will the region-specific Data Regulations fracture and change the Internet as we know it?.Will it then lead to the unprecedented dominance of region-locked content, software and hardware?.Will smaller international businesses be able to keep up?.These and other concerns are at the forefront of today’s Big Data legal discussions.

At large, the emerging data privacy laws aim to provide a more ethical environment and much-needed transparency to user data processing.

But it’s important to understand that they are a work in progress.

There’s a lengthy path of trial and error ahead where businesses are concerned; and legislators are likely to miss the mark on their first few tries.

Those who collect and process Big Data should have boundaries, obviously.

But these boundaries shouldn’t be restrictive nor should they leave businesses (especially SMEs) no choice but to quit a particular regional market altogether.

Creation of a global standard with internationally agreed-upon core principles would also be an immense boon.

This should also go together with taking into consideration existing conditions and nuances, having clear-cut guidelines, and allowing sufficient transition periods to apply the needed changes.

Final ThoughtsWhen it comes to Big Data, it’s easy to get overwhelmed with its endless exciting possibilities.

Nevertheless, critical assessment, the understanding of shortcomings and vulnerabilities (technological, ethical and legal), as well as strategies to address them should be at the core of any Big Data implementation.

Bio: Elena Yakimova is the Head of Web Testing Department at software testing company a1qa.

She started her career in QA in 2008.

Now Elena’s in-house QA team consists of 115 skilled engineers who have successfully completed more than 250 projects in telecom, retail, e-commerce, and other verticals.

Resources:Related: var disqus_shortname = kdnuggets; (function() { var dsq = document.

createElement(script); dsq.

type = text/javascript; dsq.

async = true; dsq.

src = https://kdnuggets.



js; (document.

getElementsByTagName(head)[0] || document.


appendChild(dsq); })();.

. More details

Leave a Reply