BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Data Is A Two-Edged Sword: How Startups Can Balance Privacy And The Need To Know

Following
This article is more than 9 years old.

How much do you reveal about yourself online? And how much do you want everyone (the world) to know?

These questions have been part of the zeitgeist recently with Facebook’s emotion study being revealed -- the company experimented on nearly 700,000 users, without the users knowing in 2012, with the purpose to test emotion (happy and sad) based on items you see in your news feed.

And one of the largest stories in the last decade, which still looms, is the National Security Administration story, wherein Edward Snowden revealed that the U.S. government was monitoring data collected by big tech companies.

When these breaches happen, the work that comes to mind many times, for many people,  is Dave Eggers’ novel “The Circle,” which debuted in 2013.

It’s a sort-of science fictional tale, but really more of the author’s imaginings of what our collective near-future could look like with a giant data-hungry-always-knowing-always-watching-wants-to-be-friends-with-all corporation as the main setting, where forced sharing of all information is the norm.

Though it is dystopian fiction, the book is so close to being real at times -- that it reveals some truths about how we treat each other in a digital age. How we treat privacy, data, the right to know, and the right to opt out.

The book, and recent news events, raise important questions: Who owns data and should it be protected?

It raises questions about where we are headed, as apps, smartphones, laptops, and all of our devices are constantly collecting and storing our personal data points.

Is our data safe? Will the results of that information be made public? In the digital age of wanting to know everything -- is knowing everything a good thing? What’s the balance between privacy and free platforms?

When we opt in, are we opting in forever?

And, honestly, will the majority (of users) gladly give away all privacy in order to keep free platforms free (i.e. here’s all my data, please do whatever you want with it, but keep Facebook free).

In the future, will there be an “off-the-grid” or unplugged time? Or will we always be “on?” More grandly, what do human rights look like in a digital age?

The future, and our present, is wrought with all of these questions. And those who handle data have ethical choices to make all the time.

As so many new startups form in the big data space, it will be interesting to see how a multitude of very young companies decide to deal with other people’s data, as they collect it.

How Big Data Works

As for how “big data” works, check out the short video below, which explains the concept. Big data is just data. A lot of it. Massive amounts of digital footprints we leave behind, while we’re on the Internet.

We produce more than 2.5 quintillion bytes per day. (A number that’s hardly imaginable).

That data can be (and is) collected and stored, and can be analyzed to make predictions, new products, and so forth.

The giant upside to big data is that the inferences and knowledge that can be gained can be tremendous.

Collecting this data could help us change the future of healthcare, how we treat diseases, education and how we teach people, and how we make the world of finance work. The data could bring about several other big industry revelations, improving our ways of life and business.

Responsible Data Collection And Use 

Los Angeles-based Susa Ventures, a new firm co-founded by venture capitalist Eva Ho in 2013, recently raised a $25 million seed fund to specifically invest in data-centric founders and businesses.

Previously serving as Vice President of Marketing and Operations at Factual, a company focused on real-time data, location, and mobile personalization, Ho is now seeking out the best data-driven startups to give them their beginning.

“Susa typically looks for technical founders, ideally a product manager plus a data scientist, who have experience building data stack solutions that produce radical vs. incremental change and solve meaningful societal problems,” she said, regarding what the firm is seeking in teams specifically.

As for safe data collection and usage, where startups are concerned, she emphasized transparency above all other things.

“Startups need to be highly responsible for protecting consumer privacy. They can achieve that by being very transparent about what data they are collecting, how often, how they plan to use it, and how it will be handled after the user is off the app or service.”

Regarding privacy as a whole, she concurred that it’s a delicate matter.

“Privacy is a very complex issue that is a fine balance between control and accessibility. We live in a world where to enjoy some of the most brilliant and valuable Internet services, we have to share some of our personal data,” she explained. “With so many data capturing tools around us, from our phones to ever ubiquitous sensors, it is inevitable that data is being collected about us actively and passively."

"As a consumer, it’s important to understand your own tolerance threshold and assess the trade-offs between data hoarding and data sharing. I am a firm believer of sharing more versus less because I think there is a great net societal positive to be more open with your data.  But everyone has to decide their own boundaries.”