It’s Complicated: Our Love/Hate Relationship with Big Data

Events, High Technology, Technology Posted Feb 19, 2019 by Gabrielle Kondracki

It’s all fun and games until someone’s data gets collected without them realizing it or someone is left wondering where his or her information is going and how it will be used – and this complex arrangement has become increasingly important as more and more companies are being granted access to your own personal activity via applications downloaded on your phone, tablet or laptop.

Therefore, it’s no surprise to have seen such an eager audience fill a jam-packed room for MIT Enterprise Forum Cambridge’s event last Wednesday on Data Ethics: Exploring Vice and Virtue in Big Data. Gathered together with a panel of top experts in the industry, the night presented an interactive session debating the flaws that we are seeing in data-driven algorithms and misuse in data usage. The theme of the night was centered around questioning what is ethical in the world of artificial intelligence (AI) and big data. One thing is for sure, it’s not as black and white as we may have thought, and there’s a lot of work to be done. 

The lively and well prepared moderator Karen Hao, AI and social impact reporter for MIT Tech Review, guided the discussion with a diverse panel of experts. It was quickly apparent to the audience just how slippery of a slope AI and data usage can be. The enthusiastic panel included:

A sincere statement from John Loughnane at law firm Nutter was that, “with finance and technology moving at the speed of light, legal guidelines are constantly trying to catch up.” All the panelists agreed it is important to consider the implications of this lapse. Coming from the only student on the panel, Irene Chen couldn’t stress enough that it’s up to the individual to stay vigilant about knowing what data is being collected on them as the laws are still trying to catch up.

This led to an interesting thought—just how much do we pay attention to our privacy? How often do users read the “terms & conditions” outline before downloading an app or checking off the “I accept” box? It certainly makes one ask, “am I truly a victim if I don’t take the time to read the fine print?” Greg Woolf at Coalesce.ai explained, “privacy is a fundamental right, but the United States has been a bit more relaxed about that right.” He explained there is certainly a cultural aspect to this trend. For instance, Europe is far more stringent in terms of what personally identifiable information (PII) is collected and how it’s used by companies, while people in China are more flexible about accepting less parameters around their privacy. So, while it is up to the user to stay alert regarding their privacy, it is also critical that companies collecting user data be more explicit in terms of providing simple terms and conditions and there needs to be a level of legal oversight.

Dan Stowell explained how his company Canopy doesn’t compromise a user’s data when they download his app. Instead, he works towards allowing the user to experience personalized recommendations without giving up personal data that is stored on the user’s phone. Stowell explained, “there really isn’t a lot of data a company needs on a person to know what advertisements or recommendations to make for a user.”

The moderator’s closing question to the panel asked, “what can users do to push for data privacy?” The overall tone seemed optimistic. Considering there isn’t much a user can do to reverse all the data that has been collected over the years, concern needs to be directed towards algorithms. As AI and machine learning continue to grow rapidly and as a result, make decisions for the user, users need to understand how impactful those decisions can be. The panel couldn’t emphasize enough that users can not assume these data issues will get solved without them, and that by staying educated and aware about what data you are sharing are the first steps in seeing change. As we learn more and become more tech-savvy, we may be able to reclaim some of our data and take back our information.

If there is anything we learned from this data ethics event, it’s that data privacy isn’t as concrete of a concept yet, and until it gets there, it’s crucial to be more aware of what we’re allowing the public to see. Collecting data starts with the users allowing data to get collected and all the panelists agreed, it’s up to us to control what we share.