How your personal data can create injustice
A price for a purchase that is based on one’s spending habits, one’s neighborhood or the amount of credit available to this person? Unbelievable, right? But maybe soon reality. Price discrimination is possible due to the data points about you, which are available to and used by many companies, sometimes in unexpected ways.
I was recently in Berlin meeting with other attorneys working with data protection issues. During a casual conversation with one of these colleagues, let’s call him Jan, he mentioned he was booking a vacation for his family online. Jan was not able to book the trip despite trying several times. He then logged on to his bank credit card account and simply increased his spending limit. He then was able to book the trip. The interesting detail was that the price was lower, significantly lower, than before, because Jan now had a higher spending limit – a crazy, yet great example of profiling.
In general, the practice of profiling includes using someone’s personal data by an automated process to predict or analyze certain things about the person including his or her economic situation. The overall concept of profiling is not new. In the United States, the concept of “redlining” began in the 1930’s when banks refused housing loans to individuals based on their neighborhood and not the person’s actual ability to pay back the loan. The individuals impacted were mostly poor and black. US laws tried to address this injustice only many decades later.
The problem with this is obvious: price fluctuation which rewards those with more income creates economic injustice. While it seems counter-intuitive, it is not uncommon in the US. Apparently, Jan’s travel operator acted similarly. His case is a lighthearted one with a “happy” consequence: a lower price for his family vacation because he had a higher credit limit. The vast amount of personal data readily available increases the possibility of profiling. The consequences of this data abundance can be far-reaching, unexpected and not favorable to all individuals.
Although the idea of open source technology was born with the internet, it has quickly become a profitable marketplace thanks to technology. Technology creates some marvelous conveniences which we often use “free” of charge. However, nothing is really free. Often the price for innovative conveniences is our personal data. Most of us do not think twice before clicking “accept” or “agree” to long, complicated terms which are vague, broad and rarely read in order to use the cool new app or website feature. But we should.
The Right to Choose
When mentioning data privacy or data protection issues, people often say “I have nothing to hide” or “I am not so interesting that anyone cares what I do” or “there is no such thing as privacy, it’s all out there anyway.” All of those statements may be true. However, many people who believe this may not have considered how something as basic as where they are located may have a significant impact on another realm of life.
Consider one situation in the US where geolocation was used. Geolocation, where you are located based on your cellphone usage, can be used to provide discounts to nearby shops or help give you directions to get to your next appointment. That same location information, however, may also be exploited in other ways.
According to various reports, the American advertising company Copley Advertising LLC used a sophisticated form of geolocation to send advertisements to cellphones that came near 140 different health clinics in several US cities. The advertisements urged women to find alternatives to abortion. Entities opposed to the procedure paid for the ads to be sent to cellphones. In April of 2017, the Massachusetts Attorney General reached a settlement agreement with Copley prohibiting the company from using this form of geo-targeting (actually geo-fencing) near health facilities in Massachusetts. The topic of abortion and appropriate uses of geolocation are beyond the scope of this article. The point is that our cellphones and online activities generate incredible amounts of data which companies and governments have the capacity to use in ways we may not agree with or expect. Many technological innovations do not consider an individual’s right to choose what information to share, with whom and under what circumstances.
Technology is ahead of law enforcement
Citizens and residents of the European Union will have protections from profiling and geolocation uses in May of 2018 when the General Data Protection Regulation (GDPR, Art. 4(1) & (5)) goes into effect. Under the GDPR decisions with legal consequences for individuals may not be based solely on profiling (Art. 22(1)). Also, users have to be informed if profiling is applied (Article 13(2), Article 14(2), Article 15(1), Article 21(1) & (2)). Location data is included in the GDPR’s definition of personal data. This means that where an individual is located will have the same protections as other types of personal data. Interested entities on both sides of the Atlantic are currently struggling with appropriate implementation of the GDPR. However, while regulation and laws can help to prevent or correct data-based injustices, these alone will not entirely stop it.
Technology, innovation and progress will continue at an intense pace, particularly in times of globalization and digital transformation. Technology will keep pushing us forward. In turn, we must be aware and knowledgeable. So, before you click “accept”, stop for a minute. Read the terms or, at least, be aware of what personal information you are about to provide and what other information can be determined by the new app or feature. That awareness can help you make a choice, hopefully an informed choice. Otherwise, you may pay a price you would not consider to be fair.