Data Discrimination: The Dark Side of Big Data
There’s no question that Big Data creates Big Opportunities for marketers. Deep knowledge about prospects and customers help marketing teams create a wide variety of segmented and personalized marketing campaigns. But there is a dark side to all this knowledge: data discrimination.
The challenge facing today’s marketers isn’t using Big Data to understand better their customers and prospects; it’s knowing if, when, and where that knowledge crosses the line and becomes harmful discrimination.
Big Data Can Lead to Data Discrimination
Altimeter analysts recently discussed the top digital trends for 2016, and among the list is a focus on customizing consumer experiences. Consumers generally understand that their data is being collected and that they’re being monitored at basically every step of their purchasing journey. Whether they are entirely comfortable with that fact or not, most of them, at least, expect to get some payout in return, and that payout is marketing that is tailored to their particular needs, wants, and interests.
Big Data allows for that customized marketing approach, which is why any team that wishes to make a place for themselves in the marketing field is relying on Big Data more and more. The collection and analysis of Big Data are valuable to your marketing team in numerous ways, including the following:
- Big Data can guide the development of new products and services
- Big Data allows you to predict individual preferences to tailor services and offers
- Big Data supports individualized marketing
- Big Data, when properly used, can translate into significant revenue growth
With all the benefits that Big Data offers, along with the fact that it is becoming more affordable and easier than ever to collect and store more varied categories of data, it doesn’t make sense to market without the use of data analytics. It seems obvious that, with a proper approach, Big Data can benefit the marketing field on both sides – that of the marketer and the consumer.
However, the collection and analysis of Big Data are not without risks and scrutiny. As marketers rely more on Big Data for marketing, the Federal Trade Commission (FTC) is looking into the pros and cons of this practice, and recently released a report on the matter, “Big Data: A Tool for Inclusion or Exclusion?“. In this report, the FTC brings up the issues of data discrimination and disparate impact – two issues that you may not be familiar with, but which you should be if you (or your company) leverage Big Data for marketing purposes. If you don’t familiarize yourself with these topics, you could find that the consequences are dire.
Understanding Data Discrimination
Many laws and regulations in place are meant to protect the consumer when it comes to the collection, sharing, and selling of personal data. As the trend towards data analysis has grown, many marketers have made it a priority to educate their employees on these regulations. And, according to another one of Altimeter’s reported trends – that being, the immersion of brand issues in regards to data use and privacy – that will need to continue to be a priority. This is a field that is ever-changing. In fact, there is currently a pending amendment to the California data privacy law that would extend privacy protection to geophysical location and biometric data. Suffice to say, whether you manage a marketing department or own your own company, it is your responsibility to ensure total compliance with all applicable FTC regulations regarding marketing and consumer data use.
While most marketers are paying attention to these shifting privacy laws, many are less familiar with those regulations which are set in place to prevent the discrimination and exclusion of particular groups of consumers. While the following laws may not initially seem to be pertinent to marketers, they are potentially applicable to Big Data practices, and the avoidance of data discrimination (that is, using personal data to discriminate against certain protected groups, either intentionally or unintentionally):
- The Fair Credit Reporting Act: The Fair Credit Reporting Act, or FCRA, applies to the collection and selling of consumer information that may be used in regards to credit, employment, housing, insurance, or other benefits. This act is meant to ensure that this type of data is reported with maximum accuracy, and only provided for appropriate purposes. This Act may apply to marketers involved in the advertisement of credit cards or bank loans. Being cognizant of how these products are being marketed is important. For example, Big Data analytics may exclude someone from receiving marketing for a prime rate credit card, simply because of non-traditional analytic predictors, such as a person’s zip code, relationship status, or even social media use. This may potentially be a violation of the FCRA.
- Equal Opportunity Laws: There are many equal opportunity laws in place, including the Equal Credit Opportunity Act and the Genetic Information Nondiscrimination Act, all of which are meant to prohibit discrimination based on protected characteristics, such as race, gender, age, religion, marital status, etc. Again, it is important for marketers to use caution when targeting consumers based purely on information collected via Big Data. You need to be careful to ensure that your advertisements are not excluding a group of people based on characteristics that are protected under equal opportunity laws. With Big Data, it may be easy to make such a mistake, as the human connection is often lost in the mire of information.
Disparate Impact and Unintentional Data Discrimination
Another important concept for marketers to understand is that of disparate impact. If a discrimination case is brought against your company, the plaintiff would have to show evidence of disparate impact. According to the FTC report, disparate impact occurs when a company’s policies or practices have a “disproportionate adverse effect or impact on a protected class, unless those practices or policies further a legitimate business need that cannot reasonably be achieved by means that are less disparate in their impact.”
In practice, this means the following: your team’s use of Big Data results in marketing campaigns that exclude protected classes based on personal data. See the example above regarding lower credit rates. Additionally, marketers must understand that it is not necessary for disparate impact to be intentional for the claim to be founded.
In a recent article regarding digital discrimination, aimClear CEO Laura Weintraub, who has almost 20 years of prior experience as an attorney, says that marketers need to understand the legal landscape in which they’re marketing. She suggests those that need to be particularly concerned are marketers in the fields of finance, public accommodation, and housing. Further supporting this idea is past action taken by the FTC. AdAge reports that in 2008 the FTC settled with Visa and MasterCard marketing firm CompuCredit Corporation after finding that CompuCredit failed to disclose its use of a behavioral scoring model to reduce some consumers’ lines of credit.
The FTC is paying attention. Is your team complying with the law?
4 Big Data Usage Best Practices
The benefits of Big Data are too much to be ignored. However, some steps should be taken by your marketing team to ensure that you are taking advantage of the benefits of Big Data, without exposing your company to the marketing compliance risks of discrimination, exclusion, and a potentially costly lawsuit. A basic understanding of those laws that can be applied to Big Data practices is only the beginning. Following are some suggestions of Big Data practices that will ensure you are targeting your marketing consumer without violating the guidelines of data discrimination or disparate impact:
1. Make Sure You Have A Representative Data Set
When researching marketing or purchasing trends, be sure that you have an accurate and thorough representation of consumers. There are groups of consumers who may be more private about sharing information, or who may not have ready access to technology or forms of social media, but that doesn’t mean they don’t have purchasing power or shouldn’t be represented in your data.
2. Account for Biases in Your Data Model
Nearly all forms of data collection have hidden biases. This is why the human component to data analyzation is so important. You should be looking at collected data with the understanding that some bias exists. Further, you should look at your data model and try to identify and eliminate those potential sources of bias. For example, is there an inherent bias (monetarily or ethnically, let’s say) stemming from a limited data set? If so, see our previous recommendation.
3. Continually Test Data Predictions
Big Data predictions are often accurate, but they are not flawless. Keep track of data predictors and monitor their accuracy. If you are making marketing decisions based on unsuccessful analysis and predictors, changes need to be made.
4. Put Fairness Above Analytics
If you are making marketing decisions based on Big Data and you are unsure whether there are ethical issues regarding exclusion or discrimination, it is always best to err on the side of caution. Being a fair and ethical marketer is best not only for your consumer but also for your reputation as a brand.
Don’t Let Big Data Lead to Data Discrimination
Big Data has made a huge impact on the marketing world, and will continue to do so. While there are risks involved with the use of Big Data, there is also the potential for significant rewards. Your ultimate goal should be to use data and analytics in such a way that you can focus on your target consumer without excluding or discriminating against others. As in most areas of marketing compliance, this just comes down to using common sense, good judgment, and transparency, to ensure that your campaigns are on the up-and-up.