Big Data Platforms Enable Racial Profiling and the Exploitation of the Most Economically Vulnerable Groups in Society
Once upon a time, people celebrated the Internet as promising a new era where shoppers, invisible on the web, could not be judged based on their race or otherwise discriminated against. However, online behavioral targeting can combine a home address and a few more characteristics to create an almost perfect proxy for race. If anything, such online discrimination can be more vicious for its subtlety and invisibility since customers don’t even know what prices are being offered to other people of different races or socioeconomic circumstances. And it’s not even clear that current laws could fully address such harms if they could be made visible, since as George Mason University professor Rebecca Goldin noted in a 2009 article, what would be the legal status if banks used “the kind of music one buys to determine his or her loan rate?”[i]
Such online “weblining” has been well documented. Along with the price discrimination based on location discussed above, companies like Wells Fargo listing houses for sale have collected zip codes of online browsers and directed those buyers towards neighborhoods of similar racial makeup.[ii] This online discrimination parallels the broader reality of companies like Wells Fargo illegally steering an estimated 30,000 black and Hispanic borrowers from 2004 to 2009 into more costly subprime mortgages or charging them higher fees than comparable white borrowers.[iii]
As ColorLines magazine has noted, a "user’s browsing history, their location and IP information…combined with information available in Google’s public data explorer (including US census, education, population, STD stats, and state financial data) presumably could also be folded into the personalized search algorithm to surmise a lot more than your race."[iv] Latanya Sweeney described in an academic article how, on sites detailing legal information about individuals, when people searched for a name "on the more ad trafficked website, a black-identifying name was 25% more likely to get an ad suggestive of an arrest record.”[v]
What is disturbing is that people online can find themselves losing opportunity as their ongoing behavior or interests lump them in with the “wrong” racial or other group in the algorithms of big data platforms. For example, Kevin Johnson, a condo owner and businessman, found that after returning from his honeymoon, his credit limit had been lowered from $10,800 to $3800. The change was not based on anything he had done but, according to a letter from the credit card company, he had shopped at stores whose patrons “have a poor repayment history.”[vi] If your habits associate you with particular categories or groups, you will invisibly find opportunities opening up or closing down based on how data algorithms choose to place you. Similarly, whether you get a refund when making a complaint to a company will often be heavily influenced by the categories in which data analysis places a caller.
For less ethical companies, big data gives them the ability to seek out the most vulnerable prospects to exploit and entice them with scams and misleading offers. Such niche scams and economically exploitive relationships can be focused on those most vulnerable to the scam’s appeal, while remaining essentially invisible to everyone else, including reporters and researchers trying to evaluate the harms from online advertising methods.
The data broker industry even has a term – “sucker lists” – for the poor, old and less educated groups that they compile for such unethical marketers. For example, people who reply to sweepstakes offers are put onto a list by one data broker and offered to advertisers as an “ideal audience for…subprime credit offers” and other enticements. Other lists include “suffering seniors” who are identified as having Alzheimer’s or similar maladies.[vii] The Federal Trade Commission itself has noted that when companies use a consumer’s financial status to send targeted advertisements, it is not covered by FCRA if they don’t cover specific pre-approved offers of credit.[viii]
Search advertising is especially attractive to companies looking for micro markets of vulnerable targets for scams, since the combination of keyword searches and demographic data allows what writer Jaron Lanier calls the “ambulance chasers and snake oil salesmen” of the Internet to get targeted access to victims. The “minimalist link” of a search ad focuses on lead generation for such companies where users self-select into the advertisers’ target group by clicking on the link.[ix] For example, one company ran advertisements for poisons and chemicals on the Google Group page alt.suicide.methods where users were discussing how to kill themselves.
Reflecting the more comprehensive problems in search advertising targeting the vulnerable, Google in August 2011 agreed to pay a $500 million civil forfeiture to the federal government, one of the largest in history, as part of a settlement for the company knowingly allowing illegal pharmacies to target users on its search engine.[x] The company had been put on notice by the government as early as 2003 that companies were selling illegal steroids and fake prescription medicine to desperately ill individuals, yet the company not only accepted the ads but its staff helped foreign-based pharmacies write their ads for maximum effectiveness. It was only when a felon, David Whitaker, collaborated with the government in a sting operation that the full extent of the company’s collaboration with such scam and illegal marketers was fully documented, including that knowledge of the collaboration went all the way up to CEO Larry Page.[xi]
[i] Rebecca Goldin, “Doting on Data,” Notices of the AMS Book Review (April 2009); http://www.ams.org/notices/200904/rtx090400483p.pdf
[ii] Andrews Ibid., p. 36.
[iii] James O’Toole, “Wells Fargo in $175M discriminatory lending settlement”, CNNMoney.com, July 12, 2012; http://money.cnn.com/2012/07/12/real_estate/wells-fargo-lending-settlement
[iv] Jorge Rivas, “Google Calls Racial Profiling Claims ‘Wildly Inaccurate’”, Colorlines, Sept. 28, 2011,http://colorlines.com/archives/2011/09/google_responds_to_preliminary_study_says_their_ads_dont_racially_profile.html
[v] Latanya Sweeney, “Discrimination in Online Ad Delivery,” Working Paper,
[vi] Andrews Ibid, p. 20.
[vii] http://online.wsj.com/news/articles/SB10001424052970204556804574260062522686326; see also Rockefeller Senate Data Broker report-p. i - http://op.bna.com/der.nsf/id/sbay-9ehtxt/$File/Rockefeller%20report%20on%20data%20brokers.pdf
[viii] FTC Data Brokers report, p. 25 — http://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparency-accountability-report-federal-trade-commission-may-2014/140527databrokerreport.pdf
[ix] John Brockman, “The Local-Global Flip, or, "The Lanier Effect": A Conversation with Jaron Lanier,” Edge, Aug. 29, 2011; http://edge.org/conversation/the-local-global-flip