From Catalogs to Clicks: The Fair Lending Implications of Targeted Internet Marketing
- Source: consumercomplianceoutlook.org
Treliant Takeaway:
Treliant knows fair lending, FinTech, and marketing risk. Digital marketing, including social media and targeted online marketing, is becoming increasingly important for financial institutions. Marketing credit or housing digitally can broaden your institution’s reach, but it can also increase fair lending risks. If you need assistance with assessing and managing your digital compliance risk, Treliant can help.
Article Highlights:
In the most recent issue of Consumer Compliance Outlook, “From Catalogs to Clicks: Fair Lending Implications of Targeted Internet Marketing” by Carol Evans and Westra Miller discussed the developments in alternatives to brick-and-mortar shopping from the introduction of the Sears and Roebuck catalog in the 1880s through today’s online and app-based shopping patterns. Their discussion focused on the fair lending implications of the explosion of targeted marketing and advances in nontraditional credit scoring made possible by the use of massive data sets about consumers combined with sophisticated data mining and consumer scoring algorithms. Using these technologies, lenders can track consumer online activities, predict future shopping patterns or customer value, and assess creditworthiness in new and unfamiliar ways.
Although some consumers may appreciate greater personalization of their online banking experiences or the potential to increase fair access to credit and housing, others have raised concerns about privacy and consumer protection, including fair lending. Fair lending risks inherent in targeted digital advertising include digital redlining and steering, among other risks. One type of fair lending concerns is exemplified by a 2019 consent order between a large social media platform and the U. S. Department of Housing and Urban Development (HUD). The social media platform permitted advertisers to include or exclude users from advertising based on demographic characteristics including age and sex, as well as personal interests that might be proxies for familial status, race, sex, age, or familial status. In addition, the social media company varied pricing of advertising based on target audience characteristics, including protected characteristics (or proxies for protected characteristics) under ECOA and FHA.
For lenders trying to avoid similar violations, the article offers helpful advice. First, lenders should be aware of the fair lending risks posed by algorithm-driven marketing. Use of either vended or internally developed marketing models must be accompanied by appropriate controls to ensure data used to train the models is complete, accurate, and does not reflect past discrimination.
Second, lenders using online advertising services or platforms must monitor ad placement criteria and to determine whether any audience filters, placement criteria, or placement service algorithms that could result in targeting advertisements on prohibited characteristics or proxies for such characteristics. Similarly, audience reports should be reviewed to ascertain whether the advertising audience is skewed by personal or neighborhood characteristics.
Finally, lenders should take steps to ensure that each consumer applying for credit receives the best terms and conditions for which he or she qualifies, regardless of application channel or marketing platform.
For more information on the potential risks of big data, algorithm-driven decision-making, and digital redlining, check out these offerings in the Treliant Knowledge Center:
Big Data, Machine Learning, and Bias – The Consumer Protection Implications of Emerging Technologies Digital Banking Raises the Specter of Digital Redlining Insurtech: Managing Risks to Consumer Privacy, Information Security, and Fairness Viral Credit Card Discrimination Claims Offer Lessons in Modern Risk Management |