Skip to main content
All
January 20, 2016

FTC Report on Big Data Could Foreshadow Big Compliance Issues: Implications for Unfair Lending, Credit Reporting, and Unfair and Deceptive Practices Compliance

Arnold & Porter Advisory

Using “Big Data,” which can be loosely defined as the amassing and analysis of large consumer data sets and the incorporation of analytical results and conclusions into marketing and lending decisions, is quickly changing the ways in which both traditional and alternative lenders conduct their business. A study conducted by the Economist estimates that at least 74 percent of companies in the banking space have recently invested in new technologies to better leverage Big Data. And there is an emerging array of new FinTech companies offering loan products or services based on the use of non-traditional methods for assessing creditworthiness, largely through the use of Big Data.

Given that Big Data is quickly becoming a fixture in the consumer lending industry, it is no surprise that Big Data has caught the eye of government regulators. Earlier this month, the Federal Trade Commission (FTC) issued a report (the Report)1 outlining the concerns it has with respect to how Big Data is used. Based on the concerns raised by the FTC in the Report, companies using Big Data should consider compliance implications early and often. While the Report discusses issues that cut across almost every aspect of private sector use of Big Data, in this Advisory we highlight key aspects of the Report that are most relevant in the context of consumer lending and the provision of consumer credit. We also provide some high-level guidance on how to approach an efficient compliance program that balances evolving regulatory requirements with maintenance of a culture of innovation that is associated with Big Data.

The Report

The Report is the latest installment in the FTC’s campaign to provide guidance regarding the use of Big Data. The campaign began on September 15, 2014, when the FTC held a public workshop on Big Data to explore, in particular, “the potential impact of big data on low-income and underserved populations.”2 On March 19, 2014, the FTC hosted a related seminar on alternative scoring products.3 In May of 2014, the FTC released its report entitled “Data Brokers: A Call for Transparency and Accountability,” that focused on the collection, compilation, and analytics of Big Data.4 Based on the information it had collected, the FTC released the Report, which focuses in particular on “how companies use big data to help consumers and the steps they can take to avoid inadvertently harming consumers through big data analytics.”5

While the Report recognizes that Big Data may open up avenues for providing credit to underserved and underrepresented populations, the Report also raises several concerns about potential legal pitfalls in using Big Data in connection with consumer lending. The key overarching consumer lending concerns raised by the Report include potential violations of the Equal Credit Opportunity Act (ECOA),6 the Fair Credit Reporting Act (FCRA), and unfair or deceptive practices related to Big Data.

Use of Big Data May Inadvertently Open the Door to Allegations of Unfair Lending

In relying on Big Data to make determinations about creditworthiness, companies may incorporate biases that disadvantage certain customers based on protected characteristics under ECOA, such as race, religion, gender, or marital status, among others. While lenders typically understand that it is illegal to treat equally qualified borrowers differently because of, for example, their race or gender, it should also be noted that, in addition to such “disparate treatment” cases, federal enforcement actions have also targeted instances of so-called “disparate impact,” where the government alleges that a facially neutral policy disproportionately and negatively impacts a protected class. Just as there would be potential liability under ECOA should a company decide to lend only to women or only to married people, a lender may find itself facing regulatory actions and/or civil litigation if the metrics it uses to make credit decisions have a disproportionate impact on a protected group, even if no discriminatory intent is present. A typical example of such a policy would be a lender that establishes a minimum loan amount for its mortgage lending, which may result in minority applicants being disproportionately denied loans. However, the same principles can be applied to any data point used in a lending decision, including education, purchase histories, and even the medium through which the potential borrower communicated with the lender.

The use of Big Data in targeted marketing of credit products, if not properly vetted and documented, also can result in potential ECOA violations. With respect to documentation, Regulation B (the implementing regulation of ECOA) requires that lenders maintain records regarding the criteria used to select recipients of prescreened solicitations. Lenders using algorithms or machine learning for targeted solicitations are not exempt from these requirements, despite the complexity and opaqueness that can be introduced by Big Data analytical techniques.

Even if proper documentation is maintained, the use of Big Data in targeted solicitations for credit can result in liability under ECOA if the solicitations create an environment that discourages individuals with protected characteristics from applying for credit or does not target such individuals to the same degree as non-protected potential applicants. While generally lenders are not prohibited from engaging in targeted marketing of lending programs that are open to all, ECOA violations may occur if by virtue of the solicitations consumers are discouraged from applying for better offers. For example, if a lender were to use apparently neutral data points, such as the purchasing of diapers and self-help books, to select consumers to receive offers of sub-prime credit products, such a practice could ultimately create ECOA concerns if the result was the disproportionate targeting of individuals with protected characteristics (such as single women) to receive only sub-prime credit offers. ECOA violations could result if it could be shown that individuals with protected characteristics were deterred from applying for lower-priced products because they were steered by the marketing to less favorable products. Alternatively, if individuals with protected characteristics targeted for sub-prime credit offers were excluded from marketing for lower-priced products, claims of ECOA violations based on disparate impact could also arise even if only facially neutral characteristics are used in the marketing decisions.

Use of Third Parties To Assist in Synthesizing Big Data Can Create Additional Obligations Under FCRA

The FCRA generally applies to companies that compile and sell consumer information for use in credit, employment, housing, or other similar decisions. Under the FCRA, companies meeting the definition of a credit reporting agency (CRA) have legal duties regarding ensuring the accuracy of consumer reports, providing consumers access to their information, and correction of errors. Lenders using information provided by a CRA to make adverse credit determinations (either denial or increased costs) must also provide notice to the consumer of the lenders’ use of the consumer report information.

Given the complexities of Big Data, lenders are increasingly looking to unaffiliated third parties to assist in managing and maximizing the potentials presented by internally collected data. Because the FCRA does not generally apply where lenders use internally generated data in making credit decisions, lenders relying on their own Big Data for credit determinations may believe that they are exempt from FCRA requirements. Where a lender allows an unaffiliated entity access to the company’s own data to assist in making credit determinations, this act may trigger FCRA implications. As stated in the Report, the FTC will likely view third parties offering analytical services performed on the company’s own data as meeting the definition of being a CRA under FCRA. Not only does this create legal obligations for the unaffiliated entity with respect to the data, but it also triggers lender-specific notice requirements in the event of adverse credit decisions that would not otherwise exist had the data analysis been kept in-house.

Lenders that look to companies offering new predictive solutions based on Big Data may also be inadvertently triggering FCRA-related obligations. While predictive solution products aim to provide better insight into a consumer’s creditworthiness by looking at non-traditional factors, lenders opting to use these products may be subject to reporting requirements under the FCRA. If, for instance, the lender provides customer-specific information to obtain input on a credit decision, the lender will likely be obligated to provide mandatory notice to the consumer in the event of an adverse decision. If, on the other hand, the lender uses aggregate information to develop general lending guidelines, and based on these guidelines a consumer is denied credit, the lender will be required to disclose the nature of the aggregate information if the lender receives a specific request from the consumer.

Use of Big Data May Be Challenged as Unfair or Deceptive

Under Section 5 of the Federal Trade Commission Act (Section 5), lenders are prohibited from engaging in unfair or deceptive acts or practices. Big Data presents several pitfalls that may trigger liability under Section 5. Any use of Big Data that does not comport with promises or representations made to consumers regarding how their information will be used could create Section 5 exposure. On a less obvious level however, maintaining Big Data without adequately securing the information, or sharing the information with other entities that use the data for fraudulent purposes, may result in allegations that the lender has engaged in unfair practices under Section 5. Such concerns also would exist under the similar prohibitions on “unfair, deceptive, or abusive” acts and practices found in the Consumer Financial Protection Act, which is enforced by the Consumer Financial Protection Bureau (CFPB).

Three Lines of Defense Approach in Light of Compliance Considerations

While the Report raises significant issues for companies in the consumer credit space, it leaves many questions unanswered, including what the FTC’s next steps will be in regulating the use of Big Data. In the absence of formal guidance on where and how the FTC will move forward with the concerns raised in the Report, lenders and others in the consumer credit space should consider how they might best incorporate elements of their compliance regime into their analytic operations based on the concerns identified by the FTC. While the Report offers a non-exhaustive list of suggested compliance questions, which are reproduced in the Appendix to this Advisory, these suggested compliance questions leave many practical implementation and legal interpretation issues unaddressed.

A preliminary question for accessing compliance obligations is an assessment of whether a company falls within the enforcement jurisdiction of the FTC or of another regulator with similar enforcement authority. For traditional lenders, the answer to this question is likely self-evident, but, for emerging FinTech companies, the answer may be less clear. The collective jurisdiction of the FTC, CFPB, and federal and state prudential banking regulators, not to mention state attorneys general, is quite comprehensive with respect to providers of consumer finance products. While a jurisdictional analysis is fact-specific, most companies that conduct consumer credit analysis, reporting, or underwriting will be subject to one or more consumer laws that can be enforced by one or more regulatory or law-enforcement bodies.

The compliance structure adopted by a company should be tailored to its unique facts and circumstances, including not only how it uses Big Data, but also what its regulatory framework is. Nevertheless, one particular compliance model that is instructive for companies to consider in their compliance analysis is the “Three Lines of Defense” model, which has been widely adopted as a best practice compliance structure, is a regulatory expectation in the banking industry,7 and highlights a significant compliance challenge in the Big Data context. Under the Three Lines of Defenses model, compliance under a firm’s board of directors’ supervision is pursued through three channels: first are the business lines, which should conduct a firm’s business with compliance in mind; second are a firm’s risk and compliance departments that continuously monitor the business; and third are the audit professionals who periodically check risk and compliance programs for effectiveness.

While each of the three lines of defense can serve an important role in the compliance program, the first line of defense is particularly important in the Big Data context. Business functions related to the use of Big Data are driven by technological platforms and technical staff (engineers, computer programmers, data scientists, etc.). These technical business-line personnel may be focused primarily on their technical work. The incorporation of added layers of compliance may constrain technical professionals in a way that they may not fully appreciate if they are not accustomed to the First Line of Defense model. Indeed, many professionals working with Big Data in the consumer credit space come from significantly less heavily regulated industries. Because of the highly technical nature of Big Data use, it is critical that technical professionals are sufficiently educated and empowered to incorporate the compliance function into their day-to-day roles to ensure compliance. Compliance considerations should be explained to the business lines beginning at the product development stage and continuing throughout the lifecycle of the product, so that business-line personnel will better understand the compliance aspects of their business and can therefore better incorporate compliance into their thinking and practices. They often may be the best participants to identify and avoid potential compliance violations and make appropriate modifications to the technology platform. It is also imperative that companies incorporate compliance as efficiently as possible so as not to stymie the very culture of innovation conducive to product design and operational efficiency that has made them successful in the first instance. In addition, to the extent that business-line personnel are describing (in privacy policies, in FCRA disclosures, and in other regulated contexts) the uses made of Big Data, it is important for the business and the technical professionals to work closely together so that the statements of how Big Data is analyzed and used are accurate and complete. In this context, subject to regulatory requirements, it is essential for companies to say what they do and to do what they say.

For the second line of defense, a firm should evaluate, and where appropriate augment, the knowledge and familiarity of the compliance and audit professionals with the technological and business aspects of the Big Data analytic models and the uses made of the results of those models. Compliance and audit professions (as well as the business lines using Big Data) should also be alert to the possibility that potential disparate impacts created by the use of Big Data may not be intuitive. It is therefore important to incorporate into any compliance program periodic checks to confirm how the use of Big Data is impacting consumer lending in practice so appropriate course corrections can be made if necessary.

For the third line of defense, a firm should assess whether the audit professionals it engages have the requisite background and training in Big Data compliance issues. Because Big Data is a growing field pulling from a wide range of disciplines, firms should look for audit professionals that can offer substantial depth in understanding both Big Data’s applied aspects (such as the risks associated with choosing one analytical model over another), as well as the more self-evident technical aspects (such as coding and database management). 

Conclusion

As the use of Big Data continues to expand and move into new areas and influence consumer lending practices, it is likely to attract increasing attention from regulators and the plaintiffs’ bar. In anticipation of this scrutiny, now is the time for companies to incorporate compliance expectations and processes into their business models and operating cultures prior to the incurrence of regulatory issues or litigation. This requires implementing an efficient compliance program that balances, on the one hand, the incorporation of a compliance culture in the technology business units using Big Data as a first line of defense, with, on the other hand, the burden such compliance may place on technological innovation. An efficient compliance approach must be sensitive both to Big Data’s technological culture and potential to add important business insights and efficiencies and to the evolving regulatory climate.

Appendix: FTC’s Suggested Questions for Legal Compliance

  • If you compile big data for others who will use it for eligibility decisions (such as credit, employment, insurance, housing, government benefits, and the like), are you complying with the accuracy and privacy provisions of the FCRA? FCRA requirements include requirements to (1) have reasonable procedures in place to ensure the maximum possible accuracy of the information you provide, (2) provide notices to users of your reports, (3) allow consumers to access information you have about them, and (4) allow consumers to correct inaccuracies.
  • If you receive Big Data products from another entity that you will use for eligibility decisions, are you complying with the provisions applicable to users of consumer reports? For example, the FCRA requires that entities that use this information for employment purposes certify that they have a “permissible purpose” to obtain it, certify that they will not use it in a way that violates equal opportunity laws, provide pre-adverse action notice to consumers, and thereafter provide adverse action notices to those same consumers.
  • If you are a creditor using Big Data analytics in a credit transaction, are you complying with the requirement to provide statements of specific reasons for adverse action under ECOA? Are you complying with ECOA requirements related to requests for information and record retention?
  • If you use Big Data analytics in a way that might adversely affect people in their ability to obtain credit, housing, or employment:
    • Are you treating people differently based on a prohibited basis, such as race or national origin?
    • Do your policies, practices, or decisions have an adverse effect or impact on a member of a protected class, and if they do, are they justified by a legitimate business need that cannot reasonably be achieved by means that are less disparate in their impact?
  • Are you honoring promises you make to consumers and providing consumers material information about your data practices?
  • Are you maintaining reasonable security over consumer data?
  • Are you undertaking reasonable measures to know the purposes for which your customers are using your data?
    • If you know that your customer will use your Big Data product to commit fraud, do not sell your products to that customer. If you have reason to believe that your data will be used to commit fraud, ask more specific questions about how your data will be used.
    • If you know that your customer will use your Big Data products for discriminatory purposes, do not sell your products to that customer. If you have reason to believe that your data will be used for discriminatory purposes, ask more specific questions about how your data will be used.
  1. FTC, “Big Data: A Tool for Inclusion or Exclusion?,” (January 2016).

  2. FTC, “Big Data: A Tool for Inclusion or Exclusion?,” Workshop (September 15, 2014).

  3. FTC, “Spring Privacy Series: Alternative Scoring Products,” Seminar (March 19, 2014).

  4. FTC, “Data Brokers: A Call for Transparency and Accountability,” (May 2014).

  5. On January 14, 2015, the FTC subsequently held its “PrivacyCon” conference in which, among other topics, the use of Big Data in connection with consumer privacy and data security was explored. FTC, “PrivacyCon,” Conference (January 14, 2016). Discussion topics included whether discriminatory actions might arise given the analytic design, volitional use, or machine learning of data analytic systems. See, e.g., Amit Datta, et al., “Automated Experiments on Ad Privacy Settings: A Tale of Opacity, Choice, and Discrimination,” Proceeding on Privacy Enhancing Technology (2015).

  6. It should be noted that the ECOA concerns raised by the FTC in the Report would apply equally in the context of the Fair Housing Act (FHA), which applies to residential (e.g., consumer mortgage) lending; however, the FTC does not have enforcement authority for the FHA and does not discuss it in the Report.

  7. See, e.g., OCC, 79 Fed. Reg. 241, “OCC Guidelines Establishing Heightened Standards for Certain Large Insured National Banks, Insured Federal Savings Associations, and Insured Federal Branches; Integration of Regulations,” (December 16, 2014).