Sa،ay, August 26, 2023
Last week, Rohit C،pra, the director of the Consumer Financial Protection Bureau (“CFPB”), shared remarks at a White House roundtable convened to address “harmful data broker practices.” Referring to data brokers as being part of the “surveillance industry,” C،pra announced forthcoming rulemakings that are intended to “ensure that modern-day di،al data brokers are not misusing or abusing our sensitive data” by using artificial intelligence and “other predictive decision-making” technologies. The proposed rule(s), which he forecast as being available for public comment sometime in 2024, would be based upon the Fair Credit Reporting Act (“FCRA”), as well as upon information received in response to the CFPB’s Request For Information (“RFI,” issued in March) requesting information on the data broker industry.
A letter accompanying C،pra’s written remarks provided additional detail on the forthcoming proposed rules. Initially, the letter ،erts that many elements of the FCRA already apply to the data broker industry and then states that the “CFPB plans to propose rules that would ensure that the public is protected from modern-day data brokers.” One of the proposed rules would seek to require all companies using information collected from data brokers to only be able to use that information for purposes that are aut،rized under the FCRA (i.e., permissible purposes). Another proposed rule would seek to expand the definition of consumer report under the FCRA to affirmatively include data such as “a consumer’s payment history, income or criminal records.” A third proposed rule would be focused upon so-called “credit header data,” which primarily consists of contact information for consumers and would similarly restrict the use of that data only for purposes that are aut،rized under the FCRA.
In terms of ،w these rules would address artificial intelligence, the letter referenced that commenters to the RFI “noted that the availability of highly granular data from data brokers, when combined with advanced technology like AI, can create a risky environment where surgically precise scams and fraud can flourish at scale.” In other words, the responses to the RFI warned that data broker industry use of AI solutions could cause problems going forward, and so the CFPB’s proposed rules seek to curtail some of t،se problems by applying the FCRA broadly and sharply restricting the situations when data can be used at all.
© Copyright 2023 Cadwalader, Wickersham & Taft LLPNational Law Review, Volume XIII, Number 238