We live in a time when we are all connected in more ways than are imaginable. Our habits are under constant scrutiny: what we buy, how often we buy it, how we drive, where we go, the sites we visit on the Internet…and on and on and on. Big data is here, and it isn’t going away. The big question now is how the massive amounts of data compiled on every single living being in the United States is handled, and whether it is done ethically. Certainly, with so much information up for grabs at any given time, there is a definite potential for the data to be used insidiously. And that is where the U.S. Federal Trade Commission (FTC) comes in.
To ring in the New Year, the FTC has issued a new report titled Big Data: A Tool for Inclusion or Exclusion?, focusing on the benefits and risks of the commercial use of big data on low-income and underserved consumers.
The report is the result of the FTC’s acknowledgement that “the proliferation of smartphones, social networks, cloud computing, and more powerful predictive analytic techniques have enabled the collection, analysis, use, and storage of data in a way that was not possible just a few years ago.” It does not seek to recommend additional legislation or regulation, but instead offers explanation and guidance about how currently existing laws and regulations apply to the challenges presented by big data.
The concern, understandably, is that the use of big data can perpetrate discrimination against those without sufficient credit, employment, or rental history, especially in situations where “inaccuracies and biases…might lead to detrimental effects for low-income and underserved populations.” Particularly, it examined concerns that the data may be used to exclude certain demographics from credit and/or employment opportunities. It was very specific where two existing laws that affect employment background screening are concerned: the EEOC and the FCRA.
- Equal Opportunity Laws: Businesses have a number of equal opportunity laws to consider, but the FTC report focuses specifically on the Equal Credit Opportunity Act (ECOA), which makes it against the law for a creditor to discriminate against applicants on the basis of race, color, religion, national origin, sex, marital status, or age. Failure to comply with the law can result in punitive damages as high as $10,000 in individual cases and the lesser of $500,000 or 1% of the creditor’s net worth in class actions. The FTC report is concerned primarily with the “disparate impact doctrine,” which essentially says that consumers have a right to challenge consumer credit decisions that have an “unintentional but disproportionate” adverse effect on minorities. This originally applied to housing decisions, but is also applicable to credit and employment decisions affecting minorities.
- Fair Credit Reporting Act: The FTC specifically stated in the report that the FCRA is not limited to traditional consumer reporting agencies, credit bureaus, and lenders under the statute. It also applies to so-called “data brokers,” particularly if those companies “advertise their services for eligibility purposes.” The report emphasized that the FTC applies the same methods and standards whether a company uses traditional or non-traditional methods of data application, and that companies should keep the law in mind when using big data to make FCRA-covered decisions.
We are The Cedalius Group, the employment background screening provider you can trust. We understand the complexities of big data, and the concern consumers have that the data is used responsibly. We are advocates for our clients and the data they provide to us, and “compliance” is our keyword. If you have questions about big data and how you use it, give us a call today at 404.963.9862 or visit us online at www.thecedaliusgroup.com.
The Cedalius Group offers insight into the background screening industry for educational purposes. We always recommend you consult with your legal counsel to determine practices that best suit your business needs.