Here's what's happening right now, and nobody's talking about it loud enough.
An AI company called Eightfold AI has been quietly running inside the hiring systems of Microsoft, PayPal, Starbucks, Morgan Stanley, BNY, Chevron, and Bayer. Their platform scrapes personal data on job applicants — work history, education, social footprints — and feeds it through a proprietary large language model that ranks and scores each person based on their "likelihood of success."
Sounds like a resume screener. But there's a problem. A massive one.
These "AI scores" are consumer reports under federal law. And if that's true — and a new class action says it is — then Eightfold AI has been violating the Fair Credit Reporting Act on a massive scale: running secret reports on millions of job seekers without consent, without disclosure, and without giving anyone a chance to dispute what the algorithm says about them.
The FCRA isn't just about credit scores. It covers any consumer report used to make decisions about employment, housing, or credit. If an AI is ranking you for a job without telling you — and using your personal data to do it — that could be an FCRA violation whether there's a number attached or not.
The Lawsuit That Could Change Everything
In January 2026, workers' rights advocates filed Kistler v. Eightfold AI Inc. — described as a first-in-the-nation class action challenging AI hiring tools under the FCRA. The filing alleges that Eightfold's platform generates "consumer reports" on job applicants without their knowledge, without letting them see what's in the report, and without giving them any opportunity to correct errors in the data.
If that sounds exactly like what the credit bureaus did before the FCRA cracked down — it's because it is.
The suit claims Eightfold violated two core FCRA requirements:
- 1. No disclosure. Applicants weren't told that an AI was generating a consumer-style report on them or that it would be used to make or influence employment decisions.
- 2. No right to dispute. The FCRA requires that when a report is used against you, you get to see it and challenge inaccuracies. None of that happened here.
- 3. No consent. The report was generated and used before applicants even knew a system like this existed.
"What Eightfold is doing is what the credit bureaus used to do before Congress said enough — collect data in the dark, score people without telling them, and let companies make life-altering decisions based on invisible reports."
This Isn't Just a Hiring Problem
The credit community needs to pay close attention here because this lawsuit is cracking open a legal question that goes way beyond jobs.
If courts rule that AI scoring tools used in employment decisions are consumer reports, what stops that logic from extending to AI tools used in credit decisions? In insurance underwriting? In apartment approvals? Those markets are already using AI models that operate the same way — scraping data, running scores, making decisions — without telling consumers what's in their file.
We've been fighting for credit report transparency for decades. The FCRA gave us that right. Now AI is trying to create a parallel system that does the same thing — but calls it an "algorithm" instead of a "report" to dodge the rules.
This case could force AI companies across industries to comply with FCRA requirements — meaning disclosure, consent, and the right to dispute errors in AI-generated scores. That's a win for every consumer who's ever been quietly screened by a system they didn't know existed.
What's at Stake for Consumers
Let's keep it 100. If you've applied for a job at any major corporation in the last few years, there's a real chance an AI scored you in ways you never saw. Eightfold's platform is used by companies with hundreds of thousands of employees worldwide. The number of affected applicants could be in the tens of millions.
Under the FCRA, if a company violates your consumer reporting rights, you're entitled to statutory damages of $100–$1,000 per violation — and in class actions, that adds up fast. The Capital One FCRA class action that just wrapped up a final approval hearing this month settled for $2.4 million. These cases are real, the money is real, and consumers win them.
| Your FCRA Rights | What They Mean | Status |
|---|---|---|
| Right to disclosure | Be told when a consumer report is being pulled on you | Violated |
| Right to access | See the report used against you | Violated |
| Right to dispute | Challenge inaccurate information | Violated |
| Consent requirement | Give written permission before a report is run | Violated |
| Adverse action notice | Be told if a report was used to deny you a job | Not Addressed |
What You Should Do Right Now
You can't always fight a system you can't see. But you can make sure your side of the ledger is clean. Here's how to protect yourself in the age of AI screening:
- 1 Pull your credit reports from all three bureaus — Experian, TransUnion, Equifax. AI tools often scrape public data and cross-reference it with financial history. Errors on your reports can poison AI scores about you that you never even see.
- 2 Dispute every error immediately. Under the FCRA, bureaus must investigate within 30–45 days. With CFPB enforcement gutted right now, you can't wait for the government to do it for you. Do it yourself — or use an AI-powered credit bot that does it automatically.
- 3 Request your file from specialty reporting agencies. LexisNexis, CoreLogic, and others maintain separate reports that employers use in background checks. You can request them for free once a year under FCRA rules.
- 4 Ask employers what screening tools they use. You're entitled to know. If a company used an AI tool to screen you and you didn't get the job, request an adverse action notice. They're legally required to provide one.
- 5 Monitor the lawsuit. If you applied to Microsoft, PayPal, Starbucks, Morgan Stanley, BNY, Chevron, or Bayer and didn't get the role, you may be a class member in Kistler v. Eightfold AI. Track it at classaction.org.
The Bigger Fight We're All In
This case is bigger than one AI company and one lawsuit. It's about whether the consumer protections that took decades to build — the right to see what's in your file, to dispute errors, to be treated with transparency — survive the age of AI.
Credit bureaus tried to operate in the dark. Congress forced them into the light. AI hiring companies are trying the same thing. The courts are going to have to answer: does the FCRA follow the data, or does it stop at the label?
At NMD ZAZA, we've always said the same thing: knowledge is leverage. The more you know about the systems scoring you — credit, employment, insurance, housing — the better you can position yourself to win. The people who don't know are the ones who get filtered out by algorithms they never knew existed.
Stay locked in. Stay informed. And clean up your credit while you still control the variables.
AI Shouldn't Score You Without Your Knowledge. Fix What You Can Control.
Your credit report is one of the most powerful data points feeding AI systems across hiring, housing, and lending. Get it clean. Get it accurate. Let our AI-powered credit bot work the system for you — automated, 24/7, $29 flat.
Credit law. AI. Consumer rights. Breaking intelligence delivered straight to your inbox.