BIPA Class Action Targets AI Voice Data

by Chief Editor

Microsoft Teams Lawsuit: A Turning Point for Biometric Data Privacy in the Workplace

Five Illinois residents have launched a class action lawsuit against Microsoft, alleging the company’s Teams platform illegally collects and analyzes voice data without proper consent. The lawsuit, filed February 5, 2026, in the US District Court for the Western District of Washington (Basich et al. V. Microsoft Corp.), centers on violations of the Illinois Biometric Information Privacy Act (BIPA).

The Core of the Complaint: Voiceprints and BIPA

Plaintiffs Alex Basich, Kristin Bondlow, and three others claim Microsoft’s real-time transcription feature captures speakers’ voices, assessing qualities like pitch, tone, and timbre to identify individuals – effectively creating “voiceprints.” This process, known as diarization, is a sophisticated technology that enhances meeting accessibility. However, the lawsuit argues Microsoft failed to inform users about this data collection, violating BIPA’s requirements for notice and consent.

BIPA mandates that companies obtain informed written consent before collecting and using biometric data, including voiceprints. It also requires detailing how the data will be used and how long it will be stored. The plaintiffs allege Microsoft did none of this.

Financial Implications: Billions at Stake?

The potential financial impact is substantial. The suit seeks damages of either actual harm or $1,000 per negligent violation, escalating to $5,000 per violation if the court finds intentional or reckless disregard for BIPA. Given Teams’ widespread adoption, the total liability could be significant.

The Broader Trend: AI, Biometrics, and the Rise of Privacy Litigation

This lawsuit isn’t an isolated incident. It’s part of a growing wave of litigation surrounding the utilize of biometric data and artificial intelligence. Companies are increasingly leveraging AI to enhance productivity, but often without fully addressing the associated privacy risks.

The “Shadow AI” Problem for CIOs

The case highlights a critical gap in the SaaS supply chain. When vendors update terms of service to enable new AI features, do those features automatically comply with local laws like BIPA? CIOs and Compliance Officers face a challenge: they are deploying “smart” collaboration tools without fully understanding the data practices behind them.

This creates a risk of “shadow AI” – AI operating within an organization without proper oversight or compliance measures. Vetting the privacy policies of UC vendors is becoming a financial imperative.

Beyond Illinois: A Global Privacy Landscape

While BIPA is specific to Illinois, similar biometric privacy laws are emerging globally. The EU’s General Data Protection Regulation (GDPR) places strict limits on the processing of biometric data, and other jurisdictions are considering similar legislation. Organizations operating internationally must navigate a complex and evolving privacy landscape.

Microsoft’s History with Regulatory Scrutiny

Microsoft has a long history of navigating regulatory challenges. In 2020, a complaint from Slack led to an investigation by the European Commission into whether Microsoft bundling Teams with Office 365 constituted anti-competitive behavior. Microsoft eventually unbundled its products globally to address these concerns, a process finalized in late 2025.

However, the Basich et al. case presents a different kind of threat. Unlike antitrust fines, which are often viewed as a cost of doing business, biometric privacy violations directly impact user trust.

Did you understand?

The Illinois Biometric Information Privacy Act (BIPA) is considered one of the strictest biometric privacy laws in the United States, setting a high bar for companies collecting and using biometric data.

Future Implications: What’s Next for UC and Privacy?

The Microsoft Teams lawsuit is likely to have several significant consequences:

  • Increased Scrutiny of AI-Powered Features: Companies will face greater scrutiny of how they use AI to process user data, particularly biometric data.
  • Enhanced Transparency and Consent Mechanisms: Expect to notice more robust transparency measures and consent mechanisms for data collection.
  • Shift Towards Privacy-Enhancing Technologies: There may be increased adoption of privacy-enhancing technologies, such as differential privacy and federated learning, to minimize data collection.
  • Greater Legal Risk for UC Vendors: Unified Communications vendors will need to prioritize compliance with biometric privacy laws to mitigate legal risk.

Pro Tip:

Review your organization’s data processing agreements with all SaaS vendors, paying close attention to clauses related to biometric data and AI. Ensure these agreements align with applicable privacy laws.

FAQ

  • What is BIPA? The Illinois Biometric Information Privacy Act is a state law regulating the collection, use, and storage of biometric data.
  • What is “diarization”? Diarization is a technology used to identify different speakers in an audio recording, creating speaker profiles.
  • What are the potential damages in this lawsuit? Damages could range from $1,000 to $5,000 per violation, depending on whether the violation is deemed negligent or intentional.
  • Does this lawsuit affect users outside of Illinois? While the lawsuit is specific to Illinois residents, it could set a precedent for similar claims in other jurisdictions.

As AI becomes increasingly integrated into UC and collaboration tools, the legal and ethical considerations surrounding data privacy will only intensify. Organizations must prioritize compliance and transparency to build trust with users and avoid costly legal battles.

Want to learn more about data privacy and compliance? Explore our other articles on the topic or subscribe to our newsletter for the latest updates.

You may also like

Leave a Comment