AG’s Lawsuit Raises New Questions Around AI and Use of Personal Data

FRANKFORT, Ky. — A lawsuit filed by Kentucky Attorney General Russell Coleman against artificial-intelligence chatbot company Character Technologies Inc. is drawing new attention to how rapidly evolving AI platforms collect, store and use personal data — and what that could mean for banks, credit unions and the millions of consumers whose financial information increasingly intersects with AI tools.

According to spokesperson Peg Alizarisray, the civil suit, filed in Franklin Circuit Court, accuses Character.AI of violating state consumer-protection and data-privacy laws by failing to safeguard sensitive personal information, particularly that of minors. While the case centers on child safety, legal and compliance experts say its broader implications could extend to financial institutions that rely on third-party technology vendors, data analytics firms and AI-driven customer-engagement tools.

Russell Coleman

At issue is how AI platforms gather and retain conversational data — often highly personal in nature — and whether users are adequately informed about how that information is stored, shared or repurposed.

Data Collection and Consent in the Spotlight

Coleman’s lawsuit alleges Character.AI induced users to disclose deeply personal information without meaningful consent or adequate safeguards, potentially violating the Kentucky Consumer Data Protection Act, which took effect in 2026.

That statute, like similar laws adopted in other states, grants consumers the right to know what data companies collect, how it is used and with whom it is shared — requirements that increasingly apply to financial institutions and their vendors.

At least one analyst said any company that touches consumer data, including financial data, should be watching the case closely.

Financial institutions already operate under strict federal privacy rules, including the Gramm-Leach-Bliley Act, but many now integrate AI tools for marketing, chat services, fraud detection and member engagement. Regulators have warned that third-party data practices can expose institutions to compliance and reputational risk.

Vendor Risk and Member Trust

The lawsuit contends Character.AI failed to implement sufficient age verification and controls, allowing sensitive information to be collected from minors. For financial institutions, the case underscores concerns about how AI systems may inadvertently capture or infer financial behavior, emotional states or decision-making patterns — data that could be considered sensitive under state privacy laws.

Consumer advocates argue the litigation could accelerate calls for clearer limits on how AI-generated interactions are stored and whether such data can be used to train future models.

“Once personal data enters an AI system, it can be difficult to determine where it goes or how long it persists,” Coleman said in announcing the suit. “That lack of transparency is unacceptable.”

Broader Regulatory Implications

Character.AI said it is reviewing the complaint and emphasized that it has invested in safety and privacy controls, including protections for younger users.

Even so, experts say the Kentucky case may serve as a template for future actions against technology providers whose platforms intersect with financial services, payments, lending or digital wallets.

For banks and credit unions, the lawsuit is a reminder that data governance extends beyond their own walls — and that state attorneys general are increasingly willing to test how emerging technologies comply with consumer-protection and privacy laws.

Facebook
Twitter
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.