Sam Altman, the CEO of ChatGPT creator OpenAI, recently made headlines when he said that chatbot conversations don't enjoy the same privacy or confidentiality protections as do communications with a lawyer. If you've had tell-all legal confessionals with your favorite artificial intelligence (AI) chatbot, Altman's revelation might come as upsetting news.
But it shouldn't be, for one rather obvious reason: An AI chatbot—no matter how confident, authoritative, or arrogant it might sound—isn't a lawyer. That you take emotional comfort from your dog doesn't make Spot a psychologist. So too, Claude isn't a divorce lawyer because it seems to understand child custody and alimony.
We'll set the record straight about AI, legal advice, and confidentiality. Here's what you should know:
Long story short: If you want to keep it a secret, don't tell an AI chatbot.
AI chatbots like Claude or ChatGPT (also known as "bots") sometimes are capable of doing remarkable work. But no bot, no matter how "smart" or well-trained, can give you legal advice. Why? Because only a trained and licensed lawyer can do that.
The problem is that chatbots are taught to sound confident and authoritative. They offer what sounds like legal advice. And it's so convincing that people have a hard time telling the difference between what's real and what isn't.
What are you really getting if you turn to AI for legal advice? Chatbots have been described, accurately, as "next word predictors." They don't think, speak, or write as people do. After gobbling up huge amounts of training data, they learn how to mimic human speech and writing patterns, using mathematical probabilities to predict—often with uncanny accuracy—what word should appear next in a sequence.
Simply stated, chatbots are trained to guess, very quickly and very well. As one lawyer wrote, when you ask AI for legal advice, you aren't getting legal advice. You're playing a game of "legal Mad Libs."
When you talk to your lawyer about a legal matter, they're obligated to keep what you say in the strictest confidence. A chatbot doesn't have to follow those rules. Why? Because it isn't a lawyer.
In every state, lawyers must follow strict rules governing their professional behavior. One of those rules prohibits a lawyer, in all but a few situations, from disclosing information about their representation of you. Another rule called the attorney-client privilege does much the same.
An attorney who breaks these rules—even inadvertently—can land in hot water with regulatory authorities like the state bar association or the state supreme court. And on top of all that, a disclosure that harms you can lead to a legal malpractice lawsuit.
None of those rules apply when you tell your legal secrets to a chatbot. Because it isn't a lawyer, a bot doesn't have to follow attorney confidentiality rules. Likewise, conversations with a chatbot aren't protected from disclosure by the attorney client privilege. And if a bot does spill the beans on your communications, you don't have a legal malpractice claim against it.
The bot's creator might promise to keep your information private—sort of—but there's plenty of wiggle room in those terms of service (discussed below). And even if you can make out a breach of contract claim, the compensation you recover (called "damages") likely will be limited.
In short, you don't have the same legal or other protections with a chatbot that you do with a lawyer.
When you signed up for a chatbot account, you had to agree to the chatbot owner's terms of service, and usually its privacy policy, too. If you're like most people, you agreed to those terms without reading them. That was a mistake, because the terms of service and privacy policy most likely:
The terms of service you probably ignored—but that you'll be treated as having read and agreed to anyway—form part of the contract between you and the chatbot's owner. In the event of a legal fight, the chatbot's owner will claim that you agreed:
For example, Perplexity's Terms of Service expressly tell you not to rely on it for advice of any kind, including legal advice. In addition, nothing Perplexity tells you is "a substitute for advice from a qualified professional." OpenAI's Terms of Use say that by using ChatGPT, you agree not to rely on it as a substitute for professional advice.
So, what does all this legalese mean? It means that when you ask a chatbot for legal advice, or when you rely on what the bot tells you instead of talking to a lawyer, you're doing things you agreed not to do. In legal terms, you've violated ("breached") the terms of service or use, meaning you've breached the contract.
Have a look at the chatbot's terms of service and privacy policy. Both probably tell you not to expect what you say to remain private. OpenAI's Terms of Use, for instance, say that the company might use "Content"—things you tell the chatbot and that it tells you in response—to "comply with applicable law." The company's Privacy Policy does the same.
You'll find similar terms and privacy conditions in use by all chatbot owners. They'll disclose your data and information to:
In case it isn't clear by now: What you tell a chatbot can land you in jail, or cost you money in a civil lawsuit. In other words, the problem with asking a chatbot for legal advice goes well beyond lack of confidentiality. If you wind up in court, the other side can use your words (and what the bot says in response) against you.
Suppose, for example, you're arrested on suspicion of DWI. Before you hired a lawyer, you communicated with a chatbot, disclosing potentially incriminating facts about your case. The prosecutor subpoenas your chatbot's owner to produce records of all your communications with the bot since your arrest. They'll get those records and will use anything that's incriminating against you in court.
When you're communicating with a chatbot, especially about anything having to do with legal matters, pretend like the prosecutor or your opponent's lawyer is listening in to what you say. Even though they aren't, they might as well be. Save the talk about your legal problem for face-to-face conversations with your lawyer.
Getting legal help from a chatbot is like getting legal advice from your hair stylist. Neither one knows what they're talking about, and they might end up blabbing the details of your conversation to the wrong people.
If you need help with a legal problem, speak to a lawyer—someone who knows the law, is trained to analyze legal issues, and can provide you with experienced, sound counsel. And just as important, someone who will keep what you say confidential.