The #1 AI Deposition Summary Tool in Today’s Legal Market
Do You Want AI Reading Your Depositions?
AI-generated deposition summaries are rapidly becoming a staple in the litigation industry, with court reporting companies rolling them out every week. It sounds convenient, but is it as magical as it seems?
Before you embrace AI for digesting deposition transcripts, there are some critical considerations. Here’s what you need to know to make informed decisions about using AI-generated summaries in your practice.
Who’s Really "Reading" Your Deposition? Spoiler alert: It’s probably not your favorite court reporter. In most instances, court reporting companies are not using their own proprietary AI models to summarize depositions. Instead, they rely on third-party tools like ChatGPT to generate these summaries. They may even outsource it to yet another company. While this might sound harmless, it raises important questions: Is the content reviewed for accuracy before reaching your hands? Is the deposition data being handled confidentially? Who pays for the cost of these services?
Since you want to protect your bar license, you need answers to these questions.
Your Ethical Duties Don’t Vanish With AI The ethical rules governing legal practice don’t get set aside just because AI is involved. Even when an AI-generated deposition summary lands in your inbox, you remain responsible for ensuring its accuracy and compliance with professional standards. The American Bar Association’s (ABA) Standing Committee on Ethics and Professional Responsibility and several state bars have issued guidance on generative AI use, which extends to deposition summaries.
According to ABA Model Rules 5.1 and 5.3, you must supervise and verify the accuracy of any AI-generated text used in your work. The guidelines emphasize that lawyers outsourcing tasks to third parties must make reasonable efforts to ensure the work meets professional standards. This obligation extends to the use of AI tools. Since large language models (LLMs) like ChatGPT can sometimes fabricate information, every AI-generated summary should be reviewed for accuracy before you rely on it, whether in motions, mediations or in court.
Generic, Over-Inclusive, and Often Inaccurate AI-generated deposition summaries are notorious for being generic and over-inclusive. Unlike a seasoned trial lawyer or paralegal who zeroes in on the deposition testimony that truly matters to a case, AI can’t yet discern the subtle nuances, spot evidentiary issues, or assess witness credibility. Expect to spend considerable time editing AI-generated summaries to make them useful for motions, mediation, or trial preparation in your cases. The editing process could be just as time-consuming as drafting a summary from scratch.
Confidentiality: Are You Unknowingly Sharing Sensitive Information? A critical ethical consideration is the protection of client confidentiality. Many LLM providers, such as OpenAI and Anthropic, clearly state that data processed through their models can be used for further training unless special precautions are taken. This means that sensitive deposition content, including personal identifiers, financial details, or medical information, could potentially be exposed to unauthorized parties.
The ABA’s Model Rule 1.6 on protecting client confidentiality requires that lawyers carefully evaluate the risks before entering client information into AI tools. The rule points out that various generative AI tools offer different levels of protection against unauthorized disclosure, meaning that risk assessments must be tailored to the specific client, matter, and AI tool in use.
Depositions often include personally identifiable information (PII), such as Social Security numbers, medical histories, or financial conditions. In many jurisdictions, such information must be redacted from public filings. Sharing PII outside of tightly controlled legal contexts can breach statutory privacy laws or constitute common law torts for invasion of privacy. Simply put, just because sensitive information was disclosed during a deposition doesn’t mean it can be disclosed publicly or processed by LLMs that may use it as training data.
Are Zero Data Retention Policies (ZDR) in Place? To ensure sensitive information remains confidential, you should only work with AI solution providers such as esumry that adhere to strict security protocols and Zero Data Retention (ZDR) policies. These policies ensure that no user data is stored or used for training the language model after the session ends. If your AI service provider lacks such arrangements, you could be running afoul of your confidentiality obligations, court rules, or state privacy laws.
Additionally, even if AI-generated summaries are being produced by a third-party court reporting company, you are still responsible for ensuring that all parties involved follow the necessary confidentiality requirements. Make sure that the court reporting company has enforceable confidentiality obligations and ZDR arrangements in place with the AI provider.
Communicating with Clients: The Need for Transparency Transparency with your clients about using AI is not just a good idea—it’s a professional obligation under ABA Model Rule 1.4. The ABA advises that you should discuss the use of AI tools with your clients, especially when the technology is integral to a specific task or could impact their case. Consider whether your clients need to be informed about how AI will process their information and whether it could affect their confidence in the legal representation you provide.
Another reason for discussing AI use with clients is cost. While AI can save time, the cost of third-party AI services may still be passed on to the client as a recoverable expense. The ABA suggests that when using AI for tasks like deposition summaries, which differ from standard overhead costs like grammar-checking software, it is reasonable to bill clients for these services, provided you disclose this in advance. Clear communication about fees and AI’s role in handling their case can help maintain transparency and client trust.
Conclusion: Embrace AI, but with Caution AI is a powerful tool that can streamline legal work, but it doesn’t replace the ethical and legal responsibilities that define the practice of law. The use of AI-generated deposition summaries requires careful supervision to ensure accuracy, robust measures to protect client confidentiality, and open communication with clients about costs and technology use. By taking a balanced approach, you can leverage AI’s benefits while still upholding the standards of the legal profession. Technology may assist, but the final responsibility rests with you.
Try It Out!
We invite you to see firsthand how AI and automation can revolutionize your deposition analysis.
Get a 14-day free trial of esumry today and start transforming your litigation practice for the future.