AI Clinical Notetakers, The Real Cost, the Real Limitations, & When They Make Sense
A Practical Framework for Evaluating Whether AI Documentation Tools Are Worth
The Investment, & How to Roll Them Out If They Are
Table Of Contents
Overview of AI Clinical Notetakers for Mental Health Practices
If there is one AI tool that has captured the attention of mental health practice owners faster than any other, it's the AI clinical notetaker. The promise is compelling:
A tool that listens to a therapy session, in person or via telehealth, and generates a structured clinical note in seconds, freeing clinicians from the hours of documentation that eat into their evenings, their weekends, and their capacity to see more patients.
The reality, based on Solomon Advising's work across dozens of group practices, is more nuanced than the pitch, but also more encouraging than the skeptics suggest. Over half of Solomon Advising's client practices are using AI notetakers in some capacity, and the experience has been overwhelmingly positive. The time savings are real. The compliance improvements are measurable. And for most practices, the math works.
But "the math works" only holds if you approach it as an economic decision rather than a technology impulse. The practices that adopt AI notetakers successfully are the ones that establish a baseline, roll out in phases, measure the impact, and make a data-driven determination about whether the cost is justified by the savings. The practices that struggle are the ones that adopt across the board without knowing what they're measuring against, or worse, are treated as a staff perk rather than a business investment.
This guide walks through what practice owners need to know: what the actual adoption landscape looks like, how to evaluate ROI before you commit, how to navigate the EHR-native versus standalone decision, and the emerging policy questions around pre-licensed clinicians, patient disclosure, and session recording that most practices haven't addressed yet.
This topic is part of our comprehensive guide to Technology & AI in Mental Health Practices, which covers the full landscape of how technology is impacting private practices and what owners need to know.
section one
The Adoption Landscape, Who's Using Them & What They're Actually Experiencing
AI notetakers are no longer an early-adopter curiosity.
Based on Solomon Advising's client base, over 50% of group practices are using an AI notetaker in some capacity, whether that's full team adoption, leadership-only access, or a test rollout with a handful of clinicians. The trajectory is clearly toward broader adoption, and the holdouts are increasingly in the minority.
"Generally, the experience has been pretty positive," says Jennifer Guidry, CEO of Solomon Advising. "Almost every client I have that's using it definitely sees the advantage, definitely sees the time savings. There really hasn't been any bad experience that I've seen across the board. It's mostly been good."
That positive signal deserves context, though, because it's grounded in a specific understanding of what these tools are actually doing. The primary value of an AI notetaker in mental health isn't clinical insight; it's compliance efficiency. The documentation requirements that consume clinician time are overwhelmingly driven by insurance requirements, not by clinical necessity.
"To be candid, most of the note-taking requirements are there for the sake of satisfying insurance," Guidry explains. "It is not the case that clinicians are worried about AI taking an inaccurate note and having that impact patient care in any way. This is seen as a potential tool to help practices save time and energy and tighten up compliance, to ensure that everything the insurance wants to see in the note or the treatment plan is in there."
This reframing matters because it changes what you're evaluating. You're not asking whether AI can replace clinical judgment; it can't, and nobody is suggesting it should. You're asking whether AI can generate a compliance-ready draft of a progress note faster and more consistently than a clinician typing from memory at 9 PM on a Thursday. And the answer, for most practices, is unambiguously yes.
"We didn't want to be the practice that jumped on every new tool without thinking it through. We piloted with a small group first, evaluated it honestly, and then made a decision based on what we actually saw, not what the sales pitch promised."
- Laura Slagle, Owner of Olive Leaf Family Therapy
What the Market Looks Like
The AI notetaker landscape for mental health is maturing rapidly.
At the most accessible end, several major EHR platforms have built AI documentation tools directly into their systems:
Beyond the EHR-native options, standalone AI notetakers designed specifically for behavioral health include tools like Mentalyc, Blueprint, Upheal, JotPsych, Berries, and PMHScribe.
→ SimplePractice
offers Note Taker, its integrated AI tool that can listen to telehealth sessions conducted within the platform and generate draft progress notes. It can also process voice-transcribed clinician impressions after a session, allowing clinicians to dictate rather than type. The tool requires explicit client consent through a specific authorization form.
→ TherapyNotes
has launched TherapyFuel, its AI suite that includes note summarization and a coming ambient scribe feature for both in-person and telehealth sessions. The scribe functionality is still in beta rollout as of early 2026 but represents TherapyNotes' commitment to EHR-native AI.
→ ICANotes
offers an ambient AI scribe that's fully integrated into its EHR, generating structured, bill-ready notes in real time with no permanent audio storage, a notable differentiator for practices concerned about data retention.
→ Jane App
has announced AI Scribe, a feature that will convert voice notes and session recordings into structured SOAP notes using AI-powered charting. As of this writing, the feature is in development.
More general-purpose AI transcription tools like Freed and Fathom are also used by some mental health practices, though they weren't designed for behavioral health documentation specifically.
The pricing ranges are significant.
EHR-native tools are sometimes included in existing subscription tiers or available as add-ons for a modest fee. Standalone tools range from per-session pricing (Blueprint at $0.49 per session) to monthly subscriptions typically falling in the $25-$50 per clinician range, though some premium platforms charge $90 or more per clinician per month.
section two
The ROI Framework: How to Know If It's Worth It Before You Commit
Step one
Establish a Baseline
Before you introduce any AI tool, you need to know what documentation is currently costing you. For W2 practices that compensate clinicians for administrative time, which includes note-taking, treatment planning, meeting attendance, supervision, and training, the documentation component of that administrative time is a real and measurable cost.
"Are you adequately or accurately tracking how much time your clinicians are spending on taking notes?" Guidry asks. "And what does that cost to you? Most practices don't know."
The answer varies significantly by clinician. A seasoned therapist with a stable caseload of individual adult clients may spend 10-15 minutes per note. A pre-licensed clinician doing complex intakes with adolescents may spend 45 minutes or more. The baseline isn't a single number; it's a distribution across your clinical team, and the clinicians at the top of that distribution are where the economic case for AI notetakers is strongest.
Step two
Roll Out in Phases
"I would roll out in phases with a handful of clinicians, and then I would measure how much reduction there is in their actual note-taking time," Guidry explains. "It should significantly reduce the time they spend, significantly."
A well-structured pilot typically involves three to five clinicians representing a range of experience levels, caseload types, and documentation habits. Run the pilot for four to six weeks, long enough to get past the learning curve and capture steady-state performance. Track the same metric you baselined: time per note, measured consistently.
Step three
Measure the Delta
If a clinician was spending an average of 30 minutes per note and the AI tool reduces that to 8 minutes, that's 22 minutes saved per session. For a clinician seeing 25 clients per week, that's over 9 hours per week of documentation time recovered. At a burdened hourly rate of $35-$50 for administrative time, that's $315-$450 per week in cost savings, against a tool cost of $25-$50 per month.
The math often isn't close. For most practices, the ROI is substantial, sometimes an order of magnitude. But it only becomes visible if you've done the work of establishing the baseline and measuring the change.
Step four
Make the Economic Decision
"What you don't want to do is have no sense of how much time your clinicians are spending taking notes, incur the cost across the board, and then have no way to measure whether there's been any cost savings or benefit," Guidry cautions. "This shouldn't be introduced simply as a perk or a staff benefit. It absolutely needs to have an economic benefit. It's got to have a return."
"The time savings were real, our clinicians are spending significantly less time on notes and more time either with patients or actually leaving at a reasonable hour. That matters for retention as much as it matters for productivity."
- Laura Slagle, Owner of Olive Leaf Family Therapy
Section 3:
EHR-Native vs. Standalone: The Integration Decision That Most Practices Get Wrong
Once a practice decides to adopt an AI notetaker, the next decision is which one, and this is where Guidry's guidance is clear and specific.
"I generally recommend using whatever AI notetaker is built natively into the EHR," she says. "SimplePractice and TherapyNotes both have one. Sessions Health has one. Jane has one coming. It's just not nearly as efficient if you use a standalone tool that requires a secondary platform."
The logic is grounded in the same principle that drives so many operational challenges in mental health practices: integration. Every time you introduce a standalone tool, you create another login, another workflow, another data transfer point, and another potential compliance surface. The clinician generates a note in the standalone tool, then has to copy it into the EHR. That's a manual step that introduces friction, creates opportunities for error, and undermines the time savings the tool was supposed to deliver.
EHR-native tools eliminate that friction. The note is generated within the system the clinician already uses for scheduling, billing, and clinical records. There's no copy-paste step, no formatting adjustment, no toggling between platforms. The documentation workflow stays contained in a single system.
There's also a compliance advantage. "That also covers any concern you would have about HIPAA compliance," Guidry notes. "If you're going to use Fathom or Freed or any of these other notetakers, whether they're designed for mental health or not, you 100% need to ensure that there's a BAA in place. But that's one less step you have to worry about if you use the AI notetaker that's built into your EHR."
When Standalone Tools Make Sense
This isn't to say standalone tools don't have a role. There are scenarios
where they may be the better choice:
Your EHR doesn't yet offer an AI notetaker, or its offering is immature and limited.
Some EHR-native tools are still in early stages, TherapyNotes 'scribe is in beta, Jane's is in development, and may not yet deliver the functionality a practice needs.
Your practice has clinicians working across modalities that require specialized documentation. Some standalone tools like Mentalyc offer more sophisticated template options, treatment plan generation, and clinical insight features that go beyond what EHR-native tools currently provide.
Your practice is evaluating an EHR switch and doesn't want to invest in a tool that's locked to a platform you may leave. Standalone tools that work across EHRs offer portability that native tools don't.
But for most group practices using SimplePractice, TherapyNotes, or a comparable EHR with a built-in option, the EHR-native path is the right default. It's simpler, it's more efficient, and it eliminates an entire category of compliance and workflow concerns.
section 4
The Policy Questions Nobody's Answered Yet
Beyond the economics and the technology, AI notetakers are surfacing a set of policy questions that most practices haven't formally addressed. These are emerging issues, not settled ones, but practice owners who think about them now will be ahead of the curve.
→ Patient Disclosure & Consent
When an AI tool is recording or processing a therapy session, whether in real time or from a post-session dictation, patients need to know. This is both an ethical obligation and, increasingly, a compliance requirement.
Most AI notetaker vendors provide consent language and authorization forms. SimplePractice, for example, requires a specific consent form before the Note Taker can be activated for a session. But beyond the vendor's requirements, practices need to think about their broader AI use policy: what AI tools do we use, how do we use them, what data do they process, and how do we communicate that to patients?
"There is sort of this general ethical discussion around what to disclose and how to disclose to your patients what your AI use model is as a practice," Guidry explains. "Like, do you use it and how do you use it, so that your patients are informed and are aware of your security policies?"
This is an area where having a clear, documented practice-level policy, not just relying on individual vendor consent forms, demonstrates professionalism and builds patient trust.
→ The Session Recording Question
There's a related trend that's emerging independently of AI notetakers but is increasingly intersecting with them: patients wanting to record their own therapy sessions.
"This has been happening quite a bit in the telehealth space," Guidry notes. "Patients want to record their sessions, and many times do that without permission from their therapist. Therapists and practices often make this a violation of their practice policies and say that this is not allowed."
The dynamic is evolving. Patients who record sessions are often doing so for benign reasons; they want to review what was discussed, capture coping strategies they might forget, or even run the recording through an AI tool to generate their own session summaries. But the practice implications are significant: recorded sessions create legal exposure, raise questions about confidentiality, and intersect with consent laws that vary by state.
Whether or not a practice allows AI notetakers, it needs a clear policy on session recording, who can record, under what circumstances, with what consent, and what happens with the recording. This is no longer a hypothetical concern; it's happening now, and practices without a policy are exposed.
→ The Pre-Licensed Clinician Debate
"Most practice owners that I speak with have hesitation about allowing their pre-licensed clinicians to use an AI notetaker out of the gate," Guidry observes. "They feel like it's important and a fundamental part of their training to know how to take an appropriate note. If they jump straight to relying on AI for their note-taking, are they going to have enough discernment to identify if this is a quality note, if this is an accurate note?"
This is a real tension, and there's no universal answer. Pre-licensed clinicians and associates completing their supervised hours spend disproportionately more time on notes than experienced clinicians, precisely because they're still developing the skill. They're slower, less efficient, and the notes often cost the practice more in administrative time than any other clinician category.
The economic argument for giving AI notetakers to pre-licensed clinicians is strong. The training argument against it is also legitimate. Some practices are landing on a hybrid approach: requiring a minimum threshold of supervised hours (1,000 hours is a common benchmark) before granting AI notetaker access, or allowing pre-licensed clinicians to use the tool as a drafting aid that they're required to substantially review and edit, maintaining the learning process while reducing the time burden.
"We put a lot of thought into how we rolled this out, especially for our newer clinicians. The technology is helpful, but it doesn't replace the skill of learning to write a strong clinical note. We wanted to make sure we weren't shortcutting that development."
- Laura Slagle, Owner of Olive Leaf Family Therapy
→ How This Relates to Technology & AI in Mental Health Practices
AI notetakers are the most visible example of how AI tools are entering the clinical workflow, but they're part of a much larger technology landscape that practice owners need to navigate. The integration challenges, compliance considerations, and economic evaluation frameworks discussed here apply to every AI tool a practice considers adopting. This topic is part of our comprehensive Technology & AI Guide for Mental Health Practices, which provides practice owners with the full picture of what's changing and how to respond strategically.
key takeaways
1. AI Notetakers Work, But Treat Adoption as an Economic Decision, Not a Technology Impulse
Over 50% of group practices are already using AI notetakers in some capacity, and the experience has been overwhelmingly positive. But successful adoption requires a baseline measurement of current documentation costs, a phased rollout with a small group of clinicians, and a clear cost-benefit analysis before committing across the team. The ROI is usually substantial, but it only becomes visible if you measure it.
2. Default to Your EHR's Built-In Tool
If your EHR offers a native AI notetaker, start there. It eliminates the copy-paste workflow, reduces compliance overhead, and keeps documentation within a single system. Standalone tools have a role when your EHR doesn't offer a native option or when you need specialized functionality, but for most practices, the EHR-native path is simpler, more efficient, and lower risk.
3. Get Ahead of the Policy Questions Now
Pre-licensed clinician access, patient disclosure and consent, practice-level AI use policies, and session recording rules are all emerging issues that most practices haven't formally addressed. The practices that develop clear policies now, rather than reacting when a problem surfaces, will be better positioned and better protected.
Related Articles & Resources
To further your understanding of AI tools in clinical settings and the broader technology landscape for mental health practices, we've
curated a selection of related articles and resources. Back To Pillar Page.
frequently asked questions
-
Start with whatever is built into your EHR. If you're on SimplePractice, use Note Taker. If you're on TherapyNotes, explore TherapyFuel. If your EHR doesn't yet offer a native option, evaluate standalone tools designed specifically for behavioral health, such as Mentalyc, Blueprint, Upheal, and JotPsych, which are all worth considering. Ensure any standalone tool has a signed BAA and a clear data retention policy. Avoid using general-purpose transcription tools that weren't designed for clinical documentation unless you've thoroughly vetted their HIPAA compliance posture.
-
Most AI notetaker vendors provide consent language and authorization forms that you can incorporate into your intake process. But don't rely solely on vendor-provided forms. Develop a practice-level AI use disclosure that explains what tools you use, how session data is processed, whether recordings are stored and for how long, and what patients can opt out of. Make this part of your standard informed consent process, not a separate surprise form. Transparency builds trust, and patients who understand what's happening are far more likely to be comfortable with it.
-
The clinician. AI notetakers generate drafts; the clinician is responsible for reviewing, editing, and signing off on the final note. This is no different from reviewing a note written by a human scribe. The tool assists with documentation; the clinical and legal responsibility remains with the clinician. This is why the review step is non-negotiable, and why practices should build review expectations into their documentation policy regardless of whether AI is involved.