Return to Blog

Moody Abdul

Is Using AI for Therapy Notes Ethical?

Jun 2, 2025

Thumbnail
Thumbnail
Thumbnail

Using AI for therapy notes is ethical, but only if it’s done responsibly.

As tools like ChatGPT, AI chatbots, and mental health apps become more popular in mental healthcare, therapists are asking more questions. On one hand, they want to get more out of AI tools. On the other hand, they’re unsure whether it’s completely secure and ethical.

More than 50% of mental health practitioners report burnout, with a significant driver being the administrative burden. This includes lengthy documentation and electronic health record (EHR) work.

In this guide, we’ll explore the ethical implications of using AI in psychotherapy note-taking:

  • How to stay compliant with HIPAA, PHIPA, and professional ethical codes.

  • How to evaluate tools, especially HIPAA/PHIPA‑compliant AI tools like Klarify

  • Best practices for transparent, ethical implementation

Key Ethical Considerations When Using AI for Therapy Notes

As mental health professionals explore the use of AI in therapy sessions, it's essential to address issues related to confidentiality, transparency, professional competence, and accuracy.

Confidentiality and Privacy Protection

In the U.S. and Canada, regulations like HIPAA and PHIPA require that mental health documentation be securely stored and protected.

Ethical practice means choosing AI platforms that are HIPAA- and PHIPA-compliant, guarantee no model training on session data, and store information securely in Canada or the U.S.

Transparency and Consent

Clients should be told clearly and early if AI tools are involved in documenting their therapy. This includes explaining what the tool does, how data is handled, and obtaining specific consent that gives clients the option to ask questions or decline. Transparent communication protects client autonomy and builds the trust necessary for ethical therapeutic relationships.

Need help getting started? Klarify has created a therapist-friendly AI consent form template to help you start these conversations with clarity and confidence.

Professional Competence and Human Oversight

AI can assist with writing notes, but it cannot replace a therapist’s clinical reasoning or professional judgment. Clinicians must review and revise AI-generated content to ensure that each note accurately reflects the session. Maintaining strong documentation skills and oversight ensures that AI supports rather than weakens clinical competence.

Accuracy and Clinical Integrity

When using AI to support therapy documentation, transcription accuracy is critical, especially in clinical contexts where misheard words can distort meaning. Speech-to-text AI technology has come a long way, but it’s not perfect. Background noise, accents, and technical or medical vocabulary can still lead to errors.

In therapy, a small transcription mistake, like confusing “mania” with “mania-like symptoms”, can affect diagnosis or treatment planning. That’s why therapists should use AI tools designed to handle clinical language and always review transcripts before finalizing notes.

To reduce risks, professionals should:

  1. Never rely on AI-generated notes without personal review.

  2. Stay updated on known limitations and ensure oversight at every step.

The Benefits of Ethical AI Implementation in Therapy Notes

When used responsibly, AI apps can provide meaningful support to therapists in terms of clinical care and professional well-being.

Here are some of the benefits when using AI for therapy documentation:

Reduced Administrative Burden and Prevents Burnout

A 2023 survey found that 55% of therapists cited administrative burden as a top cause of burnout. AI tools help reduce time spent on notes, freeing up energy for client care. In practice, AI-generated therapy notes have helped some mental health professionals reclaim 12-15 hours per month. This extra time can free up space for rest, supervision, and more sessions, which ideally will help prevent burnout.

Enhanced Therapeutic Presence

Therapists often experience anxiety around forgetting session details or being unable to capture them accurately—a form of stress that directly erodes therapeutic presence.

One therapist shared on the r/therapists subreddit:

“I feel anxiety around progress notes, which has resulted in me either avoiding and putting off notes, or writing unnecessarily long, detailed notes out of fear.”

Supporting note-taking can feel like a form of “emotional armor,” allowing therapists to remain fully present with clients without the distraction or stress of recalling and documenting everything after the session. 

As one Klarify user put it, she felt “noticeably more present with clients,” simply because she wasn’t carrying the weight of unfinished notes.

Improved Documentation Consistency

When a mental health therapist manages large caseloads or works across multiple modalities, consistency in note-taking can become a challenge. Ethical AI tools trained specifically on psychotherapy frameworks (like SOAP, BIRP, EMDR, IFS, or CBT) help reinforce documentation standards.

These tools work like a second set of eyes, helping spot missing details, suggest structure, and keep notes complete. For therapists who struggle with documentation, this kind of support can offer both peace of mind and improved clinical continuity without compromising ethical standards.

Comparing AI Therapy Note Tools

Now that we’ve covered the ethical foundations and potential benefits, the next step is choosing the right AI tool. Not all AI note-taking systems are designed with therapists (or ethics) in mind. Some are designed for general medical documentation. In contrast, others may raise concerns related to session recording, data storage, or a lack of clinical specificity.

Here’s how various categories of AI therapy tools compare, especially when evaluating them through the lens of PHIPA and HIPAA compliance, client trust, and clinical usefulness.

Generic Tools vs. Therapy-Specific Solutions

Many therapists explore general tools like ChatGPT or features inside electronic health record systems. 

By contrast, tools specifically built for therapy offer structured support for modalities like SOAP, BIRP, EMDR, or IFS, helping therapists meet documentation standards more consistently. They also often include features like note templates, treatment plan support, and clinical language parsing, things general-purpose AI tools simply weren’t designed to handle.

Recording vs. Non-Recording AI Tools

Recording sessions with AI tools can raise ethical and legal questions, especially if full video or audio files are stored long-term or processed on external servers. Some tools like Klarify take a privacy-first approach by recording audio-only, and even then, only temporarily. All recordings are deleted immediately after processing.

For therapists who prefer not to record at all, Klarify also supports dictated session summaries. This gives you full control over what’s documented while still benefiting from AI-generated notes. Always review a tool’s recording policy to ensure it aligns with your clinical and ethical standards.

Canadian vs. US-Based Solutions

For therapists in Canada, using a U.S.-based tool that only meets HIPAA standards may not be enough. Tools must comply with PHIPA, PIPEDA, and applicable provincial laws.

For example, Klarify is hosted in Canada and is fully compliant across both Canada and the U.S., including HIPAA, PHIPA, and PIPEDA. This makes it a suitable choice for therapists who prioritize local data residency and legal protection.

If you're looking for a privacy-first, therapist-designed solution, try Klarify for free and experience the difference firsthand.

Red Flags: When AI Therapy Notes Cross Ethical Lines

Using AI therapy tools requires vigilance. Even well-intentioned use can slip into unethical territory. Here are the major warning signs to watch for and how to avoid them.

Inadequate Training and Preparation

Red flag: Using AI tools with live clients before understanding their risks or limitations.

For example, a private therapy began using a new AI note-taker without realizing it stored session transcripts in the cloud without encryption. During a supervision meeting, she discovered her documentation practices could violate PHIPA guidelines, putting her license and client confidentiality at risk.

What to do:

  • Test AI tools in low-stakes environments first.

  • Review vendor privacy documentation and verify HIPAA/PHIPA compliance.

  • Make AI part of your practice management review, not an ad hoc experiment.

Over-Reliance and Skill Erosion

Red flag: Letting AI generate notes without review.

AI can support documentation tasks. However, relying too heavily on AI-generated content over time may cause therapists to disengage from clinical reflection, weakening both documentation quality and ethical judgment.

What to do:

  • Always review and edit AI-generated therapy notes.

  • Treat the documentation process as a form of self-supervision and therapeutic integration.

  • Stay active in continuing education around AI applications in therapy.

Client Deception or Insufficient Disclosure

Red flag: Not telling clients about your use of artificial intelligence tools.

Transparency is a cornerstone of ethical practice. Failing to disclose that an AI system is involved in documentation, even if it’s secure, can damage trust and violate ethical codes.

What to do:

  • Offer a clear, plain-language explanation during intake or informed consent.

  • Example: “I use an AI tool that helps me draft my notes after sessions. It doesn’t store your data or record audio. You’re welcome to ask questions or opt out.”

  • Document client understanding and agreement.

Best Practices for Ethical AI Therapy Note Implementation

Once you’ve identified the risks and red flags, the next step is ensuring that any AI assistant you adopt fits your practice ethically, legally, and clinically. These best practices will help you choose the right tool, communicate with clients, and maintain high standards over time.

Evaluate Tools Before You Use Them

Don’t rely on marketing alone. Look under the hood of any AI therapy note app and ask:

  • Does it meet your country’s privacy regulations (HIPAA or PHIPA)?

  • Is there a signed BAA available?

  • Where is data stored, and is it erased after notes are generated?

  • Has the system been shaped by mental health professionals or tested in clinical settings?

Talk to Clients Clearly and Early

Even if an AI system doesn’t store personal data, you still need to let clients know you’re using it. This is especially important when addressing sensitive mental health concerns, where transparency builds trust and safeguards the therapeutic relationship. Avoid vague language in your intake forms. Terms like "technology-assisted notes" or "digital tools may be used" don’t give clients a clear picture.

Instead, use plain, specific language like:

“I use an AI note assistant that helps me summarize sessions. It doesn’t record video or store your data. You’re welcome to ask questions or opt out.”

Clear communication ensures clients understand what’s involved and supports informed consent.

Keep Your Clinical Standards High

Using AI doesn't mean you stop being a clinician. Your role is still to ensure each note is accurate, professional, and clinically sound. That means:

  • Regularly reviewing your AI-assisted documentation.

  • Staying informed about changes to the tool.

  • Continuing education on AI in psychotherapy.

  • Using note-taking as an opportunity for clinical reflection, not just administrative tasks.

By keeping human oversight central, you can integrate AI into your practice without compromising your responsibility or therapeutic presence.

TL;DR

AI is quickly becoming a practical part of clinical workflows, including how therapists document sessions. And while tools like AI therapy notes can reduce burnout, improve consistency, and support therapist well-being, they also demand thoughtful implementation.

So, is using AI for therapy notes ethical?

Yes, but only when therapists stay in the driver’s seat.

Ethical AI use requires more than compliance with HIPAA or PHIPA. It involves transparency with clients, continuous oversight, and a commitment to preserving clinical judgment. The use of AI in therapy must always center the client’s safety, confidentiality, and trust.

If you're exploring AI-powered tools for your practice, start small. Test features, read the fine print, and have open conversations with your clients. Most importantly, remember that no tool replaces the wisdom, empathy, or presence of a human therapist.

Looking for a privacy-first, therapist-designed AI note assistant? Try Klarify today and see how it can support your practice ethically.

FAQs

Can I use ChatGPT to write therapy notes?

ChatGPT is not HIPAA or PHIPA compliant, so it’s not recommended for clinical documentation. Specialized tools built for psychotherapy are safer and more appropriate.

Is AI note-taking HIPAA compliant?

Only some AI tools meet HIPAA requirements. Klarify, for example, is a HIPAA- and PHIPA-compliant AI note assistant built specifically for therapy. Always check for a signed BAA and secure data handling.

What happens if AI creates incorrect information in my therapy notes?

Therapists are still responsible for reviewing and correcting all AI content. Inaccurate notes can compromise care and violate ethical standards, so human oversight is essential.

Do I need to tell my clients I'm using AI for notes?

Yes. Informed consent is required. Clients have the right to know how their session information is used, especially when involving AI tools in psychotherapy documentation.

Are Canadian therapists allowed to use US-based AI tools?

Yes, but doing so may raise PHIPA compliance concerns if data is stored outside Canada. Tools like Klarify, which are hosted in Canada, help maintain data residency and provide therapists with peace of mind.

Ready to get started?

Ready to get started?

Ready to get started?

Get in Touch

Customer Care

+1 (778) 800 5773

Get in Touch

Customer Care

+1 (778) 800 5773

Get in Touch

Customer Care

+1 (778) 800 5773