When AI Starts Doing the “New Grad” Work: What It Means for Us as Therapists

If you have been in practice long enough, you probably remember how your early years were shaped by the unglamorous parts of the job. Notes, reports, intake forms, scheduling messages, scoring, and endless documentation clean up. It was exhausting, but it was also part of the apprenticeship. Writing things down forced us to clarify what we saw, what we thought it meant, and why we chose a particular next step.

Now artificial intelligence is stepping into that exact layer of clinical life across occupational therapy, speech therapy, psychology, and physiotherapy. Not in a dramatic “robots are replacing therapists” way, but in a practical, everyday way. AI can draft documentation, summarize long text, organize information, and generate first pass templates. That changes what clinics measure, what managers expect, and what early career clinicians feel pressured to deliver.

From our perspective as clinicians, the biggest shift is that AI is compressing parts of the learning curve. A new graduate can produce something that looks polished very quickly, and sometimes that polished output can hide the fact that clinical reasoning is still developing. A confident sounding paragraph is not the same as a sound formulation. A tidy plan is not the same as an individualized plan. When caseloads are heavy and supervision time is limited, it becomes easier to mistake speed for competence, and that is where risk quietly grows.

So the entry level bar is moving. If routine tasks become faster, the expectation becomes that the clinician will contribute more of what cannot be automated. Stronger clinical reasoning shows up earlier. Ethical judgment stops being a once a year training topic and becomes a daily decision, especially around privacy, consent, bias, and what information should never be entered into a public tool. Digital literacy also becomes part of professionalism, not because we need to be tech experts, but because we need to understand enough to use tools responsibly and explain their limits.

We also think it helps to be very clear about what AI is good for in real practice. It can support preparation, structure, and efficiency. It can help us draft, brainstorm, and organize. But it cannot hold accountability. We still have to review every output like we might be questioned on it. We still have to check sources when research is summarized. We still have to tailor every plan to the person in front of us, because generic recommendations can be subtly wrong for a client’s context, culture, risks, and goals.

For supervisors, clinic owners, and senior clinicians, there is a parallel responsibility. If AI reduces documentation time, we should be intentional about where that time goes. It can go toward better reflection, clearer consent conversations, more collaborative care, and stronger follow up. Or it can go toward squeezing in more sessions while calling it innovation. Only one of those choices protects client centered care, and only one supports the development of safe clinicians.

In the end, we do not see AI as a replacement for therapy: it reshapes the role. The therapists who will thrive are those who think clearly and use tools carefully, while protecting what stays deeply human: relationship, pacing, and ethical judgment. That also means using AI responsibly: keep client information anonymized/de‑identified whenever possible, and prefer local, offline systems (on-device or a secure internal server) over sending data to online tools. If a cloud-based tool is used, it should be only with explicit informed consent, strict limits on what is entered, and a transparent explanation of where data goes, who can access it, and how it is stored.If AI gives us anything worth keeping, it is time to do more of what only clinicians can do.

Resources

Leave a Comment

Your email address will not be published. Required fields are marked *

Shopping Cart