English

English

AI in Therapy: Cognitive and Clinical Impacts for Speech, Occupational, Physical, Psychomotor Therapists, and Psychologists

Introduction: AI’s Expanding Role in Therapy Artificial intelligence (AI), especially large language models (LLMs) such as ChatGPT, is rapidly reshaping the landscape of healthcare and therapy. From generating therapy materials and automating documentation to providing real-time feedback and supporting client communication, AI promises greater efficiency, personalization, and accessibility for practitioners across speech therapy, occupational therapy, physical therapy, psychology, and psychomotor therapy. However, as AI becomes more embedded in daily practice, emerging research urges therapists to consider not just the practical benefits, but also the cognitive and clinical implications for both therapists and clients (Kosmyna et al., 2024). Cognitive Engagement: What Happens When We Use AI? Recent experimental research has shown that the way therapists and clients interact with AI tools can significantly affect cognitive engagement and learning outcomes. In a study by Kosmyna et al. (2024), participants were assigned to write essays using either only their own knowledge, a traditional search engine, or an LLM like ChatGPT. EEG brain activity was measured, and participants were interviewed about memory, ownership, and satisfaction with their work. The findings reveal that those who relied on LLMs exhibited the weakest neural connectivity, particularly in brain regions involved in memory, attention, and deep processing. By contrast, participants who used only their own brains demonstrated the strongest, most widespread brain activity, while those using search engines were intermediate. This suggests that LLMs, while effective at reducing immediate cognitive load and making tasks feel easier, may also encourage more passive engagement and less deep processing of information (Kosmyna et al., 2024; Sweller, 2011). Moreover, LLM users reported lower ownership over their work and struggled to recall or quote from their essays, compared to those who used search engines or worked unaided. This impaired memory and reduced sense of authorship may have important implications for therapy, where engagement, self-reflection, and memory are central to progress and learning (Kosmyna et al., 2024). Clinical Implications for Therapy Disciplines For Speech and Language Therapists:AI can generate prompts, exercises, and language models for clients, but over-reliance on these tools may reduce clients’ active participation and expressive language development. The process of generating one’s own ideas and sentences is crucial for language acquisition and memory formation (Kosmyna et al., 2024; Yang et al., 2024). For Occupational and Physical Therapists:AI is increasingly used in physical therapy for movement analysis, remote monitoring, and personalized exercise planning. Wearable sensors and AI-driven platforms can track gait, range of motion, and exercise adherence, providing real-time feedback and automating progress documentation. However, optimal motor learning and transfer to daily life require clients to be actively involved in planning, reflection, and problem-solving. Passive following of AI-generated routines may not engage the cognitive and motor systems as robustly as therapist-guided or self-directed activities (Sweller, 2011). For example, a PT might use AI to suggest a progression of exercises, but the best outcomes occur when clients set goals, reflect on their progress, and adapt routines in collaboration with their therapist. For Psychologists and Psychomotor Therapists:AI tools can assist with psychoeducation, cognitive-behavioral interventions, and emotional support. However, therapists must be vigilant about “cognitive offloading”—the tendency to let AI do the thinking, which can diminish clients’ critical thinking, emotional processing, and self-reflection (Kosmyna et al., 2024; Yang et al., 2024). For All Disciplines:AI-generated documentation and treatment plans can save time, but therapists may feel less connected to these records and may struggle to recall details later. This can impact continuity of care, clinical judgment, and professional satisfaction. Furthermore, the homogenization of AI-generated content risks undermining the creativity and individualized care that are hallmarks of effective therapy (Kosmyna et al., 2024; Niloy et al., 2024). Balancing Benefits and Cognitive Risks AI tools offer clear advantages: they reduce extraneous cognitive load, streamline information retrieval, and can increase productivity (Kosmyna et al., 2024; Sweller, 2011). For PTs, this means more efficient data collection, progress tracking, and even predictive analytics for injury risk or recovery. However, these benefits come with trade-offs. Lower cognitive effort may lead to less deep engagement, weaker memory encoding, and reduced development of problem-solving skills. Studies in educational settings have found that students using AI for writing or programming tasks perform worse on measures of long-term learning, self-efficacy, and creative thinking compared to those using traditional methods (Yang et al., 2024; Niloy et al., 2024). Moreover, the tendency for AI-generated outputs to be more similar to each other—less diverse in language and thought—may limit the range of perspectives and approaches explored in therapy. This is especially concerning in fields that value creativity, individualized care, and holistic understanding of clients (Kosmyna et al., 2024). Practical Recommendations for Therapists Conclusion: Navigating the AI Era in Therapy AI is a powerful tool for therapists, including PTs, but it is not a replacement for the human mind or the therapeutic relationship. The latest research demonstrates that while AI can make tasks easier and more efficient, it may also reduce cognitive engagement, memory, and creativity if overused or used uncritically. Therapists across all disciplines must strive for a thoughtful balance—leveraging AI’s strengths while actively protecting the cognitive, creative, and relational skills that define effective therapy. By doing so, both therapists and clients can continue to grow, learn, and thrive in an increasingly AI-augmented world. References Kosmyna, N., Hauptmann, E., Yuan, Y. T., Situ, J., Liao, X.-H., Beresnitzky, A. V., Braunstein, I., & Maes, P. (2024). Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task. MIT Media Lab. Sweller, J. (2011). Cognitive Load Theory. Psychology of Learning and Motivation, 55, 37-76. Yang, S., Li, J., & Chen, X. (2024). The Impact of ChatGPT on Student Learning: Evidence from a Programming Course. Computers & Education, 205, 104889. Niloy, S., Rahman, M., & Sultana, S. (2024). Effects of ChatGPT on Creative Writing Skills among College Students. Journal of Educational Technology Development and Exchange, 17(1), 15-28.

English

OpenAI’s Latest ChatGPT Updates: A Leap Forward for Therapists, Educators, and Allied Health Professionals

The world of artificial intelligence (AI) continues to evolve at a breathtaking pace, and OpenAI’s recent update to ChatGPT is a prime example. Announced just days ago via the @chatgptricks Instagram page, these changes bring ChatGPT closer to being an everyday assistant for professionals across multiple fields—including speech-language pathologists (SLPs), occupational therapists (OTs), physical therapists (PTs), psychologists, psychomotor therapists, and special educators. This article explores the key updates and demonstrates how each professional group can apply them to enhance therapy planning, communication, documentation, and client engagement. 1. Image Generation in WhatsApp (via 1-800-ChatGPT) OpenAI now enables image generation directly within WhatsApp. By messaging 1-800-ChatGPT, professionals can create high-quality AI-generated visuals on the go. This allows therapists and educators to: This seamless integration with WhatsApp is ideal for communicating with families and aides in real-time. As Khanna et al. (2023) note, integrating AI into familiar platforms reduces tech resistance and promotes real-world utility. 2. Custom GPTs on All Models: Personalized AI for Every Budget Custom GPTs—essentially specialized AI assistants—can now run on any ChatGPT model, including the faster and more affordable GPT-4 mini. This means that: By enabling personalization across all models, OpenAI is empowering practitioners to innovate without needing premium access (Park & Prabhakaran, 2024). 3. Projects Update: Deep Research, Memory, and Voice ChatGPT’s new Projects feature is perhaps the most powerful update for therapists and educators. Features include: Imagine an SLP who uploads weekly session notes and asks ChatGPT to summarize progress or generate session reports. Or a psychologist using voice input to log observations after a session while driving. According to Xie et al. (2023), memory-enhanced AI significantly improves continuity, reducing time spent on repetitive tasks and increasing focus on clinical reasoning. Applications Across Professions Professional How to Use the Updates SLPs Visual cards, home carryover plans, social scripts, and dynamic assessments OTs Visual schedules, motor planning boards, sensory diet suggestions, adaptive tasks PTs Exercise demonstrations, movement sequences, and balance/stability progression visuals Psychologists Therapeutic visuals, journaling prompts, emotion regulation tools, and CBT reflections Psychomotor Therapists Body scheme illustrations, bilateral coordination games, and session recaps Special Educators Differentiated content creation, reading supports, visual timetables, and gamified learning These updates make AI more practical, inclusive, and therapeutic—not just “smart.” The ability to store information across sessions brings ChatGPT closer to functioning like a case manager or clinical assistant (Mitchell et al., 2023). Ethical Use in Clinical and Educational Settings The ethical integration of AI remains paramount. Practitioners must ensure data privacy, informed consent, and age-appropriate content. As noted by the European Commission (2022), AI systems must be transparent, trustworthy, and non-biased. Custom GPTs used in therapeutic contexts should be evaluated for safety and accuracy, especially when interacting with children or individuals with cognitive challenges. Monitoring and review are essential to avoid over-reliance or misinformation. Conclusion: A Future of Partnership OpenAI’s latest ChatGPT updates signal a move toward intelligent collaboration, not automation. These tools are designed to amplify the expertise of professionals—supporting more personalized, efficient, and innovative service delivery. Whether you’re crafting social-emotional learning tools as a psychologist, adapting motor tasks as a PT, or preparing visuals for an AAC user, these new capabilities bring real-time support to your fingertips. The question is no longer if AI belongs in therapy and education, but how we can best harness it to meet the needs of diverse learners and clients. References European Commission. (2022). Ethics guidelines for trustworthy AI. Publications Office of the European Union. https://digital-strategy.ec.europa.eu/en/library/ethics-guidelines-trustworthy-ai Khanna, R., D’Mello, S., & Caine, K. (2023). Human-AI Collaboration in Education: A Meta-Analysis of Benefits and Barriers. Computers & Education: Artificial Intelligence, 4, 100143. https://doi.org/10.1016/j.caeai.2023.100143 Mitchell, M., Wu, S., Zaldivar, A., Barnes, P., Vasserman, L., Hutchinson, B., … & Gebru, T. (2023). Model Cards for Model Reporting. Communications of the ACM, 66(1), 56–65. https://doi.org/10.1145/3522499 Park, A., & Prabhakaran, V. (2024). Tailoring Lightweight Language Models for Equity and Access. Journal of Artificial Intelligence Research, 76, 321–342. https://doi.org/10.1613/jair.1.13933 Xie, M., Yu, Y., & Yin, Z. (2023). Memory-Augmented Dialogue Systems: A Survey. Journal of Computational Linguistics, 49(2), 221–248. https://doi.org/10.1162/coli_a_00480

English

Typing, Texting, and Aphasia: Rethinking Writing in the Digital Age

The shift toward digital communication has reshaped how we think about writing—not only for the general population but also for individuals with aphasia (PwA). Traditional writing practices such as handwriting are increasingly supplemented or replaced by texting, typing, and other digital modalities. For interdisciplinary therapists supporting clients with neurological conditions, this shift demands a broader clinical lens. Digital writing is no longer optional—it is a lifeline for communication, independence, and social participation. Yet, the impact of these tools varies across individuals and must be assessed with nuance and inclusivity. Why Digital Writing Matters in Aphasia Care Texting and typing have become dominant forms of everyday communication. People send appointment requests, update caregivers, and maintain friendships through written digital messages. For individuals with aphasia, difficulty with writing—already a major barrier—can be compounded by the challenges of navigating modern platforms and interfaces. As Dietz et al. (2011) noted, integrating technology into therapy offers new ways to support language recovery and social engagement. Today’s norms also accept spelling shortcuts, emojis, and even incomplete phrases. This more flexible standard can reduce pressure and expand opportunities for communicative success in therapy. Typing vs. Handwriting: Same Goals, New Mediums Recent studies suggest that at the group level, PwA show comparable performance in typing and handwriting regarding correct information units and utterances (Obermeyer et al., 2024). However, individual differences are important. Typing often results in slower output, more deletions, and higher spelling error rates. These effects can vary based on lesion site, motor planning issues, and familiarity with digital interfaces (Lee et al., 2024). This insight is critical for therapists choosing the most effective and functional writing modality for their clients. Applications for Speech-Language Pathologists SLPs can integrate texting and typing into language therapy to target functional writing goals. AI tools like predictive text and autocorrect can scaffold word retrieval. DAAWN (Menger et al., 2021) allows therapists to assess typing skills through structured tasks and receive automatic reports on output quality. Additionally, using the Texting Transactional Success Scale (Lee & Cherney, 2022) lets SLPs evaluate whether clients can successfully complete communication exchanges using messages. This is ideal for social communication goals and generalization into everyday contexts. Applications for Occupational Therapists For OTs, keyboarding and touchscreen typing are functional ADLs. Fine motor planning, sensory regulation, and visual-spatial processing are all part of successful digital writing. The Technology Survey (Kinsey et al., 2022) is a helpful tool to determine clients’ device usage, confidence, and motivation. OTs can incorporate texting tasks into broader technology use goals, such as accessing medical portals or coordinating with caregivers. Interventions might include adapting stylus grips, adjusting typing settings, or exploring voice-to-text as compensatory strategies. Relevance for Physical Therapists While digital writing might seem outside a PT’s scope, the motor aspects of typing and digital device use are crucial for clients with upper limb motor impairments or coordination difficulties. PTs can collaborate with OTs and SLPs to improve postural support, arm control, and endurance for sustained writing or typing tasks. Moreover, communication tools like texting can support PT-led home programs. For example, patients with aphasia may benefit from receiving simplified text instructions, images, or reminders in therapy-friendly formats. Psychomotor Therapists and Cognitive-Motor Integration Psychomotor therapists support the integration of body movement with cognitive and emotional processing. Typing and texting tasks offer rich opportunities to address cognitive-motor planning, visual tracking, bilateral coordination, and timing—all within a communication-focused framework. Integrating free text tasks, emoji use, or sequencing writing prompts into psychomotor sessions can build self-regulation and executive function while reinforcing meaningful communication. Supporting Special Educators and Assistive Communication Special educators working with students with acquired brain injury or developmental language disorder also benefit from these tools. Platforms like DAAWN and texting scale assessments allow for personalized, adaptive written expression tasks. AI tools can be used to co-create stories, label images, or send simulated text messages to characters—turning literacy tasks into interactive, communicative learning experiences. For older students or transition-aged youth, these tools foster digital literacy, independence, and social-emotional expression. Conclusion: Beyond Traditional Writing As Thiel and Conroy (2022) argue, writing remains deeply personal and powerful for people with aphasia. It’s not just a skill—it’s a means of reconnecting with the world. By integrating texting and typing into therapy, we make communication more real, more modern, and more relevant. For therapists across disciplines, this means adapting goals, tools, and expectations to reflect the way people live, learn, and communicate today. Writing is no longer just about paper and pen—it’s about access, agency, and connection. References Dietz, A., Ball, A., & Griffith, J. (2011). Reading and writing with aphasia in the 21st century: Technological applications of supported reading comprehension and written expression. Topics in Stroke Rehabilitation, 18(6), 758–769. https://doi.org/10.1310/tsr1806-758 Kinsey, L. E., Lee, J. B., Larkin, E. M., & Cherney, L. R. (2022). Texting behaviors of individuals with chronic aphasia: A descriptive study. American Journal of Speech-Language Pathology, 31(1), 99–112. https://doi.org/10.1044/2021_AJSLP-20-00287 Lee, J. B., & Cherney, L. R. (2022). Transactional success in the texting of individuals with aphasia. American Journal of Speech-Language Pathology, 31(5S), 2348–2365. https://doi.org/10.1044/2022_AJSLP-21-00291 Lee, J. B., Kinsey, L. E., & Cherney, L. R. (2024). Typing versus handwriting: A preliminary investigation of modality effects in the writing output of people with aphasia. American Journal of Speech-Language Pathology, 33(6S), 3422–3430. Menger, F., Forshaw, M., Morris, J., & Osselton, R. (2021). Digitised Assessment for Aphasia of WritiNg. https://daawn.ncldata.dev Menger, F., Morris, J., & Salis, C. (2016). Aphasia in an internet age: Wider perspectives on digital inclusion. Aphasiology, 30(2–3), 112–132. https://doi.org/10.1080/02687038.2015.1100702 Obermeyer, J., Edmonds, L., & Morgan, J. (2024). Handwritten and typed discourse in people with aphasia: Reference data for sequential picture description and comparison of performance across modality. American Journal of Speech-Language Pathology, 33(6S), 3170–3185. Thiel, L., & Conroy, P. (2022). “I think writing is everything”: An exploration of the writing experiences of people with aphasia. International Journal of Language & Communication Disorders, 57(6), 1381–1398.

English

ChatGPT’s Latest Updates: Email Integration, Smarter Tools, and What It Means for Therapy Professions

ChatGPT has evolved significantly in recent months. From a powerful language assistant, it’s now becoming a fully integrated tool for productivity and clinical support. The latest releases—including robust email link capabilities, enhanced voice interactions, and improved reasoning models—offer real benefits to professionals in speech-language pathology (SLP), occupational therapy (OT), and psychology. 1. Email Linking & Productivity Plugins One of the most impactful updates is the ability to link ChatGPT with your email and other applications using tools like Zapier and new productivity integrations. Through ChatGPT’s email plugins, you can: This streamlines administrative workflows—an important gain for SLPs, OTs, and psychologists who often juggle paperwork, session planning, and caregiver communication. According to Sendboard, tools like Zapier plug-ins enable automation directly within ChatGPT, without requiring coding skills 2. Advanced Reasoning with o3‑pro Released in June 2025, o3‑pro enhances ChatGPT’s reasoning power. It’s stronger than earlier models and can handle complex multi-step tasks such as generating therapy plans, interpreting assessment data, or crafting detailed psychoeducational handouts For therapists, that means greater reliability and precision when using ChatGPT for: While responses may take slightly longer, the improved reliability in domains like science, coding, and detailed planning is well worth it 3. Enhanced Voice Mode Recent updates to Advanced Voice Mode have made ChatGPT’s spoken responses more natural and expressive—complete with nuanced intonation and empathetic tones. It also supports live translation between languages, expanding accessibility for non-native speakers. This opens up exciting opportunities for clinicians: These voice interactions make ChatGPT feel more conversational and human—ideal for therapeutic training, client role play, or voice modeling. 4. Smarter Projects & Context Management ChatGPT’s Projects feature, updated in June, now supports “deep research,” file uploads, and conversation sharing This allows clinicians to maintain context over long-term work—such as ongoing case notes, client resources, and clinical research—in a single workspace. For therapists: Overall, this supports richer case continuity and collaborative workflows. How Therapists Are Using ChatGPT in Smarter Ways Beyond emails and worksheets, ChatGPT’s newest updates—like memory, Projects, and advanced voice—unlock more dynamic clinical uses. Here are 5 smart ways SLPs, OTs, and psychologists are using AI in their practice: 1. Realistic Simulation for SupervisionUse ChatGPT to role-play challenging scenarios—like a child resisting transitions or a parent coping with difficult news. Great for training interns in safe, reflective ways. 2. Culturally Adapted Therapy ContentQuickly adapt stories, scripts, or visuals to reflect your client’s culture, language, or religion—boosting engagement and inclusion, especially for multilingual or neurodivergent clients. 3. Internal Script CoachingHelp clients build better self-talk for anxiety, transitions, or emotional regulation. Save and reuse personalized coping scripts across sessions for reinforcement. 4. Environment & Behavior Pattern AnalysisUpload photos or logs and ask ChatGPT to flag sensory challenges, behavior triggers, or inconsistencies across therapy settings—combining your insights with scalable pattern support.5. Assistive Tech Guides for FamiliesBreak down AAC options or executive function tools into family-friendly, visual guides. This empowers parents and teens to make informed, confident choices

English

From Trend to Therapy: What the Latest AI Innovations on Instagram Mean for SLPs, OTs, and Psychologists

Artificial intelligence (AI) is dominating social media platforms like Instagram, where rapid-fire reels, demos, and viral posts showcase exciting new AI-powered tools and breakthroughs daily. For therapists—speech-language pathologists (SLPs), occupational therapists (OTs), and psychologists—these trends offer a glimpse into the future of clinical practice. But with the constant flood of AI hype, how do you separate fleeting fads from game-changing innovations? More importantly, how can you leverage these latest AI trends to enhance your therapeutic impact ethically and effectively? This article unpacks the most viral AI updates currently buzzing on Instagram, translating them into practical insights for therapy professionals. 1. AI-Powered Video and Movement Analysis: Next-Level Remote Assessment One of the most exciting AI trends gaining traction on Instagram is real-time video analysis tools that track facial expressions, speech articulation, and fine motor movements. These tools use machine learning to provide detailed, objective data on client performance, even from recorded or live video sessions. For SLPs and OTs, this means more precise assessment and progress monitoring without needing in-person observation every time. Imagine AI algorithms flagging subtle changes in a client’s articulation or motor coordination patterns that might otherwise go unnoticed. While these platforms are still emerging and need clinical validation, they represent a promising adjunct for teletherapy and hybrid models—especially valuable given the increase in remote service delivery post-pandemic. 2. Chatbots and Virtual Assistants That Understand Emotion and Context Instagram reels are buzzing with AI chatbots demonstrating near-human conversational flow and emotional sensitivity. Advanced natural language processing (NLP) models now recognize not just words but the tone and emotional intent behind them. For psychologists and SLPs working on social skills, motivation, or emotional regulation, these AI-driven conversational agents can serve as engaging, low-stakes practice partners outside sessions. They offer clients opportunities to rehearse challenging conversations or receive immediate, non-judgmental feedback on communication attempts. However, it’s critical to view these tools as supplementary aids rather than replacements for human connection and clinical judgment. 3. Creative Content Generation: Saving Time, Boosting Engagement Social media trends highlight AI tools that instantly generate text, visuals, and even audio content tailored to user needs. Therapists can harness this to create culturally adapted stories, visual schedules, relaxation scripts, or interactive worksheets in minutes. For example, AI can transform a generic narrative retell task into a culturally relevant story for diverse client populations or produce calming mindfulness scripts aligned with a client’s language and spiritual background. By automating these time-consuming tasks, therapists free up more time to focus on individualized clinical work and relationship building. 4. Ethical Considerations: What Social Media Often Misses While Instagram showcases the shiny potential of AI, it rarely highlights the ethical complexities. Many trending AI tools operate with limited transparency on data privacy, algorithmic bias, and inclusivity. Therapists must critically evaluate new AI products by asking: Navigating these concerns ensures AI enhances care without inadvertently perpetuating inequities or privacy violations. 5. Staying Updated Without Overwhelm: Curate Your AI Learning With AI news flooding social media daily, it’s easy to feel overwhelmed or distracted by hype. To stay informed and clinically relevant: This intentional approach helps therapists keep pace with AI innovation without burnout or misinformation. Conlusion Instagram’s fast-moving AI trends reveal a dynamic future where therapy and technology increasingly intersect. By understanding and critically engaging with these viral updates, SLPs, OTs, and psychologists can adopt AI tools that truly enhance client care—balancing innovation with ethics and empathy. As AI evolves, therapists who blend technical curiosity with clinical wisdom will be best positioned to harness its full potential for meaningful, inclusive, and effective therapy.

English

From Policy to Practice: How Therapy Clinics Across Europe Are Implementing the EU AI Act

The EU Artificial Intelligence Act (Regulation 2024/1689), passed in June 2024, is already reshaping how AI is used in therapy. While the legislation outlines clear rules for transparency, risk classification, data governance, and human oversight, the real test lies in how clinics across Europe are putting these principles into action. Clinicians—especially those working in neurodevelopmental, rehabilitative, and mental health services—are asking a vital question: How do we comply with new AI regulations while preserving the human-centered nature of therapy? Translating Regulation Into Clinical Reality Across Europe, clinics are moving from high-level compliance to day-to-day operational changes in how AI tools are selected, monitored, and disclosed. In Belgium, for instance, a pediatric neurorehabilitation center has adopted a formal internal review process. Before any AI-assisted tool is used with children, teams assess the system’s training data, analyze its outcomes across diverse populations, and require therapists to demonstrate understanding of the AI model’s functionality and limits. These steps go beyond mere legal checklists. Under the AI Act, many digital therapy tools—including those used for speech analysis, attention monitoring, or adaptive content delivery—fall into the “high-risk” category. This classification requires clinics to apply standards of robustness, explainability, and human oversight (European Union, 2024). As a result, some clinics now treat AI tools like they would Class II medical devices: requiring structured evaluation, documentation, and clinician sign-off before use. Training the Clinician, Not Just the Tool In Denmark, a national therapy center network has launched mandatory AI ethics workshops. These don’t aim to turn therapists into data scientists but to equip them with foundational AI literacy. Therapists learn to ask critical questions: This emphasis on reflective practice aligns with WHO recommendations (2024), which stress that clinicians—not algorithms—must remain the final decision-makers. AI can suggest, but it cannot interpret. A fluency tracker may flag increased pause time, but it’s the therapist who determines whether the change reflects anxiety, illness, or simply a noisy environment. Training also now includes simulated case studies. For instance, therapists might explore how two similar speech samples receive different AI scores and must trace the model’s reasoning—a process that builds their confidence in evaluating AI reliability and limitations (Schubert et al., 2025). Embedding Transparency Into Client Care Clinics in the Netherlands, France, and Germany are leading on transparency. Informed consent now includes plain-language disclosures when AI tools are involved. Families are told if AI contributes to scoring, tailoring interventions, or flagging areas of concern. This kind of transparency, especially in pediatric and disability services, builds trust and satisfies Article 52 of the Act: patients have the right to know when AI is influencing care (European Union, 2024). Some platforms are going further: In the Netherlands, a widely used SLP support app now includes pop-up explanations showing how progress scores are generated and interpreted. This allows families to discuss uncertainties with therapists and contribute to decisions, rather than passively accepting algorithmic output. Addressing Access and Digital Equity While AI tools can optimize therapy, they may also widen the digital divide. In response, clinics in Poland, Slovakia, and Hungary are piloting hybrid care models—combining traditional therapy with AI-supported modules that require minimal hardware or bandwidth. These systems use offline-first design, printable practice modules, or text-based feedback to serve rural or low-resource areas. Furthermore, multilingual and cross-cultural validity is becoming a focus. As Pérez and Cheung (2025) point out, many existing AI tools have poor generalization beyond English or neurotypical data. SLPs and OTs across Europe are beginning to collaborate with developers to improve training datasets, ensuring tools work equitably across languages, dialects, and developmental profiles. Accountability and Oversight in Action Traceability—a core principle of the EU AI Act—is being operationalized via updated clinical documentation. In a multidisciplinary clinic in Munich, therapists now log every AI-assisted decision, whether in assessment, goal-setting, or therapy delivery. This includes: These records serve not only legal protection but also longitudinal quality review. For instance, if an AI tool routinely flags phonological issues in bilingual children where clinicians find none, the system may require retraining or discontinuation. As Topol (2024) emphasizes, human oversight is not a safeguard—it’s a necessity. Emerging Lessons from Early Implementation From these clinic-led efforts, several themes are emerging: AI as a Partner, Not a Replacement The EU AI Act is more than a regulatory hurdle—it’s a catalyst for ethical, inclusive innovation. By mandating transparency, clinician oversight, and data accountability, it challenges the therapy field to move slowly and wisely, even amid rapid technological change. European clinicians are not just adapting to AI—they are shaping it. By speaking up about equity gaps, demanding better training data, and insisting on tools that reflect clinical nuance, therapists are reclaiming their role as co-creators, not passive users. The future of AI in therapy will not be about automation—it will be about augmentation, grounded in clinical judgment and compassionate care. Coming Next How clinics are designing therapist-led systems to evaluate and audit AI tools—without overwhelming paperwork or technical complexity. To find out more join our AI webinars for therapists! go to Courses to find out more details! References

English

From Theory to Therapy: Clinicians Are Already Using AI—Without Losing the Human Touch

AI in Clinical Care—Already Here, Already Changing Practice Artificial Intelligence is no longer theoretical. It’s embedded in our therapy rooms, electronic records, and clinical tools. From speech-language pathology to neuropsychology, AI is reshaping how we assess, document, and intervene. The question is no longer whether therapists will use AI—but how they’re doing so already, and how to do it responsibly without compromising therapeutic presence or judgment. As highlighted in our previous issue, AI is influencing not only our tools, but our decisions. Many clinicians now use AI-supported platforms—sometimes unknowingly—raising important questions about transparency, ethics, and outcomes. Schubert et al. (2025) remind us that moving from passive use to informed application requires structured education. Knowing how AI works isn’t enough; we need to understand how to reason with it, critique it, and lead its ethical use. Where AI Is Already Supporting Therapy In real-world practice, AI is already making a difference: Jha and Topol (2024) note that such tools are improving efficiency in fields reliant on pattern recognition. AI can surface meaningful shifts, propose next steps, and adapt tasks in real time. But it cannot make clinical decisions. An algorithm may detect a phonological pattern or attentional lapse. Yet the therapist must decide: Is this clinically relevant? Is it consistent with the client’s goals, history, or needs? AI can suggest. The therapist must interpret. Why Clinical Judgment Still Leads AI handles large data sets—but it cannot read between the lines. It can’t recognize when a data point reflects fatigue, emotional strain, cultural difference, or an artifact. Clinical work is not just about data. It is about human context, developmental history, motivation, and values. AI cannot weigh competing goals or make value-based decisions. As Schubert and colleagues (2025) propose, responsible AI use develops across three levels: This framework positions clinicians as decision-makers—not passive tool users, but ethical leaders. Personalization Without Losing the Personal AI can make therapy more adaptive. Some apps modify pacing or feedback in real time—slowing exercises during stuttering episodes, or increasing visual prompts for distracted learners. These features enhance responsiveness. But adaptation alone isn’t therapy. It becomes meaningful only when interpreted and guided by a clinician. Therapists remain essential in deciding whether a pattern is significant, whether a tool supports or distracts from the goal, and how to communicate these decisions to clients and families. As Topol (2024) states, AI should inform—not replace—clinical reasoning. Will AI Replace Therapists? Not Likely. Concerns about AI replacing clinicians are understandable—but not supported by current evidence. The World Health Organization (2024) affirms that best outcomes occur when clinicians retain authority and AI acts as a support. AI enhances—not diminishes—the role of skilled professionals. Clinicians with AI literacy are better equipped to: By engaging with AI critically and ethically, therapists remain stewards of care—not spectators to technological change. AI Engagement Doesn’t Require Coding—It Requires Questions You don’t need programming skills to use AI well. But you do need to ask critical questions: Functional AI literacy includes understanding key concepts like algorithmic bias, training data, and model reliability. It also involves separating evidence-based innovation from marketing hype. As WHO (2024) reminds us: clinicians are responsible for the tools they choose, even when the AI is invisible. Coming Next: Evaluating AI Tools Before You Use Them In our next article, we’ll explore how to critically evaluate an AI product before introducing it to clients. With new technologies entering the field monthly, we must remain discerning. We’ll cover: AI can’t replace what makes therapy powerful—but when used well, it can enhance connection, clarity, and compassion. To find out more join our AI webinars for therapists! go to Courses to find out more details! References Jha, S., & Topol, E. J. (2024). Adapting clinical practice to artificial intelligence: Opportunities, challenges, and ethical considerations. The Lancet Digital Health, 6(3), e175–e183. https://doi.org/10.1016/S2589-7500(24)00020-9Schubert, T., Oosterlinck, T., Stevens, R. D., Maxwell, P. H., & van der Schaar, M. (2025). AI education for clinicians. eClinicalMedicine, 79, 102968. https://doi.org/10.1016/j.eclinm.2024.102968World Health Organization. (2024). Ethics and governance of artificial intelligence for health: Guidance and tools. https://www.who.int/publications/i/item/9789240077925

English

Therapy in the Age of AI: Putting the EU AI Act 2024 into Practice

Under the Act, many AI systems used in therapy fall into the high-risk category. This classification demands that these tools meet stringent standards including:• Robust, diverse, and accurate data governance to minimize bias• Transparency so therapists can understand how AI tools generate recommendations• Human oversight to ensure therapists retain full responsibility for clinical judgment• Accountability and auditability, allowing systems to be reviewed and challenged as neededThese requirements safeguard patient welfare and help maintain trust in AI-assisted therapy. What Therapists Need to Ask Before Using AI ToolsBefore integrating any AI-supported tool into your practice, ask yourself critical questions: Is this tool approved under the EU AI Act or local regulations? What kind of data does it use, and where is it stored? Does the algorithm reflect diverse populations? Can you clearly understand and explain how it makes decisions? If the answers are unclear or unsatisfactory, it may be unwise or even unsafe to use that tool in clinical settings. The Irreplaceable Human Element in TherapyDespite the technical advances AI offers, it can never replace the deeply human elements of therapy. Empathy, cultural sensitivity, ethical reasoning, and clinical intuition remain irreplaceable skills that only therapists bring. AI is a powerful assistant—but therapists must continue to interpret AI outputs within the broader context of the client’s needs, ensuring all interventions remain person-centered and ethically grounded. Clear communication with clients about how AI tools are used is also essential to maintain transparency and trust. Collaboration and Digital Equity: Shaping the FutureThe summit highlighted the importance of collaboration between therapists and AI developers. Practitioners provide invaluable insights into real-world challenges and client diversity, helping guide innovation toward tools that are culturally competent, accessible, and effective. For example, speech-language therapists working with multilingual populations can advocate for AI systems that better handle linguistic diversity—a known limitation of many current models.Digital equity is another crucial aspect emphasized during the summit. Many clients face barriers to accessing AI-enhanced therapy due to age, socioeconomic status, disability, or geographic location. Ethical practice demands that therapists:• Offer alternatives for clients without access to AI tools• Educate clients on how to safely and effectively use digital resources• Consider hybrid models combining traditional and AI-supported therapyEnsuring no one is left behind helps make the benefits of AI truly inclusive. Key Updates from the 2025 EU AI SummitIn addition to these points, the 2025 EU AI Summit brought several key updates to light:• AI tools must undergo continuous auditing and improvements to address new risks and biases• Patients now have strengthened rights to be informed when AI plays a role in their care decisions• EU member states are working toward greater regulatory alignment to create a consistent framework across countries• A major focus on “explainable AI” aims to prevent opaque “black-box” algorithms from undermining clinical transparency Looking Forward: AI as a Partner in Ethical TherapyThe EU AI Act and the latest summit remind us that AI should augment therapists’ skills—not replace them. The future of therapy lies in balancing innovative tools with human judgment, ethical integrity, and cultural awareness. As therapists, staying informed about AI’s evolving landscape, critically evaluating tools before adoption, and actively engaging in their development allows us to lead the way in shaping ethical, person-centered AI care. Coming next: A closer look at how therapy clinics across Europe are implementing the EU AI Act in their daily practices—and the lessons we can all learn from their experiences.European Union. (2024). Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending certain Union legislative acts (Artificial Intelligence Act). Official Journal of the European Union, L 168, 1–158. https://eur-lex.europa.eu/eli/reg/2024/1689/oj

English

AI in Healthcare Is No Longer Optional—But Are Clinicians Ready?

A recent publication in eClinicalMedicine delivers a timely and urgent message: Artificial Intelligence (AI) is no longer a futuristic concept in healthcare—it is already reshaping clinical practice. From diagnostics and treatment planning to patient communication and workflow optimization, AI is increasingly present in how we work. But with its growing presence comes a vital question: are clinicians truly prepared to use it effectively and responsibly? The answer, for many, is no. Despite the rapid advancement of AI tools, healthcare professionals are often left with minimal training or support in how to use them. We need more than just exposure—we need structured, role-specific education that equips clinicians to understand, evaluate, and apply AI in a way that prioritizes patient safety and clinical integrity. A Practical Framework for AI Competency The article proposes a three-level framework for AI competency among clinicians—offering a roadmap for growth regardless of experience or specialty. At the basic level, clinicians should be able to use AI tools already integrated into their systems with confidence and safety. This includes knowing when an AI recommendation is helpful—and when clinical judgment must override it. The proficient level goes deeper, requiring the ability to critically assess AI outputs, recognize and explain ethical risks, and communicate clearly with patients about how AI is influencing their care. It’s about transparency and accountability. At the expert level, clinicians are not just users of AI—they become collaborators. They work with developers, contribute their field experience to guide innovation, and play an active role in shaping tools that reflect clinical realities rather than abstract models. No Need to Code, But a Need to Understand Clinicians don’t need to be programmers. But they do need to understand the foundations of how AI works: how it is trained, where bias can enter, and how outputs are generated. Without this understanding, there’s a risk of misuse—or worse, harm. Patients trust us to explain the care they’re receiving, including the role of AI. That means we must be able to break down complex processes into language that empowers and informs. Whether we’re discussing a speech analysis tool, a diagnostic prediction model, or a therapy planning assistant, we remain the point of accountability. Why This Matters for Every Health Professional This is especially relevant for therapists, neuropsychologists, educators, and other professionals whose roles may not traditionally intersect with technology. As AI tools begin to assist with assessments, track progress, and even suggest interventions, we need to be prepared. Not only to use these tools, but to recognize their limitations, question their suggestions, and advocate for patients when needed. In addition, AI literacy empowers clinicians to push for better tools. When we can articulate what works—and what doesn’t—we become partners in shaping the future of clinical technology. We ensure that AI evolves with human care at its core. Two More Reasons This Shift Matters First, equity in care is at stake. AI systems are only as good as the data they are trained on, and historical data can carry bias. Clinicians must be equipped to recognize these patterns and intervene when necessary to protect vulnerable populations. Second, the pace of change is accelerating. New tools are being released faster than guidelines can keep up. Without foundational AI knowledge, clinicians may struggle to keep pace—or worse, may become passive users in a system they don’t fully understand. The Shift Is Already Happening—Let’s Lead It Healthcare is undergoing a transformation, and AI is a key driver of that change. But technology alone cannot improve care. It takes knowledgeable, critically-minded clinicians to ensure that AI is used ethically, safely, and meaningfully. Whether you’re a student, a practicing clinician, or an educator training the next generation, this framework offers a roadmap—not just to catch up with the future, but to help lead it. AI is here. Education is not optional. It’s time to prepare—and to take part in shaping what comes next. In our next edition, we’ll explore how therapists across disciplines can begin to integrate AI tools into everyday practice—without losing the human connection that makes therapy so powerful. From digital progress tracking to personalized interventions, we’ll look at what’s already working—and what still needs to improve. Reference: Schubert, T., Oosterlinck, T., Stevens, R. D., Maxwell, P. H., & van der Schaar, M. (2025). AI education for clinicians. eClinicalMedicine, 79, 102968. https://doi.org/10.1016/j.eclinm.2024.102968ScienceDirect

English

Generative AI-Driven Visual Supports in Therapy

Visual supports have long been foundational tools across therapy domains. From Occupational Therapy (OT) to Speech and Language Therapy (SLT), these aids help clients—especially children—navigate routines, communicate needs, and participate meaningfully in therapy. Common tools include visual schedules, social stories, and augmentative and alternative communication (AAC) boards. These supports offer structure, clarity, and predictability, reducing anxiety and enhancing engagement. Traditionally, however, creating these supports has been a time-consuming task. Therapists often rely on pre-made clipart libraries or manually designed visuals, which may not fully represent a client’s cultural background, personal interests, or environment. For instance, a visual sequence for toothbrushing may only feature characters of a single ethnicity or depict routines unfamiliar to the child. This disconnect, while subtle, can impact the effectiveness and relatability of therapy materials. This is where generative AI offers a powerful solution. Tools like Midjourney, Veo 3, NightCafe Studio, Fotor AI Art Generator, Dream by Wombo, and StarryAI allow therapists to generate personalized, contextually relevant visuals in seconds. By simply entering a description—such as “a child brushing their teeth with diverse family members in a home bathroom”—therapists can instantly create visuals tailored to a client’s daily routines, preferences, and cultural identity. Benefits of Using Generative AI for Visual Supports: Time EfficiencyGenerative AI tools dramatically reduce the time needed to produce custom visuals, allowing therapists to spend more time on clinical work and less on material preparation. Personalization and Cultural RelevanceVisuals that reflect a client’s ethnicity, environment, or family structure foster stronger engagement. When children see themselves represented, their motivation and sense of connection increase. InclusivityClients from diverse linguistic, cultural, or neurodiverse backgrounds benefit from visual supports that are tailored to their realities—not just generic templates. FlexibilityWith generative AI, therapists can quickly adapt visuals to match changing therapy goals, environments, or client needs without being bound to a fixed image library. Client and Family EmpowermentBy involving clients and caregivers in describing the visuals they want to see, therapists promote collaboration, personalization, and a sense of ownership over the therapeutic process. How to Start Using Generative AI Tools for Visuals: Veo 3 – [deepmind.google/technologies/veo]Developed by Google DeepMind, Veo 3 generates cinematic, realistic, and context-aware video visuals—ideal for creating dynamic step-by-step guides or routine modeling. Midjourney – [midjourney.com]Operates via Discord and uses prompt commands such as:/imagine a boy brushing teeth in a colorful bathroom with Arabic designIt produces artistic, detailed visuals suitable for social stories or routine visuals. NightCafe Studio – [nightcafe.studio]User-friendly drag-and-drop interface, supports multiple AI models, includes mobile access, and offers free credits. Great for accessible and varied visual generation. Fotor AI Art Generator – [fotor.com]Part of an online photo editing suite. Allows beginner-friendly image generation in styles like cartoon, watercolor, or oil painting—ideal for younger clients. Dream by Wombo – [dream.ai]Mobile and web-based. Enter a prompt, pick an art style, and create an image—simple, intuitive, and quick for therapists on the go. StarryAI – [starryai.com]Offers customization (aspect ratio, detail level, style). Designed for easy social media sharing or integration into therapy boards and educational materials. Reflective Question How might culturally relevant, generative AI-created visuals improve client engagement and inclusion? When clients see characters who look like them, performing routines they recognize, visual supports become more than instructional tools—they become affirmations of identity. For non-verbal children using AAC, personalized image boards feel less clinical and more intuitive, encouraging spontaneous communication and deeper connection. As generative AI becomes increasingly accessible, therapists are uniquely positioned to revolutionize how visual supports are created and used. Embracing these tools thoughtfully can promote inclusion, engagement, and stronger therapeutic outcomes—while saving valuable time in the process.