AI workforce readiness is becoming a critical factor in rehabilitation pathways as artificial intelligence embeds across triage, assessment, monitoring, and digital prescription.
As these systems scale, attention has rightly focused on technical capability, governance frameworks, and clinical safety. Yet one of the most significant risks in AI-enabled rehabilitation is often overlooked: workforce readiness.
How physiotherapists perceive AI their optimism, concerns, and confidence in using it directly influences how safely and effectively AI-supported systems are deployed.
Recent evidence suggests that while enthusiasm for AI is growing within physiotherapy, preparedness is not keeping pace. This gap represents not merely an educational issue, but a system-level safety and delivery risk.
How AI Workforce Readiness Gaps Appear in Training
Insights into workforce readiness often emerge earliest during training. A large Canadian survey of more than 2,000 healthcare students, including over 200 physiotherapy students, provides a useful lens on how the next generation views AI.
The findings are instructive. Physiotherapy students, alongside medical and dental peers, broadly recognised AI’s potential to improve efficiency and quality of care.
Many expressed optimism that AI could streamline processes and free clinicians to focus on patient interaction.
However, physiotherapy students also reported higher levels of concern about AI’s impact on professional roles than several other disciplines.
Worries about job displacement, role erosion, or inappropriate replacement of clinical tasks were common. At the same time, students acknowledged limited confidence in their own understanding of AI, despite recognising its growing presence in practice.
This combination optimism paired with uncertainty is a warning signal. It suggests a workforce that is open to AI, but not yet equipped to work with it safely and critically.
Ethical Awareness Without Technical Confidence
One particularly notable finding from the survey was that physiotherapy students rated their awareness of AI’s ethical implications higher than their perceived technical understanding.
This reflects a profession deeply grounded in values such as patient-centred care, autonomy, and professional accountability.
While this ethical sensitivity is a strength, it also highlights a vulnerability. Ethical awareness without sufficient technical literacy can lead to two problematic extremes:
Over-caution, where clinicians avoid useful AI-supported tools due to uncertainty or mistrust.
Over-reliance, where AI outputs are accepted uncritically because their limitations are not well understood.
Neither outcome supports safe, scalable rehabilitation delivery.
AI Workforce Readiness and Practice Variability
From a system perspective, uneven AI literacy creates variability and variability is the enemy of safety and outcomes at scale.
As AI becomes embedded in musculoskeletal triage, remote monitoring, motion analysis, and outcome prediction, clinicians are increasingly expected to:
- interpret AI-generated insights
- understand data quality and bias
- know when to challenge or override recommendations
- explain AI-supported decisions to patients
If workforce understanding varies widely, organisations face:
- inconsistent application of digital pathways
- unclear accountability when outcomes differ
- increased governance burden
- difficulty defending AI-supported decisions under audit or scrutiny
For healthcare operators and insurers, this undermines predictability, a core requirement for outcomes-based and value-based models of care.
A Digital Divide With System Consequences
The student perception data also reflects a broader digital divide within physiotherapy education.
While AI is increasingly discussed in medical and dental training, physiotherapy curricula have been slower to engage with AI beyond surface-level digital health exposure.
This lag matters because physiotherapy is already a high-utilisation, high-variability service within healthcare systems. As digital rehabilitation platforms scale, physiotherapists will be central users of AI-enabled tools.
A workforce that is enthusiastic but underprepared introduces latent risk into otherwise well-designed systems.
In other words, technology may be ready but the human layer may not be.
Workforce Readiness Is a Governance Issue
AI literacy in physiotherapy should not be framed as optional professional development or an academic interest. It is a governance issue.
Just as clinicians are expected to understand the limitations of imaging, outcome measures, or clinical guidelines, they must also understand:
- what AI can and cannot do
- where algorithms support judgement
- where human reasoning must prevail
- how bias and error can arise
Without this baseline, organisations deploying AI-enabled rehabilitation systems carry avoidable safety, reputational, and operational risk.
From Education Gap to System Capability
Addressing this gap requires a shift in thinking. AI readiness cannot be confined to pre-registration education alone. Because AI systems evolve, workforce preparedness must be continuous, supported through:
- structured AI literacy frameworks
- ongoing professional development
- clear organisational guidance on AI use
- integration of AI understanding into supervision, audit, and governance
In this sense, education becomes delivery infrastructure, a prerequisite for safe scaling, not a peripheral activity.
Rehbox and Workforce-Aware Digital Rehabilitation
Rehbox is being developed with this workforce reality in mind. Digital rehabilitation platforms succeed not only through technical performance, but through how confidently and consistently clinicians can work with them.
By embedding transparency, clinician oversight, and interpretable insights into the rehabilitation workflow, Rehbox aims to support AI-enabled care without eroding professional authority. The goal is not to push adoption faster, but to ensure that adoption is safe, informed, and sustainable.
Looking Ahead
The student perceptions highlighted here are not a criticism of physiotherapy they are an early signal. They show a profession that is ethically grounded, optimistic about innovation, but at risk of being underprepared for the realities of AI-enabled practice.
For healthcare organisations, insurers, and professional bodies, the implication is clear:
Workforce readiness is now a limiting factor in digital rehabilitation.
Addressing AI literacy is not about future graduates alone. It is about ensuring today’s physiotherapy workforce can safely adopt, govern, and evolve alongside AI-supported systems — protecting outcomes, trust, and professional integrity as care delivery changes.