News

The hidden challenge in general practice

Deciding who is safe to drive might sound straightforward — but as Road Safety Committee chair Dr Monika Moy writes, it’s often a far more complex dilemma for GPs.

Assessing a patient’s fitness to drive is one of those bread‑and‑butter tasks that quietly sits in the background of general practice until it suddenly becomes the most difficult consult of the day. In South Australia, I’m required by law to assess patients at regular intervals if they have certain medical conditions, whenever I suspect they may be unsafe on the road, and when police or the Department of Transport request an opinion after an accident. On paper, it sounds straightforward. In reality, it is anything but.

Most of the time, the physical side is simple enough. Austroad’s Assessing Fitness to Drive guidelines are clear about things like seizure‑free periods or visual acuity thresholds. Many of us start to flounder, however, when we turn to the section on dementia.

The guidelines make it explicit. Dementia in itself does not automatically remove someone’s right to hold a licence. Yet I am often sitting across from a long‑standing patient who manages reasonably well day‑to‑day but whose short‑term memory is declining. The patient tells me their driving is fine. The family reassures me or, sometimes, insists otherwise. And I’m left trying to assemble a clinical jigsaw puzzle with several missing pieces.

Self‑reporting is notoriously unreliable when cognition is impaired. Family reports can be more accurate, but they can often be coloured by emotion. I can’t take the patient for a drive around the block — nor would I want to. But I’m somehow expected to make a call with profound implications.

Revoking a licence is one of the most fraught conversations a GP has. Some doctors have received threats for doing so — extreme examples, yes, but the emotional fallout is real. More commonly, trust erodes. Some patients quietly move on to another GP. Others push back, insisting they are being unfairly targeted. No one wins.

What can we do? Many of us use cognitive tests such as the Mini-Mental State Examination, despite knowing it was never designed for assessing fitness to drive. Tools like Trails A and B or the Snellgrove Maze get us a bit closer, but even then, we’ve all seen patients perform reasonably well while our gut tells us something is not right.

And that gut feeling matters, but it also leaves us exposed. Unlike occupational therapists trained in driving assessments, most GPs have no specialised expertise in

evaluating real‑world driving ability. The gold standard remains the on‑road test. Yet access is a major barrier — at least it is in my state of South Australia. In the public system, waits can be long and entry criteria restrictive. In the private system, cost can be significant. There is also the uncomfortable reality that even in a dual‑control vehicle, these tests carry risk for the assessor.

Discussion of this topic comes up repeatedly in my work with the AMA SA Road Safety Committee. It’s an area of general practice that demands better support, better structure, and better tools for clinicians. A recent scoping review by Belinda Johnston and colleagues published in Disability and Rehabilitation caught my attention. It looked at a wide range of cognitive tests — paper‑based, digital, even simulator‑based and compared them with on‑road performance. Most tests were either insufficiently accurate or impractical for a general practice setting. Some were easy to administer but unreliable, while others were good predictors but time‑consuming or required specialist equipment.

However, the review did highlight a tool already widely used by occupational therapy driving assessors in Australia: the DriveSafe DriveAware system. While further research is required, it stood out as one of the few options with the potential to bridge the gap between office‑based cognitive screening and on‑road outcomes.

I recently had a demonstration of the tool. What struck me was not that it claims to replace clinical judgement (it doesn’t) but that it offers structured, evidence‑informed support for a decision we are otherwise largely making in the dark. It’s iPad‑administered, takes about 15 minutes, and classifies patients broadly into those likely to pass an on‑road test, those unlikely to pass, and those in the grey zone who should be referred for a practical test.

I’m not suggesting this tool is “the answer”. There are costs involved. There is no Medicare item number, workflow integration needs thought, and further research in general practice settings is needed.

But for me, it represents something more important: proof that the gap we all feel can be addressed. Soon we might not have to rely solely on gut instinct, incomplete cognitive tests, or emotionally charged conversations with anxious families. There might be a way to bring more objectivity, more consistency and more transparency to a decision that carries huge safety and personal implications.

I believe greater investment in developing and validating tools of this kind would genuinely help GPs, patients, families and the broader community. GPs need cognitive assessments that are practical, evidence‑based, affordable, accessible, and aligned with the functional demands of driving.

Experience and empathy will always be central to general practice but they shouldn’t be the only guide when lives and livelihoods are at stake. We need evidence-based, accessible solutions that support clinicians and protect the community.

Related topics