When a hard mass appeared in her abdomen a few months before her 40th birthday, Flic Manning knew something was seriously wrong. What began as occasional pain and bloating had become more persistent, and she had difficulty eating.
After two decades managing the inflammatory condition Crohnās disease, the author and radio host was used to gastrointestinal symptoms, but this felt different.
Her GP told her it was probably just Crohnās but referred her to a gastroenterologist to be safe. That specialist dismissed her concerns. āHe didnāt even bother to touch my abdomen,ā says Manning. āHe said it was nothing to worry about.ā
Still clearly worried, Manning turned to ChatGPT next, which suggested she had a possible twisted bowel and intussusception (where part of the intestine slides into another) ā conditions that can be life-threatening. Concerned, Manning went in search of a third opinion, this time from her human gynaecologist.
āAs soon as my gyno touched the hard area she was alarmed,ā says Manning. āShe said I need a CT, like, today!ā Sure enough, the scan confirmed the chatbotās prognosis. Manning had both intussusception and a twisted bowel, requiring multiple surgeries.
ā[ChatGPT] even told me the most likely course of treatment, which also turned out to be correct,ā she says. Manning is one of 400 million weekly users who have embraced ChatGPT, OpenAIās generative artificial intelligence model.
Launched in 2022, itās used globally to write emails, and create presentations, rĆ©sumĆ©s and letters. But it also answers questions, compiles research and creates art ā and increasingly people are relying on it for medical advice.
Itās not surprising; after all, weāve been turning to Dr Google for decades. The difference is that with AI models like ChatGPT, the experience more closely resembles what you might get from a caring doctor, complete with sympathetic preamble (āIām sorry youāre not feeling wellā) before suggested solutions ā though you may also get the disclaimer: āIām not a doctor so I canāt give medical advice.ā
There are many reasons people are going online to seek prompt medical advice, including delays getting to see a GP, the increasing cost when you do (up by just over 4 per cent in the past year), fewer doctors who bulk-bill, and years-long waitlists for specialists.
Skyrocketing living costs, an ageing population and rising chronic disease rates are overwhelming an already strained system. On top of that, thereās a dire shortage of full-time GPs, which is predicted to double by 2033, and more than 70 per cent of GPs are experiencing burnout, surveys have shown.
Add the fact that women bear the brunt of healthcare inequities ā from medical gaslighting to underfunded research ā and itās easy to see the appeal of opening a new Chrome window in your own home.
āGlobally, women spend 25 per cent more of their life in poor health compared to men,ā says CEO and founder of Ovum AI Dr Ariella Heffernan-Marks, who views AI as a promising remedy to the diagnostic and treatment gaps women face.

āIt takes five years on average for any Australian woman to be diagnosed with a general condition, and itās seven to 12 years for endometriosis. Women are coming in saying, āIāve got extreme pain,ā and theyāre being told, āYouāre anxious. Youāre just over-emotional.ā On top of that, women are experiencing a gender pay gap, so they need to see healthcare providers more often but [canāt] afford it.ā
So, are we headed for an AI healthcare revolution, or falling down a dangerous rabbit hole? The experts are divided: itās either a bit of both or too early to say.
The good news is that where Dr Google is hit and miss, ChatGPT is an excellent young medico, having passed the US medical licensing exam with flying colours, according to researchers at Harvard. Another study found it had a 92 per cent diagnostic accuracy rate. However, when interpreting MRIs, ChatGPT at times performed worse than radiologists, and in devising cancer treatments, one-third contained errors, according to studies.
Thereās also the way ChatGPT presents the information it scrapes from the internet. On a first read, it sounds pretty good, but look closer and itās often gibberish. Dr Piers Howe, an associate professor at the University of Melbourne, says AI āoften produces nonsense, but it presents it so well it looks plausible, and thatās the real dangerā.
That also concerns Dr Grant Blashki, a GP, associate professor at the University of Melbourne, and editor of Artificial Intelligence, For Better or Worse. Heās excited by the potential AI has to improve the patient experience, but stresses the need to always check with a doctor. āPatients can be overconfident [with AI advice] and then delay getting care,ā he says.
Unfortunately, AI is also perpetuating the gender bias women face in traditional healthcare. āItās assessing global data sets which typically contain gender bias, so itās providing either gender or culturally biased responses,ā says Herffernan-Marks.
But one place AI is having an undeniably positive impact is in the hands of physicians, where it can potentially streamline diagnosis, treatment, testing and admin. A UK study found one in five doctors are already using it for admin or help with diagnosis.
āItās extraordinary, the pace of uptake,ā says Blashki, who uses it to augment care, not replace it, ābut doctors need to remember the buck stops with them.ā
Heffernan-Marks is realistic about the limits. āAI is not diagnostic and ⦠should be used to help you gather and understand information,ā she says. But it does offer something valuable to women whoāve been brushed off: āa non-judgemental spaceā in which to prepare to advocate for themselves.
In our struggling healthcare system, perhaps AI can alleviate ills, rather than cure them. As the bot itself warns, itās user beware. Like any technology, the risk is not in the machine, but how we use it