One winter morning, at a small pharmacy located along the inter-communal road in Nhan Nghia commune, Phu Tho province, a middle-aged man stopped his motorbike and walked in with his phone still on.
Without presenting symptoms, he directly gave the phone to the pharmacist and said briefly: "I have coughs, fever, headaches, ChatGPT told me to buy these medicines correctly.
The list of drugs appearing on the screen includes many different types, including strong anti-inflammatory drugs. When asked about medical history, allergies or drugs being used, the man became impatient: "AI has advised me, just sell it exactly like that to me.
The story is no longer rare. The habit of asking ChatGPT for advice, even "petitions" for drugs, which is common in urban areas, has now secretly spread to rural and mountainous areas, putting people at unpredictable risks to health.
Talking to Lao Dong Newspaper reporters, university pharmacist Bui Thi Lan Huong - Phuc Nguyen Pharmacy (Nhan Nghia commune, Phu Tho province) said that recently, she often met customers bringing "medicine prescriptions" from ChatGPT to the counter.
Drugs are divided into two types: prescription drugs and over-the-counter drugs. For prescription drugs, it is mandatory to have a valid prescription from a doctor for pharmacists to be allowed to sell. For over-the-counter drugs, even without a prescription, they still have to carefully exploit patient information before consulting," Ms. Huong said.
According to the female pharmacist, ChatGPT often answers very generally, and cannot be individualized according to each patient.
When selling any type of medicine, pharmacists must ask about age, allergic history, underlying diseases, drugs and functional foods being used. AI cannot do that completely. If people absolutely believe, the risk of drug reactions is very high," Ms. Huong warned.
More worryingly, if patients are lucky enough to recover after using the drug once according to ChatGPT, they will easily develop a habit of abusing AI, disregarding the role of doctors and pharmacists.
In the same situation, college pharmacist Bui Ha Chi (Hop Kim commune, Phu Tho province) said that the situation of customers requesting to buy drugs according to advice from ChatGPT is happening more and more frequently.
Ms. Chi said: "There are customers who chat GPT and then ask staff to sell accordingly. When we refuse and explain the risk, they show dissatisfaction. Even, some people after being consulted even open ChatGPT to ask again right at the pharmacy".
According to Ms. Chi, the dangerous thing is that AI tends to suggest the use of many types of drugs, including strong drugs, which are not necessary for common diseases.
In winter, flu is very common. In mild cases, just medicine to reduce fever, reduce symptoms, spray nose, increase resistance is okay. But ChatGPT often advises on using more corticosteroids, which are strong anti-inflammatory drugs that affect the adrenal glands. If abused, the risk of kidney failure and endocrine disorders is very high," Ms. Chi analyzed.
According to the female pharmacist, it is not always necessary to take many medicines when sick.
The body needs to have natural resistance. Using drugs indiscriminately, especially strong drugs, sometimes is even more harmful," she said.
In mountainous areas, where people still have limited access to health services, believing in "virtual prescriptions" from AI is even more risky. When the disease becomes severe or complications occur, people often come to medical facilities late, making treatment more difficult.
Pharmacists believe that ChatGPT or AI tools should only be seen as a general reference source of information, absolutely cannot replace the doctor's diagnosis and professional advice of health workers.