Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

ChatGPT Outperformed Doctors in Diagnosing Diseases and Medical Conditions, Says Study

Share

ChatGPT was able to outperform human doctors in diagnosing diseases and medical conditions in a study. The findings of the study were published last month and highlighted that artificial intelligence (AI) chatbots might be more efficient in analysing patient histories and conditions and provide more accurate diagnoses. While the study aimed to understand if AI chatbots could help doctors provide better diagnoses, the results unexpectedly revealed that OpenAI’s GPT-4-powered chatbot performed much better when performing without human assistance compared to when paired with a doctor.

ChatGPT Outperforms Doctors in Diagnosing Diseases

The study, published in the JAMA Network Open journal, was conducted at the Beth Israel Deaconess Medical Center in Boston by a group of researchers. The experiment aimed to find out if AI can help doctors better diagnose diseases compared to traditional methods.

According to a New York Times report, the experiment involved 50 doctors who were a mix of residents and physicians attending the medical college. They were recruited through multiple large hospital systems in the US and were given six case histories of patients. The subjects were reportedly asked to suggest a diagnosis for each of the cases and provide an explanation for why they favoured or ruled certain diagnoses out. Doctors were said to also be graded based on whether their final diagnosis was right.

To evaluate each of the participants’ performance, medical experts were reportedly selected as graders. While they were said to be shown the answers, they were not told if the response came from a doctor with access to AI, just the doctor, or from only ChatGPT.

Further, to eliminate the possibility of unrealistic case histories, the researchers reportedly picked case histories of real patients that have been used by researchers for decades but have never been published to avoid contamination. This point is important because ChatGPT cannot be trained on data which has never been published.

The findings of the study were surprising. Doctors who did not use any AI tool to diagnose the case histories had an average score of 74 percent whereas those physicians who used the chatbot scored 76 percent on average. However, when ChatGPT alone analysed the case histories and provided diagnosis, it scored an average of 90 percent

While various factors could have impacted the outcome of the study — from the experience level of the doctors to individual biases with certain diagnoses — the researchers believe the study highlights that the potential of AI systems in medical institutions cannot be ignored.