Can ChatGPT be guide in pediatric dentistry?

Abstract Background The use of ChatGPT in the field of health has recently gained popularity. In the field of dentistry, ChatGPT can provide services in areas such as, dental education and patient education. The aim of this study was to evaluate the quality, readability and originality of pediatric...

Full description

Saved in:
Bibliographic Details
Main Author: Canan Bayraktar Nahir
Format: Article
Language:English
Published: BMC 2025-01-01
Series:BMC Oral Health
Subjects:
Online Access:https://doi.org/10.1186/s12903-024-05393-1
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1841559107671162880
author Canan Bayraktar Nahir
author_facet Canan Bayraktar Nahir
author_sort Canan Bayraktar Nahir
collection DOAJ
description Abstract Background The use of ChatGPT in the field of health has recently gained popularity. In the field of dentistry, ChatGPT can provide services in areas such as, dental education and patient education. The aim of this study was to evaluate the quality, readability and originality of pediatric patient/parent information and academic content produced by ChatGPT in the field of pediatric dentistry. Methods A total of 60 questions were asked to ChatGPT for each topic (dental trauma, fluoride, and tooth eruption/oral health) consisting of pediatric patient/parent questions and academic questions. The modified Global Quality Scale (the scoring ranges from 1: poor quality to 5: excellent quality) was used to evaluate the quality of the answers and Flesch Reading Ease and Flesch-Kincaid Grade Level were used to evaluate the readability. A similarity index was used to compare the quantitative similarity of the answers given by the software with the guidelines and academic references in different databases. Results The evaluation of answers quality revealed an average score of 4.3 ± 0.7 for pediatric patient/parent questions and 3.7 ± 0.8 for academic questions, indicating a statistically significant difference (p < 0.05). Academic questions regarding dental trauma received the lowest scores (p < 0.05). However, no significant differences were observed in readability and similarity between ChatGPT answers for different question groups and topics (p > 0.05). Conclusions In pediatric dentistry, ChatGPT provides quality information to patients/parents. ChatGPT, which is difficult to readability for patients/parents and offers an acceptable similarity rate, needs to be improved in order to interact with people more efficiently and fluently.
format Article
id doaj-art-f262bb76cea3420f8edc027978317b85
institution Kabale University
issn 1472-6831
language English
publishDate 2025-01-01
publisher BMC
record_format Article
series BMC Oral Health
spelling doaj-art-f262bb76cea3420f8edc027978317b852025-01-05T12:48:23ZengBMCBMC Oral Health1472-68312025-01-012511810.1186/s12903-024-05393-1Can ChatGPT be guide in pediatric dentistry?Canan Bayraktar Nahir0Department of Pediatric Dentistry, Faculty of Dentistry, Tokat Gaziosmanpaşa UniversityAbstract Background The use of ChatGPT in the field of health has recently gained popularity. In the field of dentistry, ChatGPT can provide services in areas such as, dental education and patient education. The aim of this study was to evaluate the quality, readability and originality of pediatric patient/parent information and academic content produced by ChatGPT in the field of pediatric dentistry. Methods A total of 60 questions were asked to ChatGPT for each topic (dental trauma, fluoride, and tooth eruption/oral health) consisting of pediatric patient/parent questions and academic questions. The modified Global Quality Scale (the scoring ranges from 1: poor quality to 5: excellent quality) was used to evaluate the quality of the answers and Flesch Reading Ease and Flesch-Kincaid Grade Level were used to evaluate the readability. A similarity index was used to compare the quantitative similarity of the answers given by the software with the guidelines and academic references in different databases. Results The evaluation of answers quality revealed an average score of 4.3 ± 0.7 for pediatric patient/parent questions and 3.7 ± 0.8 for academic questions, indicating a statistically significant difference (p < 0.05). Academic questions regarding dental trauma received the lowest scores (p < 0.05). However, no significant differences were observed in readability and similarity between ChatGPT answers for different question groups and topics (p > 0.05). Conclusions In pediatric dentistry, ChatGPT provides quality information to patients/parents. ChatGPT, which is difficult to readability for patients/parents and offers an acceptable similarity rate, needs to be improved in order to interact with people more efficiently and fluently.https://doi.org/10.1186/s12903-024-05393-1Artificial intelligenceFluoridesTooth injuriesTooth eruptionOral healthPublic health informatics
spellingShingle Canan Bayraktar Nahir
Can ChatGPT be guide in pediatric dentistry?
BMC Oral Health
Artificial intelligence
Fluorides
Tooth injuries
Tooth eruption
Oral health
Public health informatics
title Can ChatGPT be guide in pediatric dentistry?
title_full Can ChatGPT be guide in pediatric dentistry?
title_fullStr Can ChatGPT be guide in pediatric dentistry?
title_full_unstemmed Can ChatGPT be guide in pediatric dentistry?
title_short Can ChatGPT be guide in pediatric dentistry?
title_sort can chatgpt be guide in pediatric dentistry
topic Artificial intelligence
Fluorides
Tooth injuries
Tooth eruption
Oral health
Public health informatics
url https://doi.org/10.1186/s12903-024-05393-1
work_keys_str_mv AT cananbayraktarnahir canchatgptbeguideinpediatricdentistry