OrthoGPT: multimodal generative pre-trained transformer models for precise diagnosis and treatment of orthopedics

This paper proposes OrthoGPT, a multimodal generative pre-trained transformer model designed for precise diagnosis and treatment of orthopedic diseases. It addresses several challenges in orthopedic care, such as the difficulties in emergency management outside hospital settings, the complexity and...

Full description

Saved in:
Bibliographic Details
Main Authors: ZHANG Zhicheng, WANG Jing, ZHANG Yang, TIAN Yonglin, ZHANG Mengmeng, LYU Yisheng, WANG Fei-Yue
Format: Article
Language:zho
Published: POSTS&TELECOM PRESS Co., LTD 2024-09-01
Series:智能科学与技术学报
Subjects:
Online Access:http://www.cjist.com.cn/thesisDetails#10.11959/j.issn.2096-6652.202433
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper proposes OrthoGPT, a multimodal generative pre-trained transformer model designed for precise diagnosis and treatment of orthopedic diseases. It addresses several challenges in orthopedic care, such as the difficulties in emergency management outside hospital settings, the complexity and high risks associated with surgical planning, and the strong demand for personalized rehabilitation therapy. The model leverages multimodal feature alignment and fusion technology, multi-scenario personalized diagnosis and treatment plan generation technology, combined with an orthopedic world model based on parallel theory and multi-agent methods. It aims to provide preliminary diagnosis and emergency guidance in non-hospital scenarios, simulate and predict personalized postoperative treatment outcomes, assist doctors in surgical planning, and offer personalized rehabilitation treatment suggestions. The purpose of this paper is to explore the potential applications of OrthoGPT in improving the efficiency and effectiveness of orthopedic diagnosis and treatment, enhancing the patient experience, and advancing the development and application of artificial intelligence in the field of orthopedics.
ISSN:2096-6652