PRISM-Med: Parameter-Efficient Robust Interdomain Specialty Model for Medical Language Tasks
Language Models (LMs) have shown remarkable potential in healthcare applications, yet their widespread adoption faces challenges in achieving consistent performance across diverse medical specialties while maintaining parameter efficiency. Current approaches to fine-tuning language models for medica...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10820505/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1841550738459721728 |
---|---|
author | Jieui Kang Hyungon Ryu Jaehyeong Sim |
author_facet | Jieui Kang Hyungon Ryu Jaehyeong Sim |
author_sort | Jieui Kang |
collection | DOAJ |
description | Language Models (LMs) have shown remarkable potential in healthcare applications, yet their widespread adoption faces challenges in achieving consistent performance across diverse medical specialties while maintaining parameter efficiency. Current approaches to fine-tuning language models for medical tasks often require extensive computational resources and struggle with managing specialized medical knowledge across different domains. To address these challenges, we present PRISM-Med (Parameter-efficient Robust Interdomain Specialty Model), a novel framework that enhances domain-specific performance through supervised domain classification and specialized adaptation. Our framework introduces three key innovations: <xref ref-type="disp-formula" rid="deqn1">(1)</xref> a domain detection model that accurately classifies medical text into specific medical domains using supervised learning, <xref ref-type="disp-formula" rid="deqn2">(2)</xref> a domain-specific Low-Rank Adaptation (LoRA) strategy that enables efficient parameter utilization while preserving specialized knowledge, and <xref ref-type="disp-formula" rid="deqn3">(3)</xref> a neural domain detector that dynamically selects the most relevant domain-specific models during inference. Through comprehensive empirical evaluation across multiple medical benchmarks (MedProb, MedNER, MedQuAD), we demonstrate that PRISM-Med achieves consistent performance improvements, with gains of up to 10.1% in medical QA tasks and 2.7% in medical knowledge evaluation compared to traditional fine-tuning baselines. Notably, our framework achieves these improvements while using only 0.1% to 0.4% of the parameters required for traditional fine-tuning approaches. PRISM-Med represents a significant advancement in developing efficient and robust medical language models, providing a practical solution for specialized medical applications where both performance and computational efficiency are crucial. |
format | Article |
id | doaj-art-1056443a0dc84ac8ad0727c1fbe3c118 |
institution | Kabale University |
issn | 2169-3536 |
language | English |
publishDate | 2025-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj-art-1056443a0dc84ac8ad0727c1fbe3c1182025-01-10T00:01:42ZengIEEEIEEE Access2169-35362025-01-01134957496510.1109/ACCESS.2024.352504110820505PRISM-Med: Parameter-Efficient Robust Interdomain Specialty Model for Medical Language TasksJieui Kang0https://orcid.org/0009-0000-7691-0930Hyungon Ryu1Jaehyeong Sim2https://orcid.org/0000-0001-8722-8486Artificial Intelligence Convergence, Ewha Womans University, Seoul, South KoreaNvidia Coporation, Seoul, South KoreaDepartment of Computer Science and Engineering, Ewha Womans University, Seoul, South KoreaLanguage Models (LMs) have shown remarkable potential in healthcare applications, yet their widespread adoption faces challenges in achieving consistent performance across diverse medical specialties while maintaining parameter efficiency. Current approaches to fine-tuning language models for medical tasks often require extensive computational resources and struggle with managing specialized medical knowledge across different domains. To address these challenges, we present PRISM-Med (Parameter-efficient Robust Interdomain Specialty Model), a novel framework that enhances domain-specific performance through supervised domain classification and specialized adaptation. Our framework introduces three key innovations: <xref ref-type="disp-formula" rid="deqn1">(1)</xref> a domain detection model that accurately classifies medical text into specific medical domains using supervised learning, <xref ref-type="disp-formula" rid="deqn2">(2)</xref> a domain-specific Low-Rank Adaptation (LoRA) strategy that enables efficient parameter utilization while preserving specialized knowledge, and <xref ref-type="disp-formula" rid="deqn3">(3)</xref> a neural domain detector that dynamically selects the most relevant domain-specific models during inference. Through comprehensive empirical evaluation across multiple medical benchmarks (MedProb, MedNER, MedQuAD), we demonstrate that PRISM-Med achieves consistent performance improvements, with gains of up to 10.1% in medical QA tasks and 2.7% in medical knowledge evaluation compared to traditional fine-tuning baselines. Notably, our framework achieves these improvements while using only 0.1% to 0.4% of the parameters required for traditional fine-tuning approaches. PRISM-Med represents a significant advancement in developing efficient and robust medical language models, providing a practical solution for specialized medical applications where both performance and computational efficiency are crucial.https://ieeexplore.ieee.org/document/10820505/Deep learningdomain adaptive adapterlow rank adaptermedical AIsmall language model |
spellingShingle | Jieui Kang Hyungon Ryu Jaehyeong Sim PRISM-Med: Parameter-Efficient Robust Interdomain Specialty Model for Medical Language Tasks IEEE Access Deep learning domain adaptive adapter low rank adapter medical AI small language model |
title | PRISM-Med: Parameter-Efficient Robust Interdomain Specialty Model for Medical Language Tasks |
title_full | PRISM-Med: Parameter-Efficient Robust Interdomain Specialty Model for Medical Language Tasks |
title_fullStr | PRISM-Med: Parameter-Efficient Robust Interdomain Specialty Model for Medical Language Tasks |
title_full_unstemmed | PRISM-Med: Parameter-Efficient Robust Interdomain Specialty Model for Medical Language Tasks |
title_short | PRISM-Med: Parameter-Efficient Robust Interdomain Specialty Model for Medical Language Tasks |
title_sort | prism med parameter efficient robust interdomain specialty model for medical language tasks |
topic | Deep learning domain adaptive adapter low rank adapter medical AI small language model |
url | https://ieeexplore.ieee.org/document/10820505/ |
work_keys_str_mv | AT jieuikang prismmedparameterefficientrobustinterdomainspecialtymodelformedicallanguagetasks AT hyungonryu prismmedparameterefficientrobustinterdomainspecialtymodelformedicallanguagetasks AT jaehyeongsim prismmedparameterefficientrobustinterdomainspecialtymodelformedicallanguagetasks |