Prefix Tuning Using Residual Reparameterization
Fine-tuning large language models for specific tasks requires updating and storing all parameters, leading to significant computational and storage cost issues. To address these challenges, parameter-efficient learning such as prefix tuning has gained attention. However, prefix tuning can suffer fro...
Saved in:
| Main Authors: | Youngjun Jung, Hyunsun Hwang, Changki Lee |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10938609/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Simple reparameterization to improve convergence in linear mixed models
by: Gregor GORJANC, et al.
Published: (2010-12-01) -
Verbs with Prefix po- in Russian Dialects of Amur Region
by: V. T. Sadchenko
Published: (2022-03-01) -
Reparameterized Feature Aggregation Convolutional Neural Network for Remote Sensing Scene Image Classification
by: Cuiping Shi, et al.
Published: (2025-01-01) -
Pronunciation of Prefixed Words in Speech: The Importance of Semantic and Intersubjective Parameters
by: Nicolas Videau, et al.
Published: (2015-05-01) -
Design of a low-delay 4-bit parallel prefix adder using QCA technology
by: Tushar Niranjan, et al.
Published: (2025-07-01)