PreparedLLM: effective pre-pretraining framework for domain-specific large language models
The direct application of large language models (LLMs) to specific domain tasks frequently encounters challenges due to the scarcity of domain data, variations in domain semantics, and the complexity of domain knowledge. Further pretraining of advanced foundational models on extensive domain-specifi...
Saved in:
| Main Authors: | Zhou Chen, Ming Lin, Zimeng Wang, Mingrun Zang, Yuqi Bai |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Taylor & Francis Group
2024-10-01
|
| Series: | Big Earth Data |
| Subjects: | |
| Online Access: | https://www.tandfonline.com/doi/10.1080/20964471.2024.2396159 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
JiuZhou: open foundation language models and effective pre-training framework for geoscience
by: Zhou Chen, et al.
Published: (2025-12-01) -
TCM-GPT: Efficient pre-training of large language models for domain adaptation in Traditional Chinese Medicine
by: Guoxing Yang, et al.
Published: (2024-01-01) -
Pretraining Enhanced RNN Transducer
by: Junyu Lu, et al.
Published: (2024-12-01) -
Comparative Analysis of Conventional CNN v’s ImageNet Pretrained ResNet in Medical Image Classification
by: Christos Raptis, et al.
Published: (2024-12-01) -
Exploring Fragment Adding Strategies to Enhance Molecule Pretraining in AI-Driven Drug Discovery
by: Zhaoxu Meng, et al.
Published: (2024-09-01)