Large-Language-Model-Enabled Text Semantic Communication Systems
Large language models (LLMs) have recently demonstrated state-of-the-art performance in various natural language processing (NLP) tasks, achieving near-human levels in multiple language understanding challenges and aligning closely with the core principles of semantic communication Inspired by LLMs’...
Saved in:
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-06-01
|
| Series: | Applied Sciences |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2076-3417/15/13/7227 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Large language models (LLMs) have recently demonstrated state-of-the-art performance in various natural language processing (NLP) tasks, achieving near-human levels in multiple language understanding challenges and aligning closely with the core principles of semantic communication Inspired by LLMs’ advancements in semantic processing, we propose LLM-SC, an innovative LLM-enabled semantic communication system framework which applies LLMs directly to the physical layer coding and decoding for the first time. By analyzing the relationship between the training process of LLMs and the optimization objectives of semantic communication, we propose training a semantic encoder through LLMs’ tokenizer training and establishing a semantic knowledge base via the LLMs’ unsupervised pre-training process. This knowledge base facilitates the creation of optimal decoder by providing the prior probability of the transmitted language sequence. Based on this, we derive the optimal decoding criteria for the receiver and introduce beam search algorithm to further reduce complexity. Furthermore, we assert that existing LLMs can be employed directly for LLM-SC without extra re-training or fine-tuning. Simulation results reveal that LLM-SC outperforms conventional DeepSC at signal-to-noise ratios (SNRs) exceeding 3 dB, as it enables error-free transmissions of semantic information under high SNRs while DeepSC fails to do so. In addition to semantic-level performance, LLM-SC demonstrates compatibility with technical-level performance, achieving approximately an 8 dB coding gain for a bit error ratio (BER) of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mn>10</mn><mrow><mo>−</mo><mn>3</mn></mrow></msup></semantics></math></inline-formula> without any channel coding while maintaining the same joint source–channel coding rate as traditional communication systems. |
|---|---|
| ISSN: | 2076-3417 |