Probing the Pitfalls: Understanding SVD’s Shortcomings in Language Model Compression
Background: Modern computational linguistics heavily relies on large language models that demonstrate strong performance in various Natural Language Inference (NLI) tasks. These models, however, require substantial computational resources for both training and deployment. To address this challenge,...
Saved in:
Main Author: | Сергей Александрович Плетенев |
---|---|
Format: | Article |
Language: | English |
Published: |
National Research University Higher School of Economics
2024-12-01
|
Series: | Journal of Language and Education |
Subjects: | |
Online Access: | https://jle.hse.ru/article/view/22368 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Analysis of argument structure constructions in the large language model BERT
by: Pegah Ramezani, et al.
Published: (2025-01-01) -
Correlation of Periodontal Phenotype with Periodontal Probing Depth in Maxillary Anterior Teeth: A Cross-Sectional Study Using Probe Transparency Method
by: Maha Maqool, et al.
Published: (2024-12-01) -
Digital Diagnostics: The Potential of Large Language Models in Recognizing Symptoms of Common Illnesses
by: Gaurav Kumar Gupta, et al.
Published: (2025-01-01) -
DESIGN OF THE CONTACT POTENTIALS DIFFERENCE PROBES
by: K. U. Pantsialeyeu, et al.
Published: (2016-06-01) -
DIGITAL CONTACT POTENTIAL DIFFERENCE PROBE
by: K. U. Pantsialeyeu, et al.
Published: (2016-09-01)