A Green AI Methodology Based on Persistent Homology for Compressing BERT
Large Language Models (LLMs) like BERT have gained significant prominence due to their remarkable performance in various natural language processing tasks. However, they come with substantial computational and memory costs. Additionally, they are essentially black-box models, being challenging to ex...
Saved in:
Main Authors: | Luis Balderas, Miguel Lastra, José M. Benítez |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/1/390 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Persistent Homology Combined with Machine Learning for Social Network Activity Analysis
by: Zhijian Zhang, et al.
Published: (2024-12-01) -
Identifying key genes in cancer networks using persistent homology
by: Rodrigo Henrique Ramos, et al.
Published: (2025-01-01) -
Enhancing Abstractive Multi-Document Summarization with Bert2Bert Model for Indonesian Language
by: Aldi Fahluzi Muharam, et al.
Published: (2025-01-01) -
Classification Performance Comparison of BERT and IndoBERT on SelfReport of COVID-19 Status on Social Media
by: Irwan Budiman, et al.
Published: (2024-03-01) -
AI anxiety: Explication and exploration of effect on state anxiety when interacting with AI doctors
by: Hyun Yang, et al.
Published: (2025-03-01)