Retrieval-Pretrained Transformer: Long-range Language Modeling with Self-retrieval
Saved in:
| Main Authors: | Ohad Rubin, Jonathan Berant |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
The MIT Press
2024-10-01
|
| Series: | Transactions of the Association for Computational Linguistics |
| Online Access: | http://dx.doi.org/10.1162/tacl_a_00693 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
In-Context Retrieval-Augmented Language Models
by: Ori Ram, et al.
Published: (2023-11-01) -
Layered Query Retrieval: An Adaptive Framework for Retrieval-Augmented Generation in Complex Question Answering for Large Language Models
by: Jie Huang, et al.
Published: (2024-11-01) -
Geographic Adaptation of Pretrained Language Models
by: Valentin Hofmann, et al.
Published: (2024-04-01) -
MVR: Synergizing Large and Vision Transformer for Multimodal Natural Language-Driven Vehicle Retrieval
by: Tareq Mahmod AlZubi, et al.
Published: (2025-01-01) -
Opportunities for retrieval and tool augmented large language models in scientific facilities
by: Michael H. Prince, et al.
Published: (2024-11-01)