Measuring the Correctness of Double-Keying: Error Classification and Quality Control in a Large Corpus of TEI-Annotated Historical Text
Among mass digitization methods, double-keying is considered to be the one with the lowest error rate. This method requires two independent transcriptions of a text by two different operators. It is particularly well suited to historical texts, which often exhibit deficiencies like poor master copie...
Saved in:
Main Authors: | Susanne Haaf, Frank Wiegand, Alexander Geyken |
---|---|
Format: | Article |
Language: | deu |
Published: |
Text Encoding Initiative Consortium
2015-03-01
|
Series: | Journal of the Text Encoding Initiative |
Subjects: | |
Online Access: | https://journals.openedition.org/jtei/739 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
The DTA “Base Format”: A TEI Subset for the Compilation of a Large Reference Corpus of Printed Text from Multiple Sources
by: Susanne Haaf, et al.
Published: (2015-04-01) -
Texts and Documents: New Challenges for TEI Interchange and Lessons from the Shelley-Godwin Archive
by: Trevor Muñoz, et al.
Published: (2015-09-01) -
Building, Encoding, and Annotating a Corpus of Parliamentary Debates in TEI XML: A Cross-Linguistic Account
by: Naomi Truan, et al.
Published: (2022-06-01) -
A TEI-based Approach to Standardising Spoken Language Transcription
by: Thomas Schmidt
Published: (2011-06-01) -
Opinion: Strategy of Semi-Automatically Annotating a Full-Text Corpus of
by: Hyun-Seok Park
Published: (2018-12-01)