Generating Authentic Grounded Synthetic Maintenance Work Orders

Large language models (LLMs) are promising for generating synthetic technical data, particularly for industrial maintenance where real datasets are often limited and unbalanced. This study generates synthetic maintenance work orders (MWOs) that are grounded to accurately represent engineering knowle...

Full description

Saved in:
Bibliographic Details
Main Authors: Allison Lau, Jadeyn Feng, Melinda Hodkiewicz, Caitlin Woods, Michael Stewart, Adriano Polpo
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11124200/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Large language models (LLMs) are promising for generating synthetic technical data, particularly for industrial maintenance where real datasets are often limited and unbalanced. This study generates synthetic maintenance work orders (MWOs) that are grounded to accurately represent engineering knowledge and authentic&#x2013;reflecting technician language, jargon, and abbreviations. First, we extracted valid engineering paths from a knowledge graph constructed using the MaintIE gold-annotated industrial MWO dataset. Each path encodes engineering knowledge as a triple. These paths are used to constrain the output of an LLM (<monospace>GPT-4o mini</monospace>) to generate grounded synthetic MWOs using few-shot prompting. The synthetic MWOs are made authentic by incorporating human-like elements, such as contractions, abbreviations, and typos. Evaluation results show that the synthetic data is 86% as natural and 95% as correct as real MWOs. Turing test experiments reveal that subject matter experts could distinguish real from synthetic data only 51% of the time while exhibiting near-zero agreement, indicating random guessing. Statistical hypothesis testing confirms the results from the Turing Test. This research offers a generic approach to extracting legitimate paths from a knowledge graph to ensure that synthetic data generated are grounded in engineering knowledge while reflecting the style and language of the technicians who write them. To enable replication and reuse, code, data and documentation are at <uri>https://github.com/nlp-tlp/LLM-KG-Synthetic-MWO</uri>
ISSN:2169-3536