Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments
A major task in particle physics is the measurement of rare signal processes. Even modest improvements in background rejection, at a fixed signal efficiency, can significantly enhance the measurement sensitivity. Building on prior research by others that incorporated physical symmetries into neural...
Saved in:
| Main Author: | |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
SciPost
2025-07-01
|
| Series: | SciPost Physics |
| Online Access: | https://scipost.org/SciPostPhys.19.1.028 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849251658349936640 |
|---|---|
| author | Luc Builtjes, Sascha Caron, Polina Moskvitina, Clara Nellist, Roberto Ruiz de Austri, Rob Verheyen, Zhongyi Zhang |
| author_facet | Luc Builtjes, Sascha Caron, Polina Moskvitina, Clara Nellist, Roberto Ruiz de Austri, Rob Verheyen, Zhongyi Zhang |
| author_sort | Luc Builtjes, Sascha Caron, Polina Moskvitina, Clara Nellist, Roberto Ruiz de Austri, Rob Verheyen, Zhongyi Zhang |
| collection | DOAJ |
| description | A major task in particle physics is the measurement of rare signal processes. Even modest improvements in background rejection, at a fixed signal efficiency, can significantly enhance the measurement sensitivity. Building on prior research by others that incorporated physical symmetries into neural networks, this work extends those ideas to include additional physics-motivated features. Specifically, we introduce energy-dependent particle interaction strengths, derived from leading-order SM predictions, into modern deep learning architectures, including Transformer Architectures (Particle Transformer), and Graph Neural Networks (Particle Net). These interaction strengths, represented as the SM interaction matrix, are incorporated into the attention matrix (transformers) and edges (graphs). Our results in event classification show that the integration of all physics-motivated features improves background rejection by $10\%-40\%$ over baseline models, with an additional gain of up to $9\%$ due to the SM interaction matrix. This study also provides one of the broadest comparisons of event classifiers to date, demonstrating how various architectures perform across this task. A simplified statistical analysis demonstrates that these enhanced architectures yield significant improvements in signal significance compared to a graph network baseline. |
| format | Article |
| id | doaj-art-3cdf7fa308a5413bb0d8fbe70e7f04c3 |
| institution | Kabale University |
| issn | 2542-4653 |
| language | English |
| publishDate | 2025-07-01 |
| publisher | SciPost |
| record_format | Article |
| series | SciPost Physics |
| spelling | doaj-art-3cdf7fa308a5413bb0d8fbe70e7f04c32025-08-20T03:56:54ZengSciPostSciPost Physics2542-46532025-07-0119102810.21468/SciPostPhys.19.1.028Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experimentsLuc Builtjes, Sascha Caron, Polina Moskvitina, Clara Nellist, Roberto Ruiz de Austri, Rob Verheyen, Zhongyi ZhangA major task in particle physics is the measurement of rare signal processes. Even modest improvements in background rejection, at a fixed signal efficiency, can significantly enhance the measurement sensitivity. Building on prior research by others that incorporated physical symmetries into neural networks, this work extends those ideas to include additional physics-motivated features. Specifically, we introduce energy-dependent particle interaction strengths, derived from leading-order SM predictions, into modern deep learning architectures, including Transformer Architectures (Particle Transformer), and Graph Neural Networks (Particle Net). These interaction strengths, represented as the SM interaction matrix, are incorporated into the attention matrix (transformers) and edges (graphs). Our results in event classification show that the integration of all physics-motivated features improves background rejection by $10\%-40\%$ over baseline models, with an additional gain of up to $9\%$ due to the SM interaction matrix. This study also provides one of the broadest comparisons of event classifiers to date, demonstrating how various architectures perform across this task. A simplified statistical analysis demonstrates that these enhanced architectures yield significant improvements in signal significance compared to a graph network baseline.https://scipost.org/SciPostPhys.19.1.028 |
| spellingShingle | Luc Builtjes, Sascha Caron, Polina Moskvitina, Clara Nellist, Roberto Ruiz de Austri, Rob Verheyen, Zhongyi Zhang Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments SciPost Physics |
| title | Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments |
| title_full | Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments |
| title_fullStr | Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments |
| title_full_unstemmed | Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments |
| title_short | Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments |
| title_sort | attention to the strengths of physical interactions transformer and graph based event classification for particle physics experiments |
| url | https://scipost.org/SciPostPhys.19.1.028 |
| work_keys_str_mv | AT lucbuiltjessaschacaronpolinamoskvitinaclaranellistrobertoruizdeaustrirobverheyenzhongyizhang attentiontothestrengthsofphysicalinteractionstransformerandgraphbasedeventclassificationforparticlephysicsexperiments |