Using low-discrepancy points for data compression in machine learning: an experimental comparison
Abstract Low-discrepancy points (also called Quasi-Monte Carlo points) are deterministically and cleverly chosen point sets in the unit cube, which provide an approximation of the uniform distribution. We explore two methods based on such low-discrepancy points to reduce large data sets in order to...
Saved in:
Main Authors: | S. Göttlich, J. Heieck, A. Neuenkirch |
---|---|
Format: | Article |
Language: | English |
Published: |
SpringerOpen
2025-01-01
|
Series: | Journal of Mathematics in Industry |
Subjects: | |
Online Access: | https://doi.org/10.1186/s13362-024-00166-5 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Variation Comparison of OLS and GLS Estimators using Monte Carlo Simulation of Linear Regression Model with Autoregressive Scheme
by: Sajid AliKhan, et al.
Published: (2021-02-01) -
Research of the Algorithm for Robot Workspace Boundary Extraction
by: Cui Zhihong, et al.
Published: (2016-01-01) -
STATISTICAL SIMULATION OF RELIABILITY OF NETWORKS WITH EXPONENTIALLY DISTRIBUTED UNIT LIFETIMES
by: ROTARU, Maria
Published: (2024-09-01) -
Bregman divergences for physically informed discrepancy measures for learning and computation in thermomechanics
by: Andrieux, Stéphane
Published: (2023-02-01) -
Discrepancy evaluation of social reconstruction-based curriculum implementation at Sekolah Rimba Indonesia
by: Rika Yustikarini
Published: (2023-12-01)