A case study on entropy-aware block-based linear transforms for lossless image compression

Abstract Data compression algorithms tend to reduce information entropy, which is crucial, especially in the case of images, as they are data intensive. In this regard, lossless image data compression is especially challenging. Many popular lossless compression methods incorporate predictions and va...

Full description

Saved in:
Bibliographic Details
Main Authors: Borut Žalik, David Podgorelec, Ivana Kolingerová, Damjan Strnad, Štefan Kohek
Format: Article
Language:English
Published: Nature Portfolio 2024-11-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-024-79038-2
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Data compression algorithms tend to reduce information entropy, which is crucial, especially in the case of images, as they are data intensive. In this regard, lossless image data compression is especially challenging. Many popular lossless compression methods incorporate predictions and various types of pixel transformations, in order to reduce the information entropy of an image. In this paper, a block optimisation programming framework $$\Phi$$ Φ is introduced to support various experiments on raster images, divided into blocks of pixels. Eleven methods were implemented within $$\Phi$$ Φ , including prediction methods, string transformation methods, and inverse distance weighting, as a representative of interpolation methods. Thirty-two different greyscale raster images with varying resolutions and contents were used in the experiments. It was shown that $$\Phi$$ Φ reduces information entropy better than the popular JPEG LS and CALIC predictors. The additional information associated with each block in $$\Phi$$ Φ is then evaluated. It was confirmed that, despite this additional cost, the estimated size in bytes is smaller in comparison to the sizes achieved by the JPEG LS and CALIC predictors.
ISSN:2045-2322