Kolmogorov Capacity with Overlap
The notion of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula>-mutual information between non-stochastic uncertain variables is introduced as...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-04-01
|
| Series: | Entropy |
| Subjects: | |
| Online Access: | https://www.mdpi.com/1099-4300/27/5/472 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | The notion of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula>-mutual information between non-stochastic uncertain variables is introduced as a generalization of Nair’s non-stochastic information functional. Several properties of this new quantity are illustrated and used in a communication setting to show that the largest <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula>-mutual information between received and transmitted codewords over <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>ϵ</mi></semantics></math></inline-formula>-noise channels equals the <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>(</mo><mi>ϵ</mi><mo>,</mo><mi>δ</mi><mo>)</mo></mrow></semantics></math></inline-formula>-capacity. This notion of capacity generalizes the Kolmogorov <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>ϵ</mi></semantics></math></inline-formula>-capacity to packing sets of overlap at most <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula> and is a variation of a previous definition proposed by one of the authors. Results are then extended to more general noise models, including non-stochastic, memoryless, and stationary channels. The presented theory admits the possibility of decoding errors, as in classical information theory, while retaining the worst-case, non-stochastic character of Kolmogorov’s approach. |
|---|---|
| ISSN: | 1099-4300 |