Hierarchical Mutual Information (HMI) can be written as a level by level summation of classical conditional mutual information terms. We prove that the HMI is bounded from above by the corresponding corresponding hierarchical joint entropy.
Complex systems often exhibit multiple levels of organization covering a wide
range of physical scales, so the study of the hierarchical decomposition of
their structure and function is frequently convenient. To better understand
this phenomenon, we introduce a generalization of information theory that works
with hierarchical partitions. We begin revisiting the recently introduced
Hierarchical Mutual Information (HMI), and show that it can be written as a
level by level summation of classical conditional mutual information terms.
Then, we prove that the HMI is bounded from above by the corresponding
hierarchical joint entropy. In this way, in analogy to the classical case, we
derive hierarchical generalizations of many other classical
information-theoretic quantities. In particular, we prove that, as opposed to
its classical counterpart, the hierarchical generalization of the Variation of
Information is not a metric distance, but it admits a transformation into one.
Moreover, focusing on potential applications of the existing developments of
the theory, we show how to adjust by chance the HMI. We also corroborate and
analyze all the presented theoretical results with exhaustive numerical
computations, and include an illustrative application example of the introduced
formalism. Finally, we mention some open problems that should be eventually
addressed for the proposed generalization of information theory to reach