The logical entropy solution is the closest to the uniform distribution in terms of the usual notion of (Euclidean) distance while the Jaynes solution is the closest in terms of the Kullback–Leibler (KL) divergence from the uniform distribution. We show that maximizing logical entropy subject to the same constraints gives a different probability distribution. In fact, there is a non-linear dit-to-bit transformation that transforms all the concepts of simple, joint, conditional and mutual logical entropy into the corresponding formulas for Shannon entropy, where the latter are especially suited for the theory of coding and communications.Įdwin Jaynes’ Ma圎ntropy method is intended to generalize the Laplace indifference principle by determining the “best” probability distribution consistent with given constraints (e.g., that rule out the uniform distribution of the indifference principle) by maximizing Shannon entropy subject to those constraints. Just as the Laplace–Boole notion of probability, as the normalized number of elements in a subset, quantifies the logic of subsets, so logical entropy, as the normalized number of distinctions in a partition, quantifies the logic of partitions – and hence the adjective “logical.” The logical entropy of a partition is, in fact, a probability measure – the probability of obtaining a distinction of the partition in two independent draws from the universe set, just as the logical Laplace–Boole probability of a subset (or event) is the one-draw probability of obtaining an element of the subset.įar from displacing the usual notion of Shannon entropy the point is to show that the Shannon entropy of a partition is a different quantification of the same notion of information-as-distinctions, i.e., the average minimum number of binary partitions (bits) that need to be joined together to make the same distinctions of a partition. The formula for logical entropy goes back to the early twentieth century, but the current development comes out of seeing the formula as the quantification of information in a partition as the normalized number of distinctions or dits (ordered pairs of elements in different blocks) of the partition. This paper is an introduction to the concept of logical entropy as the direct measure of the definition of information in terms of distinctions, differences, distinguishability, and diversity. This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The quantum logical entropy of an observable applied to a state is the probability that two different eigenvalues are obtained in two independent projective measurements of that observable on that state. And finally logical entropy linearizes naturally to the corresponding quantum concept. Hence all the concepts of simple, joint, conditional, and mutual logical entropy can be transformed into the corresponding concepts of Shannon entropy by a uniform non-linear dit-bit transform. The Shannon entropy is shown to also be based on this notion of information-as-distinctions it is the average minimum number of binary partitions (bits) that need to be joined to make all the same distinctions of the given partition. And partitions and subsets are mathematically dual concepts – so the logic of partitions is dual in that sense to the usual Boolean logic of subsets, and hence the name “logical entropy.” The logical entropy of a partition has a simple interpretation as the probability that a distinction or dit (elements in different blocks) is obtained in two independent draws from the underlying set. It is the (normalized) quantitative measure of the distinctions of a partition on a set-just as the Boole–Laplace logical probability is the normalized quantitative measure of the elements of a subset of a set. Logical entropy is the natural measure of the notion of information based on distinctions, differences, distinguishability, and diversity. Claude Shannon, as the father of the information age, gave us a theory of communications that quantified an “amount of information,” but, as he pointed out, “no concept of information itself was defined.” Logical entropy provides that definition. * Corresponding author: live in the information age. Faculty of Social Sciences, University of Ljubljana, Ljubljana 1000, Slovenia
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |