67 Amount of self-information
Self-information content and entropy are fundamental concepts that indicate how rare information is and how uncertain it is on average. The amount of self-information represents the amount of "surprise" when an event occurs. The less likely an event is to occur, the greater the surprise when it does occur and the greater the amount of self-information. For example, if you live in a ten-story apartment building and every floor is equally likely, it is relatively rare for you to know that you live on the tenth floor, so your surprise is greater. Conversely, if you know that most of the residents are concentrated on the ground floor, it will be less of a surprise to learn that you also live on the first floor.
Entropy is the probability-weighted average of the magnitude of surprise for all possible events. You can think of it as a measure of the "average uncertainty" of the entire situation. In the apartment metaphor, if the possibilities are equal on each floor, it is difficult to guess which floor it is, uncertainty is at its maximum, and entropy is high. If the possibilities are biased toward a particular floor, prediction is easier, uncertainty is lower, and entropy is lower.
From a mathematical point of view, the amount of self-information is a quantity that is uniquely determined by three requirements: the less likely it is to happen, the greater it is, it adds up when independent events occur simultaneously, and it depends only on probability. Entropy is the average of that amount of self-information and is a tool to measure the spread of the probability distribution. It acts as a criterion when choosing the smallest sign length or the most unbiased distribution under constraints. The same idea holds true when extended to continuous quantities and shares the same framework as entropy in statistical physics.
Philosophically, the amount of self-information measures the "intensity of change in perception," the degree to which an observer's beliefs are updated. The more unlikely an event is to occur, the greater the deviation from the existing outlook, and the stronger the renewal of the worldview. Entropy averages the magnitude of that renewal as a whole and is also a quantification of the "degree of ignorance". The more widely open to equal possibilities, the more diverse the future unfolds, the higher the entropy. The more biased, the more deterministic the future is and the lower the entropy. This view echoes the intuition of irreversibility in physics and can also be interpreted in historical and social discussions as the process of sorting through a bundle of open possibilities over time.
No comments:
Post a Comment