Understanding the Constraints in Maximum Entropy Methods for Modeling and Inference
The principle of maximum entropy, developed more than six decades ago, provides a systematic approach to modeling inference, and data analysis grounded in the principles of information theory, Bayesian probability and constrained optimization. Since its formulation, criticisms about the consistency...
Veröffentlicht in: | IEEE transactions on pattern analysis and machine intelligence. - 1979. - 45(2023), 3 vom: 23. März, Seite 3994-3998 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2023
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on pattern analysis and machine intelligence |
Schlagworte: | Journal Article |
Zusammenfassung: | The principle of maximum entropy, developed more than six decades ago, provides a systematic approach to modeling inference, and data analysis grounded in the principles of information theory, Bayesian probability and constrained optimization. Since its formulation, criticisms about the consistency of that method and the role of constraints have been raised. Among these, the chief criticism is that maximum entropy does not satisfy the principle of causation, or similarly, that maximum entropy updating is inconsistent due to an inadequate representation of causal information. We show that these criticisms rest on misunderstanding and misapplication of the way constraints have to be specified within the maximum entropy method. Correction of these problems eliminates the seeming paradoxes and inconsistencies critics claim to have detected. We demonstrate that properly formulated maximum entropy models satisfy the principle of causation |
---|---|
Beschreibung: | Date Completed 07.04.2023 Date Revised 11.04.2023 published: Print-Electronic Citation Status PubMed-not-MEDLINE |
ISSN: | 1939-3539 |
DOI: | 10.1109/TPAMI.2022.3185394 |