文摘
The principle of minimum cross-entropy (ME-principle) is often used as an elegant and powerful tool to build up complete probability distributions when only partial knowledge is available. The inputs it may be applied to are a prior distribution P and some new information R, and it yields as a result the one distribution P* that satisfies R and is closest to P in an information-theoretic sense. More generally, it provides a “best” solution to the problem “How to adjust P to R?”