Minimum Divergence Methods in Statistical Machine Learning: From an Information
140,90 €
Typically, KL divergence leads to the exponential model and the maximum likelihood estimation. If U is selected as an exponential function, then the corresponding U -entropy and U -divergence are reduced to the Boltzmann-Shanon entropy and the KL divergence; the minimum U -divergence estimator is equivalent to the MLE.
Jetzt bei Ebay: