KL divergence 쉽게 설명된 글
https://angeloyeo.github.io/2020/10/27/KL_divergence.html
KL divergence - 공돌이의 수학정리노트
angeloyeo.github.io
cross entropy가 entropy 보다 항상 큰 이유
Why is the cross-entropy always more than the entropy?
I understand intuitively why cross-entropy is always bigger. However, could someone show that mathematically?
stats.stackexchange.com
'정보이론' 카테고리의 다른 글
[정보이론] Convex vs Strictly Convex (0) | 2022.04.17 |
---|---|
[정보이론] Jensen's inequality 그래프로 설명 (0) | 2022.03.23 |
[정보이론] Correlation의 한계와 Mutual Infromation의 도입 (0) | 2022.03.22 |