Team:TU Darmstadt/Modelling/Statistics
From 2013.igem.org
(Difference between revisions)
Line 186: | Line 186: | ||
<font size="3" color="#F0F8FF" face="Arial regular"> | <font size="3" color="#F0F8FF" face="Arial regular"> | ||
<p text-aligne:left style="margin-left:50px; margin-right:50px"> | <p text-aligne:left style="margin-left:50px; margin-right:50px"> | ||
- | + | The DKL Analysis | |
+ | |||
+ | In information theory the Kullback-Leibler-Divergence (DKL) describes and quantifies the distance between | ||
+ | two distributions P and Q. Where P denotes an experimental distribution, it is compared with Q, a reference distribution. DKL is also known as ‘relative entropy’ as well as ‘mutual information’. | ||
+ | Although DKL is often used as a metric or distance measurement, it is not a true measurement because it is not symmetric. | ||
+ | |||
<center> | <center> | ||
<img alt="DKL" src="/wiki/images/7/71/DKL.png" width="555" height="138"> | <img alt="DKL" src="/wiki/images/7/71/DKL.png" width="555" height="138"> |
Revision as of 00:43, 5 October 2013
Information Theory
The DKL Analysis In information theory the Kullback-Leibler-Divergence (DKL) describes and quantifies the distance between two distributions P and Q. Where P denotes an experimental distribution, it is compared with Q, a reference distribution. DKL is also known as ‘relative entropy’ as well as ‘mutual information’. Although DKL is often used as a metric or distance measurement, it is not a true measurement because it is not symmetric.
insert insert text