Team:TU Darmstadt/Modelling/Statistics

From 2013.igem.org

Revision as of 00:43, 5 October 2013 by TheA (Talk | contribs)







Modelling | Statistics | Structure



Information Theory

The DKL Analysis In information theory the Kullback-Leibler-Divergence (DKL) describes and quantifies the distance between two distributions P and Q. Where P denotes an experimental distribution, it is compared with Q, a reference distribution. DKL is also known as ‘relative entropy’ as well as ‘mutual information’. Although DKL is often used as a metric or distance measurement, it is not a true measurement because it is not symmetric.

DKL






insert insert text





Test