Team:TU Darmstadt/Modelling/Statistics
From 2013.igem.org
Information Theory
The DKL Analysis In information theory the Kullback-Leibler-Divergence (DKL) describes and quantifies the distance between two distributions P and Q. Where P denotes an experimental distribution, it is compared with Q, a reference distribution. DKL is also known as ‘relative entropy’ as well as ‘mutual information’. Although DKL is often used as a metric or distance measurement, it is not a true measurement because it is not symmetric.
insert insert text