View source code
Display the source code in std/numeric.d from which this
page was generated on github.
Report a bug
If you spot a problem with this page, click here to create a
Bugzilla issue.
Improve this page
Quickly fork, edit online, and submit a pull request for this page.
Requires a signed-in GitHub account. This works well for small changes.
If you'd like to make larger changes you may want to consider using
local clone.
Function std.numeric.kullbackLeiblerDivergence
Computes the Kullback-Leibler divergence between input ranges
a
and b
, which is the sum ai * log(ai / bi)
. The base
of logarithm is 2. The ranges are assumed to contain elements in [0, 1]
. Usually the ranges are normalized probability distributions,
but this is not required or checked by kullbackLeiblerDivergence
. If any element bi
is zero and the
corresponding element ai
nonzero, returns infinity. (Otherwise,
if ai == 0 && bi == 0
, the term ai * log(ai / bi)
is
considered zero.) If the inputs are normalized, the result is
positive.
CommonType!(ElementType!Range1,ElementType!Range2) kullbackLeiblerDivergence(Range1, Range2)
(
Range1 a,
Range2 b
)
if (isInputRange!Range1 && isInputRange!Range2);
Example
import std .math .operations : isClose;
double[] p = [ 0.0, 0, 0, 1 ];
writeln(kullbackLeiblerDivergence(p, p)); // 0
double[] p1 = [ 0.25, 0.25, 0.25, 0.25 ];
writeln(kullbackLeiblerDivergence(p1, p1)); // 0
writeln(kullbackLeiblerDivergence(p, p1)); // 2
writeln(kullbackLeiblerDivergence(p1, p)); // double.infinity
double[] p2 = [ 0.2, 0.2, 0.2, 0.4 ];
assert(isClose(kullbackLeiblerDivergence(p1, p2), 0.0719281, 1e-5));
assert(isClose(kullbackLeiblerDivergence(p2, p1), 0.0780719, 1e-5));
Authors
Andrei Alexandrescu, Don Clugston, Robert Jacques, Ilya Yaroshenko
License
Copyright © 1999-2024 by the D Language Foundation | Page generated by ddox.