View source code
							
							
						
								Display the source code in std/numeric.d from which this
								page was generated on github.
							
						
							Report a bug
							
						
								If you spot a problem with this page, click here to create a
								Bugzilla issue.
							
						
							
								Improve this page
							
							
					
								Quickly fork, edit online, and submit a pull request for this page.
								Requires a signed-in GitHub account. This works well for small changes.
								If you'd like to make larger changes you may want to consider using
								local clone.
							
						Function std.numeric.jensenShannonDivergence
Computes the Jensen-Shannon divergence between a and b, which is the sum (ai * log(2 * ai / (ai + bi)) + bi * log(2 *
bi / (ai + bi))) / 2. The base of logarithm is 2. The ranges are
assumed to contain elements in [0, 1]. Usually the ranges are
normalized probability distributions, but this is not required or
checked by jensenShannonDivergence. If the inputs are normalized,
the result is bounded within [0, 1]. The three-parameter version
stops evaluations as soon as the intermediate result is greater than
or equal to limit.
						
				CommonType!(ElementType!Range1,ElementType!Range2) jensenShannonDivergence(Range1, Range2)
				(
				
				  Range1 a,
				
				  Range2 b
				
				)
				
				if (isInputRange!Range1 && isInputRange!Range2 && is(CommonType!(ElementType!Range1, ElementType!Range2)));
				
				
				CommonType!(ElementType!Range1,ElementType!Range2) jensenShannonDivergence(Range1, Range2, F)
				(
				
				  Range1 a,
				
				  Range2 b,
				
				  F limit
				
				)
				
				if (isInputRange!Range1 && isInputRange!Range2 && is(typeof(CommonType!(ElementType!Range1, ElementType!Range2) .init >= F .init) : bool));
						
					
				Example
import std .math .operations : isClose;
double[] p = [ 0.0, 0, 0, 1 ];
writeln(jensenShannonDivergence(p, p)); // 0
double[] p1 = [ 0.25, 0.25, 0.25, 0.25 ];
writeln(jensenShannonDivergence(p1, p1)); // 0
assert(isClose(jensenShannonDivergence(p1, p), 0.548795, 1e-5));
double[] p2 = [ 0.2, 0.2, 0.2, 0.4 ];
assert(isClose(jensenShannonDivergence(p1, p2), 0.0186218, 1e-5));
assert(isClose(jensenShannonDivergence(p2, p1), 0.0186218, 1e-5));
assert(isClose(jensenShannonDivergence(p2, p1, 0.005), 0.00602366, 1e-5));
Authors
Andrei Alexandrescu, Don Clugston, Robert Jacques, Ilya Yaroshenko
License
					Copyright © 1999-2024 by the D Language Foundation | Page generated by ddox.