Function std.algorithm.iteration.cacheBidirectional
cache eagerly evaluates front of range
on each construction or call to popFront,
to store the result in a cache.
The result is then directly returned when front is called,
rather than re-evaluated.
						
				auto auto cacheBidirectional(Range)
				(
				
				  Range range
				
				)
				
				if (isBidirectionalRange!Range);
						
					
				This can be a useful function to place in a chain, after functions
that have expensive evaluation, as a lazy alternative to array.
In particular, it can be placed after a call to map, or before a call
std or std
cache may provide
bidirectional range
iteration if needed, but since this comes at an increased cost, it must be explicitly requested via the
call to cacheBidirectional. Furthermore, a bidirectional cache will
evaluate the "center" element twice, when there is only one element left in
the range.
cache does not provide random access primitives,
as cache would be unable to cache the random accesses.
If Range provides slicing primitives,
then cache will provide the same slicing primitives,
but hasSlicing!Cache will not yield true (as the hasSlicing
trait also checks for random access).
Parameters
| Name | Description | 
|---|---|
| range | an input range | 
Returns
An input range with the cached values of range
Example
import stdExample
Tip
cache is eager when evaluating elements. If calling front on the
underlying range has a side effect, it will be observable before calling
front on the actual cached range.
Furthermore, care should be taken composing cache with std.
By placing take before cache, then cache will be "aware"
of when the range ends, and correctly stop caching elements when needed.
If calling front has no side effect though, placing take after cache
may yield a faster range.
Either way, the resulting ranges will be equivalent, but maybe not at the same cost or side effects.
import std