View source code
Display the source code in std/algorithm/iteration.d from which this page was generated on github.
Report a bug
If you spot a problem with this page, click here to create a Bugzilla issue.
Improve this page
Quickly fork, edit online, and submit a pull request for this page. Requires a signed-in GitHub account. This works well for small changes. If you'd like to make larger changes you may want to consider using local clone.

Function std.algorithm.iteration.cacheBidirectional

cache eagerly evaluates front of range on each construction or call to popFront, to store the result in a cache. The result is then directly returned when front is called, rather than re-evaluated.

auto auto cacheBidirectional(Range) (
  Range range
)
if (isBidirectionalRange!Range);

This can be a useful function to place in a chain, after functions that have expensive evaluation, as a lazy alternative to array. In particular, it can be placed after a call to map, or before a call std.range.filter or std.range.tee

cache may provide bidirectional range iteration if needed, but since this comes at an increased cost, it must be explicitly requested via the call to cacheBidirectional. Furthermore, a bidirectional cache will evaluate the "center" element twice, when there is only one element left in the range.

cache does not provide random access primitives, as cache would be unable to cache the random accesses. If Range provides slicing primitives, then cache will provide the same slicing primitives, but hasSlicing!Cache will not yield true (as the hasSlicing trait also checks for random access).

Parameters

NameDescription
range an input range

Returns

An input range with the cached values of range

Example

import std.algorithm.comparison : equal;
import std.range, std.stdio;
import std.typecons : tuple;

ulong counter = 0;
double fun(int x)
{
    ++counter;
    // http://en.wikipedia.org/wiki/Quartic_function
    return ( (x + 4.0) * (x + 1.0) * (x - 1.0) * (x - 3.0) ) / 14.0 + 0.5;
}
// Without cache, with array (greedy)
auto result1 = iota(-4, 5).map!(a =>tuple(a, fun(a)))()
                         .filter!(a => a[1] < 0)()
                         .map!(a => a[0])()
                         .array();

// the values of x that have a negative y are:
assert(equal(result1, [-3, -2, 2]));

// Check how many times fun was evaluated.
// As many times as the number of items in both source and result.
writeln(counter); // iota(-4, 5).length + result1.length

counter = 0;
// Without array, with cache (lazy)
auto result2 = iota(-4, 5).map!(a =>tuple(a, fun(a)))()
                         .cache()
                         .filter!(a => a[1] < 0)()
                         .map!(a => a[0])();

// the values of x that have a negative y are:
assert(equal(result2, [-3, -2, 2]));

// Check how many times fun was evaluated.
// Only as many times as the number of items in source.
writeln(counter); // iota(-4, 5).length

Example

Tip

cache is eager when evaluating elements. If calling front on the underlying range has a side effect, it will be observable before calling front on the actual cached range.

Furthermore, care should be taken composing cache with std.range.take. By placing take before cache, then cache will be "aware" of when the range ends, and correctly stop caching elements when needed. If calling front has no side effect though, placing take after cache may yield a faster range.

Either way, the resulting ranges will be equivalent, but maybe not at the same cost or side effects.

import std.algorithm.comparison : equal;
import std.range;
int i = 0;

auto r = iota(0, 4).tee!((a){i = a;}, No.pipeOnPop);
auto r1 = r.take(3).cache();
auto r2 = r.cache().take(3);

assert(equal(r1, [0, 1, 2]));
assert(i == 2); //The last "seen" element was 2. The data in cache has been cleared.

assert(equal(r2, [0, 1, 2]));
assert(i == 3); //cache has accessed 3. It is still stored internally by cache.

Authors

Andrei Alexandrescu

License

Boost License 1.0.