Class ValueBiasedStochasticSampling<T extends Copyable<T>>
- Type Parameters:
T
- The type of object under optimization.
- All Implemented Interfaces:
Splittable<TrackableSearch<T>>
,Metaheuristic<T>
,SimpleMetaheuristic<T>
,TrackableSearch<T>
Although the VBSS algorithm itself is not restricted to permutation problems, the examples that follow in this documentation focus on permutations for illustrative purposes.
Let's consider an illustrative example. Consider a problem whose solution is to be represented with a permutation of length L of the integers {0, 1, ..., (L-1)}. For the sake of this example, let L=4. Thus, we are searching for a permutation of the integers {0, 1, 2, 3}. We will iteratively build up a partial permutation containing a subset of the elements into a complete permutation.
We begin with an empty partial permutation: p = []. The first iteration will select the first element. To do so, we use a heuristic to evaluate each option within the context of the problem. Let S be the set of elements not yet in the permutation, which in this case is initially S = {0, 1, 2, 3}. Let h(p, e) be a constructive heuristic that takes as input the current partial permutation p, and an element e from S under consideration for addition to p, and which produces a real value as output that increases as the importance of adding e to p increases. That is, higher values of the heuristic imply that the heuristic has a higher level of confidence that the element e should be added next to p. For the sake of this example, consider that the heuristic values are as follows: h([], 0) = 5, h([], 1) = 7, h([], 2) = 1, and h([], 3) = 1. The heuristic seems to favor element 1 the most, and has element 0 as its second choice, and doesn't seem to think that elements 2 and 3 are very good choices relative to the others.
Now, VBSS also uses a bias function b. We will compute b(h(p, e)) for each element e under consideration for addition to partial permutation p. A common form of the bias function is b(h(p, e)) = h(p, e)a for some exponent a. The greater your confidence is in the decision making ability of the heuristic, the greater the exponent a should be. We'll use a=2 for this example. So given our heuristic values from above, we have the following: b(h([], 0)) = 25, b(h([], 1)) = 49, b(h([], 2)) = 1, and b(h([], 3)) = 1.
The element that is added to the partial permutation p is then determined randomly such that the probability P(e) of choosing element e is proportional to b(h(p, e)). In this example, P(0) = 25 / (25+49+1+1) = 25 / 76 = 0.329. P(1) = 49 / 76 = 0.645. P(2) = 1 / 76 = 0.013. And likewise P(3) = 0.013. So slightly less than two out of every three samples, VBSS will end up beginning the permutation with element 1 in this example.
For the sake of the example, let's assume that element 1 was chosen above. We now have partial permutation p = [1], and the set of elements not yet added to p is S = {0, 2, 3}. We need to compute h([1], e) for each of the elements e from S, and then compute b(h([1], e)). For most problems, the heuristic values would have changed. For example, if the problem was the traveling salesperson, then the heuristic might be in terms of the distance from the last city already in the partial permutation (favoring nearby cities). As you move from city to city, which cities are nearest will change. This is why one of the inputs to the heuristic must be the current partial permutation. So let's assume for the example that we recompute the heuristic and get the following values: h([1], 0) = 1, h([1], 2) = 4, and h([1], 3) = 3. When we compute the biases, we get: b(h([1], 0)) = 1, b(h([1], 2)) = 16, and b(h([1], 3)) = 9. The selection probabilities are thus: P(0) = 1 / (1+16+9) = 1 / 26 = 0.038; P(2) = 16 / 26 = 0.615; and P(3) = 9 / 26 = 0.346. Although there is much higher probability of selecting element 2, there is also a reasonably high chance of VBSS selecting element 3 in this case. Let's consider that it did choose element 3. Thus, p is now p = [1, 3] and S = {0, 2}.
One final decision is needed in this example. Let h([1, 3], 0) = 5, and h([1, 3], 2) = 6, which means b(h([1, 3], 0)) = 25, and b(h([1, 3], 2)) = 36. The selection probabilities are then: P(0) = 25 / (25+36) = 25 / 61 = 0.41, and P(2) = 36 / 61 = 0.59. Let's say that element 2 was chosen, so that now we have p = [0, 3, 2]. Since only one element remains, it is thus added as well to get p = [0, 3, 2, 1]. This permutation is then evaluated with the optimization problem's cost function, and the entire process repeated N times ultimately returning the best (lowest cost) of the N randomly sampled solutions.
To use this implementation of VBSS, you will need to implement a constructive heuristic for
your problem using the ConstructiveHeuristic
interface. The ValueBiasedStochasticSampling
class also provides a variety of constructors enabling defining the bias function in different
ways. The most basic uses the approach of the above example, allowing specifying the exponent,
and the default is simply an exponent of 1. The most general allows you to specify any arbitrary
bias function using the ValueBiasedStochasticSampling.BiasFunction
interface.
Assuming that the length of the permutation is L, and that the runtime of the constructive heuristic is O(f(L)), the runtime to construct one permutation using VBSS is O(L2 f(L)). If the cost, f(L), to heuristically evaluate one permutation element is simply, O(1), constant time, then the cost to heuristically construct one permutation with VBSS is simply O(L2).
See the following two publications for the original description of the VBSS algorithm:
- Vincent A. Cicirello. "Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance." PhD thesis, Ph.D. in Robotics, The Robotics Institute, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, July 2003. [PDF] [BIB]
- Vincent A. Cicirello and Stephen F. Smith. "Enhancing Stochastic Search Performance by Value-Biased Randomization of Heuristics." Journal of Heuristics, 11(1):5-34, January 2005. doi:10.1007/s10732-005-6997-8. [PDF] [BIB]
-
Nested Class Summary
Modifier and TypeClassDescriptionstatic interface
Implement this interface to implement the bias function used by VBSS. -
Constructor Summary
ConstructorDescriptionValueBiasedStochasticSampling
(ConstructiveHeuristic<T> heuristic) Constructs a ValueBiasedStochasticSampling search object.ValueBiasedStochasticSampling
(ConstructiveHeuristic<T> heuristic, double exponent) Constructs a ValueBiasedStochasticSampling search object.ValueBiasedStochasticSampling
(ConstructiveHeuristic<T> heuristic, double exponent, ProgressTracker<T> tracker) Constructs a ValueBiasedStochasticSampling search object.ValueBiasedStochasticSampling
(ConstructiveHeuristic<T> heuristic, ProgressTracker<T> tracker) Constructs a ValueBiasedStochasticSampling search object.ValueBiasedStochasticSampling
(ConstructiveHeuristic<T> heuristic, ValueBiasedStochasticSampling.BiasFunction bias) Constructs a ValueBiasedStochasticSampling search object.ValueBiasedStochasticSampling
(ConstructiveHeuristic<T> heuristic, ValueBiasedStochasticSampling.BiasFunction bias, ProgressTracker<T> tracker) Constructs a ValueBiasedStochasticSampling search object. -
Method Summary
Modifier and TypeMethodDescriptioncreateExponentialBias
(double scale) Creates an exponential bias function of the form: exp(scale * value).Gets a reference to the problem that this search is solving.final ProgressTracker<T>
Gets theProgressTracker
object that is in use for tracking search progress.final long
Gets the total run length of the metaheuristic.final SolutionCostPair<T>
optimize()
Executes a single run of a metaheuristic whose run length cannot be specified (e.g., a hill climber that terminates when it reaches a local optima, or a stochastic sampler that terminates when it constructs one solution, etc).final SolutionCostPair<T>
optimize
(int numSamples) Generates multiple stochastic heuristic samples.final void
setProgressTracker
(ProgressTracker<T> tracker) Sets theProgressTracker
object that is in use for tracking search progress.split()
Generates a functionally identical copy of this object, for use in multithreaded implementations of search algorithms.
-
Constructor Details
-
ValueBiasedStochasticSampling
Constructs a ValueBiasedStochasticSampling search object. A ProgressTracker is created for you. The bias function simply returns the heuristic value (random decisions are simply proportional to the element's heuristic value).- Parameters:
heuristic
- The constructive heuristic.- Throws:
NullPointerException
- if heuristic is null
-
ValueBiasedStochasticSampling
public ValueBiasedStochasticSampling(ConstructiveHeuristic<T> heuristic, ProgressTracker<T> tracker) Constructs a ValueBiasedStochasticSampling search object. The bias function simply returns the heuristic value (random decisions are simply proportional to the element's heuristic value).- Parameters:
heuristic
- The constructive heuristic.tracker
- A ProgressTracker- Throws:
NullPointerException
- if heuristic or tracker is null
-
ValueBiasedStochasticSampling
Constructs a ValueBiasedStochasticSampling search object. A ProgressTracker is created for you.- Parameters:
heuristic
- The constructive heuristic.exponent
- The bias function is defined as: bias(value) = pow(value, exponent).- Throws:
NullPointerException
- if heuristic is null
-
ValueBiasedStochasticSampling
public ValueBiasedStochasticSampling(ConstructiveHeuristic<T> heuristic, double exponent, ProgressTracker<T> tracker) Constructs a ValueBiasedStochasticSampling search object.- Parameters:
heuristic
- The constructive heuristic.exponent
- The bias function is defined as: bias(value) = pow(value, exponent).tracker
- A ProgressTracker- Throws:
NullPointerException
- if heuristic or tracker is null
-
ValueBiasedStochasticSampling
public ValueBiasedStochasticSampling(ConstructiveHeuristic<T> heuristic, ValueBiasedStochasticSampling.BiasFunction bias) Constructs a ValueBiasedStochasticSampling search object. A ProgressTracker is created for you.- Parameters:
heuristic
- The constructive heuristic.bias
- The bias function. If null, then the default bias is used.- Throws:
NullPointerException
- if heuristic is null
-
ValueBiasedStochasticSampling
public ValueBiasedStochasticSampling(ConstructiveHeuristic<T> heuristic, ValueBiasedStochasticSampling.BiasFunction bias, ProgressTracker<T> tracker) Constructs a ValueBiasedStochasticSampling search object.- Parameters:
heuristic
- The constructive heuristic.bias
- The bias function. If null, then the default bias is used.tracker
- A ProgressTracker- Throws:
NullPointerException
- if heuristic or tracker is null
-
-
Method Details
-
split
Description copied from interface:Splittable
Generates a functionally identical copy of this object, for use in multithreaded implementations of search algorithms. The state of the object that is returned may or may not be identical to that of the original. Thus, this is a distinct concept from the functionality of theCopyable
interface. Classes that implement this interface must ensure that the object returned performs the same functionality, and that it does not share any state data that would be either unsafe or inefficient for concurrent access by multiple threads. The split method is allowed to simply return the this reference, provided that it is both safe and efficient for multiple threads to share a single copy of the Splittable object. The intention is to provide a multithreaded search with the capability to provide spawned threads with their own distinct search operators. Such multithreaded algorithms can call the split method for each thread it spawns to generate a functionally identical copy of the operator, but with independent state.- Specified by:
split
in interfaceMetaheuristic<T extends Copyable<T>>
- Specified by:
split
in interfaceSimpleMetaheuristic<T extends Copyable<T>>
- Specified by:
split
in interfaceSplittable<T extends Copyable<T>>
- Returns:
- A functionally identical copy of the object, or a reference to this if it is both safe and efficient for multiple threads to share a single instance of this Splittable object.
-
createExponentialBias
Creates an exponential bias function of the form: exp(scale * value). If you want to use exponential bias with VBSS, carefully set the scale parameter based on the scale of the cost function you are optimizing. There is no good general purpose default that can be provided here since the scale of cost function values can vary greatly from one problem to another. As you consider how to set the scale parameter, consider that if not set well, the bias functions can easily exceed the range of doubles for some cost functions.The intended usage of this method is to provide a convenient way of constructing exponential bias functions that can be passed to one of the constructors of the class that take a BiasFunction as parameter.
- Parameters:
scale
- A parameter to scale the heuristic values.- Returns:
- A BiasFunction object representing the function: exp(scale * value)
-
optimize
Description copied from interface:SimpleMetaheuristic
Executes a single run of a metaheuristic whose run length cannot be specified (e.g., a hill climber that terminates when it reaches a local optima, or a stochastic sampler that terminates when it constructs one solution, etc). If this method is called multiple times, each call is randomized in some algorithm dependent way (e.g., a hill climber begins at a new randomly generated starting solution), and reinitializes any control parameters that may have changed during the previous call to optimize to the start of run state.- Specified by:
optimize
in interfaceSimpleMetaheuristic<T extends Copyable<T>>
- Returns:
- The current solution at the end of this run and its cost, which may or may not be the
same as the solution contained in this metaheuristic's
ProgressTracker
, which contains the best of all runs. Returns null if the run did not execute, such as if the ProgressTracker already contains the theoretical best solution.
-
optimize
Generates multiple stochastic heuristic samples. Returns the best solution of the set of samples.- Specified by:
optimize
in interfaceMetaheuristic<T extends Copyable<T>>
- Parameters:
numSamples
- The number of samples to perform.- Returns:
- The best solution of this set of samples, which may or may not be the same as the
solution contained in this search's
ProgressTracker
, which contains the best of all runs across all calls to the various optimize methods. Returns null if no runs executed, such as if the ProgressTracker already contains the theoretical best solution.
-
getProgressTracker
Description copied from interface:TrackableSearch
Gets theProgressTracker
object that is in use for tracking search progress. The object returned by this method contains the best solution found during the search (including across multiple concurrent runs if the search is multithreaded, or across multiple restarts if the run methods were called multiple times), as well as cost of that solution, among other information. See theProgressTracker
documentation for more information about the search data tracked by this object.- Specified by:
getProgressTracker
in interfaceTrackableSearch<T extends Copyable<T>>
- Returns:
- the
ProgressTracker
in use by this metaheuristic.
-
setProgressTracker
Description copied from interface:TrackableSearch
Sets theProgressTracker
object that is in use for tracking search progress. Any previously set ProgressTracker is replaced by this one.- Specified by:
setProgressTracker
in interfaceTrackableSearch<T extends Copyable<T>>
- Parameters:
tracker
- The new ProgressTracker to set. The tracker must not be null. This method does nothing if tracker is null.
-
getTotalRunLength
public final long getTotalRunLength()Description copied from interface:TrackableSearch
Gets the total run length of the metaheuristic. This is the total run length across all calls to the search. This may differ from what may be expected based on run lengths. For example, the search terminates if it finds the theoretical best solution, and also immediately returns if a prior call found the theoretical best. In such cases, the total run length may be less than the requested run length.The meaning of run length may vary from one metaheuristic to another. Therefore, implementing classes should provide fresh documentation rather than relying entirely on the interface documentation.
- Specified by:
getTotalRunLength
in interfaceTrackableSearch<T extends Copyable<T>>
- Returns:
- the total run length of the metaheuristic
-
getProblem
Description copied from interface:TrackableSearch
Gets a reference to the problem that this search is solving.- Specified by:
getProblem
in interfaceTrackableSearch<T extends Copyable<T>>
- Returns:
- a reference to the problem.
-