Monte Carlo
Sequential importance sampling is an instantiation of importance sampling, which approximates integrals by sampling.
The instantiation merely specifies that sampling takes place per dimension.
The technique becomes useful with resampling, which between dimensions redistributes the probability mass to the most likely regions of space.

probability.sequential_importance_resample.sequentialImportanceResample(targetFactors, proposers, number, weighters=…, combiners=…, combiners=…, emptySample=…)
Apply sequential importance resampling.
Parameters: 
 targetFactors – an iterable of
conditional distributions that
represent the perdimension factors of the target distribution.
A factor’s given
method should take a partial sample (initially, emptySample) and
return a distribution over the next dimension.
 proposers – an iterable of functions that take the distribution
that the corresponding factor returns and return a proposal
distribution.
The proposal distribution should approximate the target distribution.
It must be possible to sample from this proposal distribution.
 number – the number of samples that is drawn at each dimension.
 weighters – an iterable of functions
(sample, targetFactor, proposal) that return the importance weight
(the ratio between target and proposal densities) at the sample.
By default, this is an iterable of
densityWeighters.
 combiners – an iterable of functions (partialSample, newDimension)
that take a current partial sample and return it with the new dimension
appended.
By default, this is an iterable of
appendToVectors.
 emptySample – the empty partial sample.
By default, a 0×1dimensional vector numpy.empty((0, 1)) .

Returns:  (samples, normalisation) where samples is a
Categorical distribution of the full
samples, and normalisation is the estimated normalisation constant.

The following are small helper functions.

probability.sequential_importance_resample.densityWeighter(sample, localTarget, localProposal)
Weighter for continuous distributions.
Returns:  localTarget.density (sample) / localProposal.density (sample) 

probability.sequential_importance_resample.unnormalisedDensityWeighter(sample, localTarget, localProposal)
Weighter for continuous distributions with no known normalisation.
Returns:  localTarget.unnormalisedDensity (sample) / localProposal.density (sample) 

probability.sequential_importance_resample.massWeighter(sample, localTarget, localProposal)
Weighter for discrete distributions.
Returns:  localTarget.mass (sample) / localProposal.mass (sample) 

probability.sequential_importance_resample.appendToVector(partialSample, newDimension)
Returns:  the numpy vector partialSample with newDimension 
appended.

probability.sequential_importance_resample.appendToTuple(partialSample, newDimension)
Returns:  the tuple partialSample with newDimension appended. 
Resampling
The element that makes sequential importance resampling useful is resampling.
It approximates a categorical distribution by a distribution with integer weights.
It can be seen as a way of refocusing the probability mass on highprobability areas of the space.

probability.categorical.resample(source, number, strategy=systematicResampling)
Resample a Categorical distribution.
Resampling returns an approximated categorical distribution that has only
integer counts.
The expected value of the new integer count for one value is equal to the
count in the source distribution.
Returns:  A Categorical distribution
representing the resampled distribution.

Parameters: 
 source – the distribution that is resampled.
 number – the total number of samples.
 strategy – a function (source, number) that returns an iterable
over (value, count) pairs where count is an int.
This defaults to categorical.systematicResampling(), which
should work well, even though it is slightly counterintuitive.
Other options provided are categorical.multinomialResampling()
and categorical.residualResampling().


probability.categorical.multinomialResampling(source, number, uniformUnitSampler)
Apply multinomial resampling.
Returns:  an iterable over independently and identically distributed
samples from the source.

Parameters: 
 source – the distribution that samples are drawn from.
 number – the total number of samples.
 uniformUnitSampler – the sampler used to draw the underlying samples
Unif [0,1].


probability.categorical.residualResampling(source, number, uniformUnitSampler)
Apply residual resampling.
For each value, the expected number of samples is rounded down to an
integer value, and this many copies of this value are returned.
The remaining samples are then drawn as if with multinomial resampling
from a categorical distribution with the probabilities set to the
remainders.
Returns:  an iterable over samples from the source.

Parameters: 
 source – the distribution that samples are drawn from.
 number – the total number of samples.
 uniformUnitSampler – the sampler used to draw the underlying samples
Unif [0,1].


probability.categorical.systematicResampling(source, number, uniformUnitSampler)
Apply residual resampling.
Draw offset ~ Unif[0, 1].
Break a stick at 0 + offset, 1 + offset, ... (number  1) + offset
and choose the value at each of the corresponding points in the
cumulative density function.
Returns:  an iterable over samples from the source.

Parameters: 
 source – the categorical distribution that samples are drawn from.
 number – the total number of samples.
 uniformUnitSampler – the sampler used to draw the underlying samples
Unif [0,1].
