learners

Module of learners used to determine what parameters to try next given previous cost evaluations.

Each learner is created and controlled by a controller.

class mloop.learners.DifferentialEvolutionLearner(first_params=None, trust_region=None, evolution_strategy='best1', population_size=15, mutation_scale=(0.5, 1), cross_over_probability=0.7, restart_tolerance=0.01, **kwargs)

Bases: mloop.learners.Learner, threading.Thread

Adaption of the differential evolution algorithm in scipy.

Parameters:
  • params_out_queue (queue) – Queue for parameters sent to controller.
  • costs_in_queue (queue) – Queue for costs for gaussian process. This must be tuple
  • end_event (event) – Event to trigger end of learner.
Keyword Arguments:
 
  • first_params (Optional [array]) – The first parameters to test. If None will just randomly sample the initial condition. Default None.
  • trust_region (Optional [float or array]) – The trust region defines the maximum distance the learner will travel from the current best set of parameters. If None, the learner will search everywhere. If a float, this number must be between 0 and 1 and defines maximum distance the learner will venture as a percentage of the boundaries. If it is an array, it must have the same size as the number of parameters and the numbers define the maximum absolute distance that can be moved along each direction.
  • evolution_strategy (Optional [string]) – the differential evolution strategy to use, options are ‘best1’, ‘best1’, ‘rand1’ and ‘rand2’. The default is ‘best2’.
  • population_size (Optional [int]) – multiplier proportional to the number of parameters in a generation. The generation population is set to population_size * parameter_num. Default 15.
  • mutation_scale (Optional [tuple]) – The mutation scale when picking new points. Otherwise known as differential weight. When provided as a tuple (min,max) a mutation constant is picked randomly in the interval. Default (0.5,1.0).
  • cross_over_probability (Optional [float]) – The recombination constand or crossover probability, the probability a new points will be added to the population.
  • restart_tolerance (Optional [float]) – when the current population have a spread less than the initial tolerance, namely stdev(curr_pop) < restart_tolerance stdev(init_pop), it is likely the population is now in a minima, and so the search is started again.
has_trust_region

Whether the learner has a trust region.

Type:bool
num_population_members

The number of parameters in a generation.

Type:int
params_generations

History of the parameters generations. A list of all the parameters in the population, for each generation created.

Type:list
costs_generations

History of the costs generations. A list of all the costs in the population, for each generation created.

Type:list
init_std

The initial standard deviation in costs of the population. Calucalted after sampling (or resampling) the initial population.

Type:float
curr_std

The current standard devation in costs of the population. Calculated after sampling each generation.

Type:float
_best1(index)

Use best parameters and two others to generate mutation.

Parameters:index (int) – Index of member to mutate.
_best2(index)

Use best parameters and four others to generate mutation.

Parameters:index (int) – Index of member to mutate.
_rand1(index)

Use three random parameters to generate mutation.

Parameters:index (int) – Index of member to mutate.
_rand2(index)

Use five random parameters to generate mutation.

Parameters:index (int) – Index of member to mutate.
generate_population()

Sample a new random set of variables

mutate(index)

Mutate the parameters at index.

Parameters:index (int) – Index of the point to be mutated.
next_generation()

Evolve the population by a single generation

random_index_sample(index, num_picks)

Randomly select a num_picks of indexes, without index.

Parameters:
  • index (int) – The index that is not included
  • num_picks (int) – The number of picks.
run()

Runs the Differential Evolution Learner.

save_generation()

Save history of generations.

update_archive()

Update the archive.

class mloop.learners.GaussianProcessLearner(length_scale=None, update_hyperparameters=True, cost_has_noise=True, noise_level=1.0, trust_region=None, default_bad_cost=None, default_bad_uncertainty=None, minimum_uncertainty=1e-08, gp_training_filename=None, gp_training_file_type='txt', predict_global_minima_at_end=True, **kwargs)

Bases: mloop.learners.Learner, multiprocessing.context.Process

Gaussian process learner. Generats new parameters based on a gaussian process fitted to all previous data.

Parameters:
  • params_out_queue (queue) – Queue for parameters sent to controller.
  • costs_in_queue (queue) – Queue for costs for gaussian process. This must be tuple
  • end_event (event) – Event to trigger end of learner.
Keyword Arguments:
 
  • length_scale (Optional [array]) – The initial guess for length scale(s) of the gaussian process. The array can either of size one or the number of parameters or None. If it is size one, it is assumed all the correlation lengths are the same. If it is the number of the parameters then all the parameters have their own independent length scale. If it is None, it is assumed all the length scales should be independent and they are all given an initial value of 1. Default None.
  • cost_has_noise (Optional [bool]) – If true the learner assumes there is common additive white noise that corrupts the costs provided. This noise is assumed to be on top of the uncertainty in the costs (if it is provided). If false, it is assumed that there is no noise in the cost (or if uncertainties are provided no extra noise beyond the uncertainty). Default True.
  • noise_level (Optional [float]) – The initial guess for the noise level in the costs, is only used if cost_has_noise is true. Default 1.0.
  • update_hyperparameters (Optional [bool]) – Whether the length scales and noise estimate should be updated when new data is provided. Is set to true by default.
  • trust_region (Optional [float or array]) – The trust region defines the maximum distance the learner will travel from the current best set of parameters. If None, the learner will search everywhere. If a float, this number must be between 0 and 1 and defines maximum distance the learner will venture as a percentage of the boundaries. If it is an array, it must have the same size as the number of parameters and the numbers define the maximum absolute distance that can be moved along each direction.
  • default_bad_cost (Optional [float]) – If a run is reported as bad and default_bad_cost is provided, the cost for the bad run is set to this default value. If default_bad_cost is None, then the worst cost received is set to all the bad runs. Default None.
  • default_bad_uncertainty (Optional [float]) – If a run is reported as bad and default_bad_uncertainty is provided, the uncertainty for the bad run is set to this default value. If default_bad_uncertainty is None, then the uncertainty is set to a tenth of the best to worst cost range. Default None.
  • minimum_uncertainty (Optional [float]) – The minimum uncertainty associated with provided costs. Must be above zero to avoid fitting errors. Default 1e-8.
  • predict_global_minima_at_end (Optional [bool]) – If True finds the global minima when the learner is ended. Does not if False. Default True.
all_params

Array containing all parameters sent to learner.

Type:array
all_costs

Array containing all costs sent to learner.

Type:array
all_uncers

Array containing all uncertainties sent to learner.

Type:array
scaled_costs

Array contaning all the costs scaled to have zero mean and a standard deviation of 1. Needed for training the gaussian process.

Type:array
bad_run_indexs

list of indexes to all runs that were marked as bad.

Type:list
best_cost

Minimum received cost, updated during execution.

Type:float
best_params

Parameters of best run. (reference to element in params array).

Type:array
best_index

index of the best cost and params.

Type:int
worst_cost

Maximum received cost, updated during execution.

Type:float
worst_index

index to run with worst cost.

Type:int
cost_range

Difference between worst_cost and best_cost

Type:float
generation_num

Number of sets of parameters to generate each generation. Set to 5.

Type:int
length_scale_history

List of length scales found after each fit.

Type:list
noise_level_history

List of noise levels found after each fit.

Type:list
fit_count

Counter for the number of times the gaussian process has been fit.

Type:int
cost_count

Counter for the number of costs, parameters and uncertainties added to learner.

Type:int
params_count

Counter for the number of parameters asked to be evaluated by the learner.

Type:int
gaussian_process

Gaussian process that is fitted to data and used to make predictions

Type:GaussianProcessRegressor
cost_scaler

Scaler used to normalize the provided costs.

Type:StandardScaler
has_trust_region

Whether the learner has a trust region.

Type:bool
create_gaussian_process()

Create the initial Gaussian process.

find_global_minima()

Performs a quick search for the predicted global minima from the learner. Does not return any values, but creates the following attributes.

predicted_best_parameters

the parameters for the predicted global minima

Type:array
predicted_best_cost

the cost at the predicted global minima

Type:float
predicted_best_uncertainty

the uncertainty of the predicted global minima

Type:float
find_next_parameters()

Returns next parameters to find. Increments counters and bias function appropriately.

Returns:Returns next parameters from biased cost search.
Return type:next_params (array)
fit_gaussian_process()

Fit the Gaussian process to the current data

get_params_and_costs()

Get the parameters and costs from the queue and place in their appropriate all_[type] arrays. Also updates bad costs, best parameters, and search boundaries given trust region.

predict_biased_cost(params)
Predicts the biased cost at the given parameters. The bias function is:
biased_cost = cost_bias*pred_cost - uncer_bias*pred_uncer
Returns:Biased cost predicted at the given parameters
Return type:pred_bias_cost (float)
predict_cost(params)

Produces a prediction of cost from the gaussian process at params.

Returns:Predicted cost at paramters
Return type:float
run()

Starts running the Gaussian process learner. When the new parameters event is triggered, reads the cost information provided and updates the Gaussian process with the information. Then searches the Gaussian process for new optimal parameters to test based on the biased cost. Parameters to test next are put on the output parameters queue.

update_archive()

Update the archive.

update_bads()

Best and/or worst costs have changed, update the values associated with bad runs accordingly.

update_bias_function()

Set the constants for the cost bias function.

update_search_params()

Update the list of parameters to use for the next search.

update_search_region()

If trust boundaries is not none, updates the search boundaries based on the defined trust region.

wait_for_new_params_event()

Waits for a new parameters event and starts a new parameter generation cycle. Also checks end event and will break if it is triggered.

class mloop.learners.Learner(num_params=None, min_boundary=None, max_boundary=None, learner_archive_filename='learner_archive', learner_archive_file_type='txt', start_datetime=None, **kwargs)

Bases: object

Base class for all learners. Contains default boundaries and some useful functions that all learners use.

The class that inherits from this class should also inherit from threading.Thread or multiprocessing.Process, depending if you need the learner to be a genuine parallel process or not.

Keyword Arguments:
 
  • num_params (Optional [int]) – The number of parameters to be optimized. If None defaults to 1. Default None.
  • min_boundary (Optional [array]) – Array with minimimum values allowed for each parameter. Note if certain values have no minimum value you can set them to -inf for example [-1, 2, float(‘-inf’)] is a valid min_boundary. If None sets all the boundaries to ‘-1’. Default None.
  • max_boundary (Optional [array]) – Array with maximum values allowed for each parameter. Note if certain values have no maximum value you can set them to +inf for example [0, float(‘inf’),3,-12] is a valid max_boundary. If None sets all the boundaries to ‘1’. Default None.
  • learner_archive_filename (Optional [string]) – Name for python archive of the learners current state. If None, no archive is saved. Default None. But this is typically overloaded by the child class.
  • learner_archive_file_type (Optional [string]) – File type for archive. Can be either ‘txt’ a human readable text file, ‘pkl’ a python dill file, ‘mat’ a matlab file or None if there is no archive. Default ‘mat’.
  • log_level (Optional [int]) – Level for the learners logger. If None, set to warning. Default None.
  • start_datetime (Optional [datetime]) – Start date time, if None, is automatically generated.
params_out_queue

Queue for parameters created by learner.

Type:queue
costs_in_queue

Queue for costs to be used by learner.

Type:queue
end_event

Event to trigger end of learner.

Type:event
_set_trust_region(trust_region)

Sets trust region properties for learner that have this. Common function for learners with trust regions.

Parameters:trust_region (float or array) – Property defines the trust region.
_shut_down()

Shut down and perform one final save of learner.

check_in_boundary(param)

Check give parameters are within stored boundaries

Parameters:param (array) – array of parameters
Returns:True if the parameters are within boundaries, False otherwise.
Return type:bool
check_in_diff_boundary(param)

Check given distances are less than the boundaries

Parameters:param (array) – array of distances
Returns:True if the distances are smaller or equal to boundaries, False otherwise.
Return type:bool
check_num_params(param)

Check the number of parameters is right.

put_params_and_get_cost(params, **kwargs)

Send parameters to queue and whatever additional keywords. Saves sent variables in appropriate storage arrays.

Parameters:params (array) – array of values to be sent to file
Returns:cost from the cost queue
save_archive()

Save the archive associated with the learner class. Only occurs if the filename for the archive is not None. Saves with the format previously set.

update_archive()

Abstract method for update to the archive. To be implemented by child class.

exception mloop.learners.LearnerInterrupt

Bases: Exception

Exception that is raised when the learner is ended with the end flag or event.

class mloop.learners.NelderMeadLearner(initial_simplex_corner=None, initial_simplex_displacements=None, initial_simplex_scale=None, **kwargs)

Bases: mloop.learners.Learner, threading.Thread

Nelder-Mead learner. Executes the Nelder-Mead learner algorithm and stores the needed simplex to estimate the next points.

Parameters:
  • params_out_queue (queue) – Queue for parameters from controller.
  • costs_in_queue (queue) – Queue for costs for nelder learner. The queue should be populated with cost (float) corresponding to the last parameter sent from the Nelder-Mead Learner. Can be a float(‘inf’) if it was a bad run.
  • end_event (event) – Event to trigger end of learner.
Keyword Arguments:
 
  • initial_simplex_corner (Optional [array]) – Array for the initial set of parameters, which is the lowest corner of the initial simplex. If None the initial parameters are randomly sampled if the boundary conditions are provided, or all are set to 0 if boundary conditions are not provided.
  • initial_simplex_displacements (Optional [array]) – Array used to construct the initial simplex. Each array is the positive displacement of the parameters above the init_params. If None and there are no boundary conditions, all are set to 1. If None and there are boundary conditions assumes the initial conditions are scaled. Default None.
  • initial_simplex_scale (Optional [float]) – Creates a simplex using a the boundary conditions and the scaling factor provided. If None uses the init_simplex if provided. If None and init_simplex is not provided, but boundary conditions are is set to 0.5. Default None.
init_simplex_corner

Parameters for the corner of the initial simple used.

Type:array
init_simplex_disp

Parameters for the displacements about the simplex corner used to create the initial simple.

Type:array
simplex_params

Parameters of the current simplex

Type:array
simplex_costs

Costs associated with the parameters of the current simplex

Type:array
run()

Runs Nelder-Mead algorithm to produce new parameters given costs, until end signal is given.

update_archive()

Update the archive.

class mloop.learners.NeuralNetLearner(trust_region=None, default_bad_cost=None, default_bad_uncertainty=None, nn_training_filename=None, nn_training_file_type='txt', minimum_uncertainty=1e-08, predict_global_minima_at_end=True, **kwargs)

Bases: mloop.learners.Learner, multiprocessing.context.Process

Learner that uses a neural network for function approximation.

Parameters:
  • params_out_queue (queue) – Queue for parameters sent to controller.
  • costs_in_queue (queue) – Queue for costs.
  • end_event (event) – Event to trigger end of learner.
Keyword Arguments:
 
  • trust_region (Optional [float or array]) – The trust region defines the maximum distance the learner will travel from the current best set of parameters. If None, the learner will search everywhere. If a float, this number must be between 0 and 1 and defines maximum distance the learner will venture as a percentage of the boundaries. If it is an array, it must have the same size as the number of parameters and the numbers define the maximum absolute distance that can be moved along each direction.
  • default_bad_cost (Optional [float]) – If a run is reported as bad and default_bad_cost is provided, the cost for the bad run is set to this default value. If default_bad_cost is None, then the worst cost received is set to all the bad runs. Default None.
  • default_bad_uncertainty (Optional [float]) – If a run is reported as bad and default_bad_uncertainty is provided, the uncertainty for the bad run is set to this default value. If default_bad_uncertainty is None, then the uncertainty is set to a tenth of the best to worst cost range. Default None.
  • minimum_uncertainty (Optional [float]) – The minimum uncertainty associated with provided costs. Must be above zero to avoid fitting errors. Default 1e-8.
  • predict_global_minima_at_end (Optional [bool]) – If True finds the global minima when the learner is ended. Does not if False. Default True.
all_params

Array containing all parameters sent to learner.

Type:array
all_costs

Array containing all costs sent to learner.

Type:array
all_uncers

Array containing all uncertainties sent to learner.

Type:array
scaled_costs

Array contaning all the costs scaled to have zero mean and a standard deviation of 1.

Type:array
bad_run_indexs

list of indexes to all runs that were marked as bad.

Type:list
best_cost

Minimum received cost, updated during execution.

Type:float
best_params

Parameters of best run. (reference to element in params array).

Type:array
best_index

index of the best cost and params.

Type:int
worst_cost

Maximum received cost, updated during execution.

Type:float
worst_index

index to run with worst cost.

Type:int
cost_range

Difference between worst_cost and best_cost

Type:float
generation_num

Number of sets of parameters to generate each generation. Set to 5.

Type:int
noise_level_history

List of noise levels found after each fit.

Type:list
cost_count

Counter for the number of costs, parameters and uncertainties added to learner.

Type:int
params_count

Counter for the number of parameters asked to be evaluated by the learner.

Type:int
neural_net

Neural net that is fitted to data and used to make predictions.

Type:NeuralNet
cost_scaler

Scaler used to normalize the provided costs.

Type:StandardScaler
cost_scaler_init_index

The number of params to use to initialise cost_scaler.

Type:int
has_trust_region

Whether the learner has a trust region.

Type:bool
_fit_neural_net(index)

Fits a neural net to the data.

cost_scaler must have been fitted before calling this method.

_init_cost_scaler()

Initialises the cost scaler. cost_scaler_init_index must be set.

create_neural_net()

Creates the neural net. Must be called from the same process as fit_neural_net, predict_cost and predict_costs_from_param_array.

find_global_minima()

Performs a quick search for the predicted global minima from the learner. Does not return any values, but creates the following attributes.

predicted_best_parameters

the parameters for the predicted global minima

Type:array
predicted_best_cost

the cost at the predicted global minima

Type:float
find_next_parameters(net_index=None)

Returns next parameters to find. Increments counters appropriately.

Returns:Returns next parameters from cost search.
Return type:next_params (array)
get_losses()
get_params_and_costs()

Get the parameters and costs from the queue and place in their appropriate all_[type] arrays. Also updates bad costs, best parameters, and search boundaries given trust region.

import_neural_net()

Imports neural net parameters from the training dictionary provided at construction. Must be called from the same process as fit_neural_net, predict_cost and predict_costs_from_param_array. You must call exactly one of this and create_neural_net before calling other methods.

predict_cost(params, net_index=None)

Produces a prediction of cost from the neural net at params.

Returns:Predicted cost at paramters
Return type:float
predict_cost_gradient(params, net_index=None)

Produces a prediction of the gradient of the cost function at params.

Returns:Predicted gradient at paramters
Return type:float
predict_costs_from_param_array(params, net_index=None)

Produces a prediction of costs from an array of params.

Returns:Predicted cost at paramters
Return type:float
run()

Starts running the neural network learner. When the new parameters event is triggered, reads the cost information provided and updates the neural network with the information. Then searches the neural network for new optimal parameters to test based on the biased cost. Parameters to test next are put on the output parameters queue.

update_archive()

Update the archive.

update_bads()

Best and/or worst costs have changed, update the values associated with bad runs accordingly.

update_search_params()

Update the list of parameters to use for the next search.

update_search_region()

If trust boundaries is not none, updates the search boundaries based on the defined trust region.

wait_for_new_params_event()

Waits for a new parameters event and starts a new parameter generation cycle. Also checks end event and will break if it is triggered.

class mloop.learners.RandomLearner(trust_region=None, first_params=None, **kwargs)

Bases: mloop.learners.Learner, threading.Thread

Random learner. Simply generates new parameters randomly with a uniform distribution over the boundaries. Learner is perhaps a misnomer for this class.

Parameters:

**kwargs (Optional dict) – Other values to be passed to Learner.

Keyword Arguments:
 
  • min_boundary (Optional [array]) – If set to None, overrides default learner values and sets it to a set of value 0. Default None.
  • max_boundary (Optional [array]) – If set to None overides default learner values and sets it to an array of value 1. Default None.
  • first_params (Optional [array]) – The first parameters to test. If None will just randomly sample the initial condition.
  • trust_region (Optional [float or array]) – The trust region defines the maximum distance the learner will travel from the current best set of parameters. If None, the learner will search everywhere. If a float, this number must be between 0 and 1 and defines maximum distance the learner will venture as a percentage of the boundaries. If it is an array, it must have the same size as the number of parameters and the numbers define the maximum absolute distance that can be moved along each direction.
run()

Puts the next parameters on the queue which are randomly picked from a uniform distribution between the minimum and maximum boundaries when a cost is added to the cost queue.