controllers¶
Module of all the controllers used in M-LOOP. The controllers, as the name suggests, control the interface to the experiment and all the learners employed to find optimal parameters.
-
class
mloop.controllers.
Controller
(interface, max_num_runs=inf, target_cost=-inf, max_num_runs_without_better_params=inf, max_duration=inf, controller_archive_filename='controller_archive', controller_archive_file_type='txt', archive_extra_dict=None, start_datetime=None, **kwargs)¶ Bases:
object
Abstract class for controllers.
The controller controls the entire M-LOOP process. The controllers for each algorithm all inherit from this class. The class stores a variety of data which all algorithms use and also includes all of the archiving and saving features.
In order to implement your own controller class the minimum requirement is to add a learner to the learner variable and implement the next_parameters() method where you provide the appropriate information to the learner and get the next parameters. See the RandomController for a simple implementation of a controller. Note the first three keywords are all possible halting conditions for the controller. If any of them are satisfied the controller will halt (meaning an OR condition is used). This base class also creates an empty attribute self.learner. The simplest way to make a working controller is to assign a learner of some kind to this variable, and add appropriate queues and events from it.
Parameters: interface (interface) – The interface process. It is run by the controller.
Keyword Arguments: - max_num_runs (Optional [float]) – The number of runs before the controller stops. If set to float(‘+inf’) the controller will run forever assuming no other halting conditions are met. Default float(‘+inf’), meaning the controller will run until another halting condition is met.
- target_cost (Optional [float]) – The target cost for the run. If a run achieves a cost lower than the target, the controller is stopped. Default float(‘-inf’), meaning the controller will run until another halting condition is met.
- max_num_runs_without_better_params (Optional [float]) – The optimization will halt if the number of consecutive runs without improving over the best measured value thus far exceeds this number. Default float(‘+inf’), meaning the controller will run until another halting condition is met.
- max_duration (Optional [float]) – The maximum duration of the optimization, in seconds of wall time. The actual duration may exceed this value slightly, but no new iterations will start after max_duration seconds have elapsed since start_datetime. Default is float(‘+inf’), meaning the controller will run until another halting condition is met.
- controller_archive_filename (Optional [string]) – Filename for archive. The archive contains costs, parameter history and other details depending on the controller type. Default ‘controller_archive’.
- controller_archive_file_type (Optional [string]) – File type for archive. Can be either ‘txt’ for a human readable text file, ‘pkl’ for a python pickle file, ‘mat’ for a matlab file, or None to forgo saving a controller archive. Default ‘txt’.
- archive_extra_dict (Optional [dict]) – A dictionary with any extra variables that are to be saved to the archive. If None, nothing is added. Default None.
- start_datetime (Optional datetime) – Datetime for when the controller was started.
-
params_out_queue
¶ Queue for parameters to next be run by the experiment.
Type: queue
-
costs_in_queue
¶ Queue for costs (and other details) that have been returned by experiment.
Type: queue
-
interface_error_queue
¶ Queue for returning errors encountered by the interface.
Type: queue
-
end_interface
¶ Event used to trigger the end of the interface.
Type: event
-
learner
¶ The placeholder for the learner. Creating this variable is the minimum requirement to make a working controller class.
Type: None
-
learner_params_queue
¶ The parameters queue for the learner.
Type: queue
-
learner_costs_queue
¶ The costs queue for the learner.
Type: queue
-
end_learner
¶ Event used to trigger the end of the learner.
Type: event
-
num_in_costs
¶ Counter for the number of costs received.
Type: int
-
num_out_params
¶ Counter for the number of parameters received.
Type: int
-
out_params
¶ List of all parameters sent out by controller.
Type: list
-
out_extras
¶ Any extras associated with the output parameters.
Type: list
-
in_costs
¶ List of costs received by controller.
Type: list
-
in_uncers
¶ List of uncertainties received by controller.
Type: list
-
best_cost
¶ The lowest, and best, cost received by the learner.
Type: float
-
best_uncer
¶ The uncertainty associated with the best cost.
Type: float
-
best_params
¶ The best parameters received by the learner.
Type: array
-
best_index
¶ The run number that produced the best cost.
Type: float
-
_enforce_boundaries
(params)¶ Enforce the minimum and maximum parameter boundaries.
If the values in params extend outside of the allowed boundaries set by self.min_boundary and self.max_boundary by a nontrivial amount, then this method will raise an RuntimeError.
To avoid numerical precision problems, this method actually gently clips values which barely exceed the boundaries. This is because variables are internally scaled and thus may very slightly violate the boundaries after being unscaled. If a parameter’s value only slightly exceeds the boundaries, then its value will be set equal to the boundary. If a value exceeds the boundary by a nontrivial amount, then a RuntimeError will be raised.
Note that although this method is forgiving about input parameter values very slightly exceeding the boundaries, it is completely strict about returning parameter values which obey the boundaries. Thus it is safe to assume that the returned values are within the range set by self.min_boundary and self.max_boundary (inclusively).
Parameters: params (array) – Array of values to be experimentally tested. Raises: RuntimeError
– A RuntimeError is raised if any value in params exceeds the parameter value boundaries by a nontrivial amount.Returns: - The input params, except that any values which slightly
- exceed the boundaries will have been clipped to stay within the boundaries exactly.
Return type: array
-
_first_params
()¶ Checks queue to get the first parameters.
Returns: Parameters for first experiment
-
_get_cost_and_in_dict
()¶ Get cost, uncertainty, parameters, bad and extra data from experiment.
This method stores results in lists and also puts data in the appropriate ‘current’ variables. This method doesn’t return anything and instead stores all of its results in the internal storage arrays and the ‘current’ variables.
If the interface encounters an error, it will pass the error to the controller here so that the error can be re-raised in the controller’s thread (note that the interface runs in a separate thread).
-
_next_params
()¶ Send latest cost info and get next parameters from the learner.
-
_optimization_routine
()¶ Runs controller main loop. Gives parameters to experiment and saves costs returned.
-
_put_params_and_out_dict
(params, param_type=None, **kwargs)¶ Send parameters to queue with optional additional keyword arguments.
This method also saves sent variables in appropriate storage arrays.
Parameters: - params (array) – Array of values to be experimentally tested.
- param_type (Optional, str) – The learner type which generated the parameter values. Because some learners use other learners as trainers, the parameter type can be different for different iterations during a given optimization. This value will be stored in self.out_type and in the out_type list in the controller archive. If None, then it will be set to self.learner.OUT_TYPE. Default None.
Keyword Arguments: **kwargs – Any additional keyword arguments will be stored in self.out_extras and in the out_extras list in the controller archive.
-
_send_to_learner
()¶ Send the latest cost info the the learner.
-
_shut_down
()¶ Shutdown and clean up resources of the controller. end the learners, queue_listener and make one last save of archive.
-
_start_up
()¶ Start the learner and interface threads/processes.
-
_update_controller_with_learner_attributes
()¶ Update the controller with properties from the learner.
-
check_end_conditions
()¶ Check whether any of the end contions have been met.
In particular this method check for any of the following conditions:
- If the number of iterations has reached max_num_runs.
- If the target_cost has been reached.
- If max_num_runs_without_better_params iterations in a row have occurred without any improvement.
- If max_duration seconds or more has elapsed since start_datetime.
Returns: - True, if the controller should continue or False if one or
- more halting conditions have been met and the controller should end.
Return type: bool
-
optimize
()¶ Optimize the experiment. This code learner and interface processes/threads are launched and appropriately ended. Starts both threads and catches kill signals and shuts down appropriately.
-
print_results
()¶ Print results from optimization run to the logs
-
save_archive
()¶ Save the archive associated with the controller class. Only occurs if the filename for the archive is not None. Saves with the format previously set.
-
exception
mloop.controllers.
ControllerInterrupt
¶ Bases:
Exception
Exception that is raised when the controlled is ended with the end flag or event.
-
class
mloop.controllers.
DifferentialEvolutionController
(interface, **kwargs)¶ Bases:
mloop.controllers.Controller
Controller for the differential evolution learner. :param params_out_queue: Queue for parameters to next be run by experiment. :type params_out_queue: queue :param costs_in_queue: Queue for costs (and other details) that have been returned by experiment. :type costs_in_queue: queue :param **kwargs: Dictionary of options to be passed to Controller parent class and differential evolution learner. :type **kwargs: Optional [dict]
-
class
mloop.controllers.
GaussianProcessController
(interface, num_params=None, min_boundary=None, max_boundary=None, trust_region=None, learner_archive_filename='learner_archive', learner_archive_file_type='txt', param_names=None, **kwargs)¶ Bases:
mloop.controllers.MachineLearnerController
Controller for the Gaussian Process solver. Primarily suggests new points from the Gaussian Process learner. However, during the initial few runs it must rely on a different optimization algorithm to get some points to seed the learner. :param interface: The interface to the experiment under optimization. :type interface: Interface :param **kwargs: Dictionary of options to be passed to MachineLearnerController parent class and Gaussian Process learner. :type **kwargs: Optional [dict]
Keyword Args:
-
class
mloop.controllers.
MachineLearnerController
(interface, training_type='differential_evolution', num_training_runs=None, no_delay=True, num_params=None, min_boundary=None, max_boundary=None, trust_region=None, learner_archive_filename='learner_archive', learner_archive_file_type='txt', param_names=None, **kwargs)¶ Bases:
mloop.controllers.Controller
Abstract Controller class for the machine learning based solvers. :param interface: The interface to the experiment under optimization. :type interface: Interface :param **kwargs: Dictionary of options to be passed to Controller parent class and initial training learner. :type **kwargs: Optional [dict]
Keyword Arguments: - training_type (Optional [string]) – The type for the initial training source can be ‘random’ for the random learner, ‘nelder_mead’ for the Nelder–Mead learner or ‘differential_evolution’ for the Differential Evolution learner. This learner is also called if the machine learning learner is too slow and a new point is needed. Default ‘differential_evolution’.
- num_training_runs (Optional [int]) – The number of training runs to before starting the learner. If None, will be ten or double the number of parameters, whatever is larger.
- no_delay (Optional [bool]) – If True, there is never any delay between a returned cost and the next parameters to run for the experiment. In practice, this means if the machine learning learner has not prepared the next parameters in time the learner defined by the initial training source is used instead. If false, the controller will wait for the machine learning learner to predict the next parameters and there may be a delay between runs.
-
_get_cost_and_in_dict
()¶ Get cost, uncertainty, parameters, bad, and extra data from experiment.
This method calls _get_cost_and_in_dict() of the parent Controller class and additionally sends the results to machine learning learner.
-
_optimization_routine
()¶ Overrides _optimization_routine. Uses the parent routine for the training runs. Implements a customized _optimization_routine when running the machine learning learner.
-
_shut_down
()¶ Shutdown and clean up resources of the machine learning controller.
-
_start_up
()¶ Runs pararent method and also starts training_learner.
-
print_results
()¶ Adds some additional output to the results specific to controller.
-
class
mloop.controllers.
NelderMeadController
(interface, **kwargs)¶ Bases:
mloop.controllers.Controller
Controller for the Nelder–Mead solver. Suggests new parameters based on the Nelder–Mead algorithm. Can take no boundaries or hard boundaries. More details for the Nelder–Mead options are in the learners section. :param params_out_queue: Queue for parameters to next be run by experiment. :type params_out_queue: queue :param costs_in_queue: Queue for costs (and other details) that have been returned by experiment. :type costs_in_queue: queue :param **kwargs: Dictionary of options to be passed to Controller parent class and Nelder–Mead learner. :type **kwargs: Optional [dict]
-
class
mloop.controllers.
NeuralNetController
(interface, num_params=None, min_boundary=None, max_boundary=None, trust_region=None, learner_archive_filename='learner_archive', learner_archive_file_type='txt', param_names=None, **kwargs)¶ Bases:
mloop.controllers.MachineLearnerController
Controller for the Neural Net solver. Primarily suggests new points from the Neural Net learner. However, during the initial few runs it must rely on a different optimization algorithm to get some points to seed the learner. :param interface: The interface to the experiment under optimization. :type interface: Interface :param **kwargs: Dictionary of options to be passed to MachineLearnerController parent class and Neural Net learner. :type **kwargs: Optional [dict]
Keyword Args:
-
class
mloop.controllers.
RandomController
(interface, **kwargs)¶ Bases:
mloop.controllers.Controller
Controller that simply returns random variables for the next parameters. Costs are stored but do not influence future points picked. :param params_out_queue: Queue for parameters to next be run by experiment. :type params_out_queue: queue :param costs_in_queue: Queue for costs (and other details) that have been returned by experiment. :type costs_in_queue: queue :param **kwargs: Dictionary of options to be passed to Controller and Random Learner. :type **kwargs: Optional [dict]
-
mloop.controllers.
create_controller
(interface, controller_type='gaussian_process', **controller_config_dict)¶ Start the controller with the options provided. :param interface: Interface with queues and events to be passed to controller :type interface: interface
Keyword Arguments: - controller_type (Optional [str]) – Defines the type of controller can be ‘random’, ‘nelder’, ‘gaussian_process’ or ‘neural_net’. Alternatively, the controller can belong to an external module, in which case this parameter should be ‘module_name:controller_name’. Defaults to ‘gaussian_process’.
- **controller_config_dict – Options to be passed to controller.
Returns: threadable object which must be started with start() to get the controller running.
Return type: Raises: ValueError
– if controller_type is an unrecognized string