qiskit-documentation/docs/api/qiskit/0.31/qiskit.aqua.components.opti...

275 lines
9.1 KiB
Plaintext
Raw Permalink Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
title: ADAM
description: API reference for qiskit.aqua.components.optimizers.ADAM
in_page_toc_min_heading_level: 1
python_api_type: class
python_api_name: qiskit.aqua.components.optimizers.ADAM
---
# ADAM
<Class id="qiskit.aqua.components.optimizers.ADAM" isDedicatedPage={true} github="https://github.com/qiskit-community/qiskit-aqua/tree/stable/0.9/qiskit/aqua/components/optimizers/adam_amsgrad.py" signature="ADAM(maxiter=10000, tol=1e-06, lr=0.001, beta_1=0.9, beta_2=0.99, noise_factor=1e-08, eps=1e-10, amsgrad=False, snapshot_dir=None)" modifiers="class">
Bases: `qiskit.aqua.components.optimizers.optimizer.Optimizer`
Adam and AMSGRAD optimizers.
Adam \[1] is a gradient-based optimization algorithm that is relies on adaptive estimates of lower-order moments. The algorithm requires little memory and is invariant to diagonal rescaling of the gradients. Furthermore, it is able to cope with non-stationary objective functions and noisy and/or sparse gradients.
AMSGRAD \[2] (a variant of Adam) uses a long-term memory of past gradients and, thereby, improves convergence properties.
**References**
**\[1]: Kingma, Diederik & Ba, Jimmy (2014), Adam: A Method for Stochastic Optimization.**
[arXiv:1412.6980](https://arxiv.org/abs/1412.6980)
**\[2]: Sashank J. Reddi and Satyen Kale and Sanjiv Kumar (2018),**
On the Convergence of Adam and Beyond. [arXiv:1904.09237](https://arxiv.org/abs/1904.09237)
**Parameters**
* **maxiter** (`int`) Maximum number of iterations
* **tol** (`float`) Tolerance for termination
* **lr** (`float`) Value >= 0, Learning rate.
* **beta\_1** (`float`) Value in range 0 to 1, Generally close to 1.
* **beta\_2** (`float`) Value in range 0 to 1, Generally close to 1.
* **noise\_factor** (`float`) Value >= 0, Noise factor
* **eps** (`float`) Value >=0, Epsilon to be used for finite differences if no analytic gradient method is given.
* **amsgrad** (`bool`) True to use AMSGRAD, False if not
* **snapshot\_dir** (`Optional`\[`str`]) If not None save the optimizers parameter after every step to the given directory
## Methods
### get\_support\_level
<Function id="qiskit.aqua.components.optimizers.ADAM.get_support_level" signature="ADAM.get_support_level()">
Return support level dictionary
</Function>
### gradient\_num\_diff
<Function id="qiskit.aqua.components.optimizers.ADAM.gradient_num_diff" signature="ADAM.gradient_num_diff(x_center, f, epsilon, max_evals_grouped=1)" modifiers="static">
We compute the gradient with the numeric differentiation in the parallel way, around the point x\_center.
**Parameters**
* **x\_center** (*ndarray*) point around which we compute the gradient
* **f** (*func*) the function of which the gradient is to be computed.
* **epsilon** (*float*) the epsilon used in the numeric differentiation.
* **max\_evals\_grouped** (*int*) max evals grouped
**Returns**
the gradient computed
**Return type**
grad
</Function>
### load\_params
<Function id="qiskit.aqua.components.optimizers.ADAM.load_params" signature="ADAM.load_params(load_dir)">
Load iteration parameters for a file called `adam_params.csv`.
**Parameters**
**load\_dir** (`str`) The directory containing `adam_params.csv`.
**Return type**
`None`
</Function>
### minimize
<Function id="qiskit.aqua.components.optimizers.ADAM.minimize" signature="ADAM.minimize(objective_function, initial_point, gradient_function)">
Run the minimization.
**Parameters**
* **objective\_function** (`Callable`\[\[`ndarray`], `float`]) A function handle to the objective function.
* **initial\_point** (`ndarray`) The initial iteration point.
* **gradient\_function** (`Callable`\[\[`ndarray`], `float`]) A function handle to the gradient of the objective function.
**Return type**
`Tuple`\[`ndarray`, `float`, `int`]
**Returns**
A tuple of (optimal parameters, optimal value, number of iterations).
</Function>
### optimize
<Function id="qiskit.aqua.components.optimizers.ADAM.optimize" signature="ADAM.optimize(num_vars, objective_function, gradient_function=None, variable_bounds=None, initial_point=None)">
Perform optimization.
**Parameters**
* **num\_vars** (`int`) Number of parameters to be optimized.
* **objective\_function** (`Callable`\[\[`ndarray`], `float`]) Handle to a function that computes the objective function.
* **gradient\_function** (`Optional`\[`Callable`\[\[`ndarray`], `float`]]) Handle to a function that computes the gradient of the objective function.
* **variable\_bounds** (`Optional`\[`List`\[`Tuple`\[`float`, `float`]]]) deprecated
* **initial\_point** (`Optional`\[`ndarray`]) The initial point for the optimization.
**Return type**
`Tuple`\[`ndarray`, `float`, `int`]
**Returns**
A tuple (point, value, nfev) where
> point: is a 1D numpy.ndarray\[float] containing the solution
>
> value: is a float with the objective function value
>
> nfev: is the number of objective function calls
</Function>
### print\_options
<Function id="qiskit.aqua.components.optimizers.ADAM.print_options" signature="ADAM.print_options()">
Print algorithm-specific options.
</Function>
### save\_params
<Function id="qiskit.aqua.components.optimizers.ADAM.save_params" signature="ADAM.save_params(snapshot_dir)">
Save the current iteration parameters to a file called `adam_params.csv`.
<Admonition title="Note" type="note">
The current parameters are appended to the file, if it exists already. The file is not overwritten.
</Admonition>
**Parameters**
**snapshot\_dir** (`str`) The directory to store the file in.
**Return type**
`None`
</Function>
### set\_max\_evals\_grouped
<Function id="qiskit.aqua.components.optimizers.ADAM.set_max_evals_grouped" signature="ADAM.set_max_evals_grouped(limit)">
Set max evals grouped
</Function>
### set\_options
<Function id="qiskit.aqua.components.optimizers.ADAM.set_options" signature="ADAM.set_options(**kwargs)">
Sets or updates values in the options dictionary.
The options dictionary may be used internally by a given optimizer to pass additional optional values for the underlying optimizer/optimization function used. The options dictionary may be initially populated with a set of key/values when the given optimizer is constructed.
**Parameters**
**kwargs** (*dict*) options, given as name=value.
</Function>
### wrap\_function
<Function id="qiskit.aqua.components.optimizers.ADAM.wrap_function" signature="ADAM.wrap_function(function, args)" modifiers="static">
Wrap the function to implicitly inject the args at the call of the function.
**Parameters**
* **function** (*func*) the target function
* **args** (*tuple*) the args to be injected
**Returns**
wrapper
**Return type**
function\_wrapper
</Function>
## Attributes
### bounds\_support\_level
<Attribute id="qiskit.aqua.components.optimizers.ADAM.bounds_support_level">
Returns bounds support level
</Attribute>
### gradient\_support\_level
<Attribute id="qiskit.aqua.components.optimizers.ADAM.gradient_support_level">
Returns gradient support level
</Attribute>
### initial\_point\_support\_level
<Attribute id="qiskit.aqua.components.optimizers.ADAM.initial_point_support_level">
Returns initial point support level
</Attribute>
### is\_bounds\_ignored
<Attribute id="qiskit.aqua.components.optimizers.ADAM.is_bounds_ignored">
Returns is bounds ignored
</Attribute>
### is\_bounds\_required
<Attribute id="qiskit.aqua.components.optimizers.ADAM.is_bounds_required">
Returns is bounds required
</Attribute>
### is\_bounds\_supported
<Attribute id="qiskit.aqua.components.optimizers.ADAM.is_bounds_supported">
Returns is bounds supported
</Attribute>
### is\_gradient\_ignored
<Attribute id="qiskit.aqua.components.optimizers.ADAM.is_gradient_ignored">
Returns is gradient ignored
</Attribute>
### is\_gradient\_required
<Attribute id="qiskit.aqua.components.optimizers.ADAM.is_gradient_required">
Returns is gradient required
</Attribute>
### is\_gradient\_supported
<Attribute id="qiskit.aqua.components.optimizers.ADAM.is_gradient_supported">
Returns is gradient supported
</Attribute>
### is\_initial\_point\_ignored
<Attribute id="qiskit.aqua.components.optimizers.ADAM.is_initial_point_ignored">
Returns is initial point ignored
</Attribute>
### is\_initial\_point\_required
<Attribute id="qiskit.aqua.components.optimizers.ADAM.is_initial_point_required">
Returns is initial point required
</Attribute>
### is\_initial\_point\_supported
<Attribute id="qiskit.aqua.components.optimizers.ADAM.is_initial_point_supported">
Returns is initial point supported
</Attribute>
### setting
<Attribute id="qiskit.aqua.components.optimizers.ADAM.setting">
Return setting
</Attribute>
</Class>