qiskit-documentation/docs/api/qiskit/0.45/qiskit.algorithms.optimizer...

263 lines
12 KiB
Plaintext
Raw Permalink Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
title: ADAM
description: API reference for qiskit.algorithms.optimizers.ADAM
in_page_toc_min_heading_level: 1
python_api_type: class
python_api_name: qiskit.algorithms.optimizers.ADAM
---
# ADAM
<Class id="qiskit.algorithms.optimizers.ADAM" isDedicatedPage={true} github="https://github.com/qiskit/qiskit/tree/stable/0.45/qiskit/algorithms/optimizers/adam_amsgrad.py" signature="qiskit.algorithms.optimizers.ADAM(maxiter=10000, tol=1e-06, lr=0.001, beta_1=0.9, beta_2=0.99, noise_factor=1e-08, eps=1e-10, amsgrad=False, snapshot_dir=None)" modifiers="class">
Bases: [`Optimizer`](qiskit.algorithms.optimizers.Optimizer "qiskit.algorithms.optimizers.optimizer.Optimizer")
Adam and AMSGRAD optimizers.
Adam \[1] is a gradient-based optimization algorithm that is relies on adaptive estimates of lower-order moments. The algorithm requires little memory and is invariant to diagonal rescaling of the gradients. Furthermore, it is able to cope with non-stationary objective functions and noisy and/or sparse gradients.
AMSGRAD \[2] (a variant of Adam) uses a long-term memory of past gradients and, thereby, improves convergence properties.
**References**
**\[1]: Kingma, Diederik & Ba, Jimmy (2014), Adam: A Method for Stochastic Optimization.**
[arXiv:1412.6980](https://arxiv.org/abs/1412.6980)
**\[2]: Sashank J. Reddi and Satyen Kale and Sanjiv Kumar (2018),**
On the Convergence of Adam and Beyond. [arXiv:1904.09237](https://arxiv.org/abs/1904.09237)
<Admonition title="Note" type="note">
This component has some function that is normally random. If you want to reproduce behavior then you should set the random number generator seed in the algorithm\_globals (`qiskit.utils.algorithm_globals.random_seed = seed`).
</Admonition>
**Parameters**
* **maxiter** ([*int*](https://docs.python.org/3/library/functions.html#int "(in Python v3.12)")) Maximum number of iterations
* **tol** ([*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")) Tolerance for termination
* **lr** ([*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")) Value >= 0, Learning rate.
* **beta\_1** ([*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")) Value in range 0 to 1, Generally close to 1.
* **beta\_2** ([*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")) Value in range 0 to 1, Generally close to 1.
* **noise\_factor** ([*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")) Value >= 0, Noise factor
* **eps** ([*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")) Value >=0, Epsilon to be used for finite differences if no analytic gradient method is given.
* **amsgrad** ([*bool*](https://docs.python.org/3/library/functions.html#bool "(in Python v3.12)")) True to use AMSGRAD, False if not
* **snapshot\_dir** ([*str*](https://docs.python.org/3/library/stdtypes.html#str "(in Python v3.12)") *| None*) If not None save the optimizers parameter after every step to the given directory
## Attributes
### bounds\_support\_level
<Attribute id="qiskit.algorithms.optimizers.ADAM.bounds_support_level">
Returns bounds support level
</Attribute>
### gradient\_support\_level
<Attribute id="qiskit.algorithms.optimizers.ADAM.gradient_support_level">
Returns gradient support level
</Attribute>
### initial\_point\_support\_level
<Attribute id="qiskit.algorithms.optimizers.ADAM.initial_point_support_level">
Returns initial point support level
</Attribute>
### is\_bounds\_ignored
<Attribute id="qiskit.algorithms.optimizers.ADAM.is_bounds_ignored">
Returns is bounds ignored
</Attribute>
### is\_bounds\_required
<Attribute id="qiskit.algorithms.optimizers.ADAM.is_bounds_required">
Returns is bounds required
</Attribute>
### is\_bounds\_supported
<Attribute id="qiskit.algorithms.optimizers.ADAM.is_bounds_supported">
Returns is bounds supported
</Attribute>
### is\_gradient\_ignored
<Attribute id="qiskit.algorithms.optimizers.ADAM.is_gradient_ignored">
Returns is gradient ignored
</Attribute>
### is\_gradient\_required
<Attribute id="qiskit.algorithms.optimizers.ADAM.is_gradient_required">
Returns is gradient required
</Attribute>
### is\_gradient\_supported
<Attribute id="qiskit.algorithms.optimizers.ADAM.is_gradient_supported">
Returns is gradient supported
</Attribute>
### is\_initial\_point\_ignored
<Attribute id="qiskit.algorithms.optimizers.ADAM.is_initial_point_ignored">
Returns is initial point ignored
</Attribute>
### is\_initial\_point\_required
<Attribute id="qiskit.algorithms.optimizers.ADAM.is_initial_point_required">
Returns is initial point required
</Attribute>
### is\_initial\_point\_supported
<Attribute id="qiskit.algorithms.optimizers.ADAM.is_initial_point_supported">
Returns is initial point supported
</Attribute>
### setting
<Attribute id="qiskit.algorithms.optimizers.ADAM.setting">
Return setting
</Attribute>
### settings
<Attribute id="qiskit.algorithms.optimizers.ADAM.settings" />
## Methods
### get\_support\_level
<Function id="qiskit.algorithms.optimizers.ADAM.get_support_level" signature="get_support_level()">
Return support level dictionary
</Function>
### gradient\_num\_diff
<Function id="qiskit.algorithms.optimizers.ADAM.gradient_num_diff" signature="gradient_num_diff(x_center, f, epsilon, max_evals_grouped=None)" modifiers="static">
We compute the gradient with the numeric differentiation in the parallel way, around the point x\_center.
**Parameters**
* **x\_center** (*ndarray*) point around which we compute the gradient
* **f** (*func*) the function of which the gradient is to be computed.
* **epsilon** ([*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")) the epsilon used in the numeric differentiation.
* **max\_evals\_grouped** ([*int*](https://docs.python.org/3/library/functions.html#int "(in Python v3.12)")) max evals grouped, defaults to 1 (i.e. no batching).
**Returns**
the gradient computed
**Return type**
grad
</Function>
### load\_params
<Function id="qiskit.algorithms.optimizers.ADAM.load_params" signature="load_params(load_dir)">
Load iteration parameters for a file called `adam_params.csv`.
**Parameters**
**load\_dir** ([*str*](https://docs.python.org/3/library/stdtypes.html#str "(in Python v3.12)")) The directory containing `adam_params.csv`.
</Function>
### minimize
<Function id="qiskit.algorithms.optimizers.ADAM.minimize" signature="minimize(fun, x0, jac=None, bounds=None, objective_function=None, initial_point=None, gradient_function=None)">
Minimize the scalar function.
<Admonition title="Deprecated since version 0.19.0" type="danger">
`qiskit.algorithms.optimizers.adam_amsgrad.ADAM.minimize()`s argument `gradient_function` is deprecated as of qiskit-terra 0.19.0. It will be removed no earlier than 3 months after the release date. Instead, use the argument `jac`, which behaves identically.
</Admonition>
<Admonition title="Deprecated since version 0.19.0" type="danger">
`qiskit.algorithms.optimizers.adam_amsgrad.ADAM.minimize()`s argument `initial_point` is deprecated as of qiskit-terra 0.19.0. It will be removed no earlier than 3 months after the release date. Instead, use the argument `fun`, which behaves identically.
</Admonition>
<Admonition title="Deprecated since version 0.19.0" type="danger">
`qiskit.algorithms.optimizers.adam_amsgrad.ADAM.minimize()`s argument `objective_function` is deprecated as of qiskit-terra 0.19.0. It will be removed no earlier than 3 months after the release date. Instead, use the argument `fun`, which behaves identically.
</Admonition>
**Parameters**
* **fun** (*Callable\[\[POINT],* [*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")*]*) The scalar function to minimize.
* **x0** (*POINT*) The initial point for the minimization.
* **jac** (*Callable\[\[POINT], POINT] | None*) The gradient of the scalar function `fun`.
* **bounds** ([*list*](https://docs.python.org/3/library/stdtypes.html#list "(in Python v3.12)")*\[*[*tuple*](https://docs.python.org/3/library/stdtypes.html#tuple "(in Python v3.12)")*\[*[*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")*,* [*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")*]] | None*) Bounds for the variables of `fun`. This argument might be ignored if the optimizer does not support bounds.
* **objective\_function** (*Callable\[\[np.ndarray],* [*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")*] | None*) DEPRECATED. A function handle to the objective function.
* **initial\_point** (*np.ndarray | None*) DEPRECATED. The initial iteration point.
* **gradient\_function** (*Callable\[\[np.ndarray],* [*float*](https://docs.python.org/3/library/functions.html#float "(in Python v3.12)")*] | None*) DEPRECATED. A function handle to the gradient of the objective function.
**Returns**
The result of the optimization, containing e.g. the result as attribute `x`.
**Return type**
[OptimizerResult](qiskit.algorithms.optimizers.OptimizerResult "qiskit.algorithms.optimizers.OptimizerResult")
</Function>
### print\_options
<Function id="qiskit.algorithms.optimizers.ADAM.print_options" signature="print_options()">
Print algorithm-specific options.
</Function>
### save\_params
<Function id="qiskit.algorithms.optimizers.ADAM.save_params" signature="save_params(snapshot_dir)">
Save the current iteration parameters to a file called `adam_params.csv`.
<Admonition title="Note" type="note">
The current parameters are appended to the file, if it exists already. The file is not overwritten.
</Admonition>
**Parameters**
**snapshot\_dir** ([*str*](https://docs.python.org/3/library/stdtypes.html#str "(in Python v3.12)")) The directory to store the file in.
</Function>
### set\_max\_evals\_grouped
<Function id="qiskit.algorithms.optimizers.ADAM.set_max_evals_grouped" signature="set_max_evals_grouped(limit)">
Set max evals grouped
</Function>
### set\_options
<Function id="qiskit.algorithms.optimizers.ADAM.set_options" signature="set_options(**kwargs)">
Sets or updates values in the options dictionary.
The options dictionary may be used internally by a given optimizer to pass additional optional values for the underlying optimizer/optimization function used. The options dictionary may be initially populated with a set of key/values when the given optimizer is constructed.
**Parameters**
**kwargs** ([*dict*](https://docs.python.org/3/library/stdtypes.html#dict "(in Python v3.12)")) options, given as name=value.
</Function>
### wrap\_function
<Function id="qiskit.algorithms.optimizers.ADAM.wrap_function" signature="wrap_function(function, args)" modifiers="static">
Wrap the function to implicitly inject the args at the call of the function.
**Parameters**
* **function** (*func*) the target function
* **args** ([*tuple*](https://docs.python.org/3/library/stdtypes.html#tuple "(in Python v3.12)")) the args to be injected
**Returns**
wrapper
**Return type**
function\_wrapper
</Function>
</Class>