- Source: Gradient method
In optimization, a gradient method is an algorithm to solve problems of the form
min
x
∈
R
n
f
(
x
)
{\displaystyle \min _{x\in \mathbb {R} ^{n}}\;f(x)}
with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.
See also
References
Elijah Polak (1997). Optimization : Algorithms and Consistent Approximations. Springer-Verlag. ISBN 0-387-94971-2.
Kata Kunci Pencarian:
- Metode Newton
- Xeno-canto
- Pembauran galat
- Daftar film bertema lesbian, gay, biseksual dan transgender
- Cedera aksonal difus
- Pintu udara
- Sylvia Earle
- Edema paru akibat berenang
- Conjugate gradient method
- Gradient method
- Gradient descent
- Proximal gradient method
- Reinforcement learning
- Stochastic gradient descent
- Biconjugate gradient method
- Nonlinear conjugate gradient method
- Iterative method
- Biconjugate gradient stabilized method