• Anglický jazyk

Optimization methods

Autor: Source: Wikipedia

Source: Wikipedia. Pages: 35. Chapters: Expectation-maximization algorithm, Levenberg-Marquardt algorithm, Gauss-Newton algorithm, Gradient descent, Derivation of the conjugate gradient method, Luus-Jaakola, BFGS method, Cutting-plane method, Golden section... Viac o knihe

Na objednávku, dodanie 2-4 týždne

15.31 €

bežná cena: 17.40 €

O knihe

Source: Wikipedia. Pages: 35. Chapters: Expectation-maximization algorithm, Levenberg-Marquardt algorithm, Gauss-Newton algorithm, Gradient descent, Derivation of the conjugate gradient method, Luus-Jaakola, BFGS method, Cutting-plane method, Golden section search, Karmarkar's algorithm, Newton's method in optimization, Nonlinear programming, Quasi-Newton method, Interior point method, Simultaneous perturbation stochastic approximation, L-BFGS, WORHP, Nonlinear conjugate gradient method, Kantorovich theorem, Frank-Wolfe algorithm, Trust region, Line search, Sequential quadratic programming, Davidon-Fletcher-Powell formula, IPOPT, Successive parabolic interpolation, SR1 formula, Powell's method, Local convergence, Optimization algorithm. Excerpt: In statistics, an expectation-maximization (EM) algorithm is a method for finding maximum likelihood or maximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables. EM is an iterative method which alternates between performing an expectation (E) step, which computes the expectation of the log-likelihood evaluated using the current estimate for the parameters, and a maximization (M) step, which computes parameters maximizing the expected log-likelihood found on the E step. These parameter-estimates are then used to determine the distribution of the latent variables in the next E step. The EM algorithm was explained and given its name in a classic 1977 paper by Arthur Dempster, Nan Laird, and Donald Rubin. They pointed out that the method had been "proposed many times in special circumstances" by earlier authors. In particular, a very detailed treatment of the EM method for exponential families was published by Rolf Sundberg in his thesis and several papers following his collaboration with Per Martin-Löf and Anders Martin-Löf. The Dempster-Laird-Rubin paper in 1977 generalized the method and sketched a convergence analysis for a wider class of problems. Regardless of earlier inventions, the innovative Dempster-Laird-Rubin paper in the Journal of the Royal Statistical Society received an enthusiastic discussion at the Royal Statistical Society meeting with Sundberg calling the paper "brilliant". The Dempster-Laird-Rubin paper established the EM method as an important tool of statistical analysis. The convergence analysis of the Dempster-Laird-Rubin paper was flawed and a correct convergence analysis was published by C. F. Jeff Wu in 1983. Wu's proof established the EM method's convergence outside of the exponential family, as claimed by Dempster-Laird-Rubin. Given a statistical model consisting of a set of observed data, a set of unobserved latent data or missing values , and a vector of unknown param

  • Vydavateľstvo: Books LLC, Reference Series
  • Rok vydania: 2013
  • Formát: Paperback
  • Rozmer: 246 x 189 mm
  • Jazyk: Anglický jazyk
  • ISBN: 9781233138050

Generuje redakčný systém BUXUS CMS spoločnosti ui42.