Corso attualmente non disponibile

Large-Scale Optimization

Corso

A Mendrisio ()

Prezzo da consultare

Chiama il centro

Hai bisogno di un coach per la formazione?

Ti aiuterà a confrontare vari corsi e trovare l'offerta formativa più conveniente.

Descrizione

  • Tipologia

    Corso

Descrizione Computational science inevitably leads to systems of equations and functions to optimize subject to more equations. The course starts with iterative methods for solving sparse Ax=b and least-squares problems, using the Lanczos process for symmetric systems and the Golub-Kahan bidiagonalization for more general systems. The associated solvers are CG, MINRES, SYMMLQ, LSQR, and LSMR. All methods need minimal storage and are sure to converge. We then study the simplex and reduced-gradient methods for optimization subject to sparse linear constraints (and bounds on the variables), with the LUSOL package providing reliable basis factorization and updates. Interior methods handle bounds differently but still need sparse-matrix methods, as illustrated by PDCO. We then explore augmented Lagrangian methods and SQP methods for handling sparse linear and nonlinear constraints (LANCELOT, MINOS, SQOPT, SNOPT).

Domande e risposte

Aggiungi la tua domanda

I nostri consulenti e altri utenti potranno risponderti

Inserisci i tuoi dati per ricevere una risposta

Pubblicheremo solo il tuo nome e la domanda

Opinioni

Successi del Centro

2018

Tutti i corsi devono essere aggiornati

La media delle valutazioni dev'essere superiore a 3,7

Più di 50 opinioni degli ultimi 12 mesi

8 anni del centro in Emagister.

Programma

Descrizione

Computational science inevitably leads to systems of equations and functions to optimize subject to more equations. The course starts with iterative methods for solving sparse Ax=b and least-squares problems, using the Lanczos process for symmetric systems and the Golub-Kahan bidiagonalization for more general systems. The associated solvers are CG, MINRES, SYMMLQ, LSQR, and LSMR. All methods need minimal storage and are sure to converge. We then study the simplex and reduced-gradient methods for optimization subject to sparse linear constraints (and bounds on the variables), with the LUSOL package providing reliable basis factorization and updates. Interior methods handle bounds differently but still need sparse-matrix methods, as illustrated by PDCO. We then explore augmented Lagrangian methods and SQP methods for handling sparse linear and nonlinear constraints (LANCELOT, MINOS, SQOPT, SNOPT).

BIO

Michael Saunders holds a PhD in Computer Science from Stanford University. Since 1987 he has been a Professor in Operations Research at Stanford specializing in numerical algorithms for sparse linear systems and large-scale constrained optimization. He is coauthor of the linear equation solvers SYMMLQ, MINRES, LSQR, LSMR, LUSOL, LUMOD, LSRN and the optimization solvers MINOS, LSSOL, NPSOL, QPOPT, SNOPT, SQOPT, PDCO. He is an Honorary Fellow of the Royal Society of New Zealand (2007) and a SIAM Fellow (2013). In 2012 he won the SIAM Linear Algebra Prize and was elected to the Stanford University Invention Hall of Fame.

REFERENCES

  • Jorge Nocedal, Stephen Wright, Numerical Optimization, Springer Series in Operations Research and Financial Engineering, 2006

Chiama il centro

Hai bisogno di un coach per la formazione?

Ti aiuterà a confrontare vari corsi e trovare l'offerta formativa più conveniente.

Large-Scale Optimization

Prezzo da consultare