Optimization techniques are generally classified into deterministic/local and stochastic/global methods. Although effective in terms of convergence speed, the former methods generally require a ‘domain knowledge’ since in the case of non-linear and multi-minima functionals the initial trial solution must lie in the so-called ‘attraction basin’ of the global solution to avoid the convergence solution being trapped into local minima of the functional (i.e., wrong solutions of the problem at hand). In contrast, global optimization methods are potentially able to find the global optimum of the functional whatever the initial point/s of the search.
The course will review fundamentals and main issues of optimization problems, then it will focus on classical/state-of-the-art and recently introduced global optimization approaches. Applicative examples including exercises covering advanced engineering applications will corroborate the theoretical concepts.
- Fundamentals of global optimization, “No-Free-Lunch” theorem for optimization
- Deterministic optimization: Steepest Descent (SD) and Conjugate-Gradient (CG) methods
- Stochastic “Nature-Inspired” optimization algorithms
- “Competitive” methods: Genetic Algorithm (GA), Differential Evolution (DE)
- “Cooperative” methods: Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO)
- Application of global optimization methods to advanced engineering synthesis and design problems
- Recent advances within the System-by-Design (SbD) framework for the computationally-efficient solution of complex engineering problems
- Applicative examples including exercises regarding specific engineering applications of global optimization methodologies.
From: 10 July 2023
To: 14 July 2023
Onsite (Trento) and Online