Optimization problems and algorithms
4.5 (760 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
3,396 students enrolled

Optimization problems and algorithms

How to understand, formulate, and tackle the difficulties of optimization problems using heursitic algorithms in Matlab
Bestseller
4.5 (760 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
3,394 students enrolled
Last updated 10/2018
English
English [Auto-generated]
Current price: $139.99 Original price: $199.99 Discount: 30% off
5 hours left at this price!
30-Day Money-Back Guarantee
This course includes
  • 8 hours on-demand video
  • 40 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Identify, understand, formulate, and solve optimization problems
  • Understand the concepts of stochastic optimization algorithms
  • Analyse and adapt modern optimization algorithms
Requirements
  • You should have basic knowledge of programming
  • You should be familiar with Matlab's built-in programming language
Description

This is an introductory course to the stochastic optimization problems and algorithms as the basics sub-fields in Artificial Intelligence. We will cover the most fundamental concepts in the field of optimization including metaheuristics and swarm intelligence. By the end of this course, you will be able to identify and implement the main components of an optimization problem. Optimization problems are different, yet there have mostly similar challenges and difficulties such as constraints, multiple objectives, discrete variables, and noises. This course will show you how to tackle each of these difficulties. Most of the lectures come with coding videos. In such videos, the step-by-step process of implementing the optimization algorithms or problems are presented. We have also a number of quizzes and exercises to practice the theoretical knowledge covered in the lectures. 

Here is the list of topics covered:

  • History of optimization  

  • Optimization problems 

  • Single-objective optimization algorithms

  • Particle Swarm Optimization 

  • Optimization of problems with constraints 

  • Optimization of problems with binary and/or discrete variables 

  • Optimization of problems with multiple objectives

  • Optimization of problems with uncertainties 

Particle Swarm Optimization will be the main algorithm, which is a search method that can be easily applied to different applications including Machine Learning, Data Science, Neural Networks, and Deep Learning.

I am proud of 200+ 5-star reviews. Some of the reviews are as follows: 

David said: "This course is one of the best online course I have ever taken. The instructor did an excellent job to very carefully prepare the contents, slides, videos, and explains the complicated code in a very careful way. Hope the instructor can develop much more courses to enrich the society. Thanks!"

Khaled said: "Dr. Seyedali is one of the greatest instructor that i had the privilege to take a course with. The course was direct to the point and the lessons are easy to understand and comprehensive. He is very helpful during and out of the course. i truly recommend this course to all who would like to learn optimization\PSO or those who would like to sharpen their understanding in optimization. best of luck to all and THANK YOU Dr. Seyedali."

Biswajit said: "This coursework has really been very helpful for me as I have to frequently deal with optimization. The most prominent feature of the course is the emphasis given on coding and visualization of results. Further, the support provided by Dr. Seyedali through personal interaction is top notch.


Boumaza said:  "Good Course from Dr. Seyedali Mirjalili. It gives us clear picture of the algorithms used in optimization. It covers technical as well as practical aspects of optimization. Step by step and very practical approach to optimization through well though and properly explained topics, highly recommended course You really help me a lot. I hope, someday, I will be one of the players in this exciting field! Thanks to Dr. Seyedali Mirjalili."


Join 1000+ students and start your optimization journey with us. If you are in any way not satisfied, for any reason, you can get a full refund from Udemy within 30 days. No questions asked. But I am confident you won't need to. I stand behind this course 100% and am committed to help you along the way.

Who this course is for:
  • Anyone who wants to learn optimization
  • Anyone who wants to solve an optimization problem
Course content
Expand all 36 lectures 08:03:35
+ Introduction
2 lectures 14:30

In this video, the structure of the course is discussed in details. There are also some tips on how to use Udemy video player. 

Preview 07:51

This video covers the history of optimization process to solve engineering design problems. 

History of optimization
06:39
+ Optimization problems
1 lecture 11:38

In this lecture we talk about optimizations problems in general. We will be covering the main components of optimization problems and the concepts of search space/landscape. There is also a very simple and intuitive example to understand the theory covered in this lecture. 

The learning outcomes are as given below: 

  • Understanding the difference between search space and search landscape 
  • Demonstrating the ability to identify the main components of an optimization problem (system)
  • Demonstrating the ability to formulation single-objective optimization problems 
  • Understanding the most common difficulties when solving optimization problems 
Optimization problems
11:38

This quiz has been designed to assess your understanding of the concepts covered in Section 2 of this course. 

Quiz 1
6 questions
+ Optimization of problems with one objective
2 lectures 23:09

This lecture discusses the structure of single-objective optimization algorithms. All the concepts are discussed with an intuitive analogy. 

The learning outcomes are as follows: 

  • Understanding the terminologies in the field of optimization: variables, objective value, objective function, constraint, global optimum, local optimum, search agents, algorithm, and iteration
  • Understanding the main differences between conventional and modern optimization algorithms 
  • Understandingthe differences between a deterministic algorithm and a stochastic algorithm 
  • Understanding the advantages and drawbacks of stochastic and deterministic algorithms
  • Understanding the concepts of gradient and the structure of the gradient descent algorithm
Optimization of problems with one objective
11:54

This lecture covers the family of stochastic optimization algorithms. 

The learning outcomes are as follows: 

  • Understanding the differences between individual-based and population-based algorithms 
  • Understanding the advantages and drawbacks of individual-based and population-based algorithms 
  • Understanding the concepts of function evaluations needed for an optimization algorithm
  • Understanding the concepts of exploration (diversification) and exploitation (intensification)
  • Understanding the No Free Lunch theorem 
Stochastic Optimization Algorithms
11:15

This quiz has been designed to assess your understanding of the concepts covered in Section 3 of this course. 

Quiz 2
9 questions
+ Particle Swarm Optimization (PSO)
4 lectures 01:11:08

The lecture covers the most fundamental concepts for understanding the PSO algorithm as one of the most well-regarded stochastic population-based algorithms. We use the same analogy to understand the way that this algorithm searches for the global optimum of optimization problems. 

The learning outcomes are as follows: 

  • Understanding the mathematical formulating of the PSO algorithm 
  • Understanding the main components of the velocity vector in PSO: inertial component, cognitive component, and social component 
  • Analyzing the performance of PSO in terms of exploration and exploitation 
  • Understanding the impact of inertia weight, c1, and c2 on the performance of PSO
Preview 19:07

This video is a step-by-step implementation of the PSO algorithm in Matlab. 

The learning outcomes are as follows: 

  • Understanding the main logical steps in PSO for implementation 
  • Testing and analyzing the results of PSO
  • Drawing the convergence curve of the PSO algorithm 
Implementing Particle Swarm Optimization in Matlab
33:39

In the PSO algorithm, the velocity vectors might incrementally get bigger and bigger. As a consequence, the particles go outside the boundaries of the landscape and are no longer desirable. In this video, we learn how to prevent the particles from going outsides the landscape. In fact, we introduce two mechanisms to reduce the probability of overshooting the particles and re-initialzing them in case of overshooting: velocity bounding and re-positioning.  

The learning outcomes are as follows: 

  • Demonstrating an underestanding of the need for limiting the velocity vector in PSO 
  • Implementing a mechanism to limit velocity 
  • Implementing a mechanism to re-initialize the particles that go outside the landscape 
PSO with bounding velocity
12:42

In this lecture, we solve the simple case study presented in the earlier lectures using the PSO algorithm. This lecture mainly demonstrates how to replace the objective function with a desirable one to solve it. 

The learning outcomes are as follows: 

  • Understanding the steps of solving an optimization problem using the PSO algorithm 
  • Demonstrating the skills to tune the parameters of the PSO algorithm
  • How to replace the default objective function with a desired objective function 
Applying PSO to the table design problem
05:40
Quiz 3
6 questions
+ Optimization of problems with constraints
2 lectures 25:18

This lecture introduces different types of constraints when solving optimization problems. It then covers a very simple technique to handle constraints of different types. 

The learning outcomes are as follows: 

  • Demonstrating the ability to formulate a constrained optimization algorithm 
  • Understanding the difference between equality and inequality constraints 
  • Understanding the process of converting equality constraints to inequality constraints 
  • Understanding the concepts of penalty and barrier functions 
Constrained optimization
08:02

In this video, we will demonstrate how to employ a barrier function to handle constraints in the objective function without algorithm modification. 

The learning outcomes are as follows: 

  • Demonstrating the ability to Implement constraints in the objective function 
  • Demonstrating the ability to apply barrier functions in the objective function 
  • Understanding the process of uUsing PSO to estimate the global optimum of a constrained problem 
  • Demonstrating the ability to visualize the landscape before and after applying the barrier function 
  • Understanding the impact of a barrier function on the shape of landscapes 
Coding a constrained objective function and how to solve it in Matlab
17:16
Quiz 4
4 questions
+ Optimization of problems with discrete variables
3 lectures 34:36

Problems with discrete variables are very common. In this lecture, we learn how to solve such problems with stochastic optimization algorithms. 

The learning outcomes are as follows: 

  • Demonstrating the ability to formulate problems with discrete variables 
  • Understanding the shape of landscape of discrete optimization problems 
  • Demonstrating an understanding of transfer function in Binary Particle Swarm Optimization 
  • Understanding the mathematical model of the BPSO algorithm
  • Using multiple bits to choose one of the values in a set of discrete values 
Discrete optimization
06:01

In this video, we write the code for a binary PSO. Several modification will be made into the PSO to design the Binary PSO (BPSO) algorithm. A test function is also solved as an example of binary optimization problems. 

The learning outcomes are: 

  • Understanding the process of initializing the solutions for a binary problem 
  • Demonstrating the ability to define the upper and lower bouds for binary variables 
  • Demonstrating the ability to implement the sigmoid transfer function 
  • Understanding the process of using sigmoid transfer function to generate probability values and updating the position of particles in BPSO 
  • Demonstrating the ability to apply BPSO to binary problems 
Coding a binary PSO
13:19

Since the BPSO algorithm cannot solve discrete problems with multiple discrete values for each parameter, a mechanism is implemented in this video to choose more than two values from a given set discrete values. Also, the process of solving problems with discrete variables is given. A simple case study is solved to demonstrate the application of BPSO

The learning outcomes are:

  • Understanding the method of increasing the number of variables in each particle to choose more than two values from a give set of discrete values 
  • Demonstrating the ability to find correct number of bits for each variables of particles to choose one of N discrete values 
  • Demonstrating the ability to use the variables of particles in BPSO to choose discrete values 
  • Demonstrating the ability to implement a discrete version of BPSO
  • Demonstrating the ability to apply BPSO to problems with discrete values 
Coding a discrete PSO
15:16
Quiz 5
2 questions
+ Optimization of problems with multiple objectives
2 lectures 16:02

Problems with more than one objective are very common in both science and industry. In this lecture, we learn the most fundamental concepts of such problems. The problem formulation of multi-objective problems are also covered. 

The learning outcomes are as follows: 

  • Understanding the main components of a multi-objective problem/system
  • Demonstrating the ability to formulation multi-objective optimization problems 
  • Understanding the concepts of Pareto optimal solutions, Pareto optimal dominance, and Pareto optimal front
  • Demonstrating the ability to analyze Pareto optimal sets and Pareto optimal fronts 
Multi-objective optimization
07:42

In this video, three main classes of methods to solve multi-objective optimization problems using multi-objective stochastic algorithms are covered. The Multi-objective Particle Swarm Optimization algorithm is discussed as one of the most well-regarded algorithms as well. 

The learning outcomes are: 

  • Understanding different types of multi-objective optimization: a posteriori, a priori, and interactive methods  
  • Demonstrating the ability to convert a multi-objective problem into a single-objective problem 
  • Understanding the general framework of a posteriori methods 
  • Understanding the the concepts of convergence and converge in a posteriori methods 
  • Understanding the components required for MOPSO to solve multi-objective problems
Multi-objective algorithms
08:20
Quiz 6
4 questions
+ Optimization of problems with uncertainties
3 lectures 49:54

In this video, are are going to be focusing on robust optimization using stochastic optimization algorithms. The lecture starts with discussing the main types of uncertainties in operating conditions, inputs, outputs, and constraints. Then, two methods are covered to handle uncertainties in the inputs only as the most common types of errors during the manufacturing processes. Since sampling is a main part of robust optimization, three sampling methods are covered as well. 

The learning outcomes are as follows:

  • Understanding different types of uncertainties in an optimization problem/system 
  • Demonstrating the ability to formulate a problem with uncertainties in the variables (inputs) 
  • Understanding the purpose of a expectation measure 
  • Understanding the purpose of a variance measure 
  • Demonstrating the ability for formulate an optimization problem when using an expectation of a variance measure 
  • Understanding the limitation of an expectation measure calculated with integral
  • Understanding the process of using the Monte Carl technique and sampled points to calculate an expected objective value or a variance measure 
  • Understanding the differences between random, Latin Hypercube, and Orthogonal sampling techniques 
Robust optimization
26:36

In this video, an expectation measure is implemented in Matlab. The PSO algorithm is then used to find the robust optimum for a given test function. 

The learning outcomes are as follows: 

  • Demonstrating the ability to replace an objective function with an expectation measure 
  • Demonstrating the ability to calculate an approximation of the expected objective value using the Monte Carlo technique 
  • Understanding the steps of finding robust solution for an optimization problem 
  • Understanding the impact of a variance measure on the shape of landscape 
  • Understanding the mechanism of random sampling 
  • Demonstrating the ability to implement an expectation measure
Coding an expectation measure
14:29

This video implements a variance measure in Matlab and employs it to find the robust solutions for a given optimization problem. The PSO algorithm is used as the main algorithm and the objective function is changed to simulate a variance measure. 

The learning outcomes are as follows: 

  • Understanding the process of implementing a variance measure 
  • Demonstrating the ability to use a variance measure as a constraint 
  • Understanding the impact of a variance measure on the landscape 
  • Demonstrate the ability to calculate an approximation of the variance measure using the Monte Carlo technique 
  • Demonstrating the ability to implement a variance measure
Coding a variance measure
08:49
Quiz 7
3 questions
+ Bonus videos (FAQ)
16 lectures 03:55:39

This video shows the steps to design a function for the PSO instead of a Matlab script file. 

How to write PSO as a function?
16:10

This video shows the steps to run PSO multiplee times and calculate AVERAGE and STANDARD DEVIATION of multiple times.

How to run an algorithm (PSO) multiple times and collect statistical results?
13:27

In this video, a Matlab function is used to conduct the Wilcoxon ranksum test when comparing two versions of PSO. 

How to conduct a statistical test (Wilcoxon ranksum) when comparing algorithms?
14:02

In this lecture, we learn how to compare the convergence curves of two algorithms. As an example, two variants of PSO are compared. 

How to compare the convergence curves of two algorithms?
12:19

This lecture shows you two methods to solve a maximization problem using the PSO algorithm. The methods presented are algorithm independent, so you can use them to solve maximization problems with any optimization algorithm. 

How to solve maximization problems?
15:49

This video takes you through the steps to first store the average objectives of all solutions in each iteration. You will then learn how to visualize this vector and interpret its behaviour qualitatively. 

How to calculate and visualize the average objective of solutions?
11:29

This lecture demonstrates how to observe exploratory and exploitative behaviour of a particle by looking at the fluctuations in one of the variables of the first particle in the PSO algorithm. You can use this technique to see the changes in any of the variables and particle.s 

How to visualize the fluctuations of a variable is a solution?
14:11

This lecture covers the process of visualizing the history of searched points using PSO in Matlab. This allows you to qualitatively analyze the results of the PSO algorithm. 

How to visualize the search history of particles in PSO?
18:55

This lecture shifts and biases the global optimum of a simple test function. 

How to shift and bias a test function?
08:39

This lecture takes you through the steps to use the subplot and improve the quality of convergence curve, average fitness of all particles, fluctuation in the variables, and search history. 

How to better visualize qualitative data of PSO?
18:12

In this video, we will learn a technique to save the progress of PSO in each iteration into files. In other words, the process of storing results of each iteration into a seperate file is covered. 

How to store the data generated during the optimization process in files?
06:53

This lectures presents two methods to change the stopping condition of PSO. You can apply these methods to any optimization algorithm. 

How to change the stopping criterion of an optimization algorithm?
14:41

This video shows how to update the GBEST in the PSO algorithm more frequently. The PSO that has been developed in this course updates GBEST at the beginning of the iteration, and all particles use it for update theirs positions. However, we might need to update the GBEST once we find a better solution right away. The steps to do so are discussed in this video. 

How to frequently update GBEST in PSO?
10:26
How to solve problems with both binary/discrete and continuous variables: PART 1
13:12
How to solve problems with both binary/discrete and continuous variables: PART 2
19:48
How to solve problems with both binary/discrete and continuous variables: PART 3
27:26
+ Thank You
1 lecture 01:41
Thank You: voucher for my other courses
01:41