- 18
- 929 551
AlphaOpt
เข้าร่วมเมื่อ 29 มี.ค. 2017
A basic introduction to optimization including examples, methods, principles, and applications.
What Is Linear Regression?
A quick introduction to linear regression, a technique for fitting a linear model to data.
TRANSCRIPT:
Hello, and welcome to Introduction to Optimization. This video provides a basic answer to the question, what is Linear Regression?
Put simply, linear regression is a technique for fitting or matching a line to a set of data points.
As a basic example, imagine you’re driving a car, and every time you take a trip you keep track of how far you drive, and how much gas the car uses. After a few trips, you make a graph showing the miles driven on each trip on the X axis, and the gas used on each trip on the Y axis.
Linear regression is finding the line that best fits this data, and there are several ways to find it, including using optimization to minimize the error between the line and the data points, which is also known as least squares. Once we find the line, it gives us an equation y = mx + b. In this case, b is zero, x is our trip distance, y is the gas used, and m is our car’s gas mileage. We can then use this equation to predict about how much gas we can expect to use on our next trip.
So far we’ve been looking at very simple cases, with one input and one output, but what if there are multiple factors involved? This is called multiple linear regression. For example, we could include the percentage of time spent driving in the city in our trip data. In this case, instead of a line, we fit a plane to the data points.
Linear regression is a simple technique, but it’s applied every day in many areas, from engineering to machine learning, finance, healthcare, manufacturing, and more.
Linear regression is an important technique that helps us understand relationships between variables and make predictions. Thanks for watching!
TRANSCRIPT:
Hello, and welcome to Introduction to Optimization. This video provides a basic answer to the question, what is Linear Regression?
Put simply, linear regression is a technique for fitting or matching a line to a set of data points.
As a basic example, imagine you’re driving a car, and every time you take a trip you keep track of how far you drive, and how much gas the car uses. After a few trips, you make a graph showing the miles driven on each trip on the X axis, and the gas used on each trip on the Y axis.
Linear regression is finding the line that best fits this data, and there are several ways to find it, including using optimization to minimize the error between the line and the data points, which is also known as least squares. Once we find the line, it gives us an equation y = mx + b. In this case, b is zero, x is our trip distance, y is the gas used, and m is our car’s gas mileage. We can then use this equation to predict about how much gas we can expect to use on our next trip.
So far we’ve been looking at very simple cases, with one input and one output, but what if there are multiple factors involved? This is called multiple linear regression. For example, we could include the percentage of time spent driving in the city in our trip data. In this case, instead of a line, we fit a plane to the data points.
Linear regression is a simple technique, but it’s applied every day in many areas, from engineering to machine learning, finance, healthcare, manufacturing, and more.
Linear regression is an important technique that helps us understand relationships between variables and make predictions. Thanks for watching!
มุมมอง: 1 679
วีดีโอ
What is Machine Learning?
มุมมอง 1.7Kปีที่แล้ว
A basic introduction to the ideas behind machine learning, some of the major categories, and some examples of where it can be applied. TRANSCRIPT: Hello, and welcome to Introduction to Machine Learning. Have you ever wondered how computers can predict outcomes or classify data? Well, that's where machine learning comes in. In this video, we'll explore different types of machine learning algorit...
What is Newton's Method?
มุมมอง 3.5K2 ปีที่แล้ว
A quick introduction to Newton's Method, a technique for finding the roots, or zeros of a function or equation.
What is Least Squares?
มุมมอง 72K2 ปีที่แล้ว
A quick introduction to Least Squares, a method for fitting a model, curve, or function to a set of data. TRANSCRIPT Hello, and welcome to Introduction to Optimization. This video provides a basic answer to the question, what is Least Squares? Least squares is a technique for fitting an equation, line, curve, function, or model to a set of data. This simple technique has applications in many fi...
What is the Traveling Salesman Problem?
มุมมอง 129K3 ปีที่แล้ว
A quick introduction to the Traveling Salesman Problem, a classic problem in mathematics, operations research, and optimization.
How to solve the Rosenbrock optimization problem in Matlab with fminunc
มุมมอง 15K5 ปีที่แล้ว
A quick example of solving the Rosenbrock problem, which is a classic unconstrained optimization test problem, using fminunc in Matlab. CODE: github.com/abe-mart/alphaopt/blob/master/rosenbrock.m
Introduction to Optimization: Calculating Derivatives
มุมมอง 11K5 ปีที่แล้ว
This video gives an overview of three ways to obtain derivatives for optimization, symbolic differentiation, numerical differentiation, and automatic differentiation.
Python GEKKO Optimization Suite - Free Engineering Optimization Software
มุมมอง 13K6 ปีที่แล้ว
This video provides an overview of the GEKKO Optimization Suite, an open-source Python package for optimization and control of dynamic systems. Gekko Docs and Download gekko.readthedocs.io/en/latest/index.html
Matlab Fmincon Optimization Example: Constrained Box Volume
มุมมอง 38K6 ปีที่แล้ว
This video shows how to perform a simple constrained optimization problem with fmincon in Matlab. This video is part of an introductory series on optimization.
Python Scipy Optimization Example: Constrained Box Volume
มุมมอง 40K6 ปีที่แล้ว
This video shows how to perform a simple constrained optimization problem with scipy.minimize in Python. This video is part of an introductory series on optimization. GEKKO Optimization Version: th-cam.com/video/UFMFMMHVMp0/w-d-xo.html
Python Optimization Example: Constrained Box Volume with GEKKO
มุมมอง 4.1K6 ปีที่แล้ว
This video shows how to perform a simple constrained optimization problem with the GEKKO optimization package in Python. This video is part of an introductory series on optimization. GEKKO Package: gekko.readthedocs.io/en/latest/#
Python Optimization Example Snowball Rolling with Scipy Minimize
มุมมอง 20K6 ปีที่แล้ว
How big does a snowball need to be to knock down a tree after rolling for 30 seconds? We answer this question using optimization in Python. Tools used: Python, numpy, scipy odeint, scipy minimize. This video is part of an introductory series on optimization. Code available on GitHub: github.com/abe-mart/alphaopt/blob/master/Snowball Optimization/Python Version/snowball.py
Introduction To Optimization: Gradient Free Algorithms (2/2) Simulated Annealing, Nelder-Mead
มุมมอง 41K7 ปีที่แล้ว
A brief overview of Simulated Annealing, the Nelder-Mead method, and a survey of various metaphor and biological inspired optimization algorithms. This video is part of an introductory series on optimization. TRANSCRIPT: Simulated Annealing Annealing is a process in which metal or glass is heated, and then allowed to slowly cool at a controlled rate. Annealing changes the properties of a metal,...
Introduction To Optimization: Gradients, Constraints, Continuous and Discrete Variables
มุมมอง 46K7 ปีที่แล้ว
A brief introduction to the concepts of gradients, constraints, and the differences between continuous and discrete variables. This video is part of an introductory optimization series. NOTE: There is a typo in the slope formula at at 00:30. It should be delta_y/delta_x. TRANSCRIPT: Hello, and welcome to Introduction To Optimization. This video continues our discussion of basic optimization voc...
Introduction To Optimization: Gradient Free Algorithms (1/2) - Genetic - Particle Swarm
มุมมอง 45K7 ปีที่แล้ว
A conceptual overview of gradient free optimization algorithms, part one of two. This video is part of an introductory optimization series. TRANSCRIPT: Hello, and welcome to Introduction To Optimization. This video covers gradient free algorithms. Gradient based algorithms and gradient free algorithms are the two main types of methods for solving optimization problems. In this video, we will su...
Introduction To Optimization: Objective Functions and Decision Variables
มุมมอง 104K7 ปีที่แล้ว
Introduction To Optimization: Objective Functions and Decision Variables
Introduction To Optimization: Gradient Based Algorithms
มุมมอง 75K7 ปีที่แล้ว
Introduction To Optimization: Gradient Based Algorithms
Introduction to Optimization: What Is Optimization?
มุมมอง 268K7 ปีที่แล้ว
Introduction to Optimization: What Is Optimization?
☀️_☀️☀️☀️☀️☀️☀️☀️☀️
you are th beeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeest😭😭💥💥❤❤❤❤💯💯💯
Amazing videos, Clear and Concise! Thank you for covering this topic in such a brilliant way!
Very good, finally understand how that best fit line we get!!
God, I know this sounds silly but I didn't know that the plane was the equivalent to the line. Now I comprehend this as I should've since the beginning. THANK YOU AlphaOpt
Superb!
drink some water man
u can say thanks instead of this
clear and brief idea
Thanks a lot, I have the curse of being a visual learner and this was amazing.
Woah whole meat
Thanks!
This is close to application math, this is so good for gave us the biggest picture of it!
Visual learning makes things so much better.
slope is delta Y divided by delta X
Thank you, this was a great introduction to tge topic.
Great video and helpful channel! Khan academy and the organic chemistry guy are getting old and less helpful as school curriculums develop. Super grateful for these simple, direct explanations
teacher I developed a heuristic and would like to share it. My heuristic uses topology and concentric circles. What do you think?.
more videos please
in 2 mins just you explained everything
How to solve this when the x and y are bounded?
Oh, it's really exciting, i hope to see more videos from you! So far i've seen some of IBM Technology's videos on it, but i'd like to know what the current knowledge and technology of AI is as of september 2023... I wish they'd post longer than 5-8 minute videos on these subjects... It is not that they are not informative, just that they are never in-depth. I remember MIT OpenCourseWare courses were hours upon hours, but nowadays, you do searches about AI, Machine Learning, NLP, Data Analysis, and they're all very general, like an overview at best, it is good for me as a beginner, and i do not promise that i would understand the more extended videos, but i am so curious and intrigued by them, i could easily spend hours learning about it, if only the content was there.
Simple yet extremely informative👍
Just perfect. Thanks
lovely brooo, such good animation, now i have the concept in my head.
This is a great little video !
So clear
how the hell is this O(n!) ??
Thank you, what is the name of the algoodo tolbox you used for simulated annealing?
compact and thorough at the same time. thanks !
Thank you... ❤
Imagine being a Salesman and this actually happens (I k it can happen irl on godddd it's a joke)
Explain the christofian 1.5 solution and give an heuristic example as well please
excellent explanation.Thank you so mcuh
But residual != error?
Works fine in R2021b. % set initial guess values for box dimensions lengthGuess = 1; widthGuess = 1; heightGuess = 1; % load guess values into array x0 = [lengthGuess widthGuess heightGuess]; % call solver to minimize the objective function given the constraint xopt = fmincon(@objective,x0,[],[],[],[],[],[],@constraint,[]) % retrieve optimized box sizing and volume volumeOpt = calcVolume(xopt) % calculate surface area with optimized values just to double check surfaceAreaOpt = calcSurface(xopt) % define function to calculate volume of box function volume = calcVolume(x) length = x(1); width = x(2); height = x(3); volume = length * width * height; end % define function to calculate surface area of box function surfaceArea = calcSurface(x) length = x(1); width = x(2); height = x(3); surfaceArea = 2*length*width + 2*length*height + 2*height*width; end % define objective function for optimization function obj = objective(x) obj = -calcVolume(x); end % define constraint for optimization function [c, ceq] = constraint(x) c = calcSurface(x) - 10; ceq = []; end
Excellent! Thank you.
great explanation!
Omg, your explanation is better than other youtube videos and my teacher because I'm a visual learner.
Noice
Why use the squares instead of the absolute values?
because they are easier to compute and deal with mathematically. But we can use absolute values too!
because it gives more clear picture if we have error of ,1 and if we square it it will give 0,01 which is kind of scaled.
@@Rashidiillactually it's the other way around. It's better to use absolute value instead of squares as it can amplify the outliers and influence the final fit.
The Nelder-Mead method is not explained long enough to understand it.
Thank you for simplifying this
<3
<3
no example, pretty useless
Good explanation 👍
Excellent video and also quite easy to understand
This is a very similar problem I have but I have a linear obj func and constraint, and constraint is an equality. When using SLSQP I get error "singular matrix c in lsq subproblem". Seems I should use linprog but I'm not sure how or whether this type of problem can be converted to linprog. Any ideas?
I suppose we can also say optimization is choosing the best input or best process, or both the best process and best input to yield the best output
Have you tried slime mold?
Nice 👍🏽