You can find the spreadsheets for this video and some additional materials here: drive.google.com/drive/folders/1sP40IW0p0w5IETCgo464uhDFfdyR6rh7 Please consider supporting NEDL on Patreon: www.patreon.com/NEDLeducation
Hi Tan, we have got several videos on ARCH/GARCH modelling and various specifications in the Mathematical Finance playlist. Might do a video on ARIMA and ARMA at some point in the future as well!
Hey there, another question about the ARCH values - I am analysing the S&P, HangSeng as well as DAX indexes and while i have an Alpha of 0,52 for the S&P (which graphically means that the models reacts nicely to ther actual volatility), I only have 0,20 and even 0,14 for the DAX and HangSeng respectively (in the latter two cases the Log Likelihood is also just a little better than even the constant vola assumption). How is it possible that for the S&P it follows actual vola that nicely while the other two models dont seem to work that well (are the values wrong :/ )
Hi David, and glad you have applied the model to your own datasets! 0.20 and 0.14 are quite typical ARCH coefficients. Have you tried estimating a GARCH model? It just might be that the persistence of variance in Germany and Hong Kong is a little bit more long-term :)
Ever grateful for the effort you are taking to make these abstruse topics accessible to students. However, with regard to the constraints on ARCH (Alpha), should it not be strictly < 1? Please guide.
Hi Balkrishna, glad you liked the video! As for your comment, yes, you are right! Theoretically, you could approach the ARCH alpha exactly the same way I approach unconditional variance omega (insert a number very close to 1 but smaller, like 0.9999 in Solver, as Solver does not accept strict inequalities as restrictions), however I decided not to emphasise it too much in the video, as practically it does not impact the convergence of the Solver algorithm to maximise log-likelihood (while the restriction on omega does). Hope it helps!
Thank you for your response to previous comment. I have two questions. From what I have read, the (mu) is supposed to come from ARIMA process. But here, your (mu) is coming from log-likelihood function, even though I cannot see (mu) in your log-likelihood formula. How can you estimate (mu) from log likelihood function when it is not even present there. Why are you not using ARIMA first?
Hi, and thanks for the excellent question! Here, mu is implicitly going into the likelihood calculations as it is involved in residual (and thus squared residual) calculations. Of course, you could theoretically estimate mu first using some other model, but in my opinion it is more intuitive and less assumption-sensitive to maximise log-likelihood and "endogenise" the calculation of mu in the ARCH/GARCH framework.
In your Error term (epsilon^2), you are adding the white noise (u_t). But in many other places for ARCH model, I see it has been multiplied with v_t. Will that make some difference. Why is this so?
Hi, and thanks for the question! This is simply a notational difference that does not affect the calculations. You either assume white noise that is distributed with mean zero and time-varying conditional volatility, or a product of white noise with mean zero and standard deviation one multiplied by the v_t term.
Hi, and thanks for the suggestion! Will definitely record a tutorial on that in the distant future. As for now, there is a relatively old video on BRW VaR that you might find interesting on the extension of EWMA to the historical simulation approach: th-cam.com/video/CAsgjA7KodQ/w-d-xo.html
Hi Paloma, and thanks for the question! The graph is pre-constructed so it shows up automatically when the columns are filled and so that the video does not take extra time to build the graphs and format them. You can access the spreadsheet and see how the chart is built for yourself via our Google Drive link, please check the pinned comment. Hope it helps!
Hello! Here's a question about the compasion of realised vs conditional vol. You call residuals the realised vol, but the residuals themselves (observation - mu) have been given by the solution to the optimization problem, since mu has changed after using solver. To which extent can these residuals be called realised vol? Your channel is amazing! Thank you.
Hi Rafael, and glad you liked the video. Thanks so much for a very profound question :) Yes indeed, the "realised" volatility changes with the model parameters, but it is only the mean parameters than affect that (as observations fluctuate around the expected value mu). It can be more important in GARCH-M models where the expected value depends on conditional volatility (check a video on these models out if you are interested: th-cam.com/video/Wj-MeWxhK_E/w-d-xo.html). So the idea is that while mean parameters (mu and risk premia parameters in GARCH-M) affect the realised volatility, the impact is very slight, and the main GARCH parameters than seek to explain volatility persistence (alpha in ARCH, beta in GARCH, theta in TGARCH) do not. If you would like to run a model with given realised volatility, you could hold mu as constant (leave it be equal to the sample average, for example, or 0). Hope it helps!
Hi Vaibhav, and thanks for the suggestion! In general, ARCH/GARCH models do not specify what is the source of the initial shocks or turbulences, they simply model how these dissipate with time and impact future conditional volatility. You can, however, include exogeneous variance regressors in the ARCH/GARCH framework to see how a particular source of risk impacts conditional variance. These could be policy uncertainty variables (e.g., EPU), or sentiment variables. I might do a video on this in the future! Hope it helps!
Dear Sir, so how could we use it to estimate VaR? Am I right saying that we would assume there is a distribution, and we can use variance from above ARCH model and the assumed distribution to estimate VaR? Thank you.
Hi Trung, and thanks for the question! You can just plug the today's variance into the pre-estimated ARCH or GARCH model and forecast the variance for tomorrow. This value can be used in parametric VaR estimations. For example, if the expected return is 0%, the residual (abnormal return) today is 3%, and the ARCH variance equation is v(t)^2 = 0.0001 + 0.3333*e(t-1)^2, you can predict that conditional variance tomorrow is 0.0004, and thus the expected standard deviation is 2%. If the returns are normally distributed, this would give a 99% parametric VaR of -(0%+2%*z(1 - 99%)) = 4.653%, i.e., a 4.653% daily loss. Hope it helps!
Hi Librado, and thanks for the question! You can simply estimate an ARCH model based on the returns of your currency and forecast the volatility for tomorrow the following way. If the expected return is 0%, the residual (abnormal return) today is 3%, and the ARCH variance equation is v(t)^2 = 0.0001 + 0.3333*e(t-1)^2, you can predict that conditional variance tomorrow is 0.0001 + 0.3333*(0.03)^2 = 0.0004, and thus the expected standard deviation is sqrt(0.0004) = 2%. Hope it helps!
Hi! Thank you so much for this tutorial! As I was putting my data through the ARCH process and following your steps, my log-likelihood value would increase as in yours, however, my ARCH (alpha) value would become zero, and my conditional variance would stay the same. My conditional variance would equal my unconditional variance, and when I would plot my graph as in yours, the conditional volatility would just be a straight horizontal line. Am I missing a step or does that suggest something in the data?
Hi Priyansh, and glad you liked the video! As for your question, it seems you have not run the optimisation with solver or that there are some solver constraints that prevent the model to converge to optimal alpha value. Please double-check these steps!
@@NEDLeducation Hi! Thanks for the reply! Yes, I had put it through the optimization solver and put the same constraints as yours but had still resulted in the same issue. How do I fix this?
Thank you for sharing this great video, I have followed your video step by step but I have got negative values of Log-Likelihood. Is that OK? If it is not OK. Please guide me on how can I solve this issue? Thank you
@@NEDLeducation dear NEDL, I informed the Solver tab, but I got this message! which is: Solver encountered an error value in the objective Cell or a constraint cell. I think the reason is that the value of Alpha has been changed from 0 to 1 after informing the Solver process so that caused error values of Long run volatility, the first value of the conditional variance and the first value of the Log likelihood and also the value of Log likelihood. so, what I did? I informed Arch model using stata software and then I got the arch coefficient (alpha) from the stata results and then I used it in excel. IS THAT OK? last question please, Can I use the conditional volatility time series (that I get from excel ) as a volatility variable? My variable is inflation (annual data 1996-2017) and I need to a volatility inflation time series. Thank you
Hi , really loved the videos you make, surprised that it took me so long to discover it. I had a query though, Lets say i am attempting a sales forecast of Walmart (a well known Kaggle competition) using a ARIMA model and i have its output. Like 1-Feb-2021 = 221 pcs of XYZ, 2-Feb-2021 = 242 pcs of XYZ. Can GARCH be used in this scenario, how do i combine it with the ARIMA estimates ?
Hi Vipul, and glad you lied the videos! As for your question - you can combine pretty much any mean regression model with GARCH effects in the error terms by maximising the log-likelihood (you include mean regressors to estimate residuals and use the logic of GARCH to optimise all parameters at once). I am planning to record videos regarding various extensions of GARCH in the near future, so stay tuned for more content on this!
@@NEDLeducation garch would just give me the current standard deviation. so even if i know that current predicted mean is 212 and garch estimate of volatility is 2%. should i change my prediction of demand to 212 + 2% ? looking forward to your upcoming videos..
Saava, what is Constant(mu) in A5 and what is his relation to main formulas in D2,D3. Why do you assume, that he is equal to A1? 2:55 What is ARCH - Process? Could you explain it please?
I would appreciated if you make a video on Time varying markov-switching (TVMS) model by Hamilton (1989). I am currently working on a project where I need to use TVMS model by Hamilton (1989). Once again thank you so much.
Hi Muhammad, and thanks for the suggestion! I am considering doing a video on this at some point over the summer, Markov-switching GARCH is quite bulky to implement in Excel but in certainly doable. Will let you know as soon as the video is done!
hi ! can you pls make a video with other time frames like for this case you compared daily variance , i want you to show me how we can do it with one month variance " stdev.s 22 days" and arch vol 22 days ! pls ty !
Hi Jawed and thanks for the suggestion. Overall, the longer the time frequency, the more sensible is the assumption that volatility is constant (the less prominent ARCH and GARCH effects are). Therefore, as a rule of thumb, ARCH effects are less relevant for monthly data (I discussed it in the videos as well). I will consider either recording a full short video on that sometime in the future or just posting a supplementary Excel file with calculations onto Google Drive.
Thank you very much teacher for all these awesome videos! Now it expands my practical knowledge much beyond, so it intrigues me to ask you further questions, ok?! So, in the model demonstrated here we are trying to predict future volatility based on some past volatility data, right? So at the moment we know only volatility in the past but not in the future, so example at the date of let's say 13 of August 2019, we know volatility of 11 of August 2019, and 12 of August 2019, but not the volatility of 20 of August 2019, so up to the some date we can use only volatility in the past, right?! So in reality what is time period in past most optimal to use to predict some volatility in the future? In this particular case it is 5 years?! Can we use some moving average of let's say around 1000 working days in the past or let's say 1200 as equivalent of 5 years? What is the most optimal time period to use?
Hi again Ivan, and glad you are enjoying the videos! ARCH/GARCH models are actually very useful indeed for something you are asking here. If you want to forecast volatility using an ARCH model for tomorrow knowing volatility today, you can simply plug the numbers into the ARCH equation. For example, if unconditional variance (omega) is 0.0001, alpha is 0.3, and the residual (abnormal return) today is 2%, then conditional variance tomorrow is omega + alpha*e^2 = 0.0001 + 0.3*(0.02)^2 = 0.00022, and conditional volatility tomorrow is sqrt(0.00022) = 1.48%. If you want to forecast volatility the day after tomorrow, just plug 1.48% as the residual, and apply it iteratively from then on. If concerned with volatility very far into the future, you can use long-run variance formula which in case of simple ARCH is omega/(1 - alpha). It means that in the long run, variance converges to 0.0001/0.7 = 0.000143, and long run volatility is the square root of it (0.000143)^(1/2) = 1.20%. Hope it helps!
@@NEDLeducation thanks for the answer. In relation to the spreadsheet in the video, what cells are in that equation for forecast one day ahead? Thanks for the indulgence. Great job on the channel btw. Subscribed and turned alerts on! =)
I am trying to replicate all the numbers in the example however cannot replicate the result using the Microsoft solver add- in - anyone any ideas from the log likelihood and using the variables, brings back a different set of results
Hi and thanks for the suggestion! Adaptive LASSO has been already on my to-do list, so you can reasonably expect such a video to appear soon :) GARCH-MIDAS, however, can be tricky to implement in Excel so I have to think it over.
You can find the spreadsheets for this video and some additional materials here: drive.google.com/drive/folders/1sP40IW0p0w5IETCgo464uhDFfdyR6rh7
Please consider supporting NEDL on Patreon: www.patreon.com/NEDLeducation
Your explanations are perfect )
Thank You for the Video. It was very helpful.
This is indeed the best platform !
Dude, you are doing god's work. Thank you so much!
Thank you, sir, for the way you explain, step by step as well, it is easy to understand for such a newbie like I am
My respects NEDL, you have really helped me a lot
Thank you so much for the regular content. Appreciate the intuitive explanation behind model parameters.
Great explanation. Thanks you.
Thank you, this is what I miss in stats/AI, most just show how to use libs but not how to calculate.
Thank you very much
Amazing content! keep it going.
Thanks! Glad you found the tutorials helpful :)
Great video. Suggest more videos like arima model (excel) and also arma-garch (excel) if possible.
Hi Tan, we have got several videos on ARCH/GARCH modelling and various specifications in the Mathematical Finance playlist. Might do a video on ARIMA and ARMA at some point in the future as well!
Hey there,
another question about the ARCH values - I am analysing the S&P, HangSeng as well as DAX indexes and while i have an Alpha of 0,52 for the S&P (which graphically means that the models reacts nicely to ther actual volatility), I only have 0,20 and even 0,14 for the DAX and HangSeng respectively (in the latter two cases the Log Likelihood is also just a little better than even the constant vola assumption). How is it possible that for the S&P it follows actual vola that nicely while the other two models dont seem to work that well (are the values wrong :/ )
Hi David, and glad you have applied the model to your own datasets! 0.20 and 0.14 are quite typical ARCH coefficients. Have you tried estimating a GARCH model? It just might be that the persistence of variance in Germany and Hong Kong is a little bit more long-term :)
Thanks a lot for helpfull video!!!
Ever grateful for the effort you are taking to make these abstruse topics accessible to students. However, with regard to the constraints on ARCH (Alpha), should it not be strictly < 1? Please guide.
Hi Balkrishna, glad you liked the video! As for your comment, yes, you are right! Theoretically, you could approach the ARCH alpha exactly the same way I approach unconditional variance omega (insert a number very close to 1 but smaller, like 0.9999 in Solver, as Solver does not accept strict inequalities as restrictions), however I decided not to emphasise it too much in the video, as practically it does not impact the convergence of the Solver algorithm to maximise log-likelihood (while the restriction on omega does). Hope it helps!
Thank you for your response to previous comment. I have two questions. From what I have read, the (mu) is supposed to come from ARIMA process. But here, your (mu) is coming from log-likelihood function, even though I cannot see (mu) in your log-likelihood formula. How can you estimate (mu) from log likelihood function when it is not even present there. Why are you not using ARIMA first?
Hi, and thanks for the excellent question! Here, mu is implicitly going into the likelihood calculations as it is involved in residual (and thus squared residual) calculations. Of course, you could theoretically estimate mu first using some other model, but in my opinion it is more intuitive and less assumption-sensitive to maximise log-likelihood and "endogenise" the calculation of mu in the ARCH/GARCH framework.
@@NEDLeducation Dear NEDL, thanks a lot. That is helpful.
Can you please make a video on ARIMA and ARMA model considering S & P 500 Index value
Hi Vaibhav, videos on ARMA and ARIMA are in my plans for the next couple of months so stay tuned and thanks for the suggestion :)
In your Error term (epsilon^2), you are adding the white noise (u_t). But in many other places for ARCH model, I see it has been multiplied with v_t. Will that make some difference. Why is this so?
Hi, and thanks for the question! This is simply a notational difference that does not affect the calculations. You either assume white noise that is distributed with mean zero and time-varying conditional volatility, or a product of white noise with mean zero and standard deviation one multiplied by the v_t term.
Hi, it would be interesting to compare and provide an example of the EWMA approach that Riskmetrics improved in 2006 (RM 2006).
Hi, and thanks for the suggestion! Will definitely record a tutorial on that in the distant future. As for now, there is a relatively old video on BRW VaR that you might find interesting on the extension of EWMA to the historical simulation approach: th-cam.com/video/CAsgjA7KodQ/w-d-xo.html
@@NEDLeducation Thanks for the response and keep up the good work!
Everything came up fine to me
Minus the graph. How did you get it to just show up
Hi Paloma, and thanks for the question! The graph is pre-constructed so it shows up automatically when the columns are filled and so that the video does not take extra time to build the graphs and format them. You can access the spreadsheet and see how the chart is built for yourself via our Google Drive link, please check the pinned comment. Hope it helps!
Hello! Here's a question about the compasion of realised vs conditional vol. You call residuals the realised vol, but the residuals themselves (observation - mu) have been given by the solution to the optimization problem, since mu has changed after using solver. To which extent can these residuals be called realised vol? Your channel is amazing! Thank you.
Hi Rafael, and glad you liked the video. Thanks so much for a very profound question :) Yes indeed, the "realised" volatility changes with the model parameters, but it is only the mean parameters than affect that (as observations fluctuate around the expected value mu). It can be more important in GARCH-M models where the expected value depends on conditional volatility (check a video on these models out if you are interested: th-cam.com/video/Wj-MeWxhK_E/w-d-xo.html). So the idea is that while mean parameters (mu and risk premia parameters in GARCH-M) affect the realised volatility, the impact is very slight, and the main GARCH parameters than seek to explain volatility persistence (alpha in ARCH, beta in GARCH, theta in TGARCH) do not. If you would like to run a model with given realised volatility, you could hold mu as constant (leave it be equal to the sample average, for example, or 0). Hope it helps!
You are very great and wisdom teacher, how can I spread this knowledge to human I will do my best.
Hi Hanh and many thanks for such kind words. Doing my best! Stay tuned for even more educational videos :)
can you please explain real life example of shock/innovations/error term in financial markets and how it impacts volatility
Hi Vaibhav, and thanks for the suggestion! In general, ARCH/GARCH models do not specify what is the source of the initial shocks or turbulences, they simply model how these dissipate with time and impact future conditional volatility. You can, however, include exogeneous variance regressors in the ARCH/GARCH framework to see how a particular source of risk impacts conditional variance. These could be policy uncertainty variables (e.g., EPU), or sentiment variables. I might do a video on this in the future! Hope it helps!
Can you do a video on this specifically? I feel like this will tie everything together for me@@NEDLeducation
Dear Sir, so how could we use it to estimate VaR? Am I right saying that we would assume there is a distribution, and we can use variance from above ARCH model and the assumed distribution to estimate VaR? Thank you.
Hi Trung, and thanks for the question! You can just plug the today's variance into the pre-estimated ARCH or GARCH model and forecast the variance for tomorrow. This value can be used in parametric VaR estimations. For example, if the expected return is 0%, the residual (abnormal return) today is 3%, and the ARCH variance equation is v(t)^2 = 0.0001 + 0.3333*e(t-1)^2, you can predict that conditional variance tomorrow is 0.0004, and thus the expected standard deviation is 2%. If the returns are normally distributed, this would give a 99% parametric VaR of -(0%+2%*z(1 - 99%)) = 4.653%, i.e., a 4.653% daily loss. Hope it helps!
Please, could you consider make a FX forecast using Arch model? It would be very helpful for me!
Regards.
Hi Librado, and thanks for the question! You can simply estimate an ARCH model based on the returns of your currency and forecast the volatility for tomorrow the following way. If the expected return is 0%, the residual (abnormal return) today is 3%, and the ARCH variance equation is v(t)^2 = 0.0001 + 0.3333*e(t-1)^2, you can predict that conditional variance tomorrow is 0.0001 + 0.3333*(0.03)^2 = 0.0004, and thus the expected standard deviation is sqrt(0.0004) = 2%. Hope it helps!
Hi, where did you source the data from?
Hi! Thank you so much for this tutorial! As I was putting my data through the ARCH process and following your steps, my log-likelihood value would increase as in yours, however, my ARCH (alpha) value would become zero, and my conditional variance would stay the same. My conditional variance would equal my unconditional variance, and when I would plot my graph as in yours, the conditional volatility would just be a straight horizontal line. Am I missing a step or does that suggest something in the data?
Hi Priyansh, and glad you liked the video! As for your question, it seems you have not run the optimisation with solver or that there are some solver constraints that prevent the model to converge to optimal alpha value. Please double-check these steps!
@@NEDLeducation Hi! Thanks for the reply! Yes, I had put it through the optimization solver and put the same constraints as yours but had still resulted in the same issue. How do I fix this?
?
Hi! Did you find the answer fot the problem?
Thank you for sharing this great video, I have followed your video step by step but I have got negative values of Log-Likelihood. Is that OK? If it is not OK. Please guide me on how can I solve this issue? Thank you
Hi Badiah, and glad the video helped! Negative values of log-likelihood are not something extraordinary, this can happen.
@@NEDLeducation Thank you for your response, so can I go-ahead to the rest of the steps, right?
@@badiahahmed2085 Sure! If there are any issues with the final results, let me know and we can look into it in more detail.
@@NEDLeducation dear NEDL, I informed the Solver tab, but I got this message! which is: Solver encountered an error value in the objective Cell or a constraint cell. I think the reason is that the value of Alpha has been changed from 0 to 1 after informing the Solver process so that caused error values of Long run volatility, the first value of the conditional variance and the first value of the Log likelihood and also the value of Log likelihood.
so, what I did? I informed Arch model using stata software and then I got the arch coefficient (alpha) from the stata results and then I used it in excel.
IS THAT OK?
last question please,
Can I use the conditional volatility time series (that I get from excel ) as a volatility variable? My variable is inflation (annual data 1996-2017) and I need to a volatility inflation time series.
Thank you
Hi , really loved the videos you make, surprised that it took me so long to discover it. I had a query though, Lets say i am attempting a sales forecast of Walmart (a well known Kaggle competition) using a ARIMA model and i have its output. Like 1-Feb-2021 = 221 pcs of XYZ, 2-Feb-2021 = 242 pcs of XYZ. Can GARCH be used in this scenario, how do i combine it with the ARIMA estimates ?
Hi Vipul, and glad you lied the videos! As for your question - you can combine pretty much any mean regression model with GARCH effects in the error terms by maximising the log-likelihood (you include mean regressors to estimate residuals and use the logic of GARCH to optimise all parameters at once). I am planning to record videos regarding various extensions of GARCH in the near future, so stay tuned for more content on this!
@@NEDLeducation garch would just give me the current standard deviation. so even if i know that current predicted mean is 212 and garch estimate of volatility is 2%. should i change my prediction of demand to 212 + 2% ? looking forward to your upcoming videos..
Saava, what is Constant(mu) in A5 and what is his relation to main formulas in D2,D3. Why do you assume, that he is equal to A1? 2:55 What is ARCH - Process? Could you explain it please?
I would appreciated if you make a video on Time varying markov-switching (TVMS) model by Hamilton (1989). I am currently working on a project where I need to use TVMS model by Hamilton (1989). Once again thank you so much.
Hi Muhammad, and thanks for the suggestion! I am considering doing a video on this at some point over the summer, Markov-switching GARCH is quite bulky to implement in Excel but in certainly doable. Will let you know as soon as the video is done!
hi ! can you pls make a video with other time frames like for this case you compared daily variance , i want you to show me how we can do it with one month variance " stdev.s 22 days" and arch vol 22 days ! pls ty !
Hi Jawed and thanks for the suggestion. Overall, the longer the time frequency, the more sensible is the assumption that volatility is constant (the less prominent ARCH and GARCH effects are). Therefore, as a rule of thumb, ARCH effects are less relevant for monthly data (I discussed it in the videos as well). I will consider either recording a full short video on that sometime in the future or just posting a supplementary Excel file with calculations onto Google Drive.
wow
Thank you very much teacher for all these awesome videos! Now it expands my practical knowledge much beyond, so it intrigues me to ask you further questions, ok?! So, in the model demonstrated here we are trying to predict future volatility based on some past volatility data, right? So at the moment we know only volatility in the past but not in the future, so example at the date of let's say 13 of August 2019, we know volatility of 11 of August 2019, and 12 of August 2019, but not the volatility of 20 of August 2019, so up to the some date we can use only volatility in the past, right?! So in reality what is time period in past most optimal to use to predict some volatility in the future? In this particular case it is 5 years?! Can we use some moving average of let's say around 1000 working days in the past or let's say 1200 as equivalent of 5 years? What is the most optimal time period to use?
Hi again Ivan, and glad you are enjoying the videos! ARCH/GARCH models are actually very useful indeed for something you are asking here. If you want to forecast volatility using an ARCH model for tomorrow knowing volatility today, you can simply plug the numbers into the ARCH equation. For example, if unconditional variance (omega) is 0.0001, alpha is 0.3, and the residual (abnormal return) today is 2%, then conditional variance tomorrow is omega + alpha*e^2 = 0.0001 + 0.3*(0.02)^2 = 0.00022, and conditional volatility tomorrow is sqrt(0.00022) = 1.48%. If you want to forecast volatility the day after tomorrow, just plug 1.48% as the residual, and apply it iteratively from then on. If concerned with volatility very far into the future, you can use long-run variance formula which in case of simple ARCH is omega/(1 - alpha). It means that in the long run, variance converges to 0.0001/0.7 = 0.000143, and long run volatility is the square root of it (0.000143)^(1/2) = 1.20%. Hope it helps!
@@NEDLeducation thanks for the answer. In relation to the spreadsheet in the video, what cells are in that equation for forecast one day ahead? Thanks for the indulgence. Great job on the channel btw. Subscribed and turned alerts on! =)
I am trying to replicate all the numbers in the example however cannot replicate the result using the Microsoft solver add- in - anyone any ideas from the log likelihood and using the variables, brings back a different set of results
Dear Teacher, can you teach me how to do GARCH-M by excel? thank you very much
PLEASE MAKE VEDIO ON GARCH-MIDAS WITH ADAPTIVE LASSO
Hi and thanks for the suggestion! Adaptive LASSO has been already on my to-do list, so you can reasonably expect such a video to appear soon :) GARCH-MIDAS, however, can be tricky to implement in Excel so I have to think it over.