you explain everything clearly and with ease! I'm always stressed when watching educational videos but its never the case with your videos! many thanks.
Thankyou very much from bottom of my heart for this series of markov chain i am really glad people like you understand students minutes problem and come with detailed , lucid explanation even those who have no base can grasp the concept seamlessly
Watching from Nigeria and you just helped me understand this markov chain so well ...thank you so much ..I'm so happy .. preparing for my final year exam in statistics
Thank you so much sir... I searched so many videos that were of no use. Finally I found this video which helped me clear all the concepts. Once again tq sir...
ı tried to understand this topic and ı watched some videos.However in this video he explained easily this topic that is very difficult for me.Thanks for your video:)
This video is really informative...searched for this explanation for months on TH-cam. However, I will like to seek the audience help on a particular markov chain question
I got a doubt. In the first example, yoi said q0 is probability on the first day, then shouldn't q1 be the probability on the second day? How is q2 probability on the second day?
Thanks... Yes, it can be asked. It is basically the Markov Chain as P(X1=2, X2=2, X3=2,X4=2). For this, see how to solve such problems in Lecture #4 on Markov chain, available in playlist
What is role of transition probability in transition from one state to other state if we want transition from state i to state j only it required to be high or low
If initial probability is not given, then you can found it by using either of the distribution function such as Bionomial, Poisson, Geometric etc . what lecture 3 and lecture 4 for the same
In example 1, if I solve the problem using the normal approach, I get a different answer. Here is my approach: we need to find the probabilty that person chooses train on the second day, this can be obtained as: P(T on day 2 | C on day 1) + P(T on day 2 | B on day 1) + P(T on day 2 | T on day 1) = (0.7) * (0.4) + (0.2) * (0.2) + (0.1) * (0.3) = 0.35 Could anyone figure out where am I wrong? Any help is appriciated.
For it, you need Initial probabilities i.e., p(0) and TPM P. Once you have these , then required probability is p_0(2)x p_23 x p_33 x p_32 Here p_23 means element of matrix P at the position 2nd row and 3red Column. Similar meaning for p_33 and p_32. While p_0(2) means the initial probability of state 2. I hope it will clear you.
@@DrHarishGarg If I have historical data of random events for three states state A, B and C. How can I establish the initial probabilities P(0) and TPM P ?
I rarely comment on any youtube video.But the clarity of explanation in this video given the scarcity of this topic on yt is immense.
you explain everything clearly and with ease! I'm always stressed when watching educational videos but its never the case with your videos! many thanks.
Thankyou very much from bottom of my heart for this series of markov chain i am really glad people like you understand students minutes problem and come with detailed , lucid explanation even those who have no base can grasp the concept seamlessly
Watching from Nigeria and you just helped me understand this markov chain so well ...thank you so much ..I'm so happy .. preparing for my final year exam in statistics
sending my regards from Turkey for your clear teachings, wishing you the best, doctor.
Thanka and My pleasure... Keep sharing with other students too so that they will also learn with easiest manner... Happy learning...
You are my savior professor! Hope you can make more videos of stochastic process
Already 7 videos made .... Check all
Thank you so much sir...
I searched so many videos that were of no use. Finally I found this video which helped me clear all the concepts.
Once again tq sir...
Thanks. Kindly Watch other parts also.... You will definitely learn the full concept of Markov chain.
My Sir is GOD in Mathematics and I am very lucky to learn statistics from Sir :)
Clearly explained each concept and correctly divided the topics. From Kerala💥💥
Thanks for watching... Kindly explore other videos also
Every step is well understood thanks for this wonderful and informative video
Many thanks
Dr.Harish, wonderful session. Thank you -Dr.Kumara Raja Singh.G
Thank.. Keep watching other videos too..
ı tried to understand this topic and ı watched some videos.However in this video he explained easily this topic that is very difficult for me.Thanks for your video:)
Many thanks... Keep watching other videos also .you will get alot of benefit
This video is really informative...searched for this explanation for months on TH-cam.
However, I will like to seek the audience help on a particular markov chain question
Thanks
Good morning sir! You did greats of your duties and I want to may Nature bless you to doing your duties in such a way consistently also.
Your explaination is really great sir !
Wow thank you so much professor, clearly explained . you are the best
Many thanks for watching...keep watching next parts too for more understanding
It really helps. Thank you!
Glad to hear that!
Thanks a lot you make easy to statistics for all student 🙏💗💗💗🙏
It's my pleasure.... Thanks
Very informative! Eagerly waiting for other videos.
Thanks. Next video is uploaded. Kindly watch.
#OptimizationProbStat
Thank you very much for such a nice explanation
thankyou so much for explaining so clearly.God bless you!
Really thankyou so much for this....this is just amazing......
Sir, these lectures are very helpful for CSIR NET. Please make videos on PYQ on markov chain of CSIR NET. Thank you🙏
I got a doubt. In the first example, yoi said q0 is probability on the first day, then shouldn't q1 be the probability on the second day? How is q2 probability on the second day?
Very well explained.
Glad it was helpful!
very helpful SIR, very easy to understand
Most welcome.. Watch other lecture on Markov Chain (Lecture 3,4,5) for complete understanding
wonderful explantion.thank you sir
My pleasure.... You can also watch Lecture #3 -#5 on Markov Process for full topic
Very helpfull .thank you
Thanks Sir 🙏...You made it so easy
Thanks for watching ... Keep watching other videos too and share with others...
Was very helpful thank you sooooo much
Best
can u do a tutorial of MARKOV CHAIN LOTTERY ..thank you Dr. Harish Garg
OK. I will consider in my video on Markov Chain
Sir please make a video on classification of states of markov chain .
Sure.... I will upload soon.
@@DrHarishGarg Thank you sir,
Massive thank you !
Thankyou for the clear Explanation Sir:-)
My pleasure dear
Very well explained sir
Aap to dewta svaroop ho sir ji.. 🤗🙏🙏
Thanks for watching... Kindly watch other parts too for complete understanding
Thank you sir for this topic
Can you please upload videos about the topic- Brownian motion
Sir, I have export data, will you please guide how to find TPM, & Apply Markov chain to it.
Please help me.....
How did he calculate p2 can someone tell me? The final outcome of matrix how did he calculate that?
Sir plz tell if it has been asked to find prob(x1=X2=X3=x4=2)
Thanks... Yes, it can be asked. It is basically the Markov Chain as P(X1=2, X2=2, X3=2,X4=2). For this, see how to solve such problems in Lecture #4 on Markov chain, available in playlist
@@DrHarishGarg thank u very much sir
Sir kindly explain the applications on stochastic processes sir...
Sir please explain where we can use this...in general...
There is a wide applicability of stochastic process such as Decision theory , bayesian analysis, forecasting, optimization, etc
@@DrHarishGarg thanking you sir
@@DrHarishGarg can you send me your email id sir
amazing tutorial sir teach more topics please!!!
Thank you, All were already available in the playlist... otherwise let me know the topic
Very useful video. Thank you Dr Harish. Do you have any Stata codes to work on Markov model?
Sorry.... I dont have such STATA code
@@DrHarishGarg thank you alot anyway
Very informative lecture
I'm stuck in multiplication, how to calculate prob. together?
What is role of transition probability in transition from one state to other state if we want transition from state i to state j only it required to be high or low
Namaste ji, can you give an explanation for Chapman kolmogrov equation
OK... I will explain soon
Chapman Kolmorogorv video lecture is uploaded... You may watch please
@@DrHarishGarg sir how to calculate state probability if initial probability is not given ..
Can you give the solution with R studio ?
Sir please suggest some book/s related to topic. 🙏
For Book: See the link amzn.to/2NirzXT
@@DrHarishGarg thank you so much sir for your response.
great Sir.
So nice of you. Keep watching and sharing other videos also
@@DrHarishGarg certainly Sir
can u explain Example 2 (ii) in video (time:11.32)
Sir if initial probabilities are not given then how to solve the same problem with another method .
If initial probability is not given, then you can found it by using either of the distribution function such as Bionomial, Poisson, Geometric etc . what lecture 3 and lecture 4 for the same
Ok thank you so much
Your teaching is superb but your voice is very low
Thanks for watching and the feedback
I have doubt when p2 is into the how this ans comes .24 I don’t understand
Sir what is the examples of state probablity nd transition probablity? Plz rply nd write the steps process for n jobs through two nd three machines?
Watch (Part 2 of 3) and (Part 3 of 3) for the examples of state probability and transition probability.
You are not practically showing how are you multiplying P. P
Do we need to add and then multiply?! We didn't get it
P.P is standard two matrix multiply. ... Use the calculator or simply multiply row elements to Column one by one, as did in matrix multiplication.
In example 1, if I solve the problem using the normal approach, I get a different answer. Here is my approach:
we need to find the probabilty that person chooses train on the second day, this can be obtained as:
P(T on day 2 | C on day 1) + P(T on day 2 | B on day 1) + P(T on day 2 | T on day 1)
= (0.7) * (0.4) + (0.2) * (0.2) + (0.1) * (0.3) = 0.35
Could anyone figure out where am I wrong?
Any help is appriciated.
Have you got the answer?...your doubt makes me question my understanding of probability now
Sir i want to learn this topic for competitive exams lecturer cadre mcq types ques
Sir how you write qn=qo *P^(n)?
Kindly watch lecture no. 1 -TPM and n- step probability concept
❤
Sir, how to calculate prob of p(x3=2,x2=3,x1=3,x0=2) please explain
For it, you need Initial probabilities i.e., p(0) and TPM P. Once you have these , then required probability is
p_0(2)x p_23 x p_33 x p_32
Here p_23 means element of matrix P at the position 2nd row and 3red Column. Similar meaning for p_33 and p_32.
While p_0(2) means the initial probability of state 2.
I hope it will clear you.
@@DrHarishGarg biggg efforts. ☺☺
@@DrHarishGarg Thank you so much sir very nice explanation........
@@DrHarishGarg If I have historical data of random events for three states state A, B and C. How can I establish the initial probabilities P(0) and TPM P ?
I thought we just multiply the matrix in order to find the probability
Sir,How can I calculate E[X2]?
E(X2) = summation (x^2 p) for discrete variables and
Integration x^2 f(x) dx for continuous variable
@@DrHarishGarg Sir,Can you please explain how can I calculate E[X2] in Example 0 of this lecture of yours?
This explanation of yours would be really helpful
This is actually expectation at time step 2
Sir please make video classification of States of Markov chain with example
Sure... I will upload