- 48
- 1 432 533
EduFlair KTU CS
India
เข้าร่วมเมื่อ 12 ก.ย. 2020
Simple and easy to learn ktu cse lectures and notes on various subjects
Machine learning cs467
Network security
Programming in C
Machine learning cs467
Network security
Programming in C
Divide and Conquer approach - Merge sort
#divideandconquer #mergesort #python #algorithmicthinking
มุมมอง: 47
วีดีโอ
Brute Force approach to problem solving
มุมมอง 9921 วันที่ผ่านมา
#bruteforce #algorithmicthinking #python
Echelon form and Inverse of a Matrix using Gaussian Elimination
มุมมอง 1.4K2 ปีที่แล้ว
#inverseofmatrix #gaussianelimination #echelonform This video discusses Row Echelon form and Reduced row echelon form of a matrix with examples. Also it covers a numerical problem for finding the Inverse of a matrix using Gaussian Elimination method. The process of transforming a given matrix into its Reduced row echelon form is called Gaussian elimination method. This video is based on the syl...
Decision Tree problem / Finding Best Split using Information Gain
มุมมอง 24K2 ปีที่แล้ว
#decisiontree #id3 #splittingattribute Decision tree problem This video gives you an idea about finding the best splitting attribute of a decision tree. This method uses entropy and information gain as the measures.
Issues in Decision tree learning / KTU Machine Learning
มุมมอง 6K3 ปีที่แล้ว
#overfittingofdata #missingattribute #reducederrorpruning
Storage classes in C - part 2 (Static and Extern)
มุมมอง 1933 ปีที่แล้ว
#storageclassesinc #programminginc
Matrix multiplication in C programming EST102
มุมมอง 5K3 ปีที่แล้ว
#cprogramming #programminginc #matrixmultiplication
Program for Inserting element into array in C
มุมมอง 7K3 ปีที่แล้ว
Program for Inserting element into array in C
Program for deleting element from array in C
มุมมอง 10K3 ปีที่แล้ว
Program for deleting element from array in C
Decision tree - Entropy and Information gain with Example
มุมมอง 67K3 ปีที่แล้ว
Decision tree - Entropy and Information gain with Example
Expectation maximization algorithm / KTU Machine Learning
มุมมอง 32K3 ปีที่แล้ว
Expectation maximization algorithm / KTU Machine Learning
Soft Margin Hyperplane or Soft SVM / KTU Machine learning
มุมมอง 15K3 ปีที่แล้ว
Soft Margin Hyperplane or Soft SVM / KTU Machine learning
Support vector machine (SVM) - part 2 / Mathematical formulation of SVM / KTU Machine learning
มุมมอง 22K4 ปีที่แล้ว
Support vector machine (SVM) - part 2 / Mathematical formulation of SVM / KTU Machine learning
Basics of Support Vector Machine - part 1 / KTU Machine Learning
มุมมอง 19K4 ปีที่แล้ว
Basics of Support Vector Machine - part 1 / KTU Machine Learning
Hierarchical Clustering - Problem / Complete linkage / KTU Machine learning
มุมมอง 35K4 ปีที่แล้ว
Hierarchical Clustering - Problem / Complete linkage / KTU Machine learning
Hierarchical clustering - Agglomerative and Divisive method/ KTU / Machine learning
มุมมอง 44K4 ปีที่แล้ว
Hierarchical clustering - Agglomerative and Divisive method/ KTU / Machine learning
K-means Clustering Example / KTU Machine learning
มุมมอง 174K4 ปีที่แล้ว
K-means Clustering Example / KTU Machine learning
K-means Clustering algorithm / KTU Machine learning
มุมมอง 33K4 ปีที่แล้ว
K-means Clustering algorithm / KTU Machine learning
ML estimate of Poisson and Geometric distribution
มุมมอง 15K4 ปีที่แล้ว
ML estimate of Poisson and Geometric distribution
Maximum likelihood estimation (MLE) / Parameter estimation of Bernoulli / KTU Machine learning
มุมมอง 83K4 ปีที่แล้ว
Maximum likelihood estimation (MLE) / Parameter estimation of Bernoulli / KTU Machine learning
Naive bayes or bayesian classifier problem / KTU Machine learning
มุมมอง 25K4 ปีที่แล้ว
Naive bayes or bayesian classifier problem / KTU Machine learning
Bayesian classifier / Naive bayes algorithm / KTU Machine learning
มุมมอง 20K4 ปีที่แล้ว
Bayesian classifier / Naive bayes algorithm / KTU Machine learning
Ma"am I was looking for this type of content i.e concept on Information gain and entropy. After Wasting 1-2 hours on different channel finally i landed here ..Thanks a lot ❤.
Notes please
sao bt đc ra v1 v2 vậy ko hiểu phần đó
you are great and try to make the whole lesson at inglish
The best pca class ever in youtube.
the formula for quadratic formula is not right you forgot -b in start
nice explanation
Thank you soo much Mam
4 yeass ago thanks ma'am 😊
🔥
mam idu exam choikan chance undo?derivation
Thankyyyyy
Thanks
you are amazing teacher...which book are you following mam?
Ithintte output onn kaanikkamo after deletion num nammal enter cheyyano
വളരെ നല്ല ക്ലാസ്സ് ആണ് 👍👍
This steps are also applicable for 3 centroid also ? Help me.....!
It's really fantastic ❤
baji aap ny 0:57-1:07 kaya bola smaj nahi aaya
Thank you so much
Good
Madam, which language do you speak in the middle of your sentences sometimes? second you explain very well but your Quadratic Equation values are correct but not the formula you mentioned at 12:36
Thank you so much ❤
Best way!!! I learnt theory, but simply it wasnt enough… she just explained it so clealry!!
tq mam super voice
Beautiful
😊😊
nice teaching i liked it very much
Thanks, tomorrow I have machine learning exam. Concept manasillay. Anna university syllabus only cv and k-fold cv
Thanks
Formula is wrong
Your explanation is super but your translate English or Telugu.
my friend osama said he love you
😂😂
讲的好
Nice video and excellent explanation! (¡Muchas gracias por el ejemplo!) It is a good exercise to teach in a class in conjunction with the video. Here is a R code in Spanglish that checks the result obtained in the video. # Step 1: Create the data xx <- c(4, 8, 13, 7) yy <- c(11, 4, 5, 14) datos <- data.frame(xx = xx, yy = yy) # Step 2: Compute de means and center the data colMeans(datos) datos_centrados <- scale(datos, center = T, scale = F) # Step 3: Find the covariance matrix matriz_de_covarianzas <- cov(datos_centrados) # Step 4: Calculate the eigenvalues and the eigenvectors autovalores <- eigen(matriz_de_covarianzas)$values autovectores <- eigen(matriz_de_covarianzas)$vectors # (Like columns) autovectores <- -autovectores # The minus sign is arbitrary # Step 5: Derive the new dataset matriz_de_loadings <- t(autovectores) matriz_de_loadings # e1 = first row, e2 = second row # PCA1 = matriz_de_loadings[1, 1]*xx + matriz_de_loadings[1, 2]*yy # PCA2 = matriz_de_loadings[2, 1]*xx + matriz_de_loadings[2, 2]*yy datos_transformados <- t(matriz_de_loadings%*%t(datos_centrados)) datos_transformados
Is it a derivation
Plz teach in english
Good mam, please teach in only English
nicezzyy
Thx ma'am the vedio was very useful ✨
*What is this language in between aare ooo aaddaa* 🤮🤮🤮😂😂😂
she's a malayali. she is speaking in Malayalam. so there might be some mistakes in the way she speaks as she's not fluent in English. But pls try to respect atleast.
Thank you so much mam...this was really helpful 🌟
Thankyou madam I don't understand that malyalam but understood the concept. Nandi🙏💝
I'm from Andhrapradesh But i understand this prblm very well ❤🩹 Thank you mam
You need to come to my college to teach statistics.....so nice explanation....
Thanks mam
Distance formula is wrong mam
Is the normalization part correct? 11/27.3848 and -16.3848/27.3848 is what I thought is correct. Also for eigen vector, Taking the second eqn. would have yield diff eigen vector, so how to decide on which eqn to take.
Informative claas and thanks alot
formula for quadratic equation is not 1/2a to square of b^2-4ac the right formula is -b +(oR)- to square of b^2-4ac/2a