Computer Vision | Lecture 7 | Image Classification Metrics and Overfitting

แชร์
ฝัง
  • เผยแพร่เมื่อ 10 ก.พ. 2025

ความคิดเห็น • 2

  • @rohitdhankar360
    @rohitdhankar360 2 ปีที่แล้ว +1

    09:00 - Linear Model with an example of an Non Linear Feature
    12:00 - a new Feature Engineered - independent variable - h(x)
    15:00 - SIFT features
    16:00 - Deep learning algorithms - learning to feature engineer .
    20:30 - Given data is Linear Separable , Logistic Regression ( Classification ) is a good choice .
    25:30 - Logistic Regression --> is a ONE CELL Neural Network - its just ONE NEURON
    26:30 - Loss Function --> Optimizing for w(hat) - by picking the RIGHT w(hat) , we ensure that the Y(hat)i is close to most Yi
    28:40 - ENTROPY - both the classes - 0 and 1 have equal probability - 0.5 ( 50%) and thus its the Highest Uncertaintiy or Hihghest Entropy

    • @bytesizeml119
      @bytesizeml119  2 ปีที่แล้ว +1

      Thanks for the additional time stamps