Designing a neural network | Text Classification Tutorial Pt. 2 (Coding TensorFlow)

แชร์
ฝัง
  • เผยแพร่เมื่อ 18 ต.ค. 2024

ความคิดเห็น • 25

  • @DavidMartin-gr7ji
    @DavidMartin-gr7ji 6 ปีที่แล้ว +14

    Laurence, these videos are great! Thank you for making them! Some feedback: the text titles of the videos aren't easy to read and that makes it difficult to know the order to watch related videos. Consider changing the sequence of the items in the title from: "Title, Subject, Part" to "Subject, Part, Title". For example, this video would be:" Text Classification, Pt.2 | Designing a neural network".

  • @gusthema
    @gusthema 6 ปีที่แล้ว +3

    This was very good! The embedding example was great!

  • @oumski7529
    @oumski7529 ปีที่แล้ว

    تحياتي الخالصة من الجزائر thank you very mutch

  • @TheDavidlloydjones
    @TheDavidlloydjones 6 ปีที่แล้ว +7

    Thanks for giving Part 1 a title beginning with P and Part 2 a title beginning with D. This means they will alphabetise themselves in a list about as far apart as possible, and maximise the amount of searching the human has to do "manually," by eye-ball.
    Thank goodness the machines are coming!

    • @laurencemoroney655
      @laurencemoroney655 6 ปีที่แล้ว +1

      Heheh. Nice point. I should make playlists with them ordered to make it easier to figure out which comes after which. I'll speak with the social folks. :)

  • @pandarzzz
    @pandarzzz 6 ปีที่แล้ว

    Thank you for sharing this informative video! 😺🖐

  • @darrengovoni6184
    @darrengovoni6184 6 ปีที่แล้ว +1

    Had a question about this code. In cell 22 you separate out a subset of the original training data and call this validation set. And you fit only this data. Yet in cell 23 you evaluate the test_data which is already returned separately. So why the need for this extra validation set? If you train on the entire train_data and validate on the test_data, are not the two datasets already distinct?

  • @AbhishekKumar-mq1tt
    @AbhishekKumar-mq1tt 6 ปีที่แล้ว

    Thank u for this awesome video and series

  • @iwantandjaya3028
    @iwantandjaya3028 4 ปีที่แล้ว

    Hi, I have a question,
    Is lemmatization and stemming applied to the dataset?
    If not, is there built-in tf/keras feature that can pre-process the data?
    Thanks.

  • @PeterMancini
    @PeterMancini 5 ปีที่แล้ว +8

    Why 16 dimensions? Why not 3 or 160?

    • @segev1824
      @segev1824 4 ปีที่แล้ว +2

      Hi! Have you found out that answer to this? I'd greatly appreciate any insight into this!

  • @rosaria3932
    @rosaria3932 3 ปีที่แล้ว

    Thank you so much for the video! The link to the workbook is not working, could you please upload it again?

  • @intelligenttrends8935
    @intelligenttrends8935 6 ปีที่แล้ว

    AWESOME

  • @mohansaisingamsetti7456
    @mohansaisingamsetti7456 5 ปีที่แล้ว

    Nice

  • @ProfessorElectronic
    @ProfessorElectronic 5 ปีที่แล้ว +1

    Hi, how do I export the trained model from Codelab to my computer?

    • @suraj-ram7488
      @suraj-ram7488 5 ปีที่แล้ว

      I am not fully sure, but you can use the saveModel feature of TF to create files that contain your trained model. You can then download these files and run them on your computer. BTW this same youtube channel has more information regarding this feature.

  • @jdbrrocks
    @jdbrrocks 4 ปีที่แล้ว

    How can we use model.predict() in this case??

  • @EvenYan-home
    @EvenYan-home 5 ปีที่แล้ว

    oh, no. The 14th video can't be show, could yo tell me why?

  • @farhadshadmand5785
    @farhadshadmand5785 5 ปีที่แล้ว

    I tried to do that this programming with my data from tweets, but I get error below, Someone can help me??
    nvalidArgumentError: indices[0,39] = -1 is not in [0, 2000)
    [[Node: embedding_23/embedding_lookup = ResourceGather[Tindices=DT_INT32, _class=["loc:@training_12/Adam/gradients/embedding_23/embedding_lookup_grad/Reshape"], dtype=DT_FLOAT, validate_indices=true, _device="/job:localhost/replica:0/task:0/device:CPU:0"](embedding_23/embeddings, embedding_23/Cast)]]

  • @jojivarghese2105
    @jojivarghese2105 2 ปีที่แล้ว

    Pokémon