If you've only trained for 4 epochs, perhaps you need to train for more loops.... I'd be interested to know if instead of argmax you used softmax, the highest value in your output would still be an 8, would the second highest value be a 3? 3 & 8 share a lot of common pixels...
nice video! especially the self-data part. still a little confuse on cross-entropy , it says in pytorch if you used cross-entropy then no need to set layer to softmax ?.
Saw the insta ad, good luck with YT. Also nice quality tutorial
Appreciate it!
Thanks man. You are best
You're welcome
hope could have a video to explain on pytorch for NLP transformers Bert multi label classification.
If you've only trained for 4 epochs, perhaps you need to train for more loops.... I'd be interested to know if instead of argmax you used softmax, the highest value in your output would still be an 8, would the second highest value be a 3? 3 & 8 share a lot of common pixels...
nice video! especially the self-data part. still a little confuse on cross-entropy , it says in pytorch if you used cross-entropy then no need to set layer to softmax ?.
in the last line of the code shouldn't the result be passed in instead of output ? print(torch.argmax(result))
Yes, I'm sure this will give the right answer (3) and not 8. Worked for me with my own image
Thats what i needed thank you )
Glad it helped.
nice video bro keep it up
Thank you so much! Hope you have enjoyed the new content as well.