Excellent approach, Boris. There's some similarity with my rather agricultural tactic of using the tools I already have without reaching out for another wrench. FWIW I'll share it with you. I use email, specifically Mozilla Thunderbird, to email myself with tags in the subject line. Doesn't matter where I am, at my desktop computer or on a walk with mobile phone. When I think of something off goes the email to myself and filters setup in Thunderbird moves the inbound email to a folder or subfolder related to the subject line tag(s). Super simple, minimal UI interaction and all records if my thoughts, planning and actions are automagically filed in the appropriate buckets. Typically I'll think about and plan a development project off-screen for weeks at a time so that by the time I get around to coding I've got a framework to write the least amount of code possible. I love coding and love concise coding even more so, indeed, your approach resonates well with me!
Wow, that sounds smart and minimal as well! As long as it works for you, and you are happy with it, there is nothing to complain about! 😊 As you said yourself, the underlying approach is very similar 😊 Keep it up!
i am really struck in a situation where i can't decide what kind of projects can be made , how much knowledge i really have and how much I required . is there some suggestion to follow something , or some kind of roadmap , video to actually know what are the things that need some time, i am more into having practical ML knowledge. Really like the work you have been putting through your videos.
We have all been there. I myself have no idea how "good" my knowledge is. We will never know everything, and that is okay :) It takes time. As long as you continue learning and practising, you will get better and better. If you look back at old projects and cringe at them because they are so poorly implemented, you are making progress haha (I feel the same way). As long as you have the fundamentals (there are many online courses and lectures on these fundamentals) you want to get to working on projects. Just continue learning new concepts along the way. At some point, there is no perfect roadmap anymore. It's just learning new stuff here and there, and if you do it for long enough, it will slowly all come together and make more and more sense, while opening up more and more questions at the same time. That said, Kaggle is a great resource to get practical ML knowledge! I hope this somewhat helps :) Keep it up and happy learning!
I'm a newbie looking to transition into a tech career. I've been studying Python on LeetCode and learning ML from Coursera and Andrew Ng, but I'm eager to dive deeper into ML. Any advice or opportunities for a beginner like me would be fantastic. I'm excited to hear from awesome folks like you. Thanks!
You are on a good track! If you feel comfortable with the exercise in the courses, you can e.g. move to kaggle and start working on beginner projects! If you are struggling you can look at other solutions and learn from them! Keep it up 🚀☺️
I built my best ML project without going crazy by focusing on clear goals, breaking the project into manageable tasks, using robust tools, and maintaining a structured workflow. Regularly reviewing progress and seeking support from the community also helped keep things on track.
Excellent video Boris ! keep it up 👍. By the way , i have question for you : what do you think about using AutoML tools like Pycaret , H20, AutoKeras and others such tools for automating ML workflows ? Do professionals use them ? If yes, then how much role do they play during work or at what extent they use it ?
Thank you! AutoML tools are very often used in applied ML. A lot of data scientists working on business problems use them, e.g. (afaik) IBM has its own AutoML tool and uses it a lot for building ML projects for business customers. Pretty much, when it comes to such real-world business problems, the more challenging part is the data gathering, cleaning, and analysis. The data is often in tabular format and does not require the most complex Deep Learning algorithms, but rather simple, classical ML models, that can simply be trained and evaluated using AutoML. I hope this somewhat makes sense and is helpful :)
You can find the link to my notion template in the video description 😊 It doesn't contain all my dataset and paper entries, but it is indeed a cool idea to somehow share my list of dataset I have aggregated! I will keep that in mind, and perhaps do some sort of simple GitHub page containing this list :) Thank you and happy learning! ❤️
Can you give some places to look about what other people solving and where do taking these research paper, if there is any problem with question forgive it i am not good at english
you try to google something you find interesting and add „paper“ to it (e.g. „Face detection paper“). You can also specifically search in „Google Scholar“ - it‘s google but for research papers :)
i want to make a ChatBot can you tell me what i should learn and from where should I start and also one more thing can you me name some topics for my semester projects .
There are different aspects to wanting to build a ChatBot. If you want to train an LLM from scratch, that will very likely not be possible haha If you still want to somewhat train an LLM, then finetuning with e.g. LoRA makes sense. But if you really only care about the application level, I would recommend you to have a look at existing open source LLMs, how to leverage RAG, prompt engineering best practises, and how to make the interface look nice. A company called "Cohere" does pretty cool stuff on that front, so perhaps you might want to have a look at what they do and have to offer :) I hope this makes sense and somewhat helps :)
coding will be very important even for researchers. But your coding will mostly be on the model development side, and not deployment. Coding will simply be a tool for realizing your ideas though. So you need to have a good understanding of different ML techniques/ papers and build up a feeling for how to implement your ideas. This happens through coding a decent amount :) I hope this somewhat helps!
Here is one that I think is very useful for more of a practical introduction to it. huggingface.co/learn/nlp-course/chapter1/1?fw=pt And otherwise I can very highly recommend watching Andrej Karpathy's videos on TH-cam, be it his Stanford lectures or even better, his own YT videos! www.youtube.com/@AndrejKarpathy
So I tried to answer several times with my comment but it always has been instant deleted by TH-cam and I don’t know why. I’ll try to leave out the links now and just mention the title. I think if you google for them, you’ll find them without problems. Just copied my comment-attempt since im lazy :D: Here are a few of the resources which helped me in the past weeks. I haven't done the NLP Course, just the Deep learning Specialization taught by Andrew Ng but since it's made by DeepLearning, im confident its great. Work through the resources and do the examples by yourself, don't stick to blank reading. Like building code, we also build our knowledge base, at least if you are like me. Good luck and enjoy! This one is a notebook and gives some nice insides about language and its dependencies. Im sure some of the infos there will also help you to understand as well as it helped me to get into specific NLP things. „Introduction to Cultural Analytics & Python - Designed by Melanie Walsh“ This is the Course I haven't done but did the Deep Learning course by DeepLearning and it was a great learning. „Natural Language Processing Specialization - by DeepLearning AI“ This one is a written Guide, also by Deep Learning. „A COMPLETE GUIDE TO Natural Language Processing - by DeepLearning AI“ Spacy is great to start with and their Documents are very well. If you feel overwhelmed at the beginning, that's totally fine, just keep on rolling slowly. Just Google something like „spacy usage linguistic features“ to get to their docs and its overview There are plenty free Medium articles with nice examples like: „Text Classification for Beginners in NLP with codes - by Mehul Gupta“ Its an easy to follow one, greatly written to understand the basics of classification in NLP. And something more complex but very mandatory to at least know a bit about: „Tutorial: Build your own Skip-gram Embeddings and use them in a Neural Network“ by Cambridge Spark Liebe Grüße / kind regards
Multimodality his the next AGI milestone. I would never start approaching it, the only way to crack it as far as I can see is through brute force compute, I don't see any algorithmic improvements possibilities in this area for us peasants with multiple GPUs at max
Yeah, multimodality will be a very important part to building intelligent systems that can understand the world in some way or another. We as mere mortals don't need to develop the next Attention-is-all-you-need-paper. There are so many other open problems that require far less compute, I promise! There are thousands of papers accepted at top conferences each year, most of them are not going to change the world, but will help you learn more for yourself, and help you get a job you want :) People often think, they need to instantly develop the next revolutionizing technology. But that is not the case. In fact, those developments often happen by chance. The authors of the Transformer paper had no idea it would revolutionize the world of AI. They just thought of a smart way to do natural language translation and generation. You just need to figure out how to find an idea that is interesting, ideally somewhat relevant, and feasible. That's what I had to go through, and why I wanted to make this video :) I hope this somewhat helps 😊
@@borismeinardus I have different mindset. I’m lucky not to be heavily bothered by money, but interestingly you mentioned “Attention is all you need” as it actually doesn’t need a lot of compute. I think serious improvements can be made without huge computation requirements. For example I’m researching new learning rules beyond backdrop and Adam or do neuro-architecture search and I don’t need to train GPT4o for that. I actually think I am unbelievably lucky being at the very special time in history of humanity and you actually have to work on next “attention is all you need”. It is important enough to work on it even if chances of success are low, especially in your 20s, this is the age most scientific breakthroughs are made. I understand my mindset won’t work for everybody, but you actually can make a huge breakthrough in the field intelligence, which is actually one of the few remaining great unsolved mysteries of humanity, on par with “where we came from” and “how universe has appeared”. You can work on this great mystery smartest humans have tackled over last centuries and fail to solve. You can work on it at home with your computer and it makes my life very exiting. Even though I understand I will probably fail, small chance is more than enough for me, this is how huge breakthroughs really done
If you have really cool personal projects or job/ internship experience, probably, yes! But that will likely be quite hard to realize. You would need to cramp in similar effort you put in in the two years of your masters on top of what you already have to do during your bachelors. Quality > Quantity In general, don't rush things. Take your time to learn the skills properly. :) Happy learning!
Hey Boris , very good video . I can do better editing in your videos which can help you to get more engagement in your videos . Pls lmk what do you think ?
yeah, I did show quite a bit of Notion haha I somehow wanted to share these 3 phases and what went through my mind but also make it somewhat more tangible and practical. So, even without the sponsorship, I would very likely have added these demos to show how I realize this three phase system in practice and how I went about coming up with and organizing my cool ML project :) I hope this makes sense ☺️
🎉 Unlock AI for $10 per month: ntn.so/BorisMeinardus
🎯 If you can think it, you can make it happen with Notion. Try for free at: ntn.so/boris
Your topics and the way you present them is really fascinating !!
Excellent approach, Boris. There's some similarity with my rather agricultural tactic of using the tools I already have without reaching out for another wrench. FWIW I'll share it with you.
I use email, specifically Mozilla Thunderbird, to email myself with tags in the subject line. Doesn't matter where I am, at my desktop computer or on a walk with mobile phone. When I think of something off goes the email to myself and filters setup in Thunderbird moves the inbound email to a folder or subfolder related to the subject line tag(s).
Super simple, minimal UI interaction and all records if my thoughts, planning and actions are automagically filed in the appropriate buckets.
Typically I'll think about and plan a development project off-screen for weeks at a time so that by the time I get around to coding I've got a framework to write the least amount of code possible.
I love coding and love concise coding even more so, indeed, your approach resonates well with me!
Wow, that sounds smart and minimal as well! As long as it works for you, and you are happy with it, there is nothing to complain about! 😊
As you said yourself, the underlying approach is very similar 😊
Keep it up!
i am really struck in a situation where i can't decide what kind of projects can be made , how much knowledge i really have and how much I required . is there some suggestion to follow something , or some kind of roadmap , video to actually know what are the things that need some time, i am more into having practical ML knowledge. Really like the work you have been putting through your videos.
We have all been there. I myself have no idea how "good" my knowledge is. We will never know everything, and that is okay :)
It takes time. As long as you continue learning and practising, you will get better and better. If you look back at old projects and cringe at them because they are so poorly implemented, you are making progress haha (I feel the same way).
As long as you have the fundamentals (there are many online courses and lectures on these fundamentals) you want to get to working on projects. Just continue learning new concepts along the way. At some point, there is no perfect roadmap anymore. It's just learning new stuff here and there, and if you do it for long enough, it will slowly all come together and make more and more sense, while opening up more and more questions at the same time.
That said, Kaggle is a great resource to get practical ML knowledge!
I hope this somewhat helps :)
Keep it up and happy learning!
Hi , do you have a video about advice on breaking into ML research
I'm a newbie looking to transition into a tech career. I've been studying Python on LeetCode and learning ML from Coursera and Andrew Ng, but I'm eager to dive deeper into ML. Any advice or opportunities for a beginner like me would be fantastic. I'm excited to hear from awesome folks like you. Thanks!
You are on a good track!
If you feel comfortable with the exercise in the courses, you can e.g. move to kaggle and start working on beginner projects!
If you are struggling you can look at other solutions and learn from them!
Keep it up 🚀☺️
@@borismeinardus Thanks ❤
I built my best ML project without going crazy by focusing on clear goals, breaking the project into manageable tasks, using robust tools, and maintaining a structured workflow. Regularly reviewing progress and seeking support from the community also helped keep things on track.
Excellent video Boris ! keep it up 👍. By the way , i have question for you :
what do you think about using AutoML tools like Pycaret , H20, AutoKeras and others such tools for automating ML workflows ? Do professionals use them ? If yes, then how much role do they play during work or at what extent they use it ?
Thank you!
AutoML tools are very often used in applied ML. A lot of data scientists working on business problems use them, e.g. (afaik) IBM has its own AutoML tool and uses it a lot for building ML projects for business customers. Pretty much, when it comes to such real-world business problems, the more challenging part is the data gathering, cleaning, and analysis. The data is often in tabular format and does not require the most complex Deep Learning algorithms, but rather simple, classical ML models, that can simply be trained and evaluated using AutoML.
I hope this somewhat makes sense and is helpful :)
Hey I would be really greatful if you can share your full notion page as It's really amazing and specifically the dataset part is just great!
You can find the link to my notion template in the video description 😊
It doesn't contain all my dataset and paper entries, but it is indeed a cool idea to somehow share my list of dataset I have aggregated! I will keep that in mind, and perhaps do some sort of simple GitHub page containing this list :)
Thank you and happy learning! ❤️
Hi Boris, great custom dashboard! I would also recommend Notion's Project Management template as a free alternative.
Yeah! great recommendation!
Can you give some places to look about what other people solving and where do taking these research paper, if there is any problem with question forgive it i am not good at english
you try to google something you find interesting and add „paper“ to it (e.g. „Face detection paper“). You can also specifically search in „Google Scholar“ - it‘s google but for research papers :)
i want to make a ChatBot can you tell me what i should learn and from where should I start and also one more thing can you me name some topics for my semester projects .
What's the point of asking a question, one can ask chatgpt or simply google?
@@lazydart4117 but ChatGpt doesn't have experience like him
There are different aspects to wanting to build a ChatBot.
If you want to train an LLM from scratch, that will very likely not be possible haha
If you still want to somewhat train an LLM, then finetuning with e.g. LoRA makes sense.
But if you really only care about the application level, I would recommend you to have a look at existing open source LLMs, how to leverage RAG, prompt engineering best practises, and how to make the interface look nice.
A company called "Cohere" does pretty cool stuff on that front, so perhaps you might want to have a look at what they do and have to offer :)
I hope this makes sense and somewhat helps :)
Hi Boris! Can you please share me resources you use to find good research papers? A beginner here..
Alright! Time to listen to this bro!
🤩
What is your advice if those who want to start learning to become an ML researcher? Should they start by coding?
coding will be very important even for researchers. But your coding will mostly be on the model development side, and not deployment. Coding will simply be a tool for realizing your ideas though. So you need to have a good understanding of different ML techniques/ papers and build up a feeling for how to implement your ideas. This happens through coding a decent amount :)
I hope this somewhat helps!
Only 35 seconds into your video and I think you have me hooked to watch & listen to the end :-)
Thank you! I hope the content holds up to your expectations haha ❤️
@@borismeinardus It did, thank you, as you will see in my comment after watching thru to the end.
Genuinely happy to hear that! 😊
can you recommend some resources for NLP?
Here is one that I think is very useful for more of a practical introduction to it.
huggingface.co/learn/nlp-course/chapter1/1?fw=pt
And otherwise I can very highly recommend watching Andrej Karpathy's videos on TH-cam, be it his Stanford lectures or even better, his own YT videos!
www.youtube.com/@AndrejKarpathy
So I tried to answer several times with my comment but it always has been instant deleted by TH-cam and I don’t know why. I’ll try to leave out the links now and just mention the title. I think if you google for them, you’ll find them without problems. Just copied my comment-attempt since im lazy :D: Here are a few of the resources which helped me in the past weeks. I haven't done the NLP Course, just the Deep learning Specialization taught by Andrew Ng but since it's made by DeepLearning, im confident its great.
Work through the resources and do the examples by yourself, don't stick to blank reading. Like building code, we also build our knowledge base, at least if you are like me. Good luck and enjoy!
This one is a notebook and gives some nice insides about language and its dependencies. Im sure some of the infos there will also help you to understand as well as it helped me to get into specific NLP things.
„Introduction to Cultural Analytics & Python - Designed by Melanie Walsh“
This is the Course I haven't done but did the Deep Learning course by DeepLearning and it was a great learning.
„Natural Language Processing Specialization - by DeepLearning AI“
This one is a written Guide, also by Deep Learning.
„A COMPLETE GUIDE TO Natural Language Processing - by DeepLearning AI“
Spacy is great to start with and their Documents are very well. If you feel overwhelmed at the beginning, that's totally fine, just keep on rolling slowly.
Just Google something like „spacy usage linguistic features“ to get to their docs and its overview
There are plenty free Medium articles with nice examples like:
„Text Classification for Beginners in NLP with codes - by Mehul Gupta“
Its an easy to follow one, greatly written to understand the basics of classification in NLP.
And something more complex but very mandatory to at least know a bit about:
„Tutorial: Build your own Skip-gram Embeddings and use them in a Neural Network“ by Cambridge Spark
Liebe Grüße / kind regards
Multimodality his the next AGI milestone. I would never start approaching it, the only way to crack it as far as I can see is through brute force compute, I don't see any algorithmic improvements possibilities in this area for us peasants with multiple GPUs at max
Yeah, multimodality will be a very important part to building intelligent systems that can understand the world in some way or another.
We as mere mortals don't need to develop the next Attention-is-all-you-need-paper. There are so many other open problems that require far less compute, I promise! There are thousands of papers accepted at top conferences each year, most of them are not going to change the world, but will help you learn more for yourself, and help you get a job you want :)
People often think, they need to instantly develop the next revolutionizing technology. But that is not the case. In fact, those developments often happen by chance. The authors of the Transformer paper had no idea it would revolutionize the world of AI. They just thought of a smart way to do natural language translation and generation.
You just need to figure out how to find an idea that is interesting, ideally somewhat relevant, and feasible. That's what I had to go through, and why I wanted to make this video :)
I hope this somewhat helps 😊
@@borismeinardus I have different mindset. I’m lucky not to be heavily bothered by money, but interestingly you mentioned “Attention is all you need” as it actually doesn’t need a lot of compute. I think serious improvements can be made without huge computation requirements. For example I’m researching new learning rules beyond backdrop and Adam or do neuro-architecture search and I don’t need to train GPT4o for that. I actually think I am unbelievably lucky being at the very special time in history of humanity and you actually have to work on next “attention is all you need”. It is important enough to work on it even if chances of success are low, especially in your 20s, this is the age most scientific breakthroughs are made. I understand my mindset won’t work for everybody, but you actually can make a huge breakthrough in the field intelligence, which is actually one of the few remaining great unsolved mysteries of humanity, on par with “where we came from” and “how universe has appeared”. You can work on this great mystery smartest humans have tackled over last centuries and fail to solve. You can work on it at home with your computer and it makes my life very exiting. Even though I understand I will probably fail, small chance is more than enough for me, this is how huge breakthroughs really done
hi im from germany i wanna ask if its possible to get a machine learning engineer job with a bachelor in computer science?
If you have really cool personal projects or job/ internship experience, probably, yes!
But that will likely be quite hard to realize. You would need to cramp in similar effort you put in in the two years of your masters on top of what you already have to do during your bachelors.
Quality > Quantity
In general, don't rush things. Take your time to learn the skills properly. :)
Happy learning!
you look like antoine griezmann
Haha, I guess I'll take that as a compliment :)
I disagree
Hey Boris , very good video . I can do better editing in your videos which can help you to get more engagement in your videos . Pls lmk what do you think ?
Thank you for the offer! I am currently very happy with my editor :)
Where did you download the papers , was it for free or paid
All papers I sure are free. You will rarely find (AI) papers where you have to pay :)
great 💯
Thanks 😊
Bro i want to start learning game development so how I can start it
Mean what to do first
More like a notion demo
yeah, I did show quite a bit of Notion haha
I somehow wanted to share these 3 phases and what went through my mind but also make it somewhat more tangible and practical. So, even without the sponsorship, I would very likely have added these demos to show how I realize this three phase system in practice and how I went about coming up with and organizing my cool ML project :)
I hope this makes sense ☺️