Man I can not find the words to describe how happy I am for finding your channel, you have made my life about 2x better with making such a nice videos, keep on rocking.
Mr. Coding Train, you are one wonderful teacher. I was going through an online course where they were implementing markov chain for random text generation to explain some design principles. But I could not under stand it. Your this tutorial saved me.
Thanks so much for doing this video! I had to adapt your process to create a Markov Chain in MATLAB, but I was able to successfully hack it. I used MC's for a bioinformatics project where I had to generate DNA sequences and see if randomly generated sequences can maintain their integrity based on the order associated with the ngram. Good stuff!
"The erroneously beast with the unicorn is of natured." Great stuff on fascinating subject matter. I'm gonna try this with whole words instead of n-grams.
This video is awesome. I was able to use this and part 2 to build a name generator using as source text words from a conlang that was developed for the project I am working on. This was so clearly described and stepped through I was able to put it together on the first try. Thank you sincerely.
I regret that I found your channel too lately. Not even a moment was boring because it's not done in a traditional way. Unorthodox always demands attention and we offer that with satisfaction here :)
I have wrote an implementation of this markov-chain example using Processing, but I encountered an error which you don't seem to be facing but I don't understand how. Basically it is about the end of the input text, because what happens for the last ngram of the input text since you cannot get the char afterwards since it would be out of range / or simply put: it does not exist. My current understanding of this is that the markov chain generated a premature end before reaching the desired amount of iterations. I have programmed around this issue by simply jumping out of the loop when this case happens however I would be interested how you dealt with this issue. From what i have seen at 22:20 your algorithm seems to do the same since the expected output should be strings of length order + 10 however you are getting a string like "theremin." aswell which infact contains the mentioned end-of-input-string problem-ngram which leads to your string being only 9 characters long.
Wow. I built this whole project myself one day, analyzing a novel's worth of text for the statistics of what words come after other words, and then producing random text with those same probabilities. I didn't even know there was a name for it. I was just bored one day. The text was mostly gibberish in the end, but it was pretty cool gibberish.
Wow. I was probabilities. I didn't even pretty come after worth of text worth of what was just bored one day. The text for ther worth there was a novel's what was mostly gibberish.
I've been writing a markov chain in python, and what I'm doing is creating a list of all words I've seen, as well as all 2 word pairs, 3 word pairs etc for each order we want to go up in. I also store what words come after with an array whose indexes correspond to which word they follow in the list of seen. I then use an algorithm that chooses the highest order markov chain based on the current output and adds that word (word can be a single character). that keeps going for however long I want the output to be.
Daniel, first of all, thank you very much for your videos. I'm trying to find with no success if there is a way to use the idea of the markov chain to, instead of reading and generating text, to read and generate music! Im trying to use a RAW audio format to convert it to a byte array but I'm not sure I'm doing it right and It does not work. I'm trying to do that in processing just for better performance instead of using p5.js, do you know if that is even possible? Genarating some sort of music direct from bytes from another audio?. There's a new codding challenge for you!
Thank you for this video! I was coding along and noticed that if you capitalize the 'The' in the theremin sentence as your corpus ('the theremin is theirs, ok? yes, it is. this is a theremin.'), it actually only repeats "The" when you generate the text (like what Daniel is doing at 19:33). Does anyone have any insight into this? The only explanation I could even remotely think of is that the ascii of the capital 'T' is getting in the way of the random function but I'm probably completely off. Thanks again!
Daniel, you're doing a great job. I want to learn more about machine learning. and specifically unsupervised learning. I want to do something with that. thanks once again.. if you can then make more videos on Machine learning
I actually built a similar thing to this once. I took a bunch of text, usually 20-30 words, and found what came after each letter: e might be 3/8 t, 2/8 r, 1/8 c, 1/8 h, and 1/8 l. I would find random generations of words forming sentences that sound like the language the words came from. (With Spanish training data, the result sounded Spanish)
I actually built a bunch of would forming took a similar thing sentence. I words found rand 1/8 r, 1/8 t, 2/8 l. I to the result sound what came from. (With Spanish)
Dan please help! I'm new to programming and was looking for a place to start. I've played around with Processing and p5.js, and I'm extremely interested in the Nature of Code, and now I see your Programming from A to Z videos. I seem to be confusing myself with all the options. I have no coding experience, so I know I can't start with Nature of Code, but I'd like to get to that point! Where should I start?
I used your comment when following along with this tutorial :) "Dan please help! I'd like to that point! Where of Code, and no coding from A to be coding"
Hi Daniel, I started watching your content only recently, and find it very enjoyable and informative. Just a quick question: in line 10 shouldn't that 3 changed with "order" ? i think you hardcoded the 3 somwhere else in the code but don't have time to re-watch the whole thing now... Anyways keep up the good work, you are most likely the most cheerful programmer i've ever seen :^)
He got the Compile Error because var Result was called already. And the scope of result was also included the for loop, so have var result in the for loop was redundant because the result was already defined. Am I right? That's my assumption based on my knowledge of being fluent with Java. I'm starting to learn HTML and Javascript, so if anything I said was wrong, can anyone explain to me why? That is my basic understanding, though. I could be wrong, though.
The Library of Babel by Jonathan Basile. "If completed, it would contain every possible combination of 1,312,000 characters, including lower case letters, space, comma, and period. Thus, it would contain every book that ever has been written, and every book that ever could be - including every play, every song, every scientific paper, every legal decision, every constitution, every piece of scripture, and so on .At present it contains all possible pages of 3200 characters, about 10^4677 books." Really interesting algorithm.
first of all thanks for the video, could you help me with computing the n grams and then creating a hash value for thegrams individually ? ALL THIS IN PYTHON
Man I can not find the words to describe how happy I am for finding your channel, you have made my life about 2x better with making such a nice videos, keep on rocking.
i'm happy too
Mr. Coding Train, you are one wonderful teacher. I was going through an online course where they were implementing markov chain for random text generation to explain some design principles. But I could not under stand it. Your this tutorial saved me.
This is so cool! Most educational 26 minutes I've spent in a while.
Thanks so much for doing this video! I had to adapt your process to create a Markov Chain in MATLAB, but I was able to successfully hack it. I used MC's for a bioinformatics project where I had to generate DNA sequences and see if randomly generated sequences can maintain their integrity based on the order associated with the ngram. Good stuff!
"The erroneously beast with the unicorn is of natured." Great stuff on fascinating subject matter. I'm gonna try this with whole words instead of n-grams.
You're my favorite content maker on TH-cam !
This video is awesome. I was able to use this and part 2 to build a name generator using as source text words from a conlang that was developed for the project I am working on. This was so clearly described and stepped through I was able to put it together on the first try. Thank you sincerely.
I regret that I found your channel too lately. Not even a moment was boring because it's not done in a traditional way. Unorthodox always demands attention and we offer that with satisfaction here :)
"this followed by dot" :D Also, you are an excellent teacher!
I have wrote an implementation of this markov-chain example using Processing, but I encountered an error which you don't seem to be facing but I don't understand how. Basically it is about the end of the input text, because what happens for the last ngram of the input text since you cannot get the char afterwards since it would be out of range / or simply put: it does not exist.
My current understanding of this is that the markov chain generated a premature end before reaching the desired amount of iterations. I have programmed around this issue by simply jumping out of the loop when this case happens however I would be interested how you dealt with this issue. From what i have seen at 22:20 your algorithm seems to do the same since the expected output should be strings of length order + 10 however you are getting a string like "theremin." aswell which infact contains the mentioned end-of-input-string problem-ngram which leads to your string being only 9 characters long.
Your videos are so gold so I am switching adblock off before watching them.
That's the nicest thing anyone has ever said to me ❤
@@TheCodingTrain Keep up the great work!
0:43 that was a really clean transitition, blink and you miss it! Well done
This and the corresponding part 2 vid was what got me into making my very first "Markov text generator".
Wow. I built this whole project myself one day, analyzing a novel's worth of text for the statistics of what words come after other words, and then producing random text with those same probabilities. I didn't even know there was a name for it. I was just bored one day.
The text was mostly gibberish in the end, but it was pretty cool gibberish.
r/thathappened
Wow. I was probabilities. I didn't even pretty come after worth of text worth of what was just bored one day. The text for ther worth there was a novel's what was mostly gibberish.
It always happens for me
@@FLooper I was I even didn't. Wow. Just bored text was worth was.
Wow. Amazing education for free. Thanks man. Subscribed.
your TH-cam subscribers are growing, remember me, I was here from the start.
You always seem to be so bubbly and happy. LOL. Thanks for your videos :).
I love this man.
Back off, he’s mine
This channel is amazing, thank you so much!
Thanks for watching!
many, many thanks to you, sir.
Awesome video! Really interesting.
you should use youtube's automatically generated subtitles of your videos as source text
"The Unicorn is a virgin" I guess that is why they died out. (Text generated in video)
I've been writing a markov chain in python, and what I'm doing is creating a list of all words I've seen, as well as all 2 word pairs, 3 word pairs etc for each order we want to go up in. I also store what words come after with an array whose indexes correspond to which word they follow in the list of seen. I then use an algorithm that chooses the highest order markov chain based on the current output and adds that word (word can be a single character). that keeps going for however long I want the output to be.
Daniel, first of all, thank you very much for your videos.
I'm trying to find with no success if there is a way to use the idea of the markov chain to, instead of reading and generating text, to read and generate music! Im trying to use a RAW audio format to convert it to a byte array but I'm not sure I'm doing it right and It does not work. I'm trying to do that in processing just for better performance instead of using p5.js, do you know if that is even possible? Genarating some sort of music direct from bytes from another audio?. There's a new codding challenge for you!
Can you try working with MIDI files, instead? Probably much simpler than trying to make sense of raw digital audio bits.
Thank you for this video! I was coding along and noticed that if you capitalize the 'The' in the theremin sentence as your corpus ('the theremin is theirs, ok? yes, it is. this is a theremin.'), it actually only repeats "The" when you generate the text (like what Daniel is doing at 19:33). Does anyone have any insight into this? The only explanation I could even remotely think of is that the ascii of the capital 'T' is getting in the way of the random function but I'm probably completely off. Thanks again!
i love your videos !
Someone please take Trump's tweets and create a Twitter bot that tweets randomly generated Trump tweets.
@ullschwitz jib so?
great idea. give me a few hours
@@AnkushKun will update in 14 days will make this in python
Bro, I made that without reading this comment. Epic!
Great videos!
fantastic pronunciation on that final paragraph! :]
i could watch your videos forever and never program anything myself. got any advice on how to separate the time spent?
Watch one video a day, spend the rest of the time programming and being outside with people and friends?
Daniel, you're doing a great job. I want to learn more about machine learning.
and specifically unsupervised learning. I want to do something with that. thanks once again.. if you can then make more videos on Machine learning
Thanks for the feedback!
can you please make a video , how can we apply MCMC / Metropolis hasting algo in Processing ?
I actually built a similar thing to this once. I took a bunch of text, usually 20-30 words, and found what came after each letter: e might be 3/8 t, 2/8 r, 1/8 c, 1/8 h, and 1/8 l. I would find random generations of words forming sentences that sound like the language the words came from. (With Spanish training data, the result sounded Spanish)
I actually built a bunch of would forming took a similar thing sentence. I words found rand 1/8 r, 1/8 t, 2/8 l. I to the result sound what came from. (With Spanish)
Dan please help! I'm new to programming and was looking for a place to start. I've played around with Processing and p5.js, and I'm extremely interested in the Nature of Code, and now I see your Programming from A to Z videos. I seem to be confusing myself with all the options. I have no coding experience, so I know I can't start with Nature of Code, but I'd like to get to that point! Where should I start?
I used your comment when following along with this tutorial :)
"Dan please help! I'd like to that point! Where of Code, and no coding from A to be coding"
It got me confused, so I had to watch it twice :P
what font and syntax highlighting do you use?
hi,
what ide is he using?
my p5 doesnt look like that, and if it's atom, is there any video where he explains how to setup?
thanks for your help!
Replying just in case anyone is viewing this video and having the same problem now. He is using Atom, but I used editor.p5js.org/ with no problems.
Thanks for making video's i hate reading
Hi Daniel, I started watching your content only recently, and find it very enjoyable and informative. Just a quick question: in line 10 shouldn't that 3 changed with "order" ? i think you hardcoded the 3 somwhere else in the code but don't have time to re-watch the whole thing now... Anyways keep up the good work, you are most likely the most cheerful programmer i've ever seen :^)
The philosophy of math is more interesting than the robotic ish taught in school....Markov was also a philosopher...
15:14 Can someone please make a song of this "r or r or i"
I used markov chains for a rock paper scissors ai to predict the users next move
can this solution be stated under the heading of machine learning?
"I could see this followed by dot" 💀
(Remembers this dot song)
Great video!
Can we use that to generate music?
unknown unknown Of course, a lot of composers do that.
Damn, you read and speak Markovese so fluently.........Rosetta Stone or were you born there ?
Lol markovese
Which coding language is this?
He got the Compile Error because var Result was called already. And the scope of result was also included the for loop, so have var result in the for loop was redundant because the result was already defined. Am I right? That's my assumption based on my knowledge of being fluent with Java. I'm starting to learn HTML and Javascript, so if anything I said was wrong, can anyone explain to me why? That is my basic understanding, though. I could be wrong, though.
goaturojectimetin and goatimetiquits
Daniel! It would be interesting to hear your thoughts about the Library of Babel.
The Library of Babel by Jonathan Basile. "If completed, it would contain every possible combination of 1,312,000 characters, including lower case letters, space, comma, and period. Thus, it would contain every book that ever has been written, and every book that ever could be - including every play, every song, every scientific paper, every legal decision, every constitution, every piece of scripture, and so on .At present it contains all possible pages of 3200 characters, about 10^4677 books." Really interesting algorithm.
you are sososo good
Thank you so much!
Skip to 8:41 for rap freestyle
Thanks for the video, would you be able to make a video about word embeddings?
I've started a series here: th-cam.com/play/PLRqwX-V7Uu6aQ0oh9nH8c6U1j9gCg-GdF.html (but it's not complete!)
10:13 😂😂
first of all thanks for the video, could you help me with computing the n grams and then creating a hash value for thegrams individually ? ALL THIS IN PYTHON
is that Java or c++ ?
I believe it's javascript
10:24 :D
man, can I be your friend?
JavaScript ? Really?
antidisestablishmenterianism coding
Predictet ChatGPT 🙂