ok the explanations is amazing. no too in-depth but enough for me to understand the basic of how UTF-8 came about. Wish you could say more about why is UTF 8 better than UTF 16 :)
It is because UTF-16 wastes memory. UTF-16 will always use either 2 or 4 bytes even when 1 is enough. UTF-8 uses just one byte for the old 7-bit ASCII.
@@md_ez I have a reddit account. Every time I post a video, I notify people of the C_Programming subreddit. (www.reddit.com/user/gregg_ink) There 2 main projects I currently work on. 1) This youtube channel. 2) I am writing a sci-fi novel. I have an account on instagram but I use that primarily to stay in touch with fellow writers. (@greggink) I have a twitter account but I haven't posted in quite a while (@Gregg_Ink).
Can you elaborate? All code is tested before I put it in the video. You can see that I run and demonstrate the code in the video so it is clearly working. If you are going to claim something is not working, it obviously needs a bit of explaining.
@@lihuseynzad6983 Well, I don't know. Are you running it on Windows or on Linux? The code is designed for Linux, I don't think that the Dos prompt does support emojis. It could be any number of things, like for example maybe a setting within dev c++. Have you tried using gcc like I demonstrate in the video?
this video was just excellent in every aspect, I watched several other videos but this finally helped me get it
The first video that clearly explained the multibyte codepoint. Thanks!
You are welcome.
Thank you so much for the video! I really enjoyed it
Well explained video!. Back when i did my first win32 program i just did what you said and went with ASCII.
For how many years ago we are talking about?
Bastante interesante el video. Gracias por la explicación.
Great video! Thanks a lot for your work!
You are welcome.
ascii my beloved
ok the explanations is amazing. no too in-depth but enough for me to understand the basic of how UTF-8 came about. Wish you could say more about why is UTF 8 better than UTF 16 :)
It is because UTF-16 wastes memory. UTF-16 will always use either 2 or 4 bytes even when 1 is enough. UTF-8 uses just one byte for the old 7-bit ASCII.
@@GreggInkCodes I see. Thank you.
I love your channel!
Thank you.
Im trying to get å ä ö into my program. I didnt really understand why at 12:57 line 86 uint32_t. Is this just how utf-caracters are declared?
A codepoint can have a value of up to 1.1 million. Using a uint32_t is used just to make sure I have enough space.
great video! wondering why you got so few subscribers ^^
Thanks, I appreciate that.
Hey bro. can you help me how to open utf-8 file
?
You open it just like any other file. Of course, if you want to display its contents, you need an editor which supports unicode.
Sorry for a year late Mr. Professor, may I enter?
Always welcome.
@@GreggInkCodes thank you sir! (proceed to enter class with sandals)
A very good channel !!
I have a question , please
Yes?
@@GreggInkCodes Do you have an account on a social media website ?
@@md_ez I have a reddit account. Every time I post a video, I notify people of the C_Programming subreddit.
(www.reddit.com/user/gregg_ink)
There 2 main projects I currently work on. 1) This youtube channel. 2) I am writing a sci-fi novel.
I have an account on instagram but I use that primarily to stay in touch with fellow writers. (@greggink)
I have a twitter account but I haven't posted in quite a while (@Gregg_Ink).
@@GreggInkCodes I have messaged you.
your code not working
Can you elaborate? All code is tested before I put it in the video. You can see that I run and demonstrate the code in the video so it is clearly working. If you are going to claim something is not working, it obviously needs a bit of explaining.
@@GreggInkCodes I use Dev C++ I try use this code but not working
@@lihuseynzad6983 Well, I don't know. Are you running it on Windows or on Linux? The code is designed for Linux, I don't think that the Dos prompt does support emojis. It could be any number of things, like for example maybe a setting within dev c++. Have you tried using gcc like I demonstrate in the video?
@@GreggInkCodes Now I understand why your code not working Because I use this code on Windows
@@GreggInkCodes Can you help me ? I want to print unicode character but I can't find library for C++