Thanks! Tip: Please refrain from putting your text prompts at the bottom or top of the video. When pausing or rewinding, in order to study the text prompts, TH-cam covers the text with it's own overlays. Make the video full screen, turn on subtitles, pause it and take note of where the overlays are covering your text.
So important to cover the cinematic camera movements and how to apply them in prompting. Thank you again. I am still struggling with face distortions in Runway and character consistency in image to video. Will go through all your tutorials to see if I can get tips for this 🙏
This is EXACTLY the type of video tutorial I have been yearning for.....lol. Everything was explained in a detailed, easy to understand manner. You earned yourself a subscribe buddy! Keep up the good work.
Thank you so much for this video! I’m experimenting at the moment but I’m planning to create a cartoon based on a story I wrote when I was younger ☺️. I will keep following your videos ❤
And the script in part by AI too. It’s interesting how one can pick up AI generated phrasing like with a certain quality to the image which tells you it’s AI generated. I think those who are pre AI will always detect these nuances.
Great video! One thing is not clear to me: what are the steps before using this prompt? Do you always start by generating an image on Midjourney and then you animate it via Runway?
Yes I always start with image generation first because I can set the initial frame of the video which is very important. Also I can set the consistent style and character. These days I’m using more FLUX than Midjourney.
Great video, very interesting and clear, thank you very much. A question: in Runway Gen 3 can you modify the shots taken with your camera by importing them into the program or ask to use them as a model for the development of other scenes? And if so, can it be useful to underline the shooting parameters so that they are consistent with the initial footage? Thanks!
One question when you create a prompt for a scene do you specify the camera shot in that prompt for the image generation? Or do you describe the scene for the image generation than by looking at that image you decide which camera motion to use?
Great video! Thanks. Have you created any videos that suggest the best Runway prompts to get characters to move, dance, walk, turn head, etc.? This is the biggest problem I have with Runway. I can't get my characters to move as I need them to. Do you have a guide for character movements? Suggestions? Thanks.
For character movement I would highly recommend Minimax Hailou. I really hope that Runway Gen 4 in the future will allow better character action and motion because it’s lacking at the moment
I have SO MANY questions.. So, are these shots being achieved through a specific subscription package? Are there establishing images to aid the animation or just text to video? I know there's a method for prompt input but how are these images being accomplished? These are some of the best I've ever seen!
Now if someone can figure out how to do a dolly zoom shot in Runway, made famous in films like Vertigo and Jaws where the background appears to zoom away from the subject, that would be great. I've tried and tried with no luck. My other "wish list" item would be to control the time it takes for Runway to zoom in on something rather than zooming continuously to the end of the clip. If there's a way to do that, I haven't found it.
The unlimited plan includes unlimited generations of Gen-1, Gen-2, Gen-3 Alpha, and Gen-3 Alpha Turbo in Explore Mode at relaxed rate. What does that mean? How long will I have to wait for my completed video clips?
How do you view this tutorial now that Runway added camera control in the interface? Is it still useful to write these camera instructions, or is using Runway with the camera control requires different prompting?
Hey Camera Controls don't work with Gen 3 (they only work with Gen 3 Turbo which is their inferior model) and they're pretty basic. Their official recommendation is also to write prompt if you want a complex camera motion 👍🏻
@@cyberjungle well, from all the testing i've seen, Turbo is just as good, and sometimes better than Alpha. And those basic camera control sliders provide the exact same camera movement you instructed on in this video (from pan, dolly, crane, tilt etc). And combining them together you can even almost create complex movements like Arc. I guess that some complex movements need to be prompted (and then you need to pray that Runway will actually understand and follow the prompt), but 95% of camera movements described in your video and used in videos can be made far more reliably using the camera controls. My actually concern, and I need to test this further, is if using the camera controls effect how well the characters inside the scene move. I thought you might have more experience than me on this, but I'm not sure you do. Which is fine. The camera controls are pretty new to everybody.
@ yeah if you think “Turbo is as good as Alpha sometimes even better” than I guess we have a huge gap between our quality standards. I just repeated what Runway team officially wrote in the User Interface of Camera Controls page. They say if you want a complex scene please write a prompt. Camera Controls are for simple actions. This isn’t my personal opinion. It’s official recommendation of the Runway Team. Good luck 👍🏼
I NEED HELP IN USING AI TO GENERATE A 6 EPISODE LIMITED SERIES (EACH EPISODE A HALF HOUR OR BIT LONGER). Scripts are completed. QUESTION. CAN I UPLOAD A COMPLETED SCRIPT TO MINIMAX OR ANY OTHER AI GENERATOR...I HAVE FULLY DEVELOPED CHARACTERS, DIALOGUE, RECOGNIZABLE FOREIGN COUNTRIES LOCATIONS. AM I BEING PREMATURE? IS AI THATY ADVANCED?
Sir, i've been into this space for 2 or 3 weeks now and starting with using motion on leonardo, im know highly interested in starting to work on cinematics and filmmaking with AI. What is your current favorite of all competitors to use? i tend to runway, but am pretty unsure, as every tool is quite costly. Which one is your current recommendation ?
These days I mostly use Runway and Minimax but it’s difficult to predict future. Maybe I will use a different tool in 6 months because landscape is changing so fast
I have only one question, do you initially animate midjourney images (image to video) or do you work with text prompt to video? Which of these ways is used in all those beautiful examples in you video?
Thank you for such a detailed tutorial. I am just curiuos how many failed video generations were spent to create such a beautiful video scenes that we can see at this video?
As every video, this misses an explanation of the most important point. How does one create clips that last longer than a few seconds. IE how does the last frame of Clip A can be the input for the first frame of Clip B.
@@cyberjungle Sorry to say. I have subscribed to runway yesterday, unlimited. My conclusion, runway is very bad for image to video. The face is really different and the scene I want never happens. I hate it because everything is slow motion which is not interesting. Very uninteresting, I tried again with the same prompt in kling. The result kling gave much better results. Runway is very overrated. I am very disappointed and regret subscribing.
@@Holoview7 Sorry to say. I have subscribed to runway yesterday, unlimited. My conclusion, runway is very bad for image to video. The face is really different and the scene I want never happens. I hate it because everything is slow motion which is not interesting. Very uninteresting, I tried again with the same prompt in kling. The result kling gave much better results. Runway is very overrated. I am very disappointed and regret subscribing.
@@TukoCcc Yeah. I felt the same way as you did, until I actually figured out how to use it. If you follow this guys tutorials on Runway and actually implement it, you will receive MUCH MUCH better results than anything you would have expected. I also use Minimax and Kling. Although they are great for fast action scenes, they warp and distort more than Id like, plus they take a long time to render. Runway is VERY fast, making your workload 10X faster. Believe me, if you just started with Runway, you haven't even scratched the surface. It's clarity and speed make still make it better than anything out there. You just have to know how to prompt and utilize camera angles and motion prompts correctly.
Midjourney V6.1 - Photorealistic Cinematic AI Photography Style Guide:
cihanunur.gumroad.com/l/midjourneyV6-style-guide
how to find Runwayml promo code? anyone help me please kindly?
AWESOME! THANK YOU SO MUCH!
Damn my guy you managed to do what Runway hasn't even done with their tutorials. Great work here!
Thanks!
Tip: Please refrain from putting your text prompts at the bottom or top of the video. When pausing or rewinding, in order to study the text prompts, TH-cam covers the text with it's own overlays. Make the video full screen, turn on subtitles, pause it and take note of where the overlays are covering your text.
You tutorials on Ai images and video production are the best I’ve seen so far, keep up the good work, you deserve a wider audience.
This was the most inspirational opening to a tutorial video I've ever seen. I was hooked before you even taught anything. :)
💚
You're awesome bro. Love your tutorials. Easy to follow, well explained.
So important to cover the cinematic camera movements and how to apply them in prompting. Thank you again. I am still struggling with face distortions in Runway and character consistency in image to video. Will go through all your tutorials to see if I can get tips for this 🙏
Is there user friendly website, which can generate exact cinematic constant character from one reference image?
Take a look at this video: FLUX + LORA and Kling AI (Consistent Characters & AI Videos with Your Face)
th-cam.com/video/mUR8CUmDbo0/w-d-xo.html
This is EXACTLY the type of video tutorial I have been yearning for.....lol. Everything was explained in a detailed, easy to understand manner. You earned yourself a subscribe buddy! Keep up the good work.
This is amazing! I'm just getting started, but I'll be following this for sure!
We use Gen-3 Alpha all the time in our videos, and now we also finally have access to Adobe Firefly! ✨
They look wack as hell :D maybe choose another hobby
Thank you for this video, great and very, very useful info. Big fan of what you bring to the masses. Thanks!
💚
Thank you so much for this video! I’m experimenting at the moment but I’m planning to create a cartoon based on a story I wrote when I was younger ☺️. I will keep following your videos ❤
An extremely detailed guide that helps viewers fully utilize Runway Gen-3 to create professional cinematic works
This is fantastic, best tutorial ive seen.❤
Fantastic thank you for this video.
Very helpful. Another great video from you, thanks!
Such a cool video.Awesome work!
Thank you, you made me check out Runway ML after being away for year or longer. This stuff you showcased is mindblowing!
good message at the end.
this was amazing!
And the script in part by AI too. It’s interesting how one can pick up AI generated phrasing like with a certain quality to the image which tells you it’s AI generated. I think those who are pre AI will always detect these nuances.
The best of the kind and the last words of the video are inspiring.
💚
Excellently presented and beautiful, thank you for sharing
💚
Gran trabajo felicitaciones y muchas gracias me sirvio para experimentar y me gustó el resultado.
First 🎉 Please make more of these! Thank you 🙏
AWESOME! THANK YOU SO MUCH!
You have gained a follower, from the way you have explained. I want to learn more, thank you. ❤
💚
Good job!
Love your work 💖 Thank you
💚
bro your tutorials are amazing! can't thank you enough for the work you do!
💚
Inspiring video. Great Job.
Thanks! Great work!!!
Great video! One thing is not clear to me:
what are the steps before using this prompt?
Do you always start by generating an image on Midjourney and then you animate it via Runway?
Yes I always start with image generation first because I can set the initial frame of the video which is very important. Also I can set the consistent style and character. These days I’m using more FLUX than Midjourney.
@@cyberjungle what pricing plan do you use in runway?thx awesome channel
@adrianadi2779 currently using unlimited package 👍🏼
Thank you for your efforts !!!
19:01 what would you call this shot where the camera flies over the walkers and focuses on the building in the distance? That was a beautiful shot!
Que aula maravilhosa...UAU!!!
This video is to review many times. Thank you🎉
💚
you really have nailed many things, but i want to know how can i create a consistent character for my music video...i am struggling a lot for that...
th-cam.com/video/mUR8CUmDbo0/w-d-xo.htmlsi=3HHZM2Cff5kEdnkJ
Stunning mate. Very great tutorial 👏👏👏
wow i need this thank you
Nice one, to use some of the tools one need to learn some basic film making techniques.
Marvellous...always to be kept close in the fourth monitor!!!(+1)
AMAZING TEACHER! Great work on this bro 👏 💪 👌
Subscribed❤
💚
amazing! thanks !!!!
these videos given me interesting idea for my next project, Thank you.
Great video, very interesting and clear, thank you very much. A question: in Runway Gen 3 can you modify the shots taken with your camera by importing them into the program or ask to use them as a model for the development of other scenes? And if so, can it be useful to underline the shooting parameters so that they are consistent with the initial footage? Thanks!
Thank you ! Great work !
you just gained one sub here.
Man! I'm in love with the pirate girl..I'd give anything for her prompt!
Thank you!
Hi. Thanks for the video. If I'm making a 5 minutes long video, how do I connect between them? Same characters, same atmosphere and surroundings?
Thanks for sharing. Awesome work!
💚
Amazing!! Thank you so much!!!!!! wouaaaaaaaaaa
I am happy to find this channel
💚
More of these please
Thank you for these tutorials... most people won't share their methods!
One question when you create a prompt for a scene do you specify the camera shot in that prompt for the image generation? Or do you describe the scene for the image generation than by looking at that image you decide which camera motion to use?
thanks for great tutorial!
It's impossible to do a simple STATIC shot.
Great video! Thanks. Have you created any videos that suggest the best Runway prompts to get characters to move, dance, walk, turn head, etc.? This is the biggest problem I have with Runway. I can't get my characters to move as I need them to. Do you have a guide for character movements? Suggestions? Thanks.
For character movement I would highly recommend Minimax Hailou. I really hope that Runway Gen 4 in the future will allow better character action and motion because it’s lacking at the moment
What is a good plan to get on runway and mid journey for creating film 🎥
All SO Amazing!! 🎉✨👏🏼💯
💚
Very instructive video, thanks! I think this might help me improve the prompts generated by Melies AI !
תודה!
I have SO MANY questions..
So, are these shots being achieved through a specific subscription package?
Are there establishing images to aid the animation or just text to video?
I know there's a method for prompt input but how are these images being accomplished? These are some of the best I've ever seen!
Now if someone can figure out how to do a dolly zoom shot in Runway, made famous in films like Vertigo and Jaws where the background appears to zoom away from the subject, that would be great. I've tried and tried with no luck. My other "wish list" item would be to control the time it takes for Runway to zoom in on something rather than zooming continuously to the end of the clip. If there's a way to do that, I haven't found it.
Minimax can do vertigo effect pretty well
I was just thinking of this!!
You are great bro thanks
Make a video on AI commercial and AI music Video courses like this one. Would be of tremendous value
The unlimited plan includes unlimited generations of Gen-1, Gen-2, Gen-3 Alpha, and Gen-3 Alpha Turbo in Explore Mode at relaxed rate. What does that mean? How long will I have to wait for my completed video clips?
Great 👍 .
thanks - great video:)
How do you view this tutorial now that Runway added camera control in the interface? Is it still useful to write these camera instructions, or is using Runway with the camera control requires different prompting?
Hey Camera Controls don't work with Gen 3 (they only work with Gen 3 Turbo which is their inferior model) and they're pretty basic. Their official recommendation is also to write prompt if you want a complex camera motion 👍🏻
@@cyberjungle well, from all the testing i've seen, Turbo is just as good, and sometimes better than Alpha.
And those basic camera control sliders provide the exact same camera movement you instructed on in this video (from pan, dolly, crane, tilt etc). And combining them together you can even almost create complex movements like Arc.
I guess that some complex movements need to be prompted (and then you need to pray that Runway will actually understand and follow the prompt), but 95% of camera movements described in your video and used in videos can be made far more reliably using the camera controls.
My actually concern, and I need to test this further, is if using the camera controls effect how well the characters inside the scene move. I thought you might have more experience than me on this, but I'm not sure you do. Which is fine. The camera controls are pretty new to everybody.
@ yeah if you think “Turbo is as good as Alpha sometimes even better” than I guess we have a huge gap between our quality standards. I just repeated what Runway team officially wrote in the User Interface of Camera Controls page. They say if you want a complex scene please write a prompt. Camera Controls are for simple actions. This isn’t my personal opinion. It’s official recommendation of the Runway Team. Good luck 👍🏼
great video thx
can i use my video and it adds some like flowers blooming or grass growing im my video ,without changing subyects in my video?
Great images! Did you make all of these with text-to-video in Runway, or did you generate the images first? What did you use for the images?
Thank you! These are all image to video. I generated all images using Flux Realism model on Freepik 👍🏼
Thank you very much!
You're the best bro. Adamsın.
Çok teşekkürler
i love the video, however, i am more interested in how to generate these videos, prompts and all
thanks brava!
I NEED HELP IN USING AI TO GENERATE A 6 EPISODE LIMITED SERIES (EACH EPISODE A HALF HOUR OR BIT LONGER). Scripts are completed.
QUESTION. CAN I UPLOAD A COMPLETED SCRIPT TO MINIMAX OR ANY OTHER AI GENERATOR...I HAVE FULLY DEVELOPED CHARACTERS, DIALOGUE, RECOGNIZABLE FOREIGN COUNTRIES LOCATIONS.
AM I BEING PREMATURE? IS AI THATY ADVANCED?
I couldn’t access runway Gen 3 webpage on my Mac… it doesn’t open.. please how do I access it?
app.runwayml.com/login
Sir, i've been into this space for 2 or 3 weeks now and starting with using motion on leonardo, im know highly interested in starting to work on cinematics and filmmaking with AI. What is your current favorite of all competitors to use? i tend to runway, but am pretty unsure, as every tool is quite costly. Which one is your current recommendation ?
These days I mostly use Runway and Minimax but it’s difficult to predict future. Maybe I will use a different tool in 6 months because landscape is changing so fast
I have only one question, do you initially animate midjourney images (image to video) or do you work with text prompt to video? Which of these ways is used in all those beautiful examples in you video?
Image to Video (animated images)
@@cyberjungle than you so much!!!
Thank you for such a detailed tutorial. I am just curiuos how many failed video generations were spent to create such a beautiful video scenes that we can see at this video?
Good question! I had average 2,3 tries per footage.
@@cyberjungle Your certain answer is even better than my initial question 👍 Thank you.🙂 Now I know what to expect.
I constantly get flagged by runway as if I am trying to produce objectionable content for normal prompts.
You can use an editor to zoom....less complicated prompts for the AI
Thank you!
Hello! I can't make my character blink in Runway Gen 3 Turbo. Any idea? Thanks you so much
“Static Shot: Natural Movement” tend to give character a natural blink in few tries
@@cyberjungle Thank you so much, I'll try this
As every video, this misses an explanation of the most important point. How does one create clips that last longer than a few seconds.
IE how does the last frame of Clip A can be the input for the first frame of Clip B.
Masterpiece
💚
Emeğine sağlık ❤
i dont get it. how do you guys get this much motion in your runway videos?
Please give me example use runway
hello, Is this only possible with images or also with videos?
both images and videos
@ thank u, do u have some course on Udemy?
@@adem19 Nope only TH-cam and free content for my community
Nice video,,,,
in your opinion which is better image to video between Kling Ai and Runway in terms of facial consistency?
Runway
I 2nd that. Kling is really good for action scenes, but Runway is much better for facial consistency and all around clarity of video.
@@cyberjungle Sorry to say. I have subscribed to runway yesterday, unlimited. My conclusion, runway is very bad for image to video. The face is really different and the scene I want never happens. I hate it because everything is slow motion which is not interesting. Very uninteresting, I tried again with the same prompt in kling. The result kling gave much better results. Runway is very overrated. I am very disappointed and regret subscribing.
@@Holoview7 Sorry to say. I have subscribed to runway yesterday, unlimited. My conclusion, runway is very bad for image to video. The face is really different and the scene I want never happens. I hate it because everything is slow motion which is not interesting. Very uninteresting, I tried again with the same prompt in kling. The result kling gave much better results. Runway is very overrated. I am very disappointed and regret subscribing.
@@TukoCcc Yeah. I felt the same way as you did, until I actually figured out how to use it. If you follow this guys tutorials on Runway and actually implement it, you will receive MUCH MUCH better results than anything you would have expected.
I also use Minimax and Kling. Although they are great for fast action scenes, they warp and distort more than Id like, plus they take a long time to render. Runway is VERY fast, making your workload 10X faster.
Believe me, if you just started with Runway, you haven't even scratched the surface. It's clarity and speed make still make it better than anything out there. You just have to know how to prompt and utilize camera angles and motion prompts correctly.
Tem os prompts escrito em alguma lugar ?
All you need is a good StoryBoard and the film can be good.
Well where can we get these stories that people are telling... I wanna check em out.
Google Gen:48 or any other AI film festival showcase. You will see plenty of great personal stories (and a lot of sci-fi stuff)
I have a short film script. can someone help with the ai txt to video