Indie Virtual Production | Real Time Key in UE4

แชร์
ฝัง
  • เผยแพร่เมื่อ 20 ม.ค. 2025

ความคิดเห็น • 190

  • @ClassyDogFilms
    @ClassyDogFilms 4 ปีที่แล้ว +10

    It's exciting to see your tests with virtual production, Matt. This sort of thing would have cost a fortune 10 years ago. Imagine how cool it will be when you can get one of those big LED walls!

  • @erictko85
    @erictko85 4 ปีที่แล้ว +3

    Matt this is AWESOME. You are getting close and I can see that inspiration you were feeling. Its getting close to the point where you can make something that looks pretty damn amazing with Virtual Production and UE.

  • @darthgzuz
    @darthgzuz 4 ปีที่แล้ว +2

    Wow ... Tech has come a long way
    Brilliant ... Now keep that AVR (Augmented Virtual Reality) as Ur standard background
    👏👏👏

  • @Gleebi
    @Gleebi 4 ปีที่แล้ว +3

    well done m8. always difficult to try something new when there is no youtube tutorials on the subject. looks great

  • @alesis_
    @alesis_ 4 ปีที่แล้ว +1

    Great result, Matt. Thank you for your videos!

  • @airplaneian
    @airplaneian 4 ปีที่แล้ว +3

    This is looking great! So cool to see you spin this up so quickly.

  • @latentspacex
    @latentspacex 4 ปีที่แล้ว

    OMG this is so insane!!! This changes cg so much

  • @Dante02d12
    @Dante02d12 3 ปีที่แล้ว

    This looks great! It makes green screens much more believable!
    S'il y a des français passant ici: la team JDG a très probablement utilisé ce tutoriel pour leur nouveau système de fonds verts ^^. C'est le même équipement, les mêmes logiciels, et le même niveau de rendu.

  • @CONTORART
    @CONTORART 4 ปีที่แล้ว +1

    So awesome to follow your journey and see it all coming together now!

  • @racerschin
    @racerschin 2 ปีที่แล้ว

    a good worklow would be to record composite and separate virtual camera+greenscreen, edit with the composite and re-compose after all editing is done. so, shooting and editing with excellent preview, while mainaining full compositing for output.
    this workflow would require rec/stop/save commands to be sent to unreal through sdi, and auto-naming of both unreal sequences and camera, to help with camera assistance.

  • @mmtv_au
    @mmtv_au 3 ปีที่แล้ว

    Bruh. Game changer. Even now.

  • @debbiedeerproductions8204
    @debbiedeerproductions8204 3 ปีที่แล้ว

    So awesome!! Thank you...I really loved your explanation and demo of the composite:)

  • @JLOFlix
    @JLOFlix 4 ปีที่แล้ว +1

    FANTASTIC! Thanks SO MUCH, MATT!! TRULY motivating!

  • @leomaverick
    @leomaverick 4 ปีที่แล้ว

    Bro, thanks for sharing your rig, it's look really amazing!! Keep posting videos about it please!

  • @JheredStern
    @JheredStern 4 ปีที่แล้ว

    LOL, your equipment is godlike compared to mine. I am using a Canon T5I and a T6S, but I am getting similar quality results. UE4 Production is the future man. I feel like a pioneer on a new frontier.

  • @SceneOfAction
    @SceneOfAction 4 ปีที่แล้ว +4

    This is amazing. Thank you so much for sharing your process.

  • @Jsfilmz
    @Jsfilmz 4 ปีที่แล้ว

    Did you have to track these? I saw you didnt have any tracking markerw on the greensreen but had camera movements.

  • @stickwithit
    @stickwithit 4 ปีที่แล้ว +2

    This is some next level stuff Matt! Would love to set up a rig like this at home for a static camera!

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว

      For static cameras it's really straight forward and really great quality. Adding camera moves makes it a bit more of a setup with the Vive etc.

    • @stickwithit
      @stickwithit 4 ปีที่แล้ว

      @@CinematographyDatabase gonna recap your previous vids and hop on those UE tutorials when I'm back from tour! Super excited to see what I can pull off. More excited to see what you can pull off 😁

  • @VfxBlender
    @VfxBlender 4 ปีที่แล้ว +1

    Nice you got the live key to get good results.

  • @robchenhere
    @robchenhere 3 ปีที่แล้ว

    This is so dope! Thank you for making these videos!!

  • @peterxyz3541
    @peterxyz3541 4 ปีที่แล้ว

    I need this in my life 👍🏼👍🏼👍🏼👍🏼👍🏼👍🏼. One person micro studio to create micro scifi short form story

  • @minakovstudio
    @minakovstudio 4 ปีที่แล้ว +2

    Looks great. You are inspiring me to shot the short with your technics. Thx!

  • @SanneBerkhuizen
    @SanneBerkhuizen 4 ปีที่แล้ว

    Ooh. Wow.
    Maybe Ian Hubert wants to look at this! Because this is Soo cool!

  • @ChrisBraibant
    @ChrisBraibant 4 ปีที่แล้ว +1

    HI Matt. You do an amazing job. I am trying to do the same with a Vive and a GH5. Could you please recommend some resources? Thanks.

  • @SolomonJagwe
    @SolomonJagwe 4 ปีที่แล้ว

    Well done!! 👏🏽👏🏽👏🏽

  • @j005u
    @j005u 4 ปีที่แล้ว

    Hey Matt, with green screen (not a video wall obviously) if you recorded the raw camera feed and tracking data you could easily tweak the UE scene after the fact and re-render. That way, if the lighting wasn't perfect, you missed something glaring in the background during the shoot, the key wasn't perfect or whatnot, you have way more flexibility in post. Doesn't stop you from also recording from the monitor.

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว

      Yup, a lot of people go this approach and just use the live comp as reference but then do a traditional post FX finish. I’m all about the live comp 😈

    • @j005u
      @j005u 4 ปีที่แล้ว +1

      @@CinematographyDatabase Well yeah, isn't that basically what they did on Avatar? But what I really meant is, you can still do all your minor post adjustments to the scene and rendering in UE while keeping everything else exactly as you had it in camera. Also, you'd get to throw everything UE has in it's bag of tricks at the final render by not necessarily doing it real time. I'm talking things you'd probably never notice on the monitor but will show up on a 55" or whatnot. Anyway, I'm sure you've already thought of this as well. Just got me thinking, when you said you like recording onto the monitor...

    • @j005u
      @j005u 4 ปีที่แล้ว +1

      So I guess what I'm trying to say is, if you also capped the raw feed and tracking data and just did your UE programming in a smart enough way, it should be trivial to go back and fix mistakes you didn't catch on set.

  • @cire30a
    @cire30a 4 ปีที่แล้ว +3

    so how does it work with post... Are you recording on the Ursa also, and then give the unreal engine and tracking data and raw footage to the editor for posting the feature film?

  • @DiegoAli.
    @DiegoAli. 4 ปีที่แล้ว

    You are awesome Matt!!

  • @estudiorgb
    @estudiorgb 3 ปีที่แล้ว

    u can share the BP? i have htc camera tracker, but i cant get the same result as you in 9:15

  • @JOKERSTUDIO9
    @JOKERSTUDIO9 4 ปีที่แล้ว

    I love your work so much !! Looking forward to a new enhance version of CineTracer with character blocking and staging system, mark their movement.

  • @AmmonEhrisman
    @AmmonEhrisman 4 ปีที่แล้ว +1

    SUGGESTION: If you just get the LUT you want loaded onto the Blackmagic you can send that over the SDI, you would not have to add a grade in Unreal (saving you some processing power on that computer)

  • @sijigs
    @sijigs 2 ปีที่แล้ว

    I’m a bit late but thanks Matt. You’ve earned a sub

  • @ario6819
    @ario6819 4 ปีที่แล้ว

    awesome as always, if you can do a vlog on tracking would be very good, I am so inspired like you, I am a screenwriter now learning to make movies virtually

  • @TerenceKearns
    @TerenceKearns 4 ปีที่แล้ว

    An idea for lighting. Get a large TV screen, place it close to the model and send another feed of the virtual environment (taken from th eopposite direction of the camera). Tune the lighting can camera for low light so that the effect of the TV is more pronounced. Now the reflection of the light from the TV on your model makes them look more immursed. It should help to counteract all that green light pollution. It's all about proximity.

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว

      yeah we are working towards indie LED wall virtual production. I shot this demo last year, basically a mini Mando LED set - th-cam.com/video/kFBha5BE38k/w-d-xo.html

  • @racerschin
    @racerschin 2 ปีที่แล้ว

    interesting stuff. is there a lightwrap effect in the unreal compositor? that would help a lot.

  • @brendanblondeau8631
    @brendanblondeau8631 3 ปีที่แล้ว

    Brilliant !

  • @VJNirav
    @VJNirav 4 ปีที่แล้ว

    amazing bro, keep it up !!

  • @zoombapup
    @zoombapup 4 ปีที่แล้ว +2

    Question for you: Where did you get the greenscreen background? I've been searching on Amazon but couldn't find a decent one. I'm going to try a virtual setup like this (I'm an AI programmer, so want to do some realtime virtual characters with live actors interacting.

    • @zoombapup
      @zoombapup 4 ปีที่แล้ว

      Balls, so you even had it in the damn title. Sorry for the dumbass question :)

  • @MrTegid
    @MrTegid 4 ปีที่แล้ว

    This is incredible

  • @caferoast
    @caferoast 3 ปีที่แล้ว +1

    Hey, really enjoyed your video and in-depth and clear breakdown for a VP setup. I was wondering however, if you could share how to setup just Vive tracker + base stations without the HMD i don’t have an HMD and having an extremely hard time setting those up with unreal. I keep hearing they are super easy to configure but can’t find any easy setup videos for that. I was hoping if you could help explain that.
    Also tried following you along the configurations you showed in unreal here and can’t find most options you mentioned here 🙏🏽

  • @vivektyagi6848
    @vivektyagi6848 4 ปีที่แล้ว +1

    Really Cool

  • @AndreLLMedia
    @AndreLLMedia 4 ปีที่แล้ว +1

    this is awesome

  • @GregCorson
    @GregCorson 4 ปีที่แล้ว +1

    @Cinematography Database where did you get that spaceship model from? I'd like to give it a try. The Epic virtual studio models are all very bright, I'd like to try something that looks good in dim light.

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว +1

      easily one of the best on the marketplace - www.unrealengine.com/marketplace/en-US/product/sci-fi-modular-environment

  • @zoombapup
    @zoombapup 4 ปีที่แล้ว

    I was thinking of trying the new intel realsense tracking camera as a way of doing the tracking on the camera, rather than using the vive method. I've seen bigger production sets using inside-out camera tracking where they markup the room with tracking markers and then just have a camera pointed at the ceiling with a bit of software to position and orient the camera (which is basically what the intel tracking camera is doing, but doesn't use markers). Ideally I guess you'd have a rear projection screen behind the cam for environment lighting.
    Ah man, if only I had the funds I could play all day with this kind of thing.

  • @tyldarprod1399
    @tyldarprod1399 4 ปีที่แล้ว +1

    Well done. And here we go!!! Off to HomeDepot to make that green screen frame. Big thanks to wify. :)

  • @FernandoQuevedo
    @FernandoQuevedo 4 ปีที่แล้ว

    Thanks for sharing. Great job!

  • @iluminousvj
    @iluminousvj 4 ปีที่แล้ว

    "Hit stop, give the media to the Downloader, who is me in my case, err"
    lol amazing work!

  • @AncLight188
    @AncLight188 ปีที่แล้ว

    Hi Matt! How do you output your Vlog video from Composure? Is there a straight forward way to output the composited image from Composure to a .mp4 file? Thanks!

  • @darviniusb
    @darviniusb 4 ปีที่แล้ว

    How is the video signal coming to UE4, does UE work on the "raw" captured signal or does UE do fast compression? How good is the UE keyer, looks a bit limited. Woukd not be better to work with an external hardware keyer that does a key on a 444 signal then pass that to ue, or would that add to much latency ? Sorry for so many questions but i just started with this and i am interested to have as little as possible quality loss from lens to output.

  • @hardcorerick8514
    @hardcorerick8514 2 ปีที่แล้ว

    is there a way to get this camera tracking but with just an unreal camera? composure + putting the composite plane in one place, wondering if this is possible? Love the work as always!

  • @ge2719
    @ge2719 4 ปีที่แล้ว +7

    looks good, though a way to match the lighting in the scene better would be a big improvement.

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว +5

      if you check Instagram we did some testing with backlights. I'm definitely going to do videos just on matching the live action to the virtual sets. That is the fun part.

    • @HardRockChart
      @HardRockChart 4 ปีที่แล้ว +2

      @@CinematographyDatabase I've been looking into this quite a lot... would it be possible to use a few rear projectors to build a poor man's version of the ARWALL and lighting your actors that way?

    • @GerfriedGuggi
      @GerfriedGuggi 4 ปีที่แล้ว +2

      @@HardRockChart also looking into rear projection as a low budget solution :)

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว +4

      Louis Buys yes, this is possible and I will hopefully demo it.

    • @HardRockChart
      @HardRockChart 4 ปีที่แล้ว

      @@CinematographyDatabase excited to see your demo

  • @sinanarts
    @sinanarts 3 ปีที่แล้ว

    Dear Matt; I have an exact setup like your early days. I need an answer if you please 🙏🏻. I am connected to unreal via decklink card and I get keyed preview in viewport. I use composure and also I use Media bundle projected on a plane. all are perfectly working. WHAT I CANT achieve is to record my editor viewport in the UE. Is it mandatory to use an ext recorder via decklink in order to record.? I DO NOT WANT OBS or alike as a screen capture is there a way to record in the Unreal engine like take recorder or seq rec.? APRECIATE your time and help.

  • @rotanncolyn4608
    @rotanncolyn4608 2 ปีที่แล้ว

    Hi Matt! Hope you can help me. I have spend countless hours trying to get composure to work but something isn't right. The CG layer looks completely different compared to the actual scene. It is as if it doesn't read any of the post processing or lighting properly. Any advice on what I can do to fix this. Thanks

  • @SitinprettyProductions
    @SitinprettyProductions 2 ปีที่แล้ว

    Hi Matt, love all your videos. Quick (newbie question) - how can I record the CG clean plate (or just get the tracking info) if I just want to use the live comp as an on set preview and do post later? I feel like there's a really simple answer for this, but I'm not sure what it is.

  • @marcogrob3899
    @marcogrob3899 2 ปีที่แล้ว

    Thank YOU!! Can you make the background out of focus and play with forcus in general? Best Marco

    • @CinematographyDatabase
      @CinematographyDatabase  2 ปีที่แล้ว

      Yes you can, you need a way to track the real world camera focus distance however.

  • @beachcomberfilms8615
    @beachcomberfilms8615 4 ปีที่แล้ว +1

    Gee Matt you’ve come a long way in a short time. That is unbelievably amazing. I would love to see this in person. Any plans for workshops once we are free from the pandemic? BTW I’m finally going to be able to get the RTX 2080 Ti so I can finally get to work with Cine tracer. Love your work man!!!

  • @deadmikehun
    @deadmikehun 2 ปีที่แล้ว

    This video was the push i needed that made me want to learn virtual production! I can now send live video feed to a level but im struggling with two things. Is there a way to bind a hotkey that displays and hides an image or element relevant to what the actor is talking about in a live situation? Also, how can i record the final outcome of a live feed on PC? I dont have a ursa and SDI out :( Thanks!!!!

  • @Im_Derivative
    @Im_Derivative 3 ปีที่แล้ว

    If I wanna do this live on Twitch or something, is there a way to do it without the live feed ending every time I try to play the level? I want to utilize some spider AI I coded, and their animations, but can't because this method only seems to work in the editor.

  • @DamianAronidisTV
    @DamianAronidisTV 3 ปีที่แล้ว

    Hello Matt, thank you for the details on the video. I have a question. Right now I am shooting green screen on a much higher shutter speed than my frame rate requires for perfect motion blur, to be able to have a better keying in post. While live compositing is the way to go, what happens if I really want to add back the motion blur I sacrifice for the green screen? Match the shutter speeds of the real and virtual camera to match the green screen, record my scene and then add motion blur in post, or can it be done on the output comp from the engine? Thank you!

  • @HandsomeDragon
    @HandsomeDragon 2 ปีที่แล้ว

    Hello, I'm working with Composure and the cg_layer color is way off, its bad and not as good quality as the CineActorCamera. Have you come across this issue where Composure's view is different from your CineCameraActor?

  • @tomdchi12
    @tomdchi12 4 ปีที่แล้ว +1

    Is UE calculating the lighting in real time, or is the set "baked" lighting? Either way, looks like a fantastic start for a setup based on mostly "consumer" gear! You are really opening the door for a lot of productions!

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว

      This scene has baked lighting. However with RTX and SSGI fully dynamic lighting has very high quality and you only need to hit 24 FPS in this case.

  • @Kaiyes_
    @Kaiyes_ 4 ปีที่แล้ว

    Hi Matt, Looks great ! Can't wait to see how it connects with cine tracer. I have a request though. Could you please do a traditional newsroom setting that looks pro ?

  • @EanMartinTays
    @EanMartinTays 4 ปีที่แล้ว +1

    I know it takes a little longer but would it increase quality to record the foreground and back ground sperate and then key in the background in post? I was even thinking that you could record the background on a 4k Blackmagic video assist so that it's all Blackmagic RAW.

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว +1

      This is definitely possible and you can record the CG background, matte, a 3D camera data all together for a traditional post finish. I primarily focus on the live workflow because it evolves into projectors and LED walls.

    • @EanMartinTays
      @EanMartinTays 4 ปีที่แล้ว

      I'm working to move in the same direction so I'm glad you told me that before I setup my studio with the wrong workflow in mind. Love your work and excited to join in with Virtual Production.

  • @ameenmo360
    @ameenmo360 2 ปีที่แล้ว

    Hi Matt. If I want to buy a video camera compatible with Unreal Engine to connect and shoot with chroma input directly after 3D creation
    What do you recommend to buy the type of camera compatible with the fifth version

  • @Marc-hy8cf
    @Marc-hy8cf 4 ปีที่แล้ว

    Great videos love it!! Question for you though. I don't have an htc vive setup (yet.. covid $$ issues lol) so I'm trying to achieve similar results by using the iPhone as a tracker. Even though I match my cinecameraactor settings to my DSLR, i find they don't quite match in unreal. You don't seem to be having this issue with the vive trackers. what "information" do they give back.. Transform? world location? distance between cam and tracker? thx!
    Marc.

  • @deeeny_
    @deeeny_ 3 ปีที่แล้ว

    Hey man, this is amazing, thankyou very much. Just one question, once you have everything set up how do you record it? I mean, to have the render video of what you were filmed in real time. Thanks in advance!!

  • @TylerMatthewHarris
    @TylerMatthewHarris 4 ปีที่แล้ว

    If you export tracking points from AE's lockdown plugin into Maya or blender you might be able to improve the lighting situation. Have each point drive a vertex on a 2d cutout in 3d space and use a 360 photo-sphere to light the mesh then re composite.

    • @TylerMatthewHarris
      @TylerMatthewHarris 4 ปีที่แล้ว

      aescripts.com/lockdown/ , actually it looks like it'll generate a 3d mesh for you. ... i'd have killed for this a few years ago. PF track use to be the only thing that could do it... pretty sure this is just what you need

  • @rutchjohnson
    @rutchjohnson 4 ปีที่แล้ว +1

    So this is awesome! But how do you think we can merge and sync the focus pulling between virtual and real cameras?

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว +1

      With high hardware/software systems (MoSys, NCam, Stype) it’s possible. At the indie level the community is looking/inventing solutions.

  • @eliteartisan6733
    @eliteartisan6733 ปีที่แล้ว

    Please could you tell me or point to a link where i can create a material "Andy's Tutorial"?

  • @seefoodeatrice4608
    @seefoodeatrice4608 4 ปีที่แล้ว

    compmaterial Do you know how to make this?8:30

  • @antonpolinski
    @antonpolinski 4 ปีที่แล้ว

    Great Setup, i‘m gona try to build this in future, are you planning to connect your setup to cinedesigner ?

  • @AdrianooElias
    @AdrianooElias 3 ปีที่แล้ว

    Nice! Which graphics card was used?

  • @hunterjacksonmedia
    @hunterjacksonmedia 3 ปีที่แล้ว

    Man wish there was a PDF somewhere on how to set this up for all cameras.

  • @ahmadsyauki
    @ahmadsyauki 4 ปีที่แล้ว

    Is the vive tracking ready for broadcast production? Are vive making tiny random movement while tracking?

  • @poyodiazmusic
    @poyodiazmusic 2 ปีที่แล้ว

    This is great but how can you record the video composed in unreal to record it for example?

  • @fredremotion
    @fredremotion 4 ปีที่แล้ว

    Amazing thanks so much!

  • @patrickbeery9405
    @patrickbeery9405 4 ปีที่แล้ว

    Thanks for sharing this. Been working with the aja and media bundles. They work but really want to use some of the more advanced keying features in composure. I understand how composure works but don't want the media plate to be attached to the camera . My question is how can I take the media plate in composure and add it to a plane that I place inside unreal like the black magic or aja bundle? Have read that 4.25 added some new features to composure and wondering if this would be one of them.

    • @Jsfilmz
      @Jsfilmz 4 ปีที่แล้ว

      hahaha i asked him the same exact question. I found a tedious way to do it but its not live footage like his example here. Dont use composure to key the footage. Just use a material to key your footage then apply it on a plane

    • @patrickbeery9405
      @patrickbeery9405 4 ปีที่แล้ว

      @@Jsfilmz That's what we have been doing with the aja bundle, read something about new composure plates. Aja's keyer isn't the best and would love to use the composure keyer. If you have a written link to the tedious way i'd love to check it out.

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว

      you use the Composite Plate / Image Plane plugin in UE4.25 to put the comp on a 2d plane to composite in world space.

  • @Danlandoni
    @Danlandoni 4 ปีที่แล้ว

    Question! Is it possible to monitor the composite real time, record the camera internal and UE separately, then do the composite later in an NLE? For example...Record everything, export the recorded UE sequence, bring it into Resolve, insert the green screen camera file and key in Resolve, match the timecode to overlay camera on the UE sequence? This will keep my Alexa file in the highest quality (Proress 4444), and will have greater control over the captured image for grading.

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว +1

      Yes. You can on top of that just record the camera tracking data and then add whatever you need in 3D later and render out the UE4 images at 4K etc. and do the comp later I’m whatever app you want.

  • @andreic048
    @andreic048 4 ปีที่แล้ว

    You mention 2.0 base stations in the description, but would this work with 1.0 base stations as well? The htc website says the vivetracker works with the 1.0 base stations, but not sure about this specific use case.

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว +1

      I don’t know the specifics but I believe the 2.0 stations can be combined to make bigger tracked volumes easier than the 1.0. In the case of virtual production you want the best quality tracking volume as possible or the whole thing jitters. A perfectly setup space and perfect 1.0 setup would probably work, but most indie spaces have issues that are overcome with more stations, which means 2.0 stations.

    • @andreic048
      @andreic048 4 ปีที่แล้ว

      @@CinematographyDatabase thanks

  • @imiy
    @imiy 3 ปีที่แล้ว

    Spending on it few days already, but haven't found info on how to place already keyed footage of a person in UE 3D environment to make some simple camera moves and render it out. All the tutorials are for live real time production with green. So frustrating....

  • @Musicientn
    @Musicientn 4 ปีที่แล้ว

    Hi Great Video, Thx Which Computer configuration is necessary to run such setup ?

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว

      At this current level ideally is a PC workstation with a modern high end CPU, 64GB+ RAM, and 2080ti. As this scales up, it requires more high end PC hardware that we will be covering.

  • @8088NET
    @8088NET 4 ปีที่แล้ว

    Brutal !!!!! Thanks you

  • @sada16dec
    @sada16dec 4 ปีที่แล้ว

    I am trying to put foreground plate over media plate and added simple cube to foreground layer, it shows black background along with cube. ...already changed "alpha channel to linear space color only". Suggest if i am missing anything.

  • @sultanaraboughly
    @sultanaraboughly 4 ปีที่แล้ว

    Man...this is genius

  • @timdelatorre
    @timdelatorre 4 ปีที่แล้ว

    Is there a way to sync camera aperture value and shutter angle w/ background blur and motion? I imagine this is another limitation of the green screen vs an LED wall. Is there a way to get accurate dof based off of lens focal length, aperture values, and focus points? I can't imagine that kind of data is output by cameras and available to Unreal engine to calculate.

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว

      It is possible to sync the real world focus/iris/zoom/fov/chromatic aberrations/distortion of real world lenses and match them in Unreal Engine. LED walls have their own challenges with matching, but they are different than green screen for sure.

  • @estevesbb
    @estevesbb 4 ปีที่แล้ว

    How are you setting up the tracker distance to the lens? Are you just estimating the distance? Because there's parallax between the camera position and the tracker position and it's not looking good atm for me

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว

      lens calibration isn't setup yet in this demo. In the future it will be.

  • @lukout
    @lukout 4 ปีที่แล้ว

    pretty cool ! Amazing ! I am learning a little bit of Unreal, Can you use any digital camera, like for example a Gh4 ?

  • @Sanjay_jit
    @Sanjay_jit 4 ปีที่แล้ว +1

    Really cool-I just ordered a Roko because all of this virtual production is too next level. Have you had any luck trying to get some sort of lightwrap going on for realtime keying?

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว +1

      I haven't tried light wrap yet, but I heard something about it. I'll show it if I get it up and running.

  • @minarimon3106
    @minarimon3106 ปีที่แล้ว

    What’s the minimum size of a space required for virtual production led wall ??

  • @ottogarza
    @ottogarza 4 ปีที่แล้ว

    Matt do you know if the BM Video assist can record BRAW from the Ursa mini 4.6k? not the pro but the first 4.6k hope you tech guy can answer that will be really really helpfull.
    thanks for your knowledge.

  • @felipeaguirrebengoa6191
    @felipeaguirrebengoa6191 4 ปีที่แล้ว

    hola que tal, excelente el trabajo que haces y los totorales y todos tus videos, felicitaciones!
    queria consultarle lo siguiente: que tipo de hardware de pc (placa de video, procesador, ramón, etc) es necesario para realizar este tipo de trabajos? desde ya muchas gracias y espero tu respuesta

  • @KekLuck
    @KekLuck 4 ปีที่แล้ว

    Did you run into any lens distortion issues?
    Also how accurate would you say is the measurement of real space to unreal space. Has it been 100% accurate for you?

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว

      R3ZOfficial currently I do no lens mapping or correction, which makes the tracking pretty indie/bad compared to a pro tracking solution. However we will be looking at different techniques for this both indie and high end.

    • @KekLuck
      @KekLuck 4 ปีที่แล้ว

      @@CinematographyDatabase very cool to hear! Everything I found so far requires programming knowledge or is outdated. Excited to see how you continue!

  • @jono0202
    @jono0202 4 ปีที่แล้ว +8

    Matt can you make a tutorial to a live stream with a chroma key, but with a camera movement like a last shots, when you moving the camera and the background moves too pleeeeaaaasssseeeee

  • @Enzait
    @Enzait 4 ปีที่แล้ว

    Nice!

  • @pawaalfilms
    @pawaalfilms 4 ปีที่แล้ว +1

    how can we start a small virtual production . any simple DSLR tutorial Plz

  • @DavidCrossIN2U
    @DavidCrossIN2U 3 ปีที่แล้ว

    This is fantastic. I would love to learn how to do this. So, to be clear this is UE4 Composure that you are using?

  • @dyervisuals9963
    @dyervisuals9963 4 ปีที่แล้ว +1

    Is it possible to have a static physical camera (so no tracking info) into unreal engine and move a virtual camera around in unreal engine?

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว +1

      yes, you put the video on a plane and then move the UE4 camera around. The illusion is pretty broken if you try to pan or tilt, but if the camera stays perpendicular it looks OK. Someone in the Facebook group is using this technique and fooled me at first into thinking it was a 3D track.

  • @sinanarts
    @sinanarts 4 ปีที่แล้ว

    Never got a reply but lets try once more.. I do not see color diff material option have any idea.? appreciate

  • @Shagrake1
    @Shagrake1 4 ปีที่แล้ว

    Hey Matt, I’ve been following along with your experiments and it’s very exciting to see your results. I have a question, as a Unity dev I’m not that familiar with Unreal but is there a way to add a trash matte to a wide shot in this setup? The talent would still be within the key but around that, possibly and rgb green layer to mask the rest of the studio? Thanks and keep up the great work!

    • @CinematographyDatabase
      @CinematographyDatabase  4 ปีที่แล้ว

      I haven’t done it yet, but you can add 3D garbage mattes for that effect yes.

    • @Shagrake1
      @Shagrake1 4 ปีที่แล้ว

      @@CinematographyDatabase Thanks! Cheers!

  • @arnoldwinford1096
    @arnoldwinford1096 3 ปีที่แล้ว

    Hi, if I use Take recorder for record project , how to setting ? Thank you so much