New Tech! How I made PRO Level VFX in a BARN!

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 พ.ย. 2024

ความคิดเห็น • 97

  • @CompositingAcademy
    @CompositingAcademy  7 หลายเดือนก่อน +8

    Thanks for watching and sharing! Let me know what you guys thought about this as well, I'm curious to hear others thoughts! If you want to learn compositing, check out our course below:
    www.compositingacademy.com/nuke-compositing-career-starter-bundle

  • @focusmedia2465
    @focusmedia2465 7 หลายเดือนก่อน +3

    This is really exciting and gives me hope at getting into compositing footage. I love that you can see your scene in real time, what an absolute game changer. Thank you for all of your work put into this video. The scene you created was so amazing and detailed as well, very impressive!

  • @dakshpandya6559
    @dakshpandya6559 7 หลายเดือนก่อน +3

    I did the excat same setup last year. Connected my iphone with live link camera. And attched it over my dslr. So my dslr shot live action while iphone captured unreal footage. Combined together gave amazingly well made product

  • @TheDavidTurk
    @TheDavidTurk 7 หลายเดือนก่อน +9

    this may be one of the best VFX breakdowns I've ever seen!! So awesome to see how you made this using so many different techniques and tools! BRAVO!!!!

  • @JoshuaMKerr
    @JoshuaMKerr 7 หลายเดือนก่อน +5

    Great work on this breakdown, Alex. Im so impressed by what you managed to achieve. It's an amazing system, isn't it.

  • @snehalkm
    @snehalkm 7 หลายเดือนก่อน +6

    This is really great Alex, Stuff we really dont learn even while working for VFX. Super proud and happy to see you from MPC and watching you here in TH-cam. Keep making. Just by making this, made me fall in love in movies again. Thank you so much and good luck for the rest of the videos. Looking forward.

  • @buddyfx7026
    @buddyfx7026 7 หลายเดือนก่อน +3

    Yea this was great, thanks so much for the thoughtful explanation of the entire process. Instant sub!

  • @lucywallace
    @lucywallace 7 หลายเดือนก่อน +4

    This is really amazing! Thanks so much as always Alex for sharing such interesting and exciting VFX techniques 😊 Would definitely love to try out this virtual production workflow.

  • @whypee07
    @whypee07 7 หลายเดือนก่อน +1

    Well, this video has motivated me once again after all the burnout afflicted by writer's strike and extra pressure of work and learning side by side. So good to see that with such small team and assets we can create such a stunning shot.

    • @eliotmack
      @eliotmack 7 หลายเดือนก่อน +4

      This feeling of 'I can do this' is what I most wanted to make happen when building Jetset. It means a lot to me that it gave you that!

  • @FX_AnimationStudios
    @FX_AnimationStudios 7 หลายเดือนก่อน +3

    super awesome!! really nice seeing the BTS

  • @johnwoods9380
    @johnwoods9380 7 หลายเดือนก่อน +1

    Really great advice for people who have a friend who owns an ABANDONED BARN. Lack of physical space is the toughest obstacle to my plans. Can't even use my garage, because it's full of someone else's stuff. I'll call you back when I find my barn

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +2

      When there's a will there's a way! Worst case scenario hang up a greenscreen at night on a non-windy day and do it outside somewhere. There's always a workaround, the barn was lucky but we didn't even plan to use it originally. Also it was quite cold and batteries died a lot, so workarounds come with their own problems. My belief though is constraints create creativity.

  • @VFXCentral
    @VFXCentral 7 หลายเดือนก่อน +2

    Absolutely amazing!!

  • @pietromaximoff4365
    @pietromaximoff4365 7 หลายเดือนก่อน +2

    Thank you amazing video and tutorials

  • @ChronicleMove
    @ChronicleMove 7 หลายเดือนก่อน +3

    Cool! It would be interesting to hear what difficulties and limitations your team encountered when using this pipeline?

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +2

      Detailed in a few of the other responses. Mainly if you want to refine one of the real-time tracks there's a workflow that they've developed, but it ended up working pretty well. That was probably the biggest thing we worked together to figure out, but they're tooling up a gizmo that essentially does a refinement workflow in either nuke or syntheyes.
      Another hurdle was figuring out the lighting - the app can load in a model but not lighting (they're adding this feature very soon though, if you have a workstation on set). Mainly I just screenshotted the blender scenes from various angles that I knew I would be shooting, and moved around the lights accordingly. We were in a barn in some cold temperatures so I wouldn't bring a workstation there - but I can imagine this workflow will be insane when you have unreal engine live-linked into your viewfinder. They're also adding the ability to stream an eevee viewport into the phone as well if you want to use blender instead of unreal. It gives you a great idea on lighting and how to match it for the composite.
      Some other factors could be you need to have enough light for the iphone to see features for it to stay stable, so I would imagine in the pure darkness this wouldn't work, but even this was shot back-lit so I think you just need to plan accordingly. I was already stretching it here and it worked.
      They have some other features they're updating as well, currently there's a feature called 'infinite greenscreen' which essentially garbage mattes out the edges that aren't greenscreen. Currently it uses auto-detection but on an un-even greenscreen it didn't work as much, so they're going to change the approach to just snapping corners and then garbage matting the outside away. This is nice to have but I still had no problem shooting the scene since probably 80% of it was in the greenscreen area.
      Orienting the scene they also have a printed marker you can use. This is really useful for flipping the scene around etc, without having to mess around with positioning the origin by hand on the app. Basically you just aim the app at a piece of paper with a QR code, and it orients the scene to that marker.

  • @VFXforfilm
    @VFXforfilm 7 หลายเดือนก่อน +1

    Looks cold in that barn.

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +1

      it was terrible, batteries kept dying, haha

  • @moisesdelcastillo6703
    @moisesdelcastillo6703 7 หลายเดือนก่อน +1

    amazing thank looking fwd to more

  • @NirmalVfxArtist
    @NirmalVfxArtist 7 หลายเดือนก่อน +1

    Fantastic! Manipulating light to our advantage in terms of saving cost and time is something rare these days!

  • @JasonKey3D
    @JasonKey3D 7 หลายเดือนก่อน +1

    Nice EmberGen info @ 4:00 👍

  • @SeongGyu_
    @SeongGyu_ 5 หลายเดือนก่อน

    Hello, Alex!
    Can you tell me the name of the warm-colored light in the video? Or can you tell me the number of Kelvin's? (The light on the right at around 1:44 of Time!)

  • @JuJu-he8ck
    @JuJu-he8ck 7 หลายเดือนก่อน

    the shot works great. boom and bam.

  • @DEADIKATED
    @DEADIKATED 7 หลายเดือนก่อน

    Awesome! I 'm actually trying to figure out a Solution for a complicated Green Screen Shot at the moment. Very Inspiring. I subbed

  • @GabrielMendezC
    @GabrielMendezC 7 หลายเดือนก่อน +1

    For an aspiring VFX artist such as myself, this is really awesome content to learn from. Thanks Alex! 🙌

  • @ocdvfx
    @ocdvfx 7 หลายเดือนก่อน +1

    Leveling up!

  • @ninjanolan6328
    @ninjanolan6328 7 หลายเดือนก่อน +1

    Ian Hubert has been doing things like this for over a decade

  • @SeongGyu_
    @SeongGyu_ 5 หลายเดือนก่อน

    Can you tell me the product name of the monitor you use? And can you upload the video regarding the monitor topic?

  • @malcolmfrench
    @malcolmfrench 7 หลายเดือนก่อน +2

    Game is changing! GREAT job Alex

  • @marktech2378
    @marktech2378 7 หลายเดือนก่อน +1

    Nice work 👌👌

  • @AlejandroGarciaMontionargonaut
    @AlejandroGarciaMontionargonaut 6 หลายเดือนก่อน

    great workflow, quick question , why green trackinpoints are used in this green screen , Isn't it better to use a different colour or is it because of this workflow?

    • @CompositingAcademy
      @CompositingAcademy  6 หลายเดือนก่อน +1

      In this case the green markers are used because if the character passes in front of them, it can still easily be keyed out - but at the same time there's enough contrast to be able to track the pattern.
      I believe the phone app tracks better if there are features as well - but mainly I put them there just in case I wanted to track one of the shots in Nuke afterwards wth a more refined track.
      In darker lighting conditions sometimes pink tape is used, because it's very bright and creates a lot of contrast against the green. However, this is not keyable so if an actor walks past it, you'll have to paint or rotoscope out the marker

  • @Gireshahid33
    @Gireshahid33 7 หลายเดือนก่อน +1

    Amazing

  • @2artb
    @2artb 7 หลายเดือนก่อน

    Nice work n vid thanks!

  • @LFPAnimations
    @LFPAnimations 7 หลายเดือนก่อน

    How accurate is the track output of Jetset? Would you have to do a normal 3D track in post still or can you use the app's track for final pixel?

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน

      The tracks are pretty good. For the hips up out of focus shot, I ended up just using the real-time track out of the box
      For the one where the feet are really prominently featured, I wanted a sub-pixel level and wanted to refine it.
      I worked with them to figure out a workflow that essentially “snaps” (orients/scales) any post track you do to the real-time camera.
      When you do a nuke track normally it’s not to real world scale and it’s not oriented to your cg set at all, so essentially the “refined” track workflow is do your track in post, and hit a python script button to “snap” that camera to the real-time camera where we know the scale and orientation is good in world-space.
      They’re working on a nuke gizmo (or syntheyes) to wrap that workflow up, but it worked really well. Orienting one camera is one thing, but once you start having sequences this is a big time saver. Additionally you’ll probably have some shots where the realtime track works as well so you can literally just start rendering / compositing.

  • @Ricoxemani
    @Ricoxemani 7 หลายเดือนก่อน

    This is really cool. Only thing holding me back from being able to do this is not having a huge barn to shoot in.

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน

      Outdoors is a good workaround too! Hang up a greenscreen on back of a garage or any wall, and shoot at night if you need to control the lighting.

  • @otisfuse
    @otisfuse 7 หลายเดือนก่อน +2

    WOW, the revolution has begun.

  • @ryanansen
    @ryanansen 7 หลายเดือนก่อน

    How accurate/usable is the track that you get from this workflow? Is it something just good enough for look dev or do you find it good enough/comparable to a track you might be able to solve out of something like Syntheyes?

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +1

      From the tests I did, sometimes it was good enough for final track, in other cases I wanted to refine it especially when the feet were very prominent.
      They have a refined tracking workflow if you want to use syntheyes or nuke - it snaps the post track to the real-time track. In this way it saves you time on orienting / scaling / positioning the new camera in world space.

  • @WhereInTheWorldIsGinaVee
    @WhereInTheWorldIsGinaVee 7 หลายเดือนก่อน

    it looks like you were using a gymbal with the camera...did that cause any problems with LIghtcraft Jetset? Could you use it with dji ronin?

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน

      nope no problem, this combination was awesome! I balanced the gimbal with the iphone and attachment on top. This setup works great with the gimbal. This is with ronin RS3.
      You basically check the iphone to see your CG, but your final image out of the Ninja 5. Obviously it's two different cameras so the view is slightly different, but jetset gives you a super clear idea of where you are framed up against your CG with your actor

  • @jimmahbee
    @jimmahbee 7 หลายเดือนก่อน +1

    Awesome

  • @JungNguyen09
    @JungNguyen09 7 หลายเดือนก่อน

    Are courses n101 - 104 currently discounted? I really want to buy this course 😌

  • @tomcattermole1844
    @tomcattermole1844 7 หลายเดือนก่อน +8

    I swear this tech didn’t exist a couple months ago, I was trawling the internet for solutions and couldn’t find anything. Ended up having to compile 3 After Effects camera tracks across one 2 minute clip :I

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +5

      Oh yeah, for long camera tracks this is going to be really interesting. I didn’t even think about that. It was also interesting that I could use the real-time track on an out of focus background shot, those shots are usually much harder to track.

    • @eliotmack
      @eliotmack 7 หลายเดือนก่อน +2

      You're correct! We only introduced this in February at the British Society of Cinematographers' show, so it's very new. We like long camera tracks!

    • @tomcattermole1844
      @tomcattermole1844 7 หลายเดือนก่อน +2

      @@eliotmack Kudos. I was scratching my head trying to figure out what piece of the puzzle was missing to make something like this possible because it felt like all the hardware you'd need can be found in a modern phone anyway. As someone who wants to push concepts as much as possible with smaller crews/budgets this is going to be nothing short of a life saver.

    • @AstroMelodyTV
      @AstroMelodyTV 7 หลายเดือนก่อน

      @@tomcattermole1844I’m actually working on a grad project right now where I’m going to have to do a lot of camera tracking. Do you have any pointers on how I could use this method with just the iPhone? Or should I try to get a cine camera as well 😅

  • @SHVWNCOOPER
    @SHVWNCOOPER 7 หลายเดือนก่อน

    this is exactly what i need. now if you have unreal engine tutorials i'm subbing lol

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน

      later this year unreal will come into the picture

  • @themightyflog
    @themightyflog 5 หลายเดือนก่อน

    Which Accsoon semo can I use?

  • @WastedTalentStudios
    @WastedTalentStudios 5 หลายเดือนก่อน

    so do you create the world first and then see it on your iPhone or how does that work

    • @CompositingAcademy
      @CompositingAcademy  5 หลายเดือนก่อน

      Yes exactly - you can buy a kitbash set from something like Big Medium Small - and you place those objects in a 3d scene. Then you essentially load that into the app and it will load on top of the real world

  • @StudioWerkz
    @StudioWerkz 7 หลายเดือนก่อน

    Which Accsoon Seemo can be used? I see a Regular, Pro version and a 4k Version.

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +1

      All of them. The standard SeeMo is HDMI and the Pro is both HDMI and SDI but they work the same.

  • @iAmBradfordHill
    @iAmBradfordHill 7 หลายเดือนก่อน +1

    Great video! The final shots look awesome! Love that the toxic goo was practical. How solid was the tracking data out of JetSet? Did you have to clean up or re-track or was it sufficiently accurate?

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +1

      The tracks are pretty good. For the hips up out of focus shot, I ended up just using the real-time track out of the box
      For the one where the feet are really prominently featured, I wanted a sub-pixel level and wanted to refine it.
      I worked with them to figure out a workflow that essentially “snaps” (orients/scales) any post track you do to the real-time camera.
      When you do a nuke track normally it’s not to real world scale and it’s not oriented to your cg set at all, so essentially the “refined” track workflow is do your track in post, and hit a python script button to “snap” that camera to the real-time camera where we know the scale and orientation is good in world-space.
      They’re working on a nuke gizmo (or syntheyes) to wrap that workflow up, but it worked really well. Orienting one camera is one thing, but once you start having sequences this is a big time saver. Additionally you’ll probably have some shots where the realtime track works as well so you can literally just start rendering / compositing.

    • @iAmBradfordHill
      @iAmBradfordHill 7 หลายเดือนก่อน

      @@CompositingAcademy Thanks for the insight to this! That's great to hear that you could use the realtime tracking for several shots. I was curious about the idea of somehow using the realtime track to refine or orient the post track. When you mentioned scanning the set to have a model of it in post, my mind went to the syntheyes tool that uses photogrammetry to make a model to improve tracking. Sounds like this workflow is something similar. Very cool! I can't wait to use this app and workflow myself. Hoping to shoot a project with it sometime this year.

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน

      Very similar to that! It's especially useful if you have more takes. Imagine if you had 10 camera angles pointing at a CG scene from different positions. Aligning & scaling, all of those would be a painstaking process normally.
      Interestingly they also export some extra stuff for post workflows, like an A.I matte that you can use to garbage out the post tracker if you want to (basically to roto out the moving person, etc).
      What's also interesting is this can be used for CG filmmaking, or CG objects placed into real scenes. I didn't go into it on this project, but those are also possible here.

    • @eliotmack
      @eliotmack 7 หลายเดือนก่อน +2

      @@iAmBradfordHill That's exactly the methods we're setting up for Nuke and Syntheyes. For Nuke we can re-orient a solved track to match the original scale & location of the Jetset track, and solve the usual alignment & scale problems encountered with single camera post tracking.
      Syntheyes will be extremely interesting as we can 'seed' the track and then tell Syntheyes to refine it with a variety of tools (soft and hard axis locks, survey points to 3D geo, etc.)
      The Jetset live tracks are good enough that we want to use them as a base for a final subpixel solve when the shot demands it.

    • @iAmBradfordHill
      @iAmBradfordHill 7 หลายเดือนก่อน +1

      @@eliotmack That sounds like the best workflow to me. Take your live track and refine it, instead of having to start all over. All that data is invaluable, even if isn’t sub pixel, I would think it has to be helpful when refining to get sub pixel accuracy. I’ll keep an eye out for that syntheyes update. I really want to get out and play with Jet Set myself!

  • @rossdanielart
    @rossdanielart 7 หลายเดือนก่อน

    Is the prores RAW any good? does it keep more data that is useful for vfx?

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +1

      Super useful for keying, also gives more flexibility with grading. You basically transcode the prores raw into ProRes 4444 or directly into EXRs, it really helps. Also it's just less compressed overall so everything has really crisp detail

  • @unspecialist
    @unspecialist 6 หลายเดือนก่อน +4

    VFX supervisor here, nice vídeo and nice concepts. But in no way is this called virtual production (outdated term for volume production) the whole reason is to avoid the green screen as you can see by those 2 unjustified specular lights on the white bucket the actor is carrying. This is just tradicional green screen work with tracking set data. You need the LED panels to create your light environment and volume correctly

    • @CompositingAcademy
      @CompositingAcademy  6 หลายเดือนก่อน +4

      I would disagree that this isn’t virtual production - traditional greenscreen doesn’t allow you to see what you’re filming. Personally I think it helps clients understand what we’re doing (shooting, and seeing the result). This app / company also markets itself as that term for that reason. I don’t think LED volume companies can claim the term virtual production - although they would like to and have poured millions into doing so.
      Also, sure there might be a spec highlight, but these shots are impossible to do on an LED stage without compositing. There’s foreground elements, a replaced floor, rack focus, a virtual camera move extension , etc. This is why I chose this environment, it plays to the strengths of fully CG scenes in a contained space, while also costing 100x less with arguably a better result than an LED volume.
      Personally I think LED stages are a cross-over technology to something much better, most likely virtual production seen through VR headsets on set, while the greenscreen (if you’ll even need one) is replaced live or pre-vis.
      Smaller more cost effective panels would be interesting for reflections definitely. I think that’s cool. But from a first principle and even physics standpoint, there’s a lot of limitations not honestly discussed often about LED virtual production.
      Also , you can get a lot of realistic lighting without needing LED panels which has been done for years, the only time you need LED panels is if you have many obvious reflective objects. Including greenscreen outside - which you can’t get realistic direct sunlight on led panels.
      The best mix is actually using greenscreen projected on an LED stage, and a virtual environment around the edges, but this still makes your cost ridiculously high for an arguably diminishing return, unless you’re filming chrome characters.

    • @monarchfilmspx0955
      @monarchfilmspx0955 2 หลายเดือนก่อน +1

      Hard agree with the actual vfx supervisor here...This is just green screen work. You really think no one could see what they shot on green screen before unreal? 😂😂😂😂

  • @violentpixelation5486
    @violentpixelation5486 7 หลายเดือนก่อน

    ❤👌💯🔥 please more #UnrealEngine #VirtualProduction

  • @DannyshRosli
    @DannyshRosli 7 หลายเดือนก่อน

    Better than LED Screens i must say

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +2

      it all depends on the lighting. Can have really bad results on greenscreen if it's lit wrong too.
      I think LED stages could be good for very reflective scenes or characters (like mandalorian, he's chrome), but they're limiting (and very expensive) in a number of ways that aren't often discussed.

  • @SeanAriez
    @SeanAriez 7 หลายเดือนก่อน +1

    Epic

  • @trizvfx
    @trizvfx 7 หลายเดือนก่อน

    Fuck yes Alex! Greta work.

  • @aidenzacharywessley3808
    @aidenzacharywessley3808 7 หลายเดือนก่อน

    @compositingAcademy how can I become vfx artist and a compositing artist can u make a video for that

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน

      Hey Aiden,
      The best way is to learn the fundamentals and build a demo reel to prove you have the skills to employers. If you're interested in the beginner series, we've built a really good path for beginners who want to go professional here. There's a bunch of projects included, and the footage can be used for reels as well.
      www.compositingacademy.com/nuke-compositing-career-starter-bundle
      All the best!

  • @themightyflog
    @themightyflog 7 หลายเดือนก่อน

    any tutorials on jetset?

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +2

      Possibly! If more people are asking for it, it's something I might do.

    • @themightyflog
      @themightyflog 7 หลายเดือนก่อน +1

      @@CompositingAcademy I would love to see your lighting tutorials. For real everyone but you seems like they are still on green screen

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +3

      good idea. I'll probably talk more about that in some next tutorials on these shots. Lighting & compositing is 100% the reason why people aren't getting the results they want.
      It's also the same reason a lot of CG environments look super video-game, people don't know how to control contrast & light mainly.

    • @orbytal1758
      @orbytal1758 7 หลายเดือนก่อน

      @@CompositingAcademy A tutorial on that would be amazing. It’s the one setback I have when doing anything virtual. Even in UE with mega scans it looks like a video game still plus my compositing skills need some work

  • @santhirabalasinthujan9170
    @santhirabalasinthujan9170 7 หลายเดือนก่อน

    Accsoon Seemo or Accsoon Seemo pro
    Using good

  • @SortOfEggish
    @SortOfEggish 7 หลายเดือนก่อน +1

    This is all impressive until the client says the barn limits their idea and they want to shoot something in Time Square lol

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +1

      Yeah you would need a bigger stage. This barn space is equivalent to a smaller greenscreen stage, it’s about the same size as a few you can rent. This workflow would still work on a wrap around stage though if you need to pan more.

  • @roxanehamel1753
    @roxanehamel1753 7 หลายเดือนก่อน +1

    I have an FX3 too. But I use it with a PortKeys LH5P II and Zhiyun Crane 4, will it work? Do I absolutely need to have pro-res raw to do virtual production like that? And can I do it with Unreal Engine and Davinci Resolve?

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน +1

      you don't necessarily need prores raw, it does help getting super clean keys and small details, but if you use one of the better Codecs in the internal recording on FX3 those will still work pretty well.
      Make sure to shoot 10 bit for sure though.
      You can use Unreal as well. Right now you can import any geometry, but they're also making making a live-link to see your unreal scene live if you hook up wired to a workstation. Either way you choose to work, you can get the camera data + plate data into unreal afterwards and it will align with your scene.

    • @roxanehamel1753
      @roxanehamel1753 7 หลายเดือนก่อน

      @@CompositingAcademy Thanks for the infos. And you're using an Iphone, but with a Google Pixel 8 can make it too?

    • @eliotmack
      @eliotmack 7 หลายเดือนก่อน +1

      @@roxanehamel1753 Right now Jetset uses iOS devices, but even a used iPhone 12/13/14 Pro will work great. The onboard LiDAR improves the 3D tracking, and makes 3D scene scanning possible.

  • @humangameryt4729
    @humangameryt4729 2 หลายเดือนก่อน

    any app for android 😢

  • @lFunGuyl
    @lFunGuyl 6 หลายเดือนก่อน +1

    You iPhone people are so annoying, but you sure get all the cool toys these days 😅

  • @kanohane
    @kanohane 7 หลายเดือนก่อน

    Iphone is trash....is there an Android software....and I wish UE would add mp4 support...😢

  • @keithtam8859
    @keithtam8859 7 หลายเดือนก่อน

    don't know. it is subscription.... and I am cheap LOL

  • @jinchoung
    @jinchoung 7 หลายเดือนก่อน

    meh. i mean cool results but don't see the value add of jetset.

    • @CompositingAcademy
      @CompositingAcademy  7 หลายเดือนก่อน

      It helps a lot when you're filming - framing up to things that don't exist is pretty unique. Personally I used to do a lot of photography I liked moving around and finding interesting angles, you can't do that traditionally which is why I think a lot of greenscreen stuff in the past is only background / distant stuff.

  • @travelstories2529
    @travelstories2529 7 หลายเดือนก่อน +1

    How much are you earning with this setup

  • @HTOP1982
    @HTOP1982 7 หลายเดือนก่อน +1

    Dude shoots GS and thinks this is VP.
    It's well done, but not correct...

    • @billwarner4641
      @billwarner4641 7 หลายเดือนก่อน +3

      The term "virtual production" came well before LED walls. But now LED walls has "taken over" the term virtual production. We're not ready to give up that fight. We think the Lightcraft Jetset approach is going to be so accessible to a much wider audience, that in the fullness of time, "virtual production" will go back to its original meaning which is any way to combine live action with a synthetic background.