I'm in the VFX industry for many years, and seeing this workflow and the new technology you're bringing to the masses is so exiting :) Can't wait to test Jetset once I finish my new green screen studio (cyclorama). Amazing job guys!
Refinement tracking : do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CG set with a live video. ?? Thanks
The standard Jetset tracking is easily good enough for shots that don't have visible ground contact. You can see some videos done by Alden Peters on TH-cam that are all straight Jetset tracking. The shots with highly visible ground contact require an additional level of precision; that's what the SynthEyes pipeline is designed to handle.
Hello. I am considering purchasing an iPhone Pro to use the LiDAR feature specifically for virtual production with the LightCraft application. Could you please let me know if there is a significant difference in LiDAR quality and performance between the iPhone models from version 12 up to the upcoming iPhone 16? Are there any major benefits of using the newer models with your application?
We've found remarkable improvements with each new generation of iPhone, especially in GPU capacity and in cooling. The LiDAR hasn't changed much, but I'd still recommended getting the newest iPhone you can simply for the other performance aspects. It makes a big difference. We're getting Jetset ready for iOS 18 and looking forward to what is in the new hardware coming up soon.
@@locksleylennoxvfx Yes -- I have an upcoming tutorial on Unreal, Syntheyes, and Fusion coming to show how those pieces integrate, including the tricky overscan aspects.
Shots with visible CGI & live action joins. In this case it's the join between the CG railing and the practical floor, but in other shots it might be high degrees of floor contact.
@@eliotmack do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CGT sets with a live video. ?? Thanks
@@stephanec3436 For many shots the standard Jetset tracking is fine. All but one of the shots in th-cam.com/video/s2y2lcsL_Lk/w-d-xo.htmlsi=oOkG1VY3s8Q5wM5X are from the Jetset data. For certain shots with very visible ground contact, you may need to do tracking refinement. It's very shot-specific.
I'm in the VFX industry for many years, and seeing this workflow and the new technology you're bringing to the masses is so exiting :)
Can't wait to test Jetset once I finish my new green screen studio (cyclorama).
Amazing job guys!
Thanks! Post some shots when can!
@@eliotmacksure 👍
Amazing
Refinement tracking : do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CG set with a live video. ?? Thanks
The standard Jetset tracking is easily good enough for shots that don't have visible ground contact. You can see some videos done by Alden Peters on TH-cam that are all straight Jetset tracking. The shots with highly visible ground contact require an additional level of precision; that's what the SynthEyes pipeline is designed to handle.
you should make more tutorial about SYNTHEYES
how to bring them over to unreal or blender ? tried fbx and usd does not works.. fails hard.. unreal has no camera sometimes
Watch closely at 17:32 -- it goes into detail on the Blender import process.
Hello. I am considering purchasing an iPhone Pro to use the LiDAR feature specifically for virtual production with the LightCraft application. Could you please let me know if there is a significant difference in LiDAR quality and performance between the iPhone models from version 12 up to the upcoming iPhone 16? Are there any major benefits of using the newer models with your application?
We've found remarkable improvements with each new generation of iPhone, especially in GPU capacity and in cooling. The LiDAR hasn't changed much, but I'd still recommended getting the newest iPhone you can simply for the other performance aspects. It makes a big difference. We're getting Jetset ready for iOS 18 and looking forward to what is in the new hardware coming up soon.
@@eliotmack thank you for the clearly answer.
Where can I get the Overscan addon that you are using for Blender?
In this case, we're not using an overscan add-on, but manually entering in the overscan sensor size calculated in Syntheyes. Much simpler.
Can you show how to do this in davinci?
I didn't see the ability in Fusion's 3D tracker to lock the solver to survey points. If you see that somewhere let me know!
@ I’m just talking about the composite part. I use systheyes for my tracking as well.
@@locksleylennoxvfx Yes -- I have an upcoming tutorial on Unreal, Syntheyes, and Fusion coming to show how those pieces integrate, including the tricky overscan aspects.
@ I can’t wait. The tracking being off is what made my pause pursuing your system when I was testing. This is so exciting.
Under what circumstances would you need to refine the live track?
Shots with visible CGI & live action joins. In this case it's the join between the CG railing and the practical floor, but in other shots it might be high degrees of floor contact.
@@eliotmack do you mean that on every project/shots we have to refine 3-D tracking with suntheyes or equivalent ? I am confused because I thought Lightcraft and the accelerometer of the iPhone already did the job of a 3-D tracking ?? If it’s the case, the real benefit of light craft is the ability to monitor in real time as CGT sets with a live video. ?? Thanks
@@stephanec3436 For many shots the standard Jetset tracking is fine. All but one of the shots in th-cam.com/video/s2y2lcsL_Lk/w-d-xo.htmlsi=oOkG1VY3s8Q5wM5X are from the Jetset data. For certain shots with very visible ground contact, you may need to do tracking refinement. It's very shot-specific.
wow
✨😎😮😵😮😎👍✨