I don't think they are overreacting Josh. Jetset, in a very humble and powerful way, is the beginning of the democratization of "large-scale" indie film productions. In upcoming versions and updates, the app will help creatives, who until now have been left out due to budget constraints, to capture their visions and to get so much closer to epic results. I'm very excited about this and will adopt and promote its use to all my colleagues. Thanks for the time and effort, and keep pushing forward.
Both your genuine enthusiasm for this app and what it does are both bad ass! I’m a digital art teacher and trying to set up a virtual production course for my high school seniors. The trackers were the one component that I could not sort out for us. This potentially solves that issue for us so that I can keep students interested in the arts ahead of the curve and on top of the most recent technology for film making. I could go on forever about how excited I am about this. So cool! 🎉🎉🎉
Hey Josh you did it! It seems it was an amazing presentation! Gotta say I felt the same way when Jonathan Nolan got on stage on my very own conference lol. Would love to see an in depth breakdown of the workflow you've used here, it seems it's the most accessible indy filmmaking VP workflow we've seen so far.
Yes it was an incredible experience. I'll definitely be covering the workflow on the channel and new developemnts as they arise. Nice work getting Jonathan Nolan to your conference by the way.
Wow. That is....ridiculously impressive. Putting a basic version of virtual production into literally anyone's hands is game-changing. Combine it with mocap and it's basically a basic Avatar setup. I'm blown away.
Thanks, I totally agree. The fact that it scales to different needs is so cool. I can't imagine going back to messing around with unreal for calibration stuff.
Two things Josh…1) it is insane that this video has so few views 2) it is incredible what you guys have done. This is absolutely the future and I want to use it as soon as possible “I run an m2 pro Mac mini upstairs and a Puget system with 2 RTX 4090’s and a 64-core Threadripper downstairs. I also own a full Rokoko body rig” and 3) I feel so BLESSED that absolutely nobody has watched this video and I hope it remains that way😂😂😂😂 I’m extremely selfish and don’t want anybody else to know what you guys have created here LOL. in fact I hope this video is somehow a breach of contract with TH-cam and gets removed almost immediately now that I’ve seen it😂😂😂. Congratulations to you guys this is amazing 🎉
It’s a two edged sword. If they don’t make money they can’t keep supporting this great tool. So I hope it goes viral, so much that they can actually lower the price 😊
That's pretty amazing! Oh how I felt your frustration and then relief figuring out your technical issue. Worst part of working a trade show... not being able to see the exhibits. Glad to see they're recycling the CanadArm into something useful terrestrially. 😁 Have long wished I'd gotten into cinematography much earliee in life but that's how the winds blow. Looking forward to watching more from this channel.
This is simply amazing and a real game changer for indie filmmakers like myself. Thanks for sharing this video. I can’t wait to try out Jetset Cinema!!! ❤️☺️👏🏾👏🏾👏🏾👏🏾👏🏾🙌🏾
Love your genuine enthusiasm, it really shines through. Also, this platform certainly has the potential to break the film industry… absolutely wild tech. I’ll definitely be looking into it.
Great video! Really interesting. I was about 30 seconds in and then realised “holy sh!t, that’s Josh!” We worked together back at the Sage on Thinking Digital and various projects. Glad to see you’re flying mate! Keep it up!
Glad you were able to overcome the issue on day 2. That must have been so disheartening. The product itself is intriguing, and really impressive. Was at the show, but didnt get very long by the Accsoon booth to see it in action. Now im disappointed I didnt get to see it live. I did see the canon booth and to be honest yours looks more interesting in terms of VP. Your floor neighbours Godox had a nifty lighting app too to be able to match lighting effects to a virtual background. Hope it goes well for you guys. I know a few people here in Ireland who would love to do some virtual production work but were scared off by the costs, but its looking more and more likely with products like Lightcraft Jetset that they can at least try it out without breaking the bank.
with the Choppy video, my guess is its some form of Windows trying to minimize resource usage, so its allocated less CPU time and re-initializing the capture device refreshed its priority, or something like that.
Interesting, I didn't think of that. Probably a byproduct of using a consumer laptop for the demo. It was unfortunately completely neccessary to do so.
@@JoshuaMKerr if your laptop has a 4090 RTX in it I’ve been having issues with it where it will run fine and then all of a sudden the video starts to lag. So it might be a video issue.
@@JoshuaMKerr Thanks, I absolutely will... just had a very quick run out with it yesterday and it is really amazing.. I think I only scratched the surface of what it can do.
With the stutter consider the venue? Signal noise let alone a cell phone signal is almost a no go. I would not put it past other vendors to put in signal boosters that could interfere either.
Oh god, totally. The place was a mess of electromagnetic interference. But the video wasn't stuttering on its way into the laptop, which helped me locate the problem.
it‘s a great creation, I have to say the tracking on the final comp was shaky but hopefully future iterations can get that solid, I would have totally suggested this on a quick turnaround job last week had I have known about it
There was a minor jitter, but I've had worse for much more expensive trackers. There are definitely tracking refinements in the works. That's a certainty.
Great video! What is the Digital Slate App that you are using on laptop. It went really fast with flickering QR codes and the slate. Not sure how does this sync the unreal with the camera. Can someone guide me to find the tool? Thanks!
you are doing a great job! thank you for putting in the work. I was checking out lightcraft because of you and will be testing it for a future project. What is really missing in the documentation of lightcraft is a finished result. like the one you posted here but maybe a bit more complex movement. In the tutorials there sometimes seems to be a lot of drift in the origin and that made me hesitant a bit.
Thanks, im glad you're enjoying my work. You might be interested to know they are introducing a post-processing feature for tracking data to bring it to sub pixel accuracy.
You need to link up with a led panel company for lighting i don't mean the volume, but having light panels that react to the unreal lighting, that will push your system, gobos point flat, and reactive lighting can be done, that will push this product over the edge. Keep it up and do the community a favour, do not let any of the big corporates buy this system or merge, as indie makers like us will be priced out of the market again
I like the idea of DMX lighting. It'll produce great colour reproduction and can be pixel mapped to Unreal Engine. I need to get my hands on some falconeyes panels or similar.
@@JoshuaMKerr Camera tracking yes, but in what sense? I did not see the game changing utility of the app. Maybe if I were there at the show it would have been more obvious. OTOH, if everyone else gets it, then don't worry about me.
The demo was about showing what can be acheived with low end hardware. But beyond that you need to use a capture device that unreal recognises. So far that's AJA, Decklink or Camlink.
We get the Android question quite often. But we're not making a statement on what phone system you should use in your life. In fact, many customers find they don't want text notifications to pop up on the screen of their virtual production system, and even if they have an iPhone, they buy a second phone for their professional use. A Vive Mars requires $5000 of dedicated hardware. This is the current low end of the market, before we entered. Compared to Vive we can handle a huge set, indoors or out, with no setup and no hardware. Think of the iPhone as an ask that customers buy $1200 of "dedicated" hardware. And it is amazing hardware, with Lidar. Lightcraft started 20 years ago and it made dedicated tracker hardware similar to Stype and Mosis. This hardware cost about $30,000 if unbundled. The IPhone matches that hardware tracker! And the Lidar adds some amazing capability. So, keep your Android phone. And get started with virtual production for free and for the cost of some amazing "dedicated" hardware that has the Apple logo. Android phones cannot match iPhones for virtual production, so there are no plans for an Android version I'm afraid.
Lidar is one thing, but it's the software and pipeline that makes this into something useful. Cameras would need a lot more hardware capability to do this internally. The best you could hope for is for a camera's lidar functionality to output it's data for use in external applications, which I don't see happening any time soon.
Hey there, great content. Thank you! Do you recommend the Accsoon SeeMo Pro over the regular SeeMo? Should I get one of these for a smoother workflow? Usually I'm using a Mars system for simple HDMI signal transmission. But for the UE Jetset solution with post-rendering the SeeMo seems necessary ... Would I need anything else apart from an iPhone/iPad (I have some old devices iPad 8th generation and iPhone 12 Mini somewhere ... )? Have you used the Tentacle system for Time Code or something similar? The Lightcraft website mentions Tenacle support. That's why I'm asking. Sorry for my many questions ;-) Best wishes and keep up the great work!!!
Hi, im about to test the tentacke for software genlock, so yes, it is supported. The Seemo pro is what you need for lens calibration. Otherwise, you just need an iphone or ipad. Im not totally sure on which iphones/ipads work best for lightcraft so I'd start there
@@JoshuaMKerr i dont Like subscriptions, but it still seems Like a steal in comparison to what you Had to Spend on Equipment/Studio space before this App. Let alone the pipeline with UE and Blender. This is godsend
@@JoshuaMKerr I've recently started exploring the realms of virtual production, specifically through short throw projection and LED wall techniques, and it’s been an eye-opening experience. After experimenting with static shots against large OLEDs, I’ve realized the vast potential these technologies hold for storytelling and scene creation. Being able to work within Unreal Engine to adjust and manipulate scenes has been particularly seems revelatory for me. While I’m aware that you’re well-versed in the incredible capabilities and the transformative impact of virtual production, I wanted to share my enthusiasm as someone just beginning to navigate this exciting frontier. It’s thrilling to envision how these tools can enhance and streamline the filmmaking process, offering unprecedented flexibility and creative freedom. I'm eagerly looking forward to deepening my understanding and harnessing these technologies to their full potential. Also just wanted to point out that we share the same last name 😊 my name is Brett Alexander Kerr!
This is epic Joshua. Can I ask..Can the app stream just the Tracked camera data out so we could drive a UE5 or Aximmetry tracked camera? Could that happen in almost real time?
You need way more technical info about how this works in the this video rather than a lifestyle vlog. What camera's sensors does it use on the iphone? how do you account for the perspective shift, etc.
I wouldn't expect that from this type of video you're right. I may do deeper dives as I continue to produce content but, in tje meantime a lot of the info you're after is on their website.
9 หลายเดือนก่อน
Interesting. That could be very usefull for vfx ads in sport events like f1
Are you able to zoom in and out on the camcorder? Will the app encode the zooming commands automatically or do you need another device to encode the zooming data? Instead of camlink can you use NDi as your video feed?
Well, the camlink 4k was the physical device that brought the feed into the computer, and Unreal recognises that video capture device. I think this problem might be to do with how unreal brings in the feed, but NDI might be worth a test. As far as lens zooming and focus you would need at least a follow focus device like a tilta, but I believe that Jetset will support focus and zoom very soon...if not already. I will have to double check that.
I had lot of problems with camlink 4k... if any of your HDMI cable in the chain or peripheral does not supports 4K 60Hz then Camlink had unstability... even in 1080p 24p....
Yes, it calibrates with your camera, so it knows your focal length, then uses a digital slate that both the iPhone and cine camera see. This is used to sync when you bring the footage into the computer. The software that does the sync is called autoshot.
how do you genlock an iphone?@@JoshuaMKerr is it an easy process to calibrate and do lens distrotion? or is it simualr to all the others with opencv libraries/taking images of a Qr/charuco board from multiple angles? this is what all the more expensive trackers do, and from my expeirnce none of them are easy to do , and to get to a 1mm accuracy whcih is needed for fornplate/AR it almost always involves a tehcnician from the tracking company. this to me is the biggest hold up in VP, tracking with a iphone is cool and it seems realy stagble, but thats 10% of the puzzle.
just chekcout the websie, seems like there is no actual lens profile, its using ARkit to match featire points to match the cmaera sensor and iphone senser, which is great for tracking and getting accuruate transfrom data, but it will def not be enough to get intrinsic lens data, whcih is needed to do AR/front plate at a cinema quality final pixture in camera VFX - super intriguued to see where this goes! and any democratizing tech or app for virtual proudction is massivly welomce!! @@JoshuaMKerr
Absolutely. We're working with Beeble/Switchlight to connect the dots. It's a lot easier to hang a couple greenscreens than rent a giant volume, and Switchlight's relighting solves a huge piece of the background lighting match question.
We get the Android question quite often. But we're not making a statement on what phone system you should use in your life. In fact, many customers find they don't want text notifications to pop up on the screen of their virtual production system, and even if they have an iPhone, they buy a second phone for their professional use. A Vive Mars requires $5000 of dedicated hardware. This is the current low end of the market, before we entered. Compared to Vive we can handle a huge set, indoors or out, with no setup and no hardware. Think of the iPhone as an ask that customers buy $1200 of "dedicated" hardware. And it is amazing hardware, with Lidar. Lightcraft started 20 years ago and it made dedicated tracker hardware similar to Stype and Mosis. This hardware cost about $30,000 if unbundled. The IPhone matches that hardware tracker! And the Lidar adds some amazing capability. So, keep your Android phone. And get started with virtual production for free and for the cost of some amazing "dedicated" hardware that has the Apple logo.
Will probably never happen as the IPhone have some hardware wich Android mobiles will never have due to Apple patents and are required for this to work. Its same with the Metahuman live face animation. Will never be avaible on Android because of the Apple patented depth camera that is required. Yes patents are a curse for sure.
Thanks man for the quick reply but I watched your other video, th-cam.com/video/_tW2WlTF6Zw/w-d-xo.htmlsi=EsxkrIK-m_yJGEfk in it I tried to find the Cineview transmitter attached to the camera but I couldn't find it. However, the live phone screen is showing on the TV. Can you please tell me how it works?
@madhavchandekar6998 Ah, I didn't set up the TV at NAB. I imagine it's being cast via airplay to a macbook and then sent to the screen via HDMI But im not totally sure.
I have been watching your channel for a while now i love the whole make it accessible for individuals, I for the ground up we are the ones the keep the VP exciting...I just downloaded the app within the first 20 secs of watching this , so no more steam vr and htc vive
@@JoshuaMKerr It's a reasonable question. I'd look at this like adding a piece of gear to your system. If you're going into production you likely don't want your personal phone on the rig. Even older iPhones with LiDAR work great. I test with a 12 Pro Max myself.
That should work; let us know if you try it! You can remote control the Jetset app from a browser over wifi to see video and trigger recording start/stop.
That's a very interesting question. In principal, yes, but there are things to watch out for. For instance is the anamorphic lens producing a sharp enough image to do keying? You might also find flares and bloom interfering with the key.
I've never used an anamorphic lens but I think that the image gets vertically expanded in post for a lot of camera which might be a step that couldn't be added to the workflow but I don't know. @@JoshuaMKerr
We are looking to anamorphic lenses. It should be doable. We'd like to handle the full Creator rig which is a SONY FX3 and Kowa 75mm Anamorphic (rehoused by P+S Technik). Stay tuned.
We get the Android question quite often. But we're not making a statement on what phone system you should use in your life. In fact, many customers find they don't want text notifications to pop up on the screen of their virtual production system, and even if they have an iPhone, they buy a second phone for their professional use. A Vive Mars requires $5000 of dedicated hardware. This is the current low end of the market, before we entered. Compared to Vive we can handle a huge set, indoors or out, with no setup and no hardware. Think of the iPhone as an ask that customers buy $1200 of "dedicated" hardware. And it is amazing hardware, with Lidar. Lightcraft started 20 years ago and it made dedicated tracker hardware similar to Stype and Mosis. This hardware cost about $30,000 if unbundled. The IPhone matches that hardware tracker! And the Lidar adds some amazing capability. So, keep your Android phone. And get started with virtual production for free and for the cost of some amazing "dedicated" hardware that has the Apple logo.
For people interested in learning more about Jetset and Jetset Pro, we have a dedicated channel with tutorial videos. For people who are interested in virtual production that is 100% produced on an iPhone, this channel is dedicated to that! th-cam.com/channels/YMv1LnKofxafC2KYyLzbGg.html
I don't think that the app will have that super-insane-crazy-omg impact on filmmaking as filmmaking is not about tracking shots but it sure as hell will be a helpful tool for many and find it's regular place with indi productions and corporate film! Great work and good luck!
We write to the iCloud storage because it's the only storage on the phone that won't be deleted if the app is removed. Now that the iPhone 15 Pro allows writes to external hard drives, that limitation may be lifting.
9 minutes clip and 10 seconds footage of what it does ? Would prefer you show me more about the app, but all i get is you talking nonchalant for over 8 minutes
@@JoshuaMKerrTBH you should spend more time for it since the title is about the app not your journey at the expo. I clicked because im curious about the app but learned basically nothing about it . Feel like clickbait.. i left frustrated.
@Chijn Good to know. The video was intended to be a journey as it's essentially a vlog that talks about the apps capabilities and people's reactions. But there are and will be more in-depth 'how it works' videos coming...promise you that
monthly payment!!?.. noooo. .put free 1080p or just to record, not live..,and pro full..(something like this) it will be good to split the big companies from me.. a mortal that dont earn in dollar
Yes it is sponsored. But we didn't give Josh any talking points or direction. And we didn't review a final before it went out. He's made two videos so far and I think they are great. Hopefully the videos give you valuable information. And wow, if there is any channel that we'd like to support, it's Josh's
What are you talking about? You make no sense. 95% of the industry uses Apple products on set. People use their iPhone pro max all the time for the lidar scanner to quickly grab a 3D scan in case of having to rebuilt the set later and hell, I even had a few times where I used the 3D scans in a final shot done on an iphone. If you ever see blockbuster set with no one using either a macbook, macpro or iphones. Please send pictures because that will be a rare sight.
@@JoshuaMKerr I was referring to his comment in which he states that anyone who uses any Apple product is not a professional even tho most DITs run Macs for grading and file transfering, VFX supervisors use iPhones for things like 3D scanning and now we have this product which will undoubtedly be very useful.
I don't think they are overreacting Josh. Jetset, in a very humble and powerful way, is the beginning of the democratization of "large-scale" indie film productions. In upcoming versions and updates, the app will help creatives, who until now have been left out due to budget constraints, to capture their visions and to get so much closer to epic results. I'm very excited about this and will adopt and promote its use to all my colleagues. Thanks for the time and effort, and keep pushing forward.
The best comment. I would have written a comment, but, I too much integrity!
Thabks Gadriel, I think exactly the same as you. Im very excited for the future of this tool.
Both your genuine enthusiasm for this app and what it does are both bad ass! I’m a digital art teacher and trying to set up a virtual production course for my high school seniors. The trackers were the one component that I could not sort out for us. This potentially solves that issue for us so that I can keep students interested in the arts ahead of the curve and on top of the most recent technology for film making. I could go on forever about how excited I am about this. So cool! 🎉🎉🎉
Thanks so much for the comment. Im certain this would be ideal for your students. I'm thrilled you enjoyed my video, we do live in exciting times.
This is huge. I am definitely following the development of your system. Subscribed.
Well...its lightcraft's system. Im just a filmmaker who loves it. I'm certainly going to be covering it a bunch more.
Philip Bloom!!!! What a gangster of Indie Cinema. Cheers Josh on your Unreal App. As a fellow film / coder, this is trail blazing.
Oh haha, I didn't make the app. Im just a filmmaker who is lucky enough to use and endorse it.
Hey Josh you did it! It seems it was an amazing presentation! Gotta say I felt the same way when Jonathan Nolan got on stage on my very own conference lol. Would love to see an in depth breakdown of the workflow you've used here, it seems it's the most accessible indy filmmaking VP workflow we've seen so far.
Yes it was an incredible experience. I'll definitely be covering the workflow on the channel and new developemnts as they arise. Nice work getting Jonathan Nolan to your conference by the way.
Wow. That is....ridiculously impressive. Putting a basic version of virtual production into literally anyone's hands is game-changing. Combine it with mocap and it's basically a basic Avatar setup. I'm blown away.
Thanks, I totally agree. The fact that it scales to different needs is so cool. I can't imagine going back to messing around with unreal for calibration stuff.
What a spectacularly fun video to watch Josh!
Thank you! Im glad you enjoyed it
Two things Josh…1) it is insane that this video has so few views 2) it is incredible what you guys have done. This is absolutely the future and I want to use it as soon as possible “I run an m2 pro Mac mini upstairs and a Puget system with 2 RTX 4090’s and a 64-core Threadripper downstairs. I also own a full Rokoko body rig” and 3) I feel so BLESSED that absolutely nobody has watched this video and I hope it remains that way😂😂😂😂 I’m extremely selfish and don’t want anybody else to know what you guys have created here LOL. in fact I hope this video is somehow a breach of contract with TH-cam and gets removed almost immediately now that I’ve seen it😂😂😂. Congratulations to you guys this is amazing 🎉
I'm moving my entire studio over to unreal engine. It's phenomenal.
I had the exact same response. Selfishly, I want this to remain an insider secret for as long as humanly possible. 😂
@@TheOneTrueJack LMFAO like seriously, I don't want ANYBODY ELSE finding out lolol.
It’s a two edged sword. If they don’t make money they can’t keep supporting this great tool.
So I hope it goes viral, so much that they can actually lower the price 😊
Sorry to disappoint you haha. It started small but we appear to have blown up.
Amazing that you did not throw the towel and went full problem solver until you had it working again!
I never even thought about giving up. That would've been much easier...damn
Thanks!
Hey Bill! Thanks to you!
Wow! Congratulations! Truly remarkable.
Cheers, it was a lot of work but I think it paid off
That's pretty amazing! Oh how I felt your frustration and then relief figuring out your technical issue. Worst part of working a trade show... not being able to see the exhibits. Glad to see they're recycling the CanadArm into something useful terrestrially. 😁
Have long wished I'd gotten into cinematography much earliee in life but that's how the winds blow. Looking forward to watching more from this channel.
Glad you enjoyed it. Yes, that was quite tense. Good job we got it fixed.
This is simply amazing and a real game changer for indie filmmakers like myself. Thanks for sharing this video. I can’t wait to try out Jetset Cinema!!! ❤️☺️👏🏾👏🏾👏🏾👏🏾👏🏾🙌🏾
Incredible! Amazing video Josh! You really knocked it out of the park with this one!
Thanks man, It was quite a big effort. Glad it's being enjoyed.
Love your genuine enthusiasm, it really shines through. Also, this platform certainly has the potential to break the film industry… absolutely wild tech. I’ll definitely be looking into it.
Thanks so much. Im just a nerd, nerding out about nerdy tech. Glad you liked the video. You'll enjoy the app I think.
This is amazing!!! Can't wait to see this in the future!!
The future is now
Great video! Really interesting. I was about 30 seconds in and then realised “holy sh!t, that’s Josh!” We worked together back at the Sage on Thinking Digital and various projects. Glad to see you’re flying mate! Keep it up!
No way! Hi Ben! How are you doing?
All great here thank you sir. I went freelance a few years ago. Photography and film making, been non stop since setting up by myself, love it.
Nothing ever goes smoothly my friend. It all adds to the experience, learning, and guides us towards perfection.
All in all, you guys did well.
Thanks! we'll get there
Glad you were able to overcome the issue on day 2. That must have been so disheartening. The product itself is intriguing, and really impressive. Was at the show, but didnt get very long by the Accsoon booth to see it in action. Now im disappointed I didnt get to see it live. I did see the canon booth and to be honest yours looks more interesting in terms of VP. Your floor neighbours Godox had a nifty lighting app too to be able to match lighting effects to a virtual background.
Hope it goes well for you guys. I know a few people here in Ireland who would love to do some virtual production work but were scared off by the costs, but its looking more and more likely with products like Lightcraft Jetset that they can at least try it out without breaking the bank.
Ah, thanks for the comment. What were canon showcasing? I didn't get much time out of the booth.
Congrats joshua. This is Amazing!
I understood half of it, but still im very excited. Congrats on this amazing work!
Thanks!
with the Choppy video, my guess is its some form of Windows trying to minimize resource usage, so its allocated less CPU time and re-initializing the capture device refreshed its priority, or something like that.
Interesting, I didn't think of that. Probably a byproduct of using a consumer laptop for the demo. It was unfortunately completely neccessary to do so.
@@JoshuaMKerr if your laptop has a 4090 RTX in it I’ve been having issues with it where it will run fine and then all of a sudden the video starts to lag. So it might be a video issue.
@@JoshuaMKerrif its running windows 11, I think its a Windows issue, maybe running windows 10 if the issue persists
Wow 🤩 This thing is looks absolute mind blowing 🤯 Good luck 👍
Thanks 👍
I need this thing. Great video. I am 100% using this for a project this month. If I can figure it all out in time
Let me know if you have questions :)
@@JoshuaMKerr Thanks, I absolutely will... just had a very quick run out with it yesterday and it is really amazing.. I think I only scratched the surface of what it can do.
@critchee I know the feeling. I just discovered a whole bunch of new stuff about how it works today.
Great to hear it. We just switched on the new Lightcraft site with forums so feel free to ask questions on anything you run into!
You are a real problem solver Joshua - Well done!
Haha, thanks. It would he nice if virtual production wasn't packed with problems. At least Jetset solves a bunch of them
With the stutter consider the venue? Signal noise let alone a cell phone signal is almost a no go. I would not put it past other vendors to put in signal boosters that could interfere either.
Oh god, totally. The place was a mess of electromagnetic interference. But the video wasn't stuttering on its way into the laptop, which helped me locate the problem.
That’s supremely cool!
So is this comment
bro , could you please tell me how you import all the stuff into blender?
You need to use autoshot. It’s free to download from lightcraft.pro. There are detailed tutorials on their website.
This is so inspiring man! well done!
Thanks mate. How are you getting on?
@@JoshuaMKerr i'm ok just too caught upo with client work to turn my attention back to unrel but someday i'll get the chance
it‘s a great creation, I have to say the tracking on the final comp was shaky but hopefully future iterations can get that solid, I would have totally suggested this on a quick turnaround job last week had I have known about it
There was a minor jitter, but I've had worse for much more expensive trackers.
There are definitely tracking refinements in the works. That's a certainty.
👍 Brilliant Work, Joshua, thank you very much, my Bro!! Instant sub. 😎
Thanks for the sub! Glad to have you onboard
5:11 was that Philip bloom testing out your rig??? Cheers all the way from Brazil Josh 🇧🇷
It was! Couldn't believe it myself.
Great video! What is the Digital Slate App that you are using on laptop. It went really fast with flickering QR codes and the slate. Not sure how does this sync the unreal with the camera. Can someone guide me to find the tool? Thanks!
The slate is part of lightcraft jetset and is accessed via autoshot.
I love you bro. I know exactly the starstruck feelings you're describing. So cool
👐 Haha, it was quite odd, I never thought I'd be that way. I kept my cool though.
you are doing a great job! thank you for putting in the work. I was checking out lightcraft because of you and will be testing it for a future project. What is really missing in the documentation of lightcraft is a finished result. like the one you posted here but maybe a bit more complex movement. In the tutorials there sometimes seems to be a lot of drift in the origin and that made me hesitant a bit.
Thanks, im glad you're enjoying my work. You might be interested to know they are introducing a post-processing feature for tracking data to bring it to sub pixel accuracy.
You need to link up with a led panel company for lighting i don't mean the volume, but having light panels that react to the unreal lighting, that will push your system, gobos point flat, and reactive lighting can be done, that will push this product over the edge. Keep it up and do the community a favour, do not let any of the big corporates buy this system or merge, as indie makers like us will be priced out of the market again
I like the idea of DMX lighting. It'll produce great colour reproduction and can be pixel mapped to Unreal Engine. I need to get my hands on some falconeyes panels or similar.
Really cool! Love this tech!
Me too!
Nice video. Only problem is after watching I have no idea what your product does.
Sorry if you feel i didn't cover it in enough detail. I'd be happy to clarify things if you have questions.
@@JoshuaMKerr Camera tracking yes, but in what sense? I did not see the game changing utility of the app. Maybe if I were there at the show it would have been more obvious. OTOH, if everyone else gets it, then don't worry about me.
Cool. About the camlink issue, Why not just go strait ethernet or USBC from the zCam to the laptop?
The demo was about showing what can be acheived with low end hardware. But beyond that you need to use a capture device that unreal recognises. So far that's AJA, Decklink or Camlink.
Keep up the good work! Will we have a tutorial anytime soon?
Working on it;)
Superb video and amazing tech mate, you definitely didn’t oversell it!!!
Glad you enjoyed it mate!
I just bought a Vive setup for a VP SCI FI film I am planning but this looks so much simpler to set up. looks like i'll have sell it and buy an Ipone
I'd definitely recommend this over a vive
When's the android version of the app coming?
I'm afraid there won't be an android version as far as I'm aware.
We get the Android question quite often. But we're not making a statement on what phone system you should use in your life. In fact, many customers find they don't want text notifications to pop up on the screen of their virtual production system, and even if they have an iPhone, they buy a second phone for their professional use. A Vive Mars requires $5000 of dedicated hardware. This is the current low end of the market, before we entered. Compared to Vive we can handle a huge set, indoors or out, with no setup and no hardware. Think of the iPhone as an ask that customers buy $1200 of "dedicated" hardware. And it is amazing hardware, with Lidar. Lightcraft started 20 years ago and it made dedicated tracker hardware similar to Stype and Mosis. This hardware cost about $30,000 if unbundled. The IPhone matches that hardware tracker! And the Lidar adds some amazing capability. So, keep your Android phone. And get started with virtual production for free and for the cost of some amazing "dedicated" hardware that has the Apple logo. Android phones cannot match iPhones for virtual production, so there are no plans for an Android version I'm afraid.
Every Flagships needs to have LiDar
Lidar is one thing, but it's the software and pipeline that makes this into something useful. Cameras would need a lot more hardware capability to do this internally. The best you could hope for is for a camera's lidar functionality to output it's data for use in external applications, which I don't see happening any time soon.
Brilliant job . We should def talk about a key production in pre production- tools of the future 🇮🇹🎬🎥💪🏼
Sounds good
Hey there, great content. Thank you! Do you recommend the Accsoon SeeMo Pro over the regular SeeMo? Should I get one of these for a smoother workflow? Usually I'm using a Mars system for simple HDMI signal transmission. But for the UE Jetset solution with post-rendering the SeeMo seems necessary ...
Would I need anything else apart from an iPhone/iPad (I have some old devices iPad 8th generation and iPhone 12 Mini somewhere ... )? Have you used the Tentacle system for Time Code or something similar? The Lightcraft website mentions Tenacle support. That's why I'm asking.
Sorry for my many questions ;-)
Best wishes and keep up the great work!!!
Hi, im about to test the tentacke for software genlock, so yes, it is supported.
The Seemo pro is what you need for lens calibration. Otherwise, you just need an iphone or ipad. Im not totally sure on which iphones/ipads work best for lightcraft so I'd start there
@@JoshuaMKerr Much appreciation for the quick response and your input. Best wishes
This app looks epic, just downloading now!
Awesome
Superb, congratulations.
Thanks
Good thing its going public not those inhouse apps
Such a great video 🙏
Glad you enjoyed it!
This is crazy. In a very good way
Thank you. It was lots of fun
@@JoshuaMKerr i dont Like subscriptions, but it still seems Like a steal in comparison to what you Had to Spend on Equipment/Studio space before this App. Let alone the pipeline with UE and Blender. This is godsend
Very exciting stuff!! Been looking to get into virtual productions for awhile!
You absolutely should :)
@@JoshuaMKerr I've recently started exploring the realms of virtual production, specifically through short throw projection and LED wall techniques, and it’s been an eye-opening experience. After experimenting with static shots against large OLEDs, I’ve realized the vast potential these technologies hold for storytelling and scene creation. Being able to work within Unreal Engine to adjust and manipulate scenes has been particularly seems revelatory for me. While I’m aware that you’re well-versed in the incredible capabilities and the transformative impact of virtual production, I wanted to share my enthusiasm as someone just beginning to navigate this exciting frontier. It’s thrilling to envision how these tools can enhance and streamline the filmmaking process, offering unprecedented flexibility and creative freedom. I'm eagerly looking forward to deepening my understanding and harnessing these technologies to their full potential.
Also just wanted to point out that we share the same last name 😊 my name is Brett Alexander Kerr!
So dope!
please put down a detailed demo ...
I will be
Great video, happy to have discovered it, thanks!
May I ask which digital slate you‘re using ?
It's the slate that comes with the app.
This is epic Joshua. Can I ask..Can the app stream just the Tracked camera data out so we could drive a UE5 or Aximmetry tracked camera? Could that happen in almost real time?
Yes, this is completely possible. I believe these capabilities are unlocked on the Jetset cine tier.
You need way more technical info about how this works in the this video rather than a lifestyle vlog. What camera's sensors does it use on the iphone? how do you account for the perspective shift, etc.
I wouldn't expect that from this type of video you're right. I may do deeper dives as I continue to produce content but, in tje meantime a lot of the info you're after is on their website.
Interesting. That could be very usefull for vfx ads in sport events like f1
That would be cool!
@@JoshuaMKerr yeah right now they are using planar tracker which sucks a lot.
Are you able to zoom in and out on the camcorder? Will the app encode the zooming commands automatically or do you need another device to encode the zooming data?
Instead of camlink can you use NDi as your video feed?
Well, the camlink 4k was the physical device that brought the feed into the computer, and Unreal recognises that video capture device. I think this problem might be to do with how unreal brings in the feed, but NDI might be worth a test.
As far as lens zooming and focus you would need at least a follow focus device like a tilta, but I believe that Jetset will support focus and zoom very soon...if not already. I will have to double check that.
UMMMM>> YES !
I had lot of problems with camlink 4k... if any of your HDMI cable in the chain or peripheral does not supports 4K 60Hz then Camlink had unstability... even in 1080p 24p....
Thats good to know. Cheers
@@JoshuaMKerr for sure! Hope it help you :)
When I changed all cables to support 4K60 then even 4:2:2 4K 10 bit was working with it linked through a Freeworld LUT6 ...
Exciting stuff! Do you plan on being at NAB in Las Vegas?
I will be there! Look for me at the Accsoon booth
I will! Looking forward to it.
@@JoshuaMKerr Great meeting you in person today! You’ve got a fantastic product on your hands. Kevin VanHook
Hey Josh... an amazing presentation! so the iPhone is used here as a tracker and syncs it with camera footage. am I right?
Yes, it calibrates with your camera, so it knows your focal length, then uses a digital slate that both the iPhone and cine camera see. This is used to sync when you bring the footage into the computer. The software that does the sync is called autoshot.
Thats Cool@@JoshuaMKerr
Looks cool!
How does it deal with genlock? And how does it deal with lens distortion/lens calibration?
Ill be showing these in an upcoming tutorial...but yes, all of those can be done.
how do you genlock an iphone?@@JoshuaMKerr
is it an easy process to calibrate and do lens distrotion? or is it simualr to all the others with opencv libraries/taking images of a Qr/charuco board from multiple angles?
this is what all the more expensive trackers do, and from my expeirnce none of them are easy to do , and to get to a 1mm accuracy whcih is needed for fornplate/AR it almost always involves a tehcnician from the tracking company.
this to me is the biggest hold up in VP, tracking with a iphone is cool and it seems realy stagble, but thats 10% of the puzzle.
just chekcout the websie, seems like there is no actual lens profile, its using ARkit to match featire points to match the cmaera sensor and iphone senser, which is great for tracking and getting accuruate transfrom data, but it will def not be enough to get intrinsic lens data, whcih is needed to do AR/front plate at a cinema quality final pixture in camera VFX - super intriguued to see where this goes!
and any democratizing tech or app for virtual proudction is massivly welomce!!
@@JoshuaMKerr
JetSet + ShitchLight = Break the Indistrie
Oh trust me, that's happening
Absolutely. We're working with Beeble/Switchlight to connect the dots. It's a lot easier to hang a couple greenscreens than rent a giant volume, and Switchlight's relighting solves a huge piece of the background lighting match question.
I there an Android version anywhere on the horizon? The possibilities here are endless! Great app.
We get the Android question quite often. But we're not making a statement on what phone system you should use in your life. In fact, many customers find they don't want text notifications to pop up on the screen of their virtual production system, and even if they have an iPhone, they buy a second phone for their professional use. A Vive Mars requires $5000 of dedicated hardware. This is the current low end of the market, before we entered. Compared to Vive we can handle a huge set, indoors or out, with no setup and no hardware. Think of the iPhone as an ask that customers buy $1200 of "dedicated" hardware. And it is amazing hardware, with Lidar. Lightcraft started 20 years ago and it made dedicated tracker hardware similar to Stype and Mosis. This hardware cost about $30,000 if unbundled. The IPhone matches that hardware tracker! And the Lidar adds some amazing capability. So, keep your Android phone. And get started with virtual production for free and for the cost of some amazing "dedicated" hardware that has the Apple logo.
Will probably never happen as the IPhone have some hardware wich Android mobiles will never have due to Apple patents and are required for this to work. Its same with the Metahuman live face animation. Will never be avaible on Android because of the Apple patented depth camera that is required. Yes patents are a curse for sure.
Wow this is insane
Glad you like it. We had fun with this
Hey same transmitter!
Could you please tell how phones screen outputting into transmitter?
USBC to hdmi
Thanks man for the quick reply but I watched your other video,
th-cam.com/video/_tW2WlTF6Zw/w-d-xo.htmlsi=EsxkrIK-m_yJGEfk
in it I tried to find the Cineview transmitter attached to the camera but I couldn't find it. However, the live phone screen is showing on the TV. Can you please tell me how it works?
@madhavchandekar6998 Ah, I didn't set up the TV at NAB. I imagine it's being cast via airplay to a macbook and then sent to the screen via HDMI But im not totally sure.
I see philip bloom.. i like
I have been watching your channel for a while now i love the whole make it accessible for individuals, I for the ground up we are the ones the keep the VP exciting...I just downloaded the app within the first 20 secs of watching this , so no more steam vr and htc vive
Yes. I can be bothered with steamvr anymore. I've been using this on a shoot today and it's made the whole production so smooth.
Waiting for the android version 🔥😎🔥
You'll be waiting a long time
@@JoshuaMKerr It's a reasonable question. I'd look at this like adding a piece of gear to your system. If you're going into production you likely don't want your personal phone on the rig. Even older iPhones with LiDAR work great. I test with a 12 Pro Max myself.
That's amazing! If you can put it on a drone... oh boy...
I dont see why not :)
That should work; let us know if you try it! You can remote control the Jetset app from a browser over wifi to see video and trigger recording start/stop.
Hi, can it work on iPads with lidar?
We had ipads running the app at the expo. Works great.
Random question, can this work with an anamorphic lens?
That's a very interesting question. In principal, yes, but there are things to watch out for. For instance is the anamorphic lens producing a sharp enough image to do keying? You might also find flares and bloom interfering with the key.
I've never used an anamorphic lens but I think that the image gets vertically expanded in post for a lot of camera which might be a step that couldn't be added to the workflow but I don't know. @@JoshuaMKerr
We are looking to anamorphic lenses. It should be doable. We'd like to handle the full Creator rig which is a SONY FX3 and Kowa 75mm Anamorphic (rehoused by P+S Technik). Stay tuned.
This is fucking amazing!
We think so too. We've actually been doing virtual production for 20 years. And this product started in 2019 and took 5 years! There is a lot to it!
When you take the cake out of the oven before it's baked😹
What is the cake in this metaphor?
Wow Philip bloom on the humble
There needs to be an android version ASAP.
Android need a lidar sensor before that will happen.
We get the Android question quite often. But we're not making a statement on what phone system you should use in your life. In fact, many customers find they don't want text notifications to pop up on the screen of their virtual production system, and even if they have an iPhone, they buy a second phone for their professional use. A Vive Mars requires $5000 of dedicated hardware. This is the current low end of the market, before we entered. Compared to Vive we can handle a huge set, indoors or out, with no setup and no hardware. Think of the iPhone as an ask that customers buy $1200 of "dedicated" hardware. And it is amazing hardware, with Lidar. Lightcraft started 20 years ago and it made dedicated tracker hardware similar to Stype and Mosis. This hardware cost about $30,000 if unbundled. The IPhone matches that hardware tracker! And the Lidar adds some amazing capability. So, keep your Android phone. And get started with virtual production for free and for the cost of some amazing "dedicated" hardware that has the Apple logo.
Do you offer paid consultation ?
I will be doing so soon. You can drop me an email at info@joshuamkerr.com
iphone only?
Afraid so.
For people interested in learning more about Jetset and Jetset Pro, we have a dedicated channel with tutorial videos. For people who are interested in virtual production that is 100% produced on an iPhone, this channel is dedicated to that! th-cam.com/channels/YMv1LnKofxafC2KYyLzbGg.html
Amazing mate.
Thank you! Cheers!
Wow, her reflection can be seen in the virtual environment 🤔
That was done in compositing.
Sold
I don't think that the app will have that super-insane-crazy-omg impact on filmmaking as filmmaking is not about tracking shots but it sure as hell will be a helpful tool for many and find it's regular place with indi productions and corporate film! Great work and good luck!
It's certainly going to speed up a lot of vfx work if you can get shots tracked live on set instead of during post.
Whoa
What came first, the chicken or the matrix?
Ask Colonel Morpheus
What a shame it's only for iPhone.
Unfortunately, I dont think Android phones are equipped for the task yet. And I say this as an android user.
Requiring iCloud storage is a non-starter.
Interesting, why is icloud storage a deal breaker?
We write to the iCloud storage because it's the only storage on the phone that won't be deleted if the app is removed. Now that the iPhone 15 Pro allows writes to external hard drives, that limitation may be lifting.
Can we test the app with external storage anytime soon before buying? When will it be lifted? Cheers...@@eliotmack
avatar for indie film makers!
Absolutely
9 minutes clip and 10 seconds footage of what it does ? Would prefer you show me more about the app, but all i get is you talking nonchalant for over 8 minutes
The video explains what the app does, but if you're looking for more details, I have other videos on the channel that do that.
@@JoshuaMKerrTBH you should spend more time for it since the title is about the app not your journey at the expo. I clicked because im curious about the app but learned basically nothing about it . Feel like clickbait.. i left frustrated.
@Chijn Good to know. The video was intended to be a journey as it's essentially a vlog that talks about the apps capabilities and people's reactions. But there are and will be more in-depth 'how it works' videos coming...promise you that
monthly payment!!?.. noooo. .put free 1080p or just to record, not live..,and pro full..(something like this) it will be good to split the big companies from me.. a mortal that dont earn in dollar
It's free to shoot on iphone. The higher tiers that include 4k and lens calibration are paid subscription.
9 minute ads
Welcome to my sponsored video :)
Yes it is sponsored. But we didn't give Josh any talking points or direction. And we didn't review a final before it went out. He's made two videos so far and I think they are great. Hopefully the videos give you valuable information. And wow, if there is any channel that we'd like to support, it's Josh's
Anyone who uses any Apple product is definitively not professional, those are part of the bitten apple cult. That's it, I unsubscribe!
Sorry to see you go.
What are you talking about? You make no sense. 95% of the industry uses Apple products on set.
People use their iPhone pro max all the time for the lidar scanner to quickly grab a 3D scan in case of having to rebuilt the set later and hell, I even had a few times where I used the 3D scans in a final shot done on an iphone.
If you ever see blockbuster set with no one using either a macbook, macpro or iphones. Please send pictures because that will be a rare sight.
@annekedebruyn7797 Except this video is not about 3d scanning.
@@JoshuaMKerr I was referring to his comment in which he states that anyone who uses any Apple product is not a professional even tho most DITs run Macs for grading and file transfering, VFX supervisors use iPhones for things like 3D scanning and now we have this product which will undoubtedly be very useful.
Ah, pardon me, I didn't catch that you were responding to the earlier comment.
Mind if I ask what you do? Are you in vfx? Just curious.
Have You tried turning it on and off again? 😂😁
Should've called you for tech support.
no. I don’t think James Cameron or Stephen Spielberg were stunned. Fucking ridiculous statement.
Haha yes...they definitely weren't present. Come to think of it, neither was Scorsese or Fincher. They won't return my calls either.
When do we get the full tutorial?! Im dying!! 😂🖤🦾
im sure thats not far off, stay tuned
That feeling you must have had when people were interacting with your creation 🤌
Very satisfying for sure ;)