Hey guys, thanks for staying with me thru the hard times.....I'm back now, and got a few videos slated for this week and next. Also wanted to mention, Insta 360 is offering a free $100 selfie stick for anyone who orders thru my link, (limited to first 30 people), so if you like what I do, and are thinking about this camera, then support the channel! geni.us/INSTA360_X4_8K
Where here for you as a peer professional my services are free if you ever wanna talk to someone and build good rapport hit me up I’m 51. I’m married. I have a 23-year-old daughter. I grew up poor in Santa Ana, California and had trauma growing up, but I my lived experience to help people with mental health and substance use disorder. I’m also in school to become a substance-abuse counselor and I just started another job substance-abuse counselor.
This is such a big aha moment for my VFX workflow. I’ve always wondered how everyone else is able to get a perfect track from untrackable footage and it makes so much more sense now.
there are multiple ways to do it. But ppl really don't use auto-track since many shots aren't easy to track. Two things you can do if you don't have a 360 camera. Turn down/up the shutter angle/speed doing this controls how much motion blur is present in the image. Also, place tracking markers in the physical scene. This way you'll know for certain that you have tracking points. In post to get the motion blur back use something like real smart motion blur. Another thing you can do is before you start tracking apply some sharpening. This will allow corners and other points to be more contrasted. Remember you need at least 8 tracking points in the scene at all times. If you can have a vfx supervisor on set they will make sure your scene is vfx ready
@@AnimeZone247 If you can have a VFX supe xD not all of us are that rich lol I work in VFX and even I didn't know the value of a witness cam. Sure, you can track the footage you already have but it's always going to be a terrible time and worse when you can't get that perfect solve. Those effects are just going to be put in post and it's never going to look right unless you're a gifted comper. An X4 is an investment but if it means you can start getting creative? Why the hell not. All I need is this rig and I'll have everything I need. Put in CG cars? Done. Change all the store signs with cards? Easy. Removing things with paintfixes? Re-projecting will take no time. I'm just advocating to move things forward and keeping this simple for the simple vfx creators.
@@1BrknHrtdRomeo I’m not saying don’t use a 360 cam, use everything at your disposal. But these are tips if you dont have a witness cam and you still wanted to match move 😂
Before witness cams we took lots reference photos of the scene and used those to assist the solve. Shooting lens grids for the main and reference cameras helps a lot too, so you can accurately dial in the lens distortion first. Main camera distortion will vary a bit due to focus variations, but that’s not a problem if you’ve an accurate solve and point cloud from the reference photos. That said, one of these would be an incredibly useful addition to anyone’s vfx kit. Great for shooting sky domes and lighting HDRs too.
Funny, I bought this camera months ago and just used it for the first time yesterday. I had no idea it could be used in this way. My mind is blown. Also sending love to you and your family. Sharing what you're going through is helpful for all of us. ❤
Great Technique. As a Professional VFX artist with over a decade of experience on Hollywood Blockbusters I just want to point out a few things to look out for if you are planning to do it in a professional environment. 1. You need to genlock both cameras so that both are capturing the frame at the same time. 2. This will only work when shooting digitally. When shooting on film, the film itself is moving, so you even need to track static shots shot on sticks. Regardless, your VFX department will probably not complain about having more reference material :) If you are planning on adding additional CG to your shot your vfx team will be even happier if you use your 360 Cam to capture a few HDRIs brackets
Staggering. I love seeing this come to fruition; I'd been seeing the foundations of technology like this in episodes of Two Minute Papers and such for a couplefew years now, and now here it is for people like us to use and be amazed by.
I remember using the X2 on a desert race track shoot at request of the client. All we used was the wide field of view. I had no clue the sheer versatility of this product - it's insane what can be accomplished with it.
FYI, i succeded in grabing gopro tracking and applying it to FX6 50mm 1.4 wide open on a close subject, handheld, all inside of after effects. That's a lot of guessing, since i could not put the exact fov of the gopro, after effect failed to track it at the right FOV, and automatic tracking was way to narrow (35mm), so i applied approx 105mm to the second camera insted of 50mm to compensate. And moved the second camera to the right position (rough calculation with after effects pixel distance of one point in my scene, and estimating how much distance it should be in pixel...), and used a null with anchor point modified and linked to main camera. It took me 1hour to create the process after some frustration, but now it works ! and i will need it so much in 3 weeks! That is definitely not perfect but simpler for me since i dont use thos software you got. thank you for the inspiration.
Hi. I´m trying to achive some how what you are describig here. i dont have an Insta360 but a GoPro. I would love to resolve some doubtsi have regards this, because instaed a 3d program like here in this video i want to do it in AE.
@contactojaime then try to contact me, you find m'y website with m'y name, you will fond m'y email adress. I made a private tutorial, i can send it to you but that's on french
How does this man NOT have a million subs yet? His films are beautiful and his product videos are funny, complex, adventurous and down right experimental all at the same time… I swear it’s that “junk draw” storytelling going around that gets all the attention! I don’t know how it works so well but it’s unfortunate… Moving in on 500k though! Congrats!
I strapped my 360 camera to the main camera on a shoot a while ago for behind the scenes, and i was thinking of this exact workflow, i am blown away it actually works!
I just experienced a Love Job and not only do i not know what to think or feel about it but ill have to watch many times over . Lotsa info. Wow! Thank You
As a fx artist I am forever asking our talented camera tracker if the track could get closer and he gives loads of valid reasons like, no lease data, it’s anamorphic, the person’s head we need tracked has no 3d scan. I think that was one of the shots excuse list. So I appreciate having the track we do have. Then after he is done on the shot the client starts delivering the data we need for the shot.
Bro! You deserve all the success you have! The amount of information you share with each video is absolute gold. Although I will probably never even get to 10% of your level, I still appreciate all you do!
This is so incredible! I am so overjoyed that I chose to buy the insta360 x4. I had no idea that is helpful with hiring out vfx help! My mind is blown!! ❤
Holy frock this is the smartest thing I’ve ever seen haha being an unemployed VFX dude, this is perfect for anyone that wants to create their own shots and become a generalist
Oh wow, I stumbled upon an idea video that I’m uniquely familiar with. I had this same idea a few years ago, but using a GoPro. I wrote to Russ Andersson about the idea, because I wanted him to validate it (and hopefully incorporate it into his magnificent SynthEyes program). Imagine my surprise and confusion when he shot down the idea. At the time, he felt like his current system/method was sufficient, even though I explained the very common issues that you’ve brought up in this video. My biggest gripe was that low light scenarios, where the hero cam is exposed such that the background is practically black, causes you to have no usable tracking features. Meanwhile, I could place a ton of glowing LEDs down in front of the camera, where my GoPro can see it, but it won’t interfere with the composer hero shot. Hopefully Russ can come around on the idea.
We've been piggybacking like this for FPV drones for years. Try gyroflow. You can calibrate a profile to your lens and camera combo first and then put the gyro data in.
@@dadbeatdead the camera file no but the gyro data yes. I can use my insta360 one R with 360 module on an Ursa mini and stabilize the ursa video in gyroflow with the insta360 data.
I sorta jumped to a conclusion that is what this video was about based on the thumbnail but it was more just a 360 camera review so I guess very different applications
@@Shauny_D yeah, it was for tracking data that’s extrapolated from a 360 camera, on GyroFlow’s website they list they don’t support any of of Insta360 X cameras. I was specifying the application that was presented for the Insta360 X4 that he’s using.
I think in 2018 we used one of the few 3D camera phones attached to the main camera to 3D track and allow us to overlay all the positions of pre-rendered 3D assets on a scene so the director could easily align eye lines to all the pre-determined positions of the 3D world beyond the set. It was a little complicated as the set would move but the background would be static so you had to track that and solve for that in real time. It worked pretty well though.
I wish I was smart enough to do all of this myself.. but the fact I can just use my 360 cam in preparation to hire out for VFX work and not have to stress too much about it is dope. Granted I understand this is within levels of other levels of effects haha.
I recently bought a Red Raptor and all week I’ve been thinking about strapping my BMPCC6K to the top of the raptor to somehow use the 6K’s gyro data in Davinci for stabilization. I did later realize that the raptor records gyro data and I just need a third party plugin to read it but that’s not as convenient. Either way, it’s cool to see somehow having a similar idea and going much further with it!
YO YO JOSH YEO - Fuck this was awesome - amazing video , as always Quality over Quantity !! Never stop making, take as long as you need when the quality is this good. Absolutely legendary - would love to see more of your Luxe Real estate BTS + little tips / tricks some day :)
Never in a million years did i think i would witness a man shooting selfie video through a dslr with an insta 360 strapped to the dslr through another camera strapped to the dslr.
I work as an engineering surveyor and have recently been learning how to use a Mavic 3E drone to do photogrammetry surveys (insanely detailed 3D models). I feel like it would make a great addition to your work (think Matterport for the whole external property)! I’m moving to California next month if you wanted to play with drones 🤙🏻
I'm actually amazed by this workflow good job ! I own a Insta 360 x3, i'll definitely try !! Only thing that seems to still be not perfect is to calculate the offset between the 2 camera to get that pixel perfect track on the main Cam... I'll for sure try to find a good workflow for this All best ❤️
Yeah let me know if you do. I could see a use case for a specific mount (cold shoe) that people can get on amazon, and working with a specific model camera, and notating the EXACT offset for everyone to use. Mine was close, but by no means super dialed in.
Really nice video! I have been using the insta 360 for a while now but never tried using it as a piggyback witness cam! Definitely gonna try that thank you!!
I was having a hard time justifying this purchase for just HDRI’s for our shots but this is truly a game changer for tracking. May reduce the need of the cumbersome workflow of something like Jetset software.
Great video! Almost have this working. Got the camera doubled in Blender, solve down below 1, but how were you able to line up cine footage? I see the Camera01screenshape, but no video. Lastly, did you do those composites in Blender or bring it into AE once lined up? Sending to AE be ideal for me. Thanks!
Hmmm perhaps I missed something, but how do you apply it to another camera that used a completely different lens? For example if the Insta is super wide, how do you apply the solve onto 50mm lens with all its little imperfections?
That's what the program is for. It takes the offset you measure from image center to image center in all three directions so that when you track the real footage into your CG render with the two cameras also offset virtually, it can match the tracking points onto the in-camera depth of field and field of view perfectly. That's the genius of it.
I had a theory a few years ago about mounting three wide-angle witness cameras (arranged in an equilateral triangle) around the hero lens, with the centre of the hero lens directly in the centre of the equilateral triangle. The idea of using a triangle would be you would have two cameras that are side-to-side relative to each other, creating stereo horizontal parallax; and the third camera would then provide vertical parallax, potentially increasing tracking accuracy.
Hey there, great video! I tested the technique a bit the other day and got okay results. But I have a couple questions. Even when I apply the exact focal length and sensor size of my shot cam to the offset virtual camera, it's a bit off. After quite a bit of fiddling, I got it close, but still not a usable matchmove for placing 3d objects.. Are you calibrating the shot cam footage to the 360 somehow(other than offset measurements) to get a solid match? Possibly locking a few tracking points of the 360 track to points on the shot cam footage somehow? Or do you only use this for long out of focus shots? With my tests, achieving the results you have for the shot at TC 10:35 would work. At TC 6:30, you show some mannequins tracked in. I'm struggling to achieve similar results with a great matchmove. Another thing I did, which could be helpful to anyone reading, is shooting the 360 cam at 48 fps so there is a smaller margin of error when you are matching up the start frames. You can make your in and outs in the insta360 software, then render out as 24fps. You can dial this in even more with a subframe time offset on your camera in your 3d package, if it's still slightly off. Excited to play with this more. Thanks for the great start!
I would love to know how to use the Camera Tracker in Davinci Resolve to do this rather than using a separate app for that. I don't see anything on YT that shows that. When I tried, I only got 1/2 of the 360 camera to track.
used to think vfx artists only uses 360 cameras for HDRI envirenment lighting reference thingy, well this is new! surprised that it tracks so well even with the distortions of a flat 360 video.
08:33 Is the manual calibration really necessary? You can't get software to do the calibration with just a few synchronized stills from both cameras in an easily trackable environment?
Can you realign the camera inside the syntheyes software and export directly to after effects ? or export the camera metadate from blender to after effects ?
Pretty impressive! Now, not being anywhere near your level, I might not have learned as much as I should have from this. And, lacking a genuine need, not going to pursue this further. Still, it’s extremely interesting. Thank you!
I’ve got 0 intentions to use this on any project but its always interesting what unusual filming techniques you show to us. Stand out from other camera people that just reviews cameras and thats it
I wonder how this would work to replace something like Jetset Autoshot workflow to track a camera and bringing it to blender like you did and combining it with a Polycam lidar scan. Also, you think DJI will allow to record the lidar data used to focus, to have scans for blender meshes?
I thought this was really clever. I'm going to do some tests to see if we can pull it into the Autoshot workflow and automate the manual lens alignment pieces. Having the 360 frames at 24p or 25p (or whatever the cine camera frame rate is) makes all the difference in getting multiple footage streams to correctly align in SynthEyes.
I just bought the X4 for my real estate photography business. I didn't know they updated their system for it! 🎉 Is it possible to make a video on it? Or point me in the right direction. Thank you!
You need to shoot the photos on the X4 on timer mode (3 sec or 5 second).....then you need to offload all photos to you computer and using the insta360 app, convert them into regular generic 360 files, (basically not the .insv file insta360 gives you)...... From there you airdrop back to your phone, open the matter port app, and import them from camera roll as a 3d scan. And it should compile the rest. Currently the X4 does not work with the matter port app for auto linking, like the predecessors do. So this is a work around until they update the app. I sorta mentioned it in the video, but the wording was not as specific.
Very cool to see the behind the scenes of videos, and new use cases of current technology. Now how the da heck do i apply this knowledge to making a normal video?
Hey guys, thanks for staying with me thru the hard times.....I'm back now, and got a few videos slated for this week and next. Also wanted to mention, Insta 360 is offering a free $100 selfie stick for anyone who orders thru my link, (limited to first 30 people), so if you like what I do, and are thinking about this camera, then support the channel! geni.us/INSTA360_X4_8K
sending my best. i lost my sister a little over a month ago, and it hasn't been easy getting back into the swing of things.
@@TylerEdwardssheesh. I'm very sorry to hear that. 😢 sending love right back your way.
Where here for you as a peer professional my services are free if you ever wanna talk to someone and build good rapport hit me up I’m 51. I’m married. I have a 23-year-old daughter. I grew up poor in Santa Ana, California and had trauma growing up, but I my lived experience to help people with mental health and substance use disorder. I’m also in school to become a substance-abuse counselor and I just started another job substance-abuse counselor.
I'm invested. Once I get my hands on the a7c, I'll slap a 360 on it.
What's the name of hte TV show you are working on? Would love to see it, see your work.
This is such a big aha moment for my VFX workflow. I’ve always wondered how everyone else is able to get a perfect track from untrackable footage and it makes so much more sense now.
there are multiple ways to do it. But ppl really don't use auto-track since many shots aren't easy to track. Two things you can do if you don't have a 360 camera. Turn down/up the shutter angle/speed doing this controls how much motion blur is present in the image. Also, place tracking markers in the physical scene. This way you'll know for certain that you have tracking points. In post to get the motion blur back use something like real smart motion blur.
Another thing you can do is before you start tracking apply some sharpening. This will allow corners and other points to be more contrasted. Remember you need at least 8 tracking points in the scene at all times. If you can have a vfx supervisor on set they will make sure your scene is vfx ready
@@AnimeZone247 If you can have a VFX supe xD not all of us are that rich lol
I work in VFX and even I didn't know the value of a witness cam. Sure, you can track the footage you already have but it's always going to be a terrible time and worse when you can't get that perfect solve. Those effects are just going to be put in post and it's never going to look right unless you're a gifted comper.
An X4 is an investment but if it means you can start getting creative? Why the hell not. All I need is this rig and I'll have everything I need. Put in CG cars? Done. Change all the store signs with cards? Easy. Removing things with paintfixes? Re-projecting will take no time.
I'm just advocating to move things forward and keeping this simple for the simple vfx creators.
@@1BrknHrtdRomeo I’m not saying don’t use a 360 cam, use everything at your disposal. But these are tips if you dont have a witness cam and you still wanted to match move 😂
Before witness cams we took lots reference photos of the scene and used those to assist the solve. Shooting lens grids for the main and reference cameras helps a lot too, so you can accurately dial in the lens distortion first. Main camera distortion will vary a bit due to focus variations, but that’s not a problem if you’ve an accurate solve and point cloud from the reference photos.
That said, one of these would be an incredibly useful addition to anyone’s vfx kit. Great for shooting sky domes and lighting HDRs too.
Funny, I bought this camera months ago and just used it for the first time yesterday. I had no idea it could be used in this way. My mind is blown. Also sending love to you and your family. Sharing what you're going through is helpful for all of us. ❤
Great Technique. As a Professional VFX artist with over a decade of experience on Hollywood Blockbusters I just want to point out a few things to look out for if you are planning to do it in a professional environment.
1. You need to genlock both cameras so that both are capturing the frame at the same time.
2. This will only work when shooting digitally. When shooting on film, the film itself is moving, so you even need to track static shots shot on sticks.
Regardless, your VFX department will probably not complain about having more reference material :) If you are planning on adding additional CG to your shot your vfx team will be even happier if you use your 360 Cam to capture a few HDRIs brackets
How do they genlock digital cameras
How would you genlock these cameras? I don't think either has it built in from what I cursory googling
You have a tremendous knack for making me feel really, really dumb, but I'm here for it!
One of the greatest creators, always busy creating so we don’t get tons of content but it’s always 🍒
ayeeee
You have NO IDEA how good the timing of this video is for a project I'm starting completely unrelated to filmmaking. Thanks a bunch !
Staggering. I love seeing this come to fruition; I'd been seeing the foundations of technology like this in episodes of Two Minute Papers and such for a couplefew years now, and now here it is for people like us to use and be amazed by.
what a time to be alive!
I remember using the X2 on a desert race track shoot at request of the client.
All we used was the wide field of view.
I had no clue the sheer versatility of this product - it's insane what can be accomplished with it.
FYI, i succeded in grabing gopro tracking and applying it to FX6 50mm 1.4 wide open on a close subject, handheld, all inside of after effects. That's a lot of guessing, since i could not put the exact fov of the gopro, after effect failed to track it at the right FOV, and automatic tracking was way to narrow (35mm), so i applied approx 105mm to the second camera insted of 50mm to compensate. And moved the second camera to the right position (rough calculation with after effects pixel distance of one point in my scene, and estimating how much distance it should be in pixel...), and used a null with anchor point modified and linked to main camera. It took me 1hour to create the process after some frustration, but now it works ! and i will need it so much in 3 weeks! That is definitely not perfect but simpler for me since i dont use thos software you got. thank you for the inspiration.
Hi. I´m trying to achive some how what you are describig here. i dont have an Insta360 but a GoPro. I would love to resolve some doubtsi have regards this, because instaed a 3d program like here in this video i want to do it in AE.
@contactojaime then try to contact me, you find m'y website with m'y name, you will fond m'y email adress. I made a private tutorial, i can send it to you but that's on french
I love my 360x3...but since I'm an idiot it's like giving a toddler a rally car and saying "Now go, win."
Awesome idea for behind the scenes :)
Can’t wait to try that matterport 3d modeling. So much fun viewing spaces in VR
How does this man NOT have a million subs yet? His films are beautiful and his product videos are funny, complex, adventurous and down right experimental all at the same time…
I swear it’s that “junk draw” storytelling going around that gets all the attention! I don’t know how it works so well but it’s unfortunate… Moving in on 500k though! Congrats!
Man the golden age of tech just keeps getting better and better!
I strapped my 360 camera to the main camera on a shoot a while ago for behind the scenes, and i was thinking of this exact workflow, i am blown away it actually works!
i never saw the need for the iNsta360, but all the cool tricks in the intro just sold me.
Even though I might never use this myself it is such a delight to watch every single of your videos!
very cool. Almost didn’t click on this cos of the piggyback whatever terminology. Fascinating video, and nicely explained for a VFX noob like me
Glad it still had some value for you, Simon!
he's back and the wardrobe has not changed... all is good in the world 🌎
The fact that you carefully explain how it's done is cool! I've just subscribed 😁
I just experienced a Love Job and not only do i not know what to think or feel about it but ill have to watch many times over . Lotsa info. Wow! Thank You
As a fx artist I am forever asking our talented camera tracker if the track could get closer and he gives loads of valid reasons like, no lease data, it’s anamorphic, the person’s head we need tracked has no 3d scan. I think that was one of the shots excuse list. So I appreciate having the track we do have. Then after he is done on the shot the client starts delivering the data we need for the shot.
Extremely Inspirational! I am just preparing to do my first VFX heavy movie that needs camera tracking and this is going to be a life saver!
Bro! You deserve all the success you have! The amount of information you share with each video is absolute gold. Although I will probably never even get to 10% of your level, I still appreciate all you do!
Damn! Been shooting with this setup just to have a superwide at weddings what a great additional way to use it, thanks man!
I love Gerald's expression as he soaks in the masterclass.
Every single video u make, it's a gold mine.
Knowledge and creativity.
Insane as always Josh, keep grinding 🤟🏻🔥
ayeee. thanks man
You could basically call it spatial timecode
I fuggn like that 😂
@@MAKEARTNOWCHANNEL spacecode?
I've watched dozens of 360 cams reviews but this is the only one that really got me buying one
USE MY LINK BROOOOOOOO. I NEED THAT $$$$$$. Bahahah.
A casual way to flex the Canon dream lens!
Such an awesome episode! That workflow is lit!!! Thanks for showing us!
You bet
This is so incredible! I am so overjoyed that I chose to buy the insta360 x4. I had no idea that is helpful with hiring out vfx help! My mind is blown!! ❤
Holy frock this is the smartest thing I’ve ever seen haha being an unemployed VFX dude, this is perfect for anyone that wants to create their own shots and become a generalist
That Canon dream lens though. So fun.
Man, you continue to inspire. Thank you.
I was never a VFX guy but this changes everything
Here we go, my best dopamine source for my creativity 😃
Make Art Nooooow 💪
Oh wow, I stumbled upon an idea video that I’m uniquely familiar with. I had this same idea a few years ago, but using a GoPro. I wrote to Russ Andersson about the idea, because I wanted him to validate it (and hopefully incorporate it into his magnificent SynthEyes program). Imagine my surprise and confusion when he shot down the idea. At the time, he felt like his current system/method was sufficient, even though I explained the very common issues that you’ve brought up in this video. My biggest gripe was that low light scenarios, where the hero cam is exposed such that the background is practically black, causes you to have no usable tracking features. Meanwhile, I could place a ton of glowing LEDs down in front of the camera, where my GoPro can see it, but it won’t interfere with the composer hero shot. Hopefully Russ can come around on the idea.
We've been piggybacking like this for FPV drones for years. Try gyroflow. You can calibrate a profile to your lens and camera combo first and then put the gyro data in.
360 cameras are not supported by GyroFlow
@@dadbeatdead the camera file no but the gyro data yes. I can use my insta360 one R with 360 module on an Ursa mini and stabilize the ursa video in gyroflow with the insta360 data.
I sorta jumped to a conclusion that is what this video was about based on the thumbnail but it was more just a 360 camera review so I guess very different applications
@@Shauny_D yeah, it was for tracking data that’s extrapolated from a 360 camera, on GyroFlow’s website they list they don’t support any of of Insta360 X cameras. I was specifying the application that was presented for the Insta360 X4 that he’s using.
This is probably the most important video I've seen about VFX in a while now. Why did no one tell me about this before? Haha..
Yes!!! Thank you so much for this amazing video!
I think in 2018 we used one of the few 3D camera phones attached to the main camera to 3D track and allow us to overlay all the positions of pre-rendered 3D assets on a scene so the director could easily align eye lines to all the pre-determined positions of the 3D world beyond the set. It was a little complicated as the set would move but the background would be static so you had to track that and solve for that in real time. It worked pretty well though.
Using it for VFX match-moving is such a fantastic idea!
I wish I was smart enough to do all of this myself.. but the fact I can just use my 360 cam in preparation to hire out for VFX work and not have to stress too much about it is dope. Granted I understand this is within levels of other levels of effects haha.
I recently bought a Red Raptor and all week I’ve been thinking about strapping my BMPCC6K to the top of the raptor to somehow use the 6K’s gyro data in Davinci for stabilization. I did later realize that the raptor records gyro data and I just need a third party plugin to read it but that’s not as convenient. Either way, it’s cool to see somehow having a similar idea and going much further with it!
YO YO JOSH YEO - Fuck this was awesome - amazing video , as always Quality over Quantity !! Never stop making, take as long as you need when the quality is this good.
Absolutely legendary - would love to see more of your Luxe Real estate BTS + little tips / tricks some day :)
Never in a million years did i think i would witness a man shooting selfie video through a dslr with an insta 360 strapped to the dslr through another camera strapped to the dslr.
Love the staircase and the drone context shots. VFX use case makes sense. Not sure I have enough use cases for that workflow though.
I work as an engineering surveyor and have recently been learning how to use a Mavic 3E drone to do photogrammetry surveys (insanely detailed 3D models). I feel like it would make a great addition to your work (think Matterport for the whole external property)! I’m moving to California next month if you wanted to play with drones 🤙🏻
Been waiting for this one. Bang up job.
360 degree behind the scenes featurettes would be so rad. It would be like being on set with a famous director
Pretty legit use case for these things. Good stuff bro
Ayeeeee❤❤❤❤
That staircase shot was SICK!
Its good to see you back. cant wait to see what you come up with. maybe Anamorphia 3...
So amazing, thank you so much for sharing.
Hooray for Syntheyes !!!!!!!!!!
I'm actually amazed by this workflow good job ! I own a Insta 360 x3, i'll definitely try !! Only thing that seems to still be not perfect is to calculate the offset between the 2 camera to get that pixel perfect track on the main Cam... I'll for sure try to find a good workflow for this
All best ❤️
Yeah let me know if you do. I could see a use case for a specific mount (cold shoe) that people can get on amazon, and working with a specific model camera, and notating the EXACT offset for everyone to use. Mine was close, but by no means super dialed in.
How do you get the 360 cam to line up to the main cam's framing and point in the same direction? (in syntheyes or blender)
Thanks for this amazing video and for sharing this valuable knowledge
Really nice video! I have been using the insta 360 for a while now but never tried using it as a piggyback witness cam! Definitely gonna try that thank you!!
This is awesom, I was not aware of this trick.
This was a great episode.
Always pushing the limits 🍻
I was having a hard time justifying this purchase for just HDRI’s for our shots but this is truly a game changer for tracking. May reduce the need of the cumbersome workflow of something like Jetset software.
This is a great highly informative youtube video
Thats bawesome, I enjoyed this
Great video! Almost have this working. Got the camera doubled in Blender, solve down below 1, but how were you able to line up cine footage? I see the Camera01screenshape, but no video. Lastly, did you do those composites in Blender or bring it into AE once lined up? Sending to AE be ideal for me. Thanks!
did you ever figure this one out?
Lovejob goes from I suppose there could be a use case to this is great, pixel perfect solve, in 4 seconds. I don't need one, but now I want one.
Josh.
You my boy blue, you my boy!
Cheers 🍻
This is great! I would love to see similar content for the Osmo Pocket 3.
Great idea. Totally makes sense. Thanks!
Hmmm perhaps I missed something, but how do you apply it to another camera that used a completely different lens? For example if the Insta is super wide, how do you apply the solve onto 50mm lens with all its little imperfections?
That's what the program is for. It takes the offset you measure from image center to image center in all three directions so that when you track the real footage into your CG render with the two cameras also offset virtually, it can match the tracking points onto the in-camera depth of field and field of view perfectly. That's the genius of it.
ABSOLUTE AWESOME...LOVE IT...
I don’t understand everything about this video, but it looks really cool!
First thing that comes to my mind is using that tracking stability for better photogrammetry scans.
I had a theory a few years ago about mounting three wide-angle witness cameras (arranged in an equilateral triangle) around the hero lens, with the centre of the hero lens directly in the centre of the equilateral triangle. The idea of using a triangle would be you would have two cameras that are side-to-side relative to each other, creating stereo horizontal parallax; and the third camera would then provide vertical parallax, potentially increasing tracking accuracy.
Now you just need two 360 cameras on your hero cam. Boom.
Hey there, great video! I tested the technique a bit the other day and got okay results. But I have a couple questions. Even when I apply the exact focal length and sensor size of my shot cam to the offset virtual camera, it's a bit off. After quite a bit of fiddling, I got it close, but still not a usable matchmove for placing 3d objects.. Are you calibrating the shot cam footage to the 360 somehow(other than offset measurements) to get a solid match? Possibly locking a few tracking points of the 360 track to points on the shot cam footage somehow? Or do you only use this for long out of focus shots? With my tests, achieving the results you have for the shot at TC 10:35 would work. At TC 6:30, you show some mannequins tracked in. I'm struggling to achieve similar results with a great matchmove.
Another thing I did, which could be helpful to anyone reading, is shooting the 360 cam at 48 fps so there is a smaller margin of error when you are matching up the start frames. You can make your in and outs in the insta360 software, then render out as 24fps. You can dial this in even more with a subframe time offset on your camera in your 3d package, if it's still slightly off.
Excited to play with this more. Thanks for the great start!
I would love to know how to use the Camera Tracker in Davinci Resolve to do this rather than using a separate app for that. I don't see anything on YT that shows that.
When I tried, I only got 1/2 of the 360 camera to track.
used to think vfx artists only uses 360 cameras for HDRI envirenment lighting reference thingy, well this is new! surprised that it tracks so well even with the distortions of a flat 360 video.
My favourite camera! 🤗
08:33 Is the manual calibration really necessary? You can't get software to do the calibration with just a few synchronized stills from both cameras in an easily trackable environment?
Can you realign the camera inside the syntheyes software and export directly to after effects ? or export the camera metadate from blender to after effects ?
Pretty impressive! Now, not being anywhere near your level, I might not have learned as much as I should have from this. And, lacking a genuine need, not going to pursue this further. Still, it’s extremely interesting. Thank you!
Yeah!!! I`ll go and make some art right now!👍👍
That Matterport model reminded me of the 3d models of Wildfire in "The Andromeda Strain". If only Doug Trumbull was still around to see it.
Joblove, what an amazing name!
Good sponsored video to deflect the attention from the new killer Qoocam 3 Ultra :D
I’ve got 0 intentions to use this on any project but its always interesting what unusual filming techniques you show to us. Stand out from other camera people that just reviews cameras and thats it
thanks dude!
best video ever
Joblove coming back..... insta360 is my new cam. VFX and josh dropping video.... wow. it is time for the next gen
we need more LOVEJOB in our lives
Yo Josh yo… I love the in depth detail. I do the Matterport in my work and my channel. If ever want to colab I can bring the Pro3
How do you match slight differences in rotation? I’m trying to do this but can’t get the track to line up correctly…
This is awesome! Next frontier: gaussian splatting for fully re-frameable shots, I'd love to see what you can do that
I wonder how this would work to replace something like Jetset Autoshot workflow to track a camera and bringing it to blender like you did and combining it with a Polycam lidar scan.
Also, you think DJI will allow to record the lidar data used to focus, to have scans for blender meshes?
I thought this was really clever. I'm going to do some tests to see if we can pull it into the Autoshot workflow and automate the manual lens alignment pieces. Having the 360 frames at 24p or 25p (or whatever the cine camera frame rate is) makes all the difference in getting multiple footage streams to correctly align in SynthEyes.
How many snap blue jean shirts do you have?! You are the Steve Jobs of the Canadian Tuxedo...and I'm here for it!!!
I could use a few more
@@MAKEARTNOWCHANNEL Jay Leno might have some you can borrow.
Oh my soul is ready!!!
I just bought the X4 for my real estate photography business. I didn't know they updated their system for it! 🎉 Is it possible to make a video on it? Or point me in the right direction. Thank you!
You need to shoot the photos on the X4 on timer mode (3 sec or 5 second).....then you need to offload all photos to you computer and using the insta360 app, convert them into regular generic 360 files, (basically not the .insv file insta360 gives you)...... From there you airdrop back to your phone, open the matter port app, and import them from camera roll as a 3d scan. And it should compile the rest. Currently the X4 does not work with the matter port app for auto linking, like the predecessors do. So this is a work around until they update the app. I sorta mentioned it in the video, but the wording was not as specific.
@@MAKEARTNOWCHANNEL Okay great, thank you so much!!
Very cool to see the behind the scenes of videos, and new use cases of current technology. Now how the da heck do i apply this knowledge to making a normal video?
Soo is this how the 3D images are taken that you see in the apple vision pro’s?
pretty significant VFX efficiency gains. i think the industry should call that VFX workaround 'the lovejob'