Thanks for all of the great tutorials on how the system works. I'm interested in Jetset Cine for applications where there are few usable tracking points visible in the view of the main camera (e.g. the frame is full of moving people or moving elements like foliage or water). From your tutorials, it appears that Jetset is designed to have the iPhone camera facing the same direction as the main camera, so that they share a similar view of the scene. Is it possible to set the system up so that the iPhone faces in a different direction from the main camera - for example, to the left of the camera or behind the camera - where there may be a richer set of fixed points to track?
The system as you said is designed around the phone and the camera pointing in the same direction. The iOS camera is quite wide angle (19-20mm S35 equivalent) so it can pick up more points than the Cine camera usually can.
I'm looking at the Atomos Ninja iPhone, as it can do 10 bit - and the Seemo 4k only does 8 bit. Is it possible to use this instead, or is there a particular reason you are recommending the Seemo unit?
At present we only support the SeeMo line of devices. They have been extremely reliable and inexpensive. 10 bit isn't useful for doing the lens calibration process.
Hey, thanks for all these videos. In engine, once I paste into the command line and the sequencer loads, I only have a black image on my image plate. I can see the video from my cinema camera in the engine separately, but it remains black as an image plane, and export render it isn’t even included.
Perhaps this question has been already answered in another video; I'm confused which Accsoon SeeMo should I get - the SeeMo 4K HDMI Smartphone Adapter or the SeeMo Pro SDI/HDMI to USB-C Video Capture Adapter? Also, would I need the CineView transmitter?
Tracking will be the same with or without the Tentacle. It's very useful for automatic take matching in post, and for synchronized real time data. Such a good device that we're now recommending it as standard use.
Thanks for all of the great tutorials on how the system works. I'm interested in Jetset Cine for applications where there are few usable tracking points visible in the view of the main camera (e.g. the frame is full of moving people or moving elements like foliage or water). From your tutorials, it appears that Jetset is designed to have the iPhone camera facing the same direction as the main camera, so that they share a similar view of the scene. Is it possible to set the system up so that the iPhone faces in a different direction from the main camera - for example, to the left of the camera or behind the camera - where there may be a richer set of fixed points to track?
The system as you said is designed around the phone and the camera pointing in the same direction. The iOS camera is quite wide angle (19-20mm S35 equivalent) so it can pick up more points than the Cine camera usually can.
Great video! But I'm still waiting for the sytheye pipline tutorial.
Ensure your camera is focused on the object you're scanning, to get more tracking points.
Hey. Can we get a video with cine-camera pipeline to blender? We only have with iphone and that makes me confused...
That's a good request. The basic behavior is the same, but it's a good idea to do a full tutorial.
I'm looking at the Atomos Ninja iPhone, as it can do 10 bit - and the Seemo 4k only does 8 bit. Is it possible to use this instead, or is there a particular reason you are recommending the Seemo unit?
At present we only support the SeeMo line of devices. They have been extremely reliable and inexpensive. 10 bit isn't useful for doing the lens calibration process.
Hey, thanks for all these videos. In engine, once I paste into the command line and the sequencer loads, I only have a black image on my image plate.
I can see the video from my cinema camera in the engine separately, but it remains black as an image plane, and export render it isn’t even included.
OK -- can you post your question on the forums at forums.lightcraft.pro/? Then we can get a link with the take and test the behavior.
Perhaps this question has been already answered in another video; I'm confused which Accsoon SeeMo should I get - the SeeMo 4K HDMI Smartphone Adapter or the SeeMo Pro SDI/HDMI to USB-C Video Capture Adapter? Also, would I need the CineView transmitter?
Here's a good link: lightcraft.pro/docs/which-accsoon-seemo/
@@lightcrafttechnology Thank you!
Do you need Tentacle Sync? I noticed it's off by default, is there a huge difference in the track without it?
Tracking will be the same with or without the Tentacle. It's very useful for automatic take matching in post, and for synchronized real time data. Such a good device that we're now recommending it as standard use.
can i use atomos hdmi output to synch the phone?
Are you using the Atomos device to record ProRes RAW? If it can output a HDMI signal that the Accsoon Seemo can read, it should be fine.
Has anyone had issues with the proxy step? I'm not sure if it's the amount of clips but it doesn't even load for me