some guys use word "indie virtual prodution" while it's still costs thousands of dollars... but this video is the real indie... thank you Bro !! really help for someone from 3rd world country where the resources are very limited
Finally. After watching all kinds of virtual setups with trackers and vr kits for tracking and making a virtual cam with the iphone, I kept thinking "can't we just ... use an iphone to do this" but no one ever did, but I was sure there was a way. Sure enough your video finally shows up in recommendations after watching all these other ones (the algorithm finally pays off) and you deliver what I assumed could already be done. Thanks
You should consider that this wasn’t far from the limit of where I got with this approach, and I don’t know that it is really viable for production of any kind. In some of the comments, I’ve listed some of the companies making plugins to make the phone tracking approach a little more reliable, but I don’t know that any of those are production-ready over a year later.
@@hyperbolicfilms yeah, no doubt, I was just looking for simple stuff around the house - mainly to have the knowledge of it. I also didn't feel like spending $600+ on sensors and such just to mess with it a bit when I have an iphone and an ipad and a dslr laying around. It's a good start to bounce off of.
I bought hitfilm pro yesterday and am getting and iphone 12 today. I have been working on a large scale unreal scene and have been looking into getting in virtual production. Thanks for posting this. Its motivating that others are shooting for same or similar goals.
At 8:18 my video tab isn’t showing my cam link. It’s just empty. My cam link is installed in my computer correctly since it’s working on other apps. Any help?
As I’ve said in some other comments here, this method might not be the best approach going forward. It seems like there’s at least one app being developed to handle the tracking of the camera with a phone much better (filmengineers dot com had something in beta which seems like it could be the perfect solution).
@@hyperbolicfilms there is no video or something about these "filmengineer . com " solution in this field as far as I search there is nothing else but only their page
@@anirudhsays1534 I have not been following closely. There were a few promising apps like byOwls and Filmengineers, but I have not seen anyone’s result using those. I have been thinking of trying this again a year later to see what’s possible now. Hopefully I’ll have time before Christmas.
Hi, my apologies, I didn't see this update. I think that mode was removed from Unreal Engine 4.25 and later. In my experience, the best way to do some kind of virtual production now with an iPhone is to do it with the CamTrackAR app using the iPhone camera too. The method I described in this video has been broken by successive updates.
This is such a great video and has saved me probably about a year or mores worth of research! Please try outputting the unreal feed to a data projector to skip the green screen? And in the unreal vid today, Matt Workman said he had to genlock the camera to the video feed. Didn't quite follow why then you tube suggested this video!
I'm not sure I'll get to the projection or LED wall stage of things, just based on the limitation of the space I have in my apartment. I just moved to a larger space, but still there's no real good space for a projection based on the layout. If I can get into someone's studio for a while, I'd definitely be interested in playing with it, but I've moved more towards animation in Unreal rather than live action/CG background integration because it's doable for less money. I'd suggest looking into Ian Fursa's work (th-cam.com/video/_FC1lvhoPRc/w-d-xo.html). He posts a lot of his experiments on Matt Workman's Facebook group and he's definitely shown that you can have some success with a low cost projector and a Vive tracker, so probably around $1000 for the hardware.
Hi, im currently working on trying to implement a setup like this at my university, is there an updated video I should use as reference instead? This does seem like something that would work, but it’s also 2 years old :) Thanks for the great video!
Hi, a teacher at the college where I took a virtual production course this summer tried to recreate it with the latest version of the software, and he determined that there was no good way to get sync. I still haven't found a good, reliable, affordable way to do virtual production. I had tried Assimilate LiveFX, which is a very good live compositing program that says it supports tracking from phones or the affordable Intel RealSense camera, but even using two fairly powerful computers with multiple SDI I/O cards and timecode, it was still out of sync. I could record the Unreal background and the video in the camera and sync the two and use LiveFX for a video preview though. There is a new very affordable tracker that hit the market now (has similar packaging/marketing images to the ReTracker Bliss, but I can't recall its name). It might be the game changing device, but still doesn't have a lot of software support. The fastest/most affordable way to do any of this is using the CamTrackAR app on iOs, but it has its limitations and isn't real time.
I think there is a frame rate offset...and if you type how many frames you want to offset it by 4 - 6 it usually fixes the drift. Don't quote me but it's something along those lines.
Sorry to bother you again with so many questions, but do you get latency with the cam link? I can see at 4:25 that you move the tripod and on the iphone there is some latency.I just want to know to see if a go for the elgatocam link or for a blackmagic decklink which is like 4 times more expensive
I never solved the latency problem with this approach. I am sure there is a way to do it within Unreal, but I did not come up with a way. From my experience doing corporate streaming for the last year and using both Blackmagic and Elgato setups, the Camlink is very reliable and I’ve never noticed much difference in the latency compared to my Blackmagic Intensity card. The $5 to $10 HDMI no name capture cards seem to offer more of a delay.
@@hyperbolicfilms oh so you are using a blackmagic decklink and getting delay anyways? perhaps might be also the computer itself, maybe a computer with faster cpu, an expensive rtx videos card, tons of memory I dont know. There is guy Matt workman from from cinematography database youtube channel, I saw a long video interview with 3 unreal engine guys and he, and he said repeatedly that the only tools he has found that there is no delay is the blackmagic ursa mini 4.6 g2 connected via sdi to a blackmagick 8k decklink card. He stressed that so many time.
@@ivancuetowicq At the time I made this video, I was using a 5 year old tower with a Nvidia 960. I now have a 3080, so I’m hoping to get some free time soon to retry this whole thing with that new system and the latest version of Unreal to see if it’s any better now.
It should record any movement, except Dutch angles. Because of the iPhone’s inability to lock into landscape mode, tilting into a Dutch angle causes the Unreal Remote app some problems.
I think the green flickering is due to the auto vignetting down to the chroma keying module. And the synching issue is something to do with genlock on camera. Please correct me if i am wrong.
Awesome video! Did you actually move the video camera itself while rendering the 15sec video? I didn't understand that part. If i had a greenscreen that is infinite around, i could possibly orbit around the subject with my camera and the sequencer would record my movement? Thank you!
In theory yes, but this method has broken since Epic changed things in more recent versions of UE. Their new mobile app doesn't work the same way and doesn't seem to work with many iPhones. I've tested it on my iPhone 11 and a new iPad and they won't connect with Unreal.
Awesome . This is so far the most clearly explained video I have watched about virtual production . I only have one question .. I was wondering if its possible to connect live video from my camera to Unreal engine wirelessly using 7D Mark 11 . Because 7D mk 11 has an inbuilt Wi-Fi feature which works when you purchase a card for it that you place in an SD slot of the 7d mk 11 camera ? Would that work without having to connect the camera via Camlink device ?
What are the details on the iphone mount? Looks like you're using an adjustable mount and an adapter to the camera body? Links or details about both would be appreciated!
John Gabriel The mount is a simple tension-based phone holder attached to the camera by a cold shoe to 1/4 20 mount. This is a Manfrotto phone holder, so was about $20, but I have a few $2 ones that would do just as well.
By the way, you might get different delays between the tracker (phone in this case) and the render. Remember that there's a delay in capturing the image and this might well be different than the framerate of the transform captured from the phone. So usually with a vive tracker, we add a buffer of N values, where N is some tweaked values that are eyeballed to give the approximate correct delay between tracker update and video display update. Anyway, keep it it, might be a useful thing to be able to do!
zoombapup I think there are too many limitations to this approach. Shortly after I recorded this, I realized that doing any sort of severe Dutch angle on the camera would cause the iPhone to switch into portrait mode, which makes the traffic unreliable. My best results were in capturing the camera movement data, recording a take of the live action plate in camera, and combining the live action and CG element in After Effects. It’s definitely not nearly as accurate as a Vive tracker would be, but I think it can work for some shots.
@@hyperbolicfilms Good point. No way to lock the camera in landscape mode? I'm using a vive tracker and a single base station and that approach works pretty well. I think these virtual cameras are probably best for doing entirely virtual shots though. I've hooked up a vive tracker based shoulder rig with a virtual camera setup and that works well (made some custom controls for an xbox PC controller). I'm also doing some work on controlling take recorder via an app on my surface pro so I can do different virtual setups and pull focus and play with the virtual cameras.
@@zoombapup I've mostly moved on to playing with mocap and animated characters for my own work. I think Epic could make the iPhone do what I was hoping it could with some adjustments to the Unreal Remote app, but considering the situation now with Apple, I doubt it would happen. It's frustrating because it is close to being a good solution for mixed live/virtual productions, but just not enough in the app.
Great tutorial!!! Any idea how to turn your TV into a makeshift LED Wall? Not for like an official production but for leaning and practicing without 1000s of dollars
I'm trying to figure that out myself, if I get it going in the next few weeks I'll post a step-by-step as well. I assume it has something to do with nDisplay still. I was thinking that using a green screen would be good for wide shots, but maybe getting an 80 inch TV would work for close-ups and like toy commercial/product shot type things.
Hey there, great video. It has been massively helpful. Can I ask can you still use the uremote camera control fully this way? I have found I can control the camera with the uremote, a keyboard or a controller all the way up to the point I change the composite output to player viewport then I seem to lose control of the touch controls and mouse/keyboard inputs once it plays the comp. Any idea how to keep the inputs and the comp?
I haven’t had much of a chance to revisit this workflow since the summer as I’ve been back at work the last few months l, so I don’t have a good answer for you. The URemote does seem to pick its starting point and there’s no way to change much. It’s a very blunt tool, and I hope they add in more control at some point.
hello, I already set the video camera feed and the iphone as a tracker. I havent done the compusure yet, but I was wondering ,after setting all these, then what? I mean do you know how to record this and export a video? Can the camera movements tracked by the iphone be recorded and possibly exported as an fbx? I know you will tell me yes with sequencer, but how? by the way for now im talking about just the camera tracked with the iphone, no video feed necesary. Im doing lot of test focusing mainly in tracking the camera.
I don’t know. The approach that I was taking after this was to record the green screen in camera, render my CG background in Unreal, and composite in After Effects. As of three days ago, there is an app in beta testing for Android and IOS that makes this approach moot. It apparently does the phone tracking and allows you to offset the position of the data to account for where your camera is. I’m not a beta tester unfortunately so I can’t say how good it works, but it seems like it’ll be a great tool for indie filmmakers if it works.
Especially inspiring because I just bought a new computer, a lumix g7, a zhiyun crane, and a new phone because my moto z2 had lost it's mic from sanitizing liquid.
You can get rid of the green corners if you change the material to "M_SinglePassDiffColorKeyer". Question: Do you know better methods or plugins for a very good greenscreen keying? Or would you stick just to the in-built tools? Thanks
Thanks for the info. I have not had the opportunity to dive into this much since I made this tutorial. I’m going to start using Aximmetry in the near future to see how well that does, partially because of the keyer and because it already has the input and output side of things handled.
this video is old now but still impressive ! I've tried to do the same but outdoor and offline using BlendARtrack (i'm on android, i didnt tried on iphone yet) I was close to a good result but even if i turn off the stabilization of my differents cameras, it's not perfect :( I think the problem is even with IBIS Off the sensor still move a bit and make the tracking a little bit wrong... Did you tried to do an offline/outdoor tracking ? on iphone there is also CamtrackAR app. See you !
Great video! One thing though... my camera shows up in media player, but now video shows... but when i open camera in windows, the video works fine... any solution? Thank you!
if we use sensor on iphone for camera movement in unreal engine , is it only for rotation movement ??? or we can use it also for position movement ?? please answer , sorry for my bad english.
This is a very nice tutorial! But i have few questions: 1, do you need to match your real camera's lens info to the cinecamera in UE? i saw when you moving the camera the two layers of the image are not aligned. 2, i didnt see you make any offest in unreal to compensate the difference between your iphone and you real camera's physical position
These are the places that I'm stuck, and I wonder if it's possible to get past that hurdle. Seeing others worth with the Vive tracker (like Greg Corson's great project and tutorials on his TH-cam channel), there are specific steps to follow to do the alignment to get everything accurate. I don't know that it's possible to get perfect alignment of lens and offset with this iPhone method using the virtual camera. There are a lot of aspects of how the virtual camera works that are a mystery to me, such as what direction it will start when I begin my scene. It will start in the position of the Cine Camera in the scene, but not in the direction that the Cine Camera is pointed. I don't know if this is a limitation of my knowledge, or a limitation of the virtual camera and remote control app. There's a fair bit of documentation on how to use the virtual camera, but not a lot on how it works.
hyperbolic films did u ever work out how to solve the direction the iPhone/iPad view starts at? Even if I set waypoints in the app my camera goes to the point but again at an annoying offset angle
@@tmaintv No, it seems to be a limitation of the app. I think it creates a camera specifically for the virtual camera. I really think the Byowls Android app will surpass all this because they are actively developing it and adding new features. Epic hasn't done much to update the app, and the documentation is so minimal that I was guessing at a lot of things when figuring this tutorial out.
Any chance of redoing with 5.4? Hopefully they’ve simplified the process. There are like 75 steps to put a thing in front of a camera. I’ll never get Unreal.
I haven't yet tried the new Vcam app to see if it has potential in this old style way of doing things. I did just release a new tutorial using Aximmetry, which you can now very easily use an iPhone for both the camera and the tracking. I'm still testing using a DSLR/Mirrorless camera with the iPhone as a tracker, but my initial tests looked promising. I'll have a tutorial of that as soon as I am sure of the workflow. Aximmetry uses Unreal Engine to work, but really takes care of the camera input and keying aspects. There is also a new subscription-based app called JetSet from Lightcraft Pro that lets you use an iPhone for virtual production.
@@hyperbolicfilms Thanks for the detailed reply. I saw the JetSet video and it looks pretty promising although I don't know if the price is right. May be worth a look though. I'd tried their older app which was basically Facetime AR and wasn't very good but hopefully they've ironed it out. I hope that some day there's a simplified method and also maybe some Unreal slimming down, i.e. a "beginner mode" that works but maybe offers only a small percentage of the options because it's pretty overwhelming.
I am looking to combine live action from Camlink 4K (DLSR) and a matching virtual camera background from Unreal using the built in chromakeyer in OBS. Skipping the composure elements all together. I want to use ipad or iphone for the tracking, but I can't tell from the video if you get world translation with the iphone or just rotational movement of the camera. I.E can I move freely in the virtual environment?
Will Huff You do get both translation and rotation with this technique. There are a number of problems with the accuracy though that I haven’t sorted out. The Unreal Remote app seems to be adding some smoothing that cant be removed, so it’s not a perfect match the way a Vive tracker can be. That is definitely a more tried and true method with far more documentation on it than this approach (which only a handful of people have ever told me they’ve tried).
hyperbolic films ahh ok I see. Thanks for that. I’ll prolly do a test then get to ordering the vive trackers and go thru those videos on setting up. Thanks!
How would you go about syncing the tracking and video? At the moment my tracking data seems to come a little after the video feed and I would like to somehow delay the video feed to get them to sync
Is there a way to stabilize footage in post if I record video on an iPhone while shooting video on my camera in a rig similar to what is shown in the video? I guess a better question would be is there a way to utilize the iPhone sensor suite to stabilize footage in the same way go pros or iPhone work? Im thinking Sony released a post processing software that can stabilize footage in post using sensor data so i feel like this is possible.
Guys I just tested this with a person greenscreen. The playback after export will be in superspeed since UE4 only likes image sequences. Just a warning this method is not reliable at the moment. Save your money and get a decklink so you can output to a recorder. :)
I think that has to do with the way Unreal Engine processes the two elements. It's taking more than realtime to render the CG element, but is recording the video in real time. So the final video becomes fastforward. In my later experiments, I recorded a take to get the virtual camera motion recorded, rendered out the CG element, recorded the live action in the physical camera, and then composited in After Effects. I think there's a good chance that the Byowls Android plugin will be better for this eventually because they are actively developing features still. Epic put out the Remote, and haven't added anything new since the initial release.
Hello. So with this method can the compositing(iphone and video camera feed) be recorded? I have tried everything and it cant just record anything. When trying with sequencer and take recorder it creates another cam that doesnt record anything cause it´s a "ghost" cam that disappears as soon as you stop playing. Can you please tell me how to record sth after do all this set up?
Hey man, thanks for this amazing tutorial. Just one question, I don't know why the virtualcameragamemode option doesn't appear in the gamemode override options, even after have installed all the pluggins. Thanks very much C:
I think things may have changed in more recent versions of Unreal. This was made in 4.23 I believe. There is a new version of the virtual camera app, but I have not been able to get Unreal LiveLink to see my phone with it.
I have done it almost the same :-) I have an iPhone 12. Connected, I can see the screen, but all I can interact if I touch the screen, the actor will jump. The camera movement is not working :-( (YET).
@@hyperbolicfilms what about an ipad 10.2? also called ipad 8 generation. I see its specs and it doesnt have proximity sensor. Would this lack of sensor wont allow for walking around?
@@ivancuetowicq The best way to check is to download the Unreal Virtual Camera app from the App store and see if you can move around in your Unreal scene. Walk around with the iPad and see what translates.
@@ivancuetowicq The best way to check is to download the Unreal Virtual Camera app from the App store and see if you can move around in your Unreal scene. Walk around with the iPad and see what translates.
Also whenever I try to capture video it also captures the virtual camera control hud (even though I otherwise don´t see it even on the phone) and the quality is really terrible compared to the previews. Any ideas anyone?
I wonder that my iPhone connected, but did not drive the camera. I did not use an ThirdPersonCharacter. Why didn't you use VirtualCamera2Actor as in the official information? (I tested also a VirtualCamera2Actor with LiveLink without success)
I made this process up based on the documentation and information I could find at the time about a year ago. I never saw anything in regards to VirtualCamera2Actor.
@@hyperbolicfilms Thanks for reply. It's under the link you have posted, right here the second page: docs.unrealengine.com/4.26/en-US/AnimatingObjects/VirtualCamera/VirtualCameraActorQuickStart/
@@hyperbolicfilms After many tests I found out, it only runs until 4.25. In 4.26 it is connecting but not working properly. Unbelievable that there is no information from UE provided.
some guys use word "indie virtual prodution" while it's still costs thousands of dollars...
but this video is the real indie... thank you Bro !!
really help for someone from 3rd world country where the resources are very limited
It's quite unvalivable because many people will leave this knowledge for themselves and You decide to share it. Big BIG THANKS MAN
Can confirm that this also works with an Iphone 7. Thanks for making this tutorial!
Green bits are caused by lighting on the green screen. Fix it by altering the vignette value in the colour keyer transform pass.
Thanks this helped with the edges!!!
Finally. After watching all kinds of virtual setups with trackers and vr kits for tracking and making a virtual cam with the iphone, I kept thinking "can't we just ... use an iphone to do this" but no one ever did, but I was sure there was a way. Sure enough your video finally shows up in recommendations after watching all these other ones (the algorithm finally pays off) and you deliver what I assumed could already be done. Thanks
You should consider that this wasn’t far from the limit of where I got with this approach, and I don’t know that it is really viable for production of any kind. In some of the comments, I’ve listed some of the companies making plugins to make the phone tracking approach a little more reliable, but I don’t know that any of those are production-ready over a year later.
@@hyperbolicfilms yeah, no doubt, I was just looking for simple stuff around the house - mainly to have the knowledge of it. I also didn't feel like spending $600+ on sensors and such just to mess with it a bit when I have an iphone and an ipad and a dslr laying around. It's a good start to bounce off of.
I bought hitfilm pro yesterday and am getting and iphone 12 today. I have been working on a large scale unreal scene and have been looking into getting in virtual production. Thanks for posting this. Its motivating that others are shooting for same or similar goals.
Be sure to check out CamTrackAR from Hit Film.
Very cool. Definitely not an easy setup. Hope it gets more fluid the more people get used to filming this way. Thanks for making the video!
Not arf!!!
Great video man! i have a GH4, my question is what cable should i use ? should be the same as GH5? Could you please tell more info? Thanks a lot!!🎥📸
Virtual hand claps for you sir! Bravo, well done, and thanks x1million.
I reached 13:20 min And I have not "virtualcamera game mode" in my list in Game mode override. I made all your steps before. What I do wrong?
Hey man, same here. Did you figure out any solution?
@@deeeny_ No . I enabled all plugins and nothing happens
@@1978rnio It's for the unreal version, I downgraded to 4.23 version and it worked.
@@deeeny_ Thanks , try to downgrade
At 8:18 my video tab isn’t showing my cam link. It’s just empty. My cam link is installed in my computer correctly since it’s working on other apps. Any help?
Wow this was such epic tutorial, completely revolutionary!!!
As I’ve said in some other comments here, this method might not be the best approach going forward. It seems like there’s at least one app being developed to handle the tracking of the camera with a phone much better (filmengineers dot com had something in beta which seems like it could be the perfect solution).
@@hyperbolicfilms there is no video or something about these "filmengineer . com " solution in this field as far as I search there is nothing else but only their page
@@hyperbolicfilms Thanks a lot for this, its been one year now, do you know if new latest tracking apps have released for this yet?
@@anirudhsays1534 I have not been following closely. There were a few promising apps like byOwls and Filmengineers, but I have not seen anyone’s result using those.
I have been thinking of trying this again a year later to see what’s possible now. Hopefully I’ll have time before Christmas.
Thanks so much for the video tutorial, its really helps me as a entry level.
A lot of this information may not apply in more recent versions of Unreal Engine, as a lot has changed in almost 2 years.
Great video man. Helped me out a lot. I am doing similar work with my Canon T5I and T6S and getting decent results.
Hello, at 13:23, on world settings dropdown menu, VIRTUALCAMERAGAMEMODE option is not visible there? plz help
Hi, my apologies, I didn't see this update. I think that mode was removed from Unreal Engine 4.25 and later.
In my experience, the best way to do some kind of virtual production now with an iPhone is to do it with the CamTrackAR app using the iPhone camera too. The method I described in this video has been broken by successive updates.
@@hyperbolicfilms thank you!!
Hi, very nice Tutorial.
How did you get access to the "VirtualCameraGameMode" in the World Settings? :)
i can't find it in Unreal 4.27 or in 5.1.
same issue.
You are amazing… technology is amazing… this is amazing…. UE5 is amazing… TH-cam is amazing thank you man. Please do more green screen UE5 tutorial
I require a good camera and a good green background. but THANK YOU SO MUCH SIR !
I'm from Germany and I must to say thank you for your help now I will make Livestream with this
Blind Gamer Good luck!
This is such a great video and has saved me probably about a year or mores worth of research! Please try outputting the unreal feed to a data projector to skip the green screen? And in the unreal vid today, Matt Workman said he had to genlock the camera to the video feed. Didn't quite follow why then you tube suggested this video!
I'm not sure I'll get to the projection or LED wall stage of things, just based on the limitation of the space I have in my apartment. I just moved to a larger space, but still there's no real good space for a projection based on the layout. If I can get into someone's studio for a while, I'd definitely be interested in playing with it, but I've moved more towards animation in Unreal rather than live action/CG background integration because it's doable for less money.
I'd suggest looking into Ian Fursa's work (th-cam.com/video/_FC1lvhoPRc/w-d-xo.html). He posts a lot of his experiments on Matt Workman's Facebook group and he's definitely shown that you can have some success with a low cost projector and a Vive tracker, so probably around $1000 for the hardware.
@@hyperbolicfilms Thank you for the link and not sure either way will be easy on our pockets!
Thanks for sharing this video. Much appretiated!
very Excellent for beginners, one question How did you connect iphone cam with unreal?
Hi, im currently working on trying to implement a setup like this at my university, is there an updated video I should use as reference instead?
This does seem like something that would work, but it’s also 2 years old :)
Thanks for the great video!
Hi, a teacher at the college where I took a virtual production course this summer tried to recreate it with the latest version of the software, and he determined that there was no good way to get sync.
I still haven't found a good, reliable, affordable way to do virtual production. I had tried Assimilate LiveFX, which is a very good live compositing program that says it supports tracking from phones or the affordable Intel RealSense camera, but even using two fairly powerful computers with multiple SDI I/O cards and timecode, it was still out of sync. I could record the Unreal background and the video in the camera and sync the two and use LiveFX for a video preview though.
There is a new very affordable tracker that hit the market now (has similar packaging/marketing images to the ReTracker Bliss, but I can't recall its name). It might be the game changing device, but still doesn't have a lot of software support.
The fastest/most affordable way to do any of this is using the CamTrackAR app on iOs, but it has its limitations and isn't real time.
@@hyperbolicfilms thanks for the very detailed answer! I really appreciate it. 🙏🏼
Great tutorial! This is basically all the kit I got at home, I will give this a try :) thanks!
You just opened my mind... THANK YOU!
I think there is a frame rate offset...and if you type how many frames you want to offset it by 4 - 6 it usually fixes the drift. Don't quote me but it's something along those lines.
Thank! I haven’t been able to find that, but i have heard that there should be something to compensate.
Can't wait to get home from work and watch this
This is so valuable. Many Thanks.
Sorry to bother you again with so many questions, but do you get latency with the cam link? I can see at 4:25 that you move the tripod and on the iphone there is some latency.I just want to know to see if a go for the elgatocam link or for a blackmagic decklink which is like 4 times more expensive
I never solved the latency problem with this approach. I am sure there is a way to do it within Unreal, but I did not come up with a way.
From my experience doing corporate streaming for the last year and using both Blackmagic and Elgato setups, the Camlink is very reliable and I’ve never noticed much difference in the latency compared to my Blackmagic Intensity card. The $5 to $10 HDMI no name capture cards seem to offer more of a delay.
@@hyperbolicfilms oh so you are using a blackmagic decklink and getting delay anyways? perhaps might be also the computer itself, maybe a computer with faster cpu, an expensive rtx videos card, tons of memory I dont know. There is guy Matt workman from from cinematography database youtube channel, I saw a long video interview with 3 unreal engine guys and he, and he said repeatedly that the only tools he has found that there is no delay is the blackmagic ursa mini 4.6 g2 connected via sdi to a blackmagick 8k decklink card. He stressed that so many time.
@@ivancuetowicq At the time I made this video, I was using a 5 year old tower with a Nvidia 960. I now have a 3080, so I’m hoping to get some free time soon to retry this whole thing with that new system and the latest version of Unreal to see if it’s any better now.
@@hyperbolicfilms cool man thanks, ill wait for it.
Thanks for illustrating this overall concept, and in such good detail. Very inspiring!
thanks for such a detailed video, but still a question.. does it track dolly movement too, or just a fix trapped movement? please answer.
It should record any movement, except Dutch angles. Because of the iPhone’s inability to lock into landscape mode, tilting into a Dutch angle causes the Unreal Remote app some problems.
@@hyperbolicfilms thanks for this valuable information.
Very nice tutorial
Esto es todo lo que busco !!!! Oh porfavor esto es oro para filmakers
probablemente no sea bueno para la producción. solo un experimento
I think the green flickering is due to the auto vignetting down to the chroma keying module. And the synching issue is something to do with genlock on camera. Please correct me if i am wrong.
Awesome video! Did you actually move the video camera itself while rendering the 15sec video? I didn't understand that part. If i had a greenscreen that is infinite around, i could possibly orbit around the subject with my camera and the sequencer would record my movement? Thank you!
In theory yes, but this method has broken since Epic changed things in more recent versions of UE. Their new mobile app doesn't work the same way and doesn't seem to work with many iPhones. I've tested it on my iPhone 11 and a new iPad and they won't connect with Unreal.
@@hyperbolicfilms that's sad. Thank you for your reply!
Loud and clear!! It was such inspiring, thanks!!
This was awesome! Thanks for the detail!
Awesome . This is so far the most clearly explained video I have watched about virtual production . I only have one question .. I was wondering if its possible to connect live video from my camera to Unreal engine wirelessly using 7D Mark 11 . Because 7D mk 11 has an inbuilt Wi-Fi feature which works when you purchase a card for it that you place in an SD slot of the 7d mk 11 camera ? Would that work without having to connect the camera via Camlink device ?
I followed this video for virtual production, thank u faculty at GBC
this is a nice tutorial !!!! thank you... I will try to follow the steps
This is a great tutorial. Thanks! Any chance you have an HD version so we can see the drop down selections better?
I uploaded a 4K file, so I'm not sure if TH-cam is still crunching that or what.
Great tutorial! Thank you!
thank you for this tut but i want to ask you if i want to make out put from this to ndi
What’s the app that you use for tracking on iPhone?
Thanks for this very helpful introduction. Answered a lot of my questions!
Great to hear! If you experiment with this, I'd love to hear your findings.
thanks for your share actually not easy setup
Very helpful video, thanks for this!
Wow I have never seen this before. Pretty awesome
very nice and quick info, thanks
thanks! got it working with my iphone 11 pro
Use the iPhone 8 instead of HTC seems to be too much easier and works perfect...What are your thoughts about it?
What are the details on the iphone mount? Looks like you're using an adjustable mount and an adapter to the camera body? Links or details about both would be appreciated!
John Gabriel The mount is a simple tension-based phone holder attached to the camera by a cold shoe to 1/4 20 mount. This is a Manfrotto phone holder, so was about $20, but I have a few $2 ones that would do just as well.
Looks like the offset of the phone and the camera? At this close range the angle is changed a lot from the field of view?
Thank you for the video. Is it necessary to have a camera? Can I control it with only iPhone12?
I have not scene a way to do it live entirely on an iPhone. But the app CamtrackAR may be a better way for an iPhone only approach.
@@hyperbolicfilms Well,thank you very much!!
By the way, you might get different delays between the tracker (phone in this case) and the render. Remember that there's a delay in capturing the image and this might well be different than the framerate of the transform captured from the phone. So usually with a vive tracker, we add a buffer of N values, where N is some tweaked values that are eyeballed to give the approximate correct delay between tracker update and video display update. Anyway, keep it it, might be a useful thing to be able to do!
zoombapup I think there are too many limitations to this approach. Shortly after I recorded this, I realized that doing any sort of severe Dutch angle on the camera would cause the iPhone to switch into portrait mode, which makes the traffic unreliable.
My best results were in capturing the camera movement data, recording a take of the live action plate in camera, and combining the live action and CG element in After Effects. It’s definitely not nearly as accurate as a Vive tracker would be, but I think it can work for some shots.
@@hyperbolicfilms Good point. No way to lock the camera in landscape mode? I'm using a vive tracker and a single base station and that approach works pretty well. I think these virtual cameras are probably best for doing entirely virtual shots though. I've hooked up a vive tracker based shoulder rig with a virtual camera setup and that works well (made some custom controls for an xbox PC controller). I'm also doing some work on controlling take recorder via an app on my surface pro so I can do different virtual setups and pull focus and play with the virtual cameras.
@@zoombapup I've mostly moved on to playing with mocap and animated characters for my own work. I think Epic could make the iPhone do what I was hoping it could with some adjustments to the Unreal Remote app, but considering the situation now with Apple, I doubt it would happen. It's frustrating because it is close to being a good solution for mixed live/virtual productions, but just not enough in the app.
hyperbolic films the virtual camera spawns a new camera instead of using yours.
How should I go about adding the buffer? Just using an iphone but the tracking is coming later than my video feed
Great tutorial!!! Any idea how to turn your TV into a makeshift LED Wall? Not for like an official production but for leaning and practicing without 1000s of dollars
I'm trying to figure that out myself, if I get it going in the next few weeks I'll post a step-by-step as well. I assume it has something to do with nDisplay still. I was thinking that using a green screen would be good for wide shots, but maybe getting an 80 inch TV would work for close-ups and like toy commercial/product shot type things.
thanks for this! definitely wanna give this a try.
this is blowing my mind
Hey there, great video. It has been massively helpful. Can I ask can you still use the uremote camera control fully this way? I have found I can control the camera with the uremote, a keyboard or a controller all the way up to the point I change the composite output to player viewport then I seem to lose control of the touch controls and mouse/keyboard inputs once it plays the comp. Any idea how to keep the inputs and the comp?
I haven’t had much of a chance to revisit this workflow since the summer as I’ve been back at work the last few months l, so I don’t have a good answer for you. The URemote does seem to pick its starting point and there’s no way to change much. It’s a very blunt tool, and I hope they add in more control at some point.
@@hyperbolicfilms thanks. No worries. If I work it out I’ll let u know. Cheers
Thanks alot for the great tutorial, I've been looking everywhere for this. Do you have any updates tweaking this?
Improved lighting, 25 fps, RTX 2080, and a better camera will fix this.
hello, I already set the video camera feed and the iphone as a tracker. I havent done the compusure yet, but I was wondering ,after setting all these, then what? I mean do you know how to record this and export a video? Can the camera movements tracked by the iphone be recorded and possibly exported as an fbx? I know you will tell me yes with sequencer, but how? by the way for now im talking about just the camera tracked with the iphone, no video feed necesary. Im doing lot of test focusing mainly in tracking the camera.
Excellent!, is the camera track exportable??
I don’t know. The approach that I was taking after this was to record the green screen in camera, render my CG background in Unreal, and composite in After Effects.
As of three days ago, there is an app in beta testing for Android and IOS that makes this approach moot. It apparently does the phone tracking and allows you to offset the position of the data to account for where your camera is. I’m not a beta tester unfortunately so I can’t say how good it works, but it seems like it’ll be a great tool for indie filmmakers if it works.
Great tutorial ! 👍👍
Especially inspiring because I just bought a new computer, a lumix g7, a zhiyun crane, and a new phone because my moto z2 had lost it's mic from sanitizing liquid.
You can get rid of the green corners if you change the material to "M_SinglePassDiffColorKeyer". Question: Do you know better methods or plugins for a very good greenscreen keying? Or would you stick just to the in-built tools? Thanks
Thanks for the info. I have not had the opportunity to dive into this much since I made this tutorial. I’m going to start using Aximmetry in the near future to see how well that does, partially because of the keyer and because it already has the input and output side of things handled.
@@hyperbolicfilms Would be great to share your experience again :)
Incredible work!
Thanks a lot!
Hello, My iphone XR always shown "Connecting to *****" Unreal Remote 2 was not found in the phone Settings, so there was no way to set permissions.
I’m not sure. You may want to contact Epic Games support
the same problem. Could not connect...
this video is old now but still impressive ! I've tried to do the same but outdoor and offline using BlendARtrack (i'm on android, i didnt tried on iphone yet)
I was close to a good result but even if i turn off the stabilization of my differents cameras, it's not perfect :(
I think the problem is even with IBIS Off the sensor still move a bit and make the tracking a little bit wrong... Did you tried to do an offline/outdoor tracking ? on iphone there is also CamtrackAR app.
See you !
Great video! One thing though... my camera shows up in media player, but now video shows... but when i open camera in windows, the video works fine... any solution? Thank you!
Is there an Android way yet.....or ipad...?
Sir i don't have iphone camera how can i do with my android phone please tell me
Does it matter if the capture card is 1080p or 4k?
This is awesome, I will try this!
12:29 how copy how paste
??????
when I connect unreal remote 2 and connect I can see the scene in the phone but get no camera control?
I haven’t tested the new version of unreal remote, so I can’t say for sure.
Amazing tutorial
if we use sensor on iphone for camera movement in unreal engine , is it only for rotation movement ??? or we can use it also for position movement ?? please answer , sorry for my bad english.
That’s been answered a few times already, it does both 👌🏼
This is a very nice tutorial! But i have few questions:
1, do you need to match your real camera's lens info to the cinecamera in UE? i saw when you moving the camera the two layers of the image are not aligned.
2, i didnt see you make any offest in unreal to compensate the difference between your iphone and you real camera's physical position
These are the places that I'm stuck, and I wonder if it's possible to get past that hurdle. Seeing others worth with the Vive tracker (like Greg Corson's great project and tutorials on his TH-cam channel), there are specific steps to follow to do the alignment to get everything accurate. I don't know that it's possible to get perfect alignment of lens and offset with this iPhone method using the virtual camera. There are a lot of aspects of how the virtual camera works that are a mystery to me, such as what direction it will start when I begin my scene. It will start in the position of the Cine Camera in the scene, but not in the direction that the Cine Camera is pointed. I don't know if this is a limitation of my knowledge, or a limitation of the virtual camera and remote control app.
There's a fair bit of documentation on how to use the virtual camera, but not a lot on how it works.
Also sorry for the slow reply, I haven't been seeing notifications for new comments!
hyperbolic films did u ever work out how to solve the direction the iPhone/iPad view starts at? Even if I set waypoints in the app my camera goes to the point but again at an annoying offset angle
@@tmaintv No, it seems to be a limitation of the app. I think it creates a camera specifically for the virtual camera.
I really think the Byowls Android app will surpass all this because they are actively developing it and adding new features. Epic hasn't done much to update the app, and the documentation is so minimal that I was guessing at a lot of things when figuring this tutorial out.
Any chance of redoing with 5.4? Hopefully they’ve simplified the process. There are like 75 steps to put a thing in front of a camera. I’ll never get Unreal.
I haven't yet tried the new Vcam app to see if it has potential in this old style way of doing things.
I did just release a new tutorial using Aximmetry, which you can now very easily use an iPhone for both the camera and the tracking. I'm still testing using a DSLR/Mirrorless camera with the iPhone as a tracker, but my initial tests looked promising. I'll have a tutorial of that as soon as I am sure of the workflow. Aximmetry uses Unreal Engine to work, but really takes care of the camera input and keying aspects.
There is also a new subscription-based app called JetSet from Lightcraft Pro that lets you use an iPhone for virtual production.
@@hyperbolicfilms Thanks for the detailed reply. I saw the JetSet video and it looks pretty promising although I don't know if the price is right. May be worth a look though. I'd tried their older app which was basically Facetime AR and wasn't very good but hopefully they've ironed it out. I hope that some day there's a simplified method and also maybe some Unreal slimming down, i.e. a "beginner mode" that works but maybe offers only a small percentage of the options because it's pretty overwhelming.
Thanks for the excellent explanation
I am looking to combine live action from Camlink 4K (DLSR) and a matching virtual camera background from Unreal using the built in chromakeyer in OBS. Skipping the composure elements all together. I want to use ipad or iphone for the tracking, but I can't tell from the video if you get world translation with the iphone or just rotational movement of the camera. I.E can I move freely in the virtual environment?
Will Huff You do get both translation and rotation with this technique. There are a number of problems with the accuracy though that I haven’t sorted out. The Unreal Remote app seems to be adding some smoothing that cant be removed, so it’s not a perfect match the way a Vive tracker can be. That is definitely a more tried and true method with far more documentation on it than this approach (which only a handful of people have ever told me they’ve tried).
hyperbolic films ahh ok I see. Thanks for that. I’ll prolly do a test then get to ordering the vive trackers and go thru those videos on setting up. Thanks!
When i export the sequence, the rendered video or image sequence is 4 times faster than it should be.
How would you go about syncing the tracking and video? At the moment my tracking data seems to come a little after the video feed and I would like to somehow delay the video feed to get them to sync
Is there a way to stabilize footage in post if I record video on an iPhone while shooting video on my camera in a rig similar to what is shown in the video? I guess a better question would be is there a way to utilize the iPhone sensor suite to stabilize footage in the same way go pros or iPhone work? Im thinking Sony released a post processing software that can stabilize footage in post using sensor data so i feel like this is possible.
No idea. I think this would take Epic making a big update to their app, which seems unlikely.
Is this need any Vive Tracker or HTC? Or only Plug-ins and Iphone to connect and camera track?
Guys I just tested this with a person greenscreen. The playback after export will be in superspeed since UE4 only likes image sequences. Just a warning this method is not reliable at the moment. Save your money and get a decklink so you can output to a recorder. :)
I think that has to do with the way Unreal Engine processes the two elements. It's taking more than realtime to render the CG element, but is recording the video in real time. So the final video becomes fastforward.
In my later experiments, I recorded a take to get the virtual camera motion recorded, rendered out the CG element, recorded the live action in the physical camera, and then composited in After Effects.
I think there's a good chance that the Byowls Android plugin will be better for this eventually because they are actively developing features still. Epic put out the Remote, and haven't added anything new since the initial release.
Hello. So with this method can the compositing(iphone and video camera feed) be recorded? I have tried everything and it cant just record anything. When trying with sequencer and take recorder it creates another cam that doesnt record anything cause it´s a "ghost" cam that disappears as soon as you stop playing. Can you please tell me how to record sth after do all this set up?
Hey man, thanks for this amazing tutorial. Just one question, I don't know why the virtualcameragamemode option doesn't appear in the gamemode override options, even after have installed all the pluggins. Thanks very much C:
I think things may have changed in more recent versions of Unreal. This was made in 4.23 I believe.
There is a new version of the virtual camera app, but I have not been able to get Unreal LiveLink to see my phone with it.
Thanks very much again for this tutorial!
Spoiler alert. Deckard is alive! :) Thank you for a great tutorial!
Glad you like it!
I have done it almost the same :-) I have an iPhone 12. Connected, I can see the screen, but all I can interact if I touch the screen, the actor will jump. The camera movement is not working :-( (YET).
Hi,great idea what if you use ndi video stream
Presumably you could follow these steps and just use the NDI in place of the HDMI input, but I have not tried it.
Did anyone figure out why the camera feed keeps disconnecting?
Legend 👍🏼
You can now do this much easier than I was trying in Aximmetry with their new Eye app
do you know if this would work with an ipad? I think an ipad would be better just to having a bigger display to see better
I have only tested it with an old iPad Air, which did not work. Newer generation iPads may work.
@@hyperbolicfilms what about an ipad 10.2? also called ipad 8 generation. I see its specs and it doesnt have proximity sensor. Would this lack of sensor wont allow for walking around?
@@ivancuetowicq The best way to check is to download the Unreal Virtual Camera app from the App store and see if you can move around in your Unreal scene. Walk around with the iPad and see what translates.
@@ivancuetowicq The best way to check is to download the Unreal Virtual Camera app from the App store and see if you can move around in your Unreal scene. Walk around with the iPad and see what translates.
ARKit didnt work to respone the location and rotation in UE4.2.6,it any bro know what s wrong?
Also whenever I try to capture video it also captures the virtual camera control hud (even though I otherwise don´t see it even on the phone) and the quality is really terrible compared to the previews. Any ideas anyone?
What is the name of the app for IOS?
The sync issue is a common problem. most people i know they delay the trackers because its frame rate is far higher than the video source
I wonder that my iPhone connected, but did not drive the camera. I did not use an ThirdPersonCharacter. Why didn't you use VirtualCamera2Actor as in the official information? (I tested also a VirtualCamera2Actor with LiveLink without success)
I made this process up based on the documentation and information I could find at the time about a year ago. I never saw anything in regards to VirtualCamera2Actor.
@@hyperbolicfilms Thanks for reply. It's under the link you have posted, right here the second page:
docs.unrealengine.com/4.26/en-US/AnimatingObjects/VirtualCamera/VirtualCameraActorQuickStart/
@@hyperbolicfilms After many tests I found out, it only runs until 4.25. In 4.26 it is connecting but not working properly. Unbelievable that there is no information from UE provided.
do you think it is possible instead an iphone, we use android smartphone as the tracker??? please kindly answer my question, thanks.
Search for something called ByOwls. They have an Android app for Unreal.