So finally I've gotten a Patreon-page and to kick things off I've released a Level Transition Blueprint that you just drop into your project to get it working. I will upload a tutorial next week but if you want to get the blueprint right away it's on my Patreon www.patreon.com/richardfrantzen And if you get the VIP you get access to our secret Discord-channel!
thank you for not cutting out moments where things dont work. This has helped me not get frustrated. I dont like watching vids where the host makes everything look so easy, because working with new tech...ITS NEVER THAT EASY
Hey Richard, Thank you so much for pushed me to learn unreal. I’m nuke compositor and I’m fascinated with all these new possibilities. I watched the unreal showcase a few days ago and I thought: maybe I can do “the same” with the vive trackers. I was wondering maybe someone is already looking in the same direction and I found you! Please lets continue growing and learning about this topic, is amazing!
I would like to thank you very much for your work and effort in sharing this material. I have been studying for months to use Unreal as a new film composer and it is amazing how there is no adequate material and literature to do this ... Everything is super hard to find and there is no compilation of information ... a real puzzle. Many, many, but many thanks for the video. I don't know you but I already owe you one.
Richard - thank you so much for sharing your knowledge on this! Great video. I've been wanting to build a "poor man's" VizRT for years and I think this tech gets very close to that goal. Looking forward to seeing more of your work and tutorials.
Richard Frantzén, Thanks for, great tutorial. Couple suggestions, that may work (unfortunately can’t test myself). A: instead of event tick you can use the time code. (Don’t remember the node name). Basically event tick depends on your work station so the results may b inconsistent. B: theoretically if you key your camera before feeding it to unreal you can try to place media actor in your camera blueprint and be independent from media plane in the scene. So basically you can freely walk around been limited only by base stations.
Hi Richard, I just watched the whole video ... it's amazing .... an incredible amount of information. You could in the next video make it clear how to apply the chroma video layer. By not having a green background at the moment of explanation you kind of reversed the process, putting a 3d element in the real environment .... I was a little confused. I didn't understand the relationship. Thank you once again.
leon malatesta yes of course! But if you put the green screen layer in OBS it’s just a matter of adding a key filter in OBS and putting the cg layer underneath! Thanks mate!
thank so much for this uniqe tutorial...noone else went so in depth. it is the only video I found to explain so detailed how to set up e virtual production equipment....BUT I have to admit I got already a headache.. too complex men... I wish it would be more easier... Hope to create more tutorials.
Man this is really good and it’s saving lots of money than using stype or similar beers, is there any chance to have a tutorial just for the gear setup plz and what exactly equipment you need to use?
Hi. Thanks for sharing experience. Why do you need HTC Vive pro headset? Could I use only basestations and tracker? If not, is it possible use older HTC Vive (not PRO) bundle?
Probably a stupid question but: Since the 2 base stations are relatively behind you, maybe that is why you had a hard time making the Blueprint detect the device !!?? Is that a possibility ?( problem you had around 25:15)
hello Richard, thanks about this tutorial. and i have other question, did you try two or more cameras? can we switch more Signal real-time from real camera and ue4 ?
@@richard.frantzen Thank you for the replay and the tutorial it really helped me out a lot, and got as far as adding the axis on the camera. I figured out how to do it with only a tracker and base stations, but when I hit "simulate" the axis spins out of control; have you ran into that problem before?.
If anyone has trouble exporting the camera movement sequence: if your resulting movie just shows a static camera, it's because your camera is still controlled by the tracker. When exporting a movie, a new Play in Editor session starts that attempts to recreate all the recorded "moves". But since your camera location and rotation are set each tick by the vive tracker, the engine isn't able to replicate the recorded sequence. So when exporting the movie, make sure to disable your tick function and re-enable it after the export is done.
sorry my english. With this flowchart for every trackable camera you need a complete system, wouldn't it be possible for live to use a single htc live with a tracker for each camera and use the chromakey of an unreal engine?
Thanks Richard!!, One question: Can I use the Blackmagic UltraStudio Recorder 3G? I was searching the mini recorder but in official page only appears the Ultrastudio recorder 3G, not the mini version... I think is the newer version but i need to be sure. Thanks for the knowledge, keep going please!
Hey Richard thx for comprehensive tut. I have 2 questions 1) Do I need vive setup as I can track via iPHONE X..? if yes ONE BASE and ONE TRACKER sufficient or no.?
Hey Richard, great video but I have a quick question. When you send the video out to be used in OBS can you explain how you are then capturing it in OBS? Are you simply taking one SDI out on the card then connecting it to a SDI in on the card or is there a way to internally capture the BM output that I dont know about? Thanks.
Hey Richard! Thank you for this tutorial, it's gave a lot of tips that are not obvious and not explained anywhere! I am about to build similar setup, and this video got me few points that I didnt think about before. 1) why to use additional computer for comping and not composure? is that neccesary? 2) why can't you sync using the timecode? like i've done the same fast tilt for syncing when I tried the virtual camera in UE with the iPad as a tracking device, but it was nessesary cuz there's no other sync tools. and by the way, had you considered to mix Vive tracker with the iPad/iPhone, so it will cover you in cases Vive loses tracking? I've seen people combining devices for motion capture so I'm pretty sure you can combine Vive and iPhone. For my tests with IPhone only it gave pretty good results (it just doesn't have exact positional tracking it uses AR for that, only rotational is precise)
First off, awesome video thank you! I'm fighting with the axis of the tracker, flipped on the X/Z I've been adjusting the rotation but I can't figure it out. ideas? tracker flies upward instead of forward
Hi mate! I solved that by flipping the cinecamera rotation. And since the last few Ue updates you need to have the vive tracker green led indicator at the back not pointing forwards as I say in the video. I Will try to do an updated video.
I spent literally an entire day trying to figure out how to flip the orientation thing and nothing seemed to work!... and then it dawned on me that the solution is simple... I'll just rotate the puck itself using a ball mount to make the puck 90 degrees. Problem solved.
Cameron Nichols It may work but not as good since you need sdi to pass the timecode and video feed without the delay. Had the same question since have a7s laying around. Luckily cheapest option Zcam e2c, which is not that expensive. As for vive, you can try URemote app if you have ARkit enabled iOS device (iPhone 10 and newer), but it’ll involve slightly different setup.
Please let me know how to load live footage from BMPCC 6K pro into UE5. I tried using Media Bundle and using Deck Link 8K capture card with recommended settings. Thanks.
Hi Richard, thank you for the super detailed tutorial. I notice you have IClone installed. I'm new to the whole live streaming CG arena and am at the moment looking at the best and simplest approach at combining Moc-cap and facial capture and Iclone seems to be the way to go. Is it not possible to do what you're doing in UE4 all within Iclone?
@@richard.frantzen I see- although from the latest release of Iclone it looks to me that it's not that far off from UE4 in that it supports HDR and subsurface. From the test renders I've seen they seem pretty close.
Hi Richard Frantzén, i’m doing your way how you use the OBS key and comp. I using the Red with black magic card, but it’s can not connect with OBS for live action. I wonder, did i miss something? Can you help me? Many Thanks!
Hey great video. I have a question tho. I have everything working but it seems that I only get tx ty and tz from the vive when I hit play. I don’t seem to be getting the rotation values. Any advise?
Why not use that tracking info on 3d softwear like blener for better cg quality but not unreal engine ? and also why not use tracker for body tracking like more professional equip. Btw this was very interesting video
I cant seem to find the right ID of my sensor, I turned it on at first so I thought it would be 0 or 1 but its not working. Any tips to see what device has what ID?
Hi! Thank you very much! I had much success in using 4. Mostly because it makes the coverage better and larger. th-cam.com/video/oxw5THZlijQ/w-d-xo.html here is a 20 min long take with the Vive that we did a week or two back!
Hi, would it be possible to skip the deck link and use the BM converters to link to the MacBook? So basically 2 converters. One for camera feed in and one for compositing output.
Great tutorial. I recently got a Vive tracker (I had this setup working pretty well before with my controller), and I wonder if you have any idea about why the axes are all messed up in the program. For example, I tilt up the VCam tilts down, I pan right, the VCam rolls counter-clockwise. Any idea why this might be happening? It seems like it might just be a calibration thing but surely it should know which way around it is since it's being tracked...
Richard, a question!: there's so little info available about how to make the complete VSP set-up, how on earth did you manage to get all these elements to work together last year?
My axis’s for the vive tracker is all upside down for some reason Like when I’m moving the tracker around keeping it straight going left to right the camera is rotating upside down Any suggestions?
dearest Richard if I start to compliment you I won't even finish in a month I can only thank you. Having said that I have been dealing with this topic for a long time in the past I have tried to trace with Nintendo and then with Kinect and more but certainly HTC lives really good. My project is aimed at creating virtual sets for small stations that cannot afford expensive systems, at the moment with my htc he lives first versioe and I traced with the controller, I have not yet purchased the tracker lives and seems to work even if I believe there will be need to delay the 3d background model because the live cameras will be slower and even more if they come from hdmi. Another thing that would be comfortable but I still don't know if the quality is good is the internal chromakey of unreal. It would also be convenient to use obs but I should send 3 levels for each single camera and two chroma keys for two cameras and I don't know if this is possible with obs.
Hey Richard, Thank you very much for making this tutorial! I'm new to VP and really learned a lot. However, after this tutorial, I'm wondering if it is also possible to do all the recordings inside Unreal without exporting anything to OBS? Isn't it possible to compose the Video feed going into Unreal and the recorded movement of the virtual camera in Unreal directly and then just render out a final Image?
There is another question I would like to ask you: the position of the vive locator is different from the position of the virtual camera, so there will be a certain gap between the picture taken by the real camera and the picture taken by the virtual camera. Does this need to be corrected? thank you very much!
Hi Richard, I really like your videos! keep it up! I have a question: to connect the vive tracker to a camera with unreal and only capture the movement, it is necessary to connect it with the htc bases or the vive tracker does it alone without another accessory. I need to capture only camera movement in unreal. thanks!!
A3 Vision my understanding that you do need a base for tracking. Question is whether you need hmd, since you can buy base and tracker for pretty cheap.
I am setting up at the moment and got all to work but one i couldnt accomplish which is how to control the focal distance .. i need to focus live with a 2nd vive tracker rotation yaw. .. i got the tracker in my scene tracked well but couldnt connect it to the cinecam focal distance ... Any information please .. Thank you so much
Mohammed Idrei I think Cinematography database did something similar, and had pretty in-depth explanation of his blueprint on his Chanel, unfortunately I don’t remember the name of the video (something like 2 vive setup or 2 tracker setup). I’m trying to figure out midi solution with arduino atm.
Wondering if you could do a tutorial on how to do this if you just have a regular catpure card. For example I have an elgato cam link 4k and I have the imput into unreal engine.
Love these Videos! Just in the future just say what you are clicking on. Hard to find the windows or where to click as the mouse is not visible either. Just found it hard to follow. Although thank you so much for this! Helps a lot!
Thanks for your tutorial, it is so interesting ,i am trying to do same thing in unreal, i followed your step but at 39:42 of your video your work is very smooth my media capture only work one frame ,when i hit play my axis guide and camera sync with my tracker ,all looks ok but the capture only give me the first frame ,any idea what happening ?i send media out by black magic media output through sdi to one monitor , also can you explain or show us how to render or output these captured video ? thanks again my brother !
Hi Richard, Thank you for sharing.How do you calculate the relative transform between vive and unreal engine?I cannot map my studio camera to the virtual camera.
Hi! That's a cumbersome process. And you actually need to find the entrance pupil. Check out Greg Corson here on youtube. He has some great videos on that!
Hi Richard, I'm really curios to know why do you use OBS instead of doing the compositing inside Unreal? BTW great job by creating the discord-channel, making the videos and putting all the work and time. thanks man.
Hello friend, The Deckline Mini Recorder for Blackmagic is compatible with the UE4 for Live tracking? I tried much and i don´t serve... Help Please...!!
Glass Hand Studios Because I needed to compensate for latency. So live feed and CG could match up. But the great Greg Carson has solved that so now I’ve incorporated his solution :) I’ve linked his video in our discord-channel that I’ve pinned here in the comments! Cheers mate!
Hi! Thanks for the tutorial Richard, it's been very helpful. I've actually set up the Vive tracker the same way as in the video, but everytime I close the project and SteamVR (like turning off and on the PC), the coordinates of the Vive tracker changes everytime, like 5 to 10 cm randomly (sometimes the tracker on the floor is at 12cm, others at 3cm...). So although I place the trackers relative to the position of the scene I want to see, everytime I start the project the trackers are not in the exact same position. How can I get this solved? Thanks :)
I have a question for you, I want to buy the "Blackmagic Design Decklink Duo 2" but I have a camera that doesn't have an SDI connector, how could I connect the camera to the card? could I buy a converter from HDMI to SDI? any help is appreciated thanks.
I'm running my T2i through an hdmi to sdi adapter into a Decklink 8k pro. Works great. I just had to remember that the T2i's video output is 59.9i. !! Old Cannons! lol. Gotta make sure to set UE4 to 59.9i as well I was good to go.
Thank you for the tutorial. I appreciate it. Unfortunately though as a viewer I got lost pretty quickly. I'd really recommend you to work on your structure. Like for example divide 1. Introduction 2. Use Cases 3. Setup 4. Blueprints and so on. This is sooo important when making such a long video. I hope you do take this comment as constructive criticism.
Hi Sir, I am a studio owner in South Africa we really want build our studio for virtual production. Please can you inform me what tracking system are you useing and are you aware of Stype tracking system.
Hey Richard, do I just need the Vive Tracker and 2 base stations to make it working? No Vive HMD or linkbox is required? Thanks in advance! Best Pascal
th-cam.com/video/KluwXzYykZE/w-d-xo.html It isn't my cup of tea, but their name was on the scene he showed. I was curious about what it ended up like too.
The camera signal is not synchronized with the video card signal and there is a delay. Solved by obs delay in time, or there will be a problem that the camera cannot be matched when the camera is moved sideways because of the signal is not synchronized, how to solve it?
roland tung yeah I know! This tutorial is a little old. Check my latest blueprint here with added camera tracking delay. And check my other videos and live streams :) www.vphub.net/forums/unreal-engine/latest-blueprint-for-camera-tracking-with-htc-vive-including-delay/
So finally I've gotten a Patreon-page and to kick things off I've released a Level Transition Blueprint that you just drop into your project to get it working. I will upload a tutorial next week but if you want to get the blueprint right away it's on my Patreon www.patreon.com/richardfrantzen
And if you get the VIP you get access to our secret Discord-channel!
thank you for not cutting out moments where things dont work. This has helped me not get frustrated. I dont like watching vids where the host makes everything look so easy, because working with new tech...ITS NEVER THAT EASY
Jay Trumbull haha thanks! Yeah this Journey has been more than frustrating 😂 but still so much fun!
yeah a lot of the time when I see someone in a tutorial have a problem, I know how to fix it myself if I have the same problem
Thank you so much for posting this. There isn't a lot of good information available on this topic and you are doing an amazing job of bridging the gap
Hey Richard, Thank you so much for pushed me to learn unreal. I’m nuke compositor and I’m fascinated with all these new possibilities. I watched the unreal showcase a few days ago and I thought: maybe I can do “the same” with the vive trackers. I was wondering maybe someone is already looking in the same direction and I found you! Please lets continue growing and learning about this topic, is amazing!
I would like to thank you very much for your work and effort in sharing this material.
I have been studying for months to use Unreal as a new film composer and it is amazing how there is no adequate material and literature to do this ... Everything is super hard to find and there is no compilation of information ... a real puzzle.
Many, many, but many thanks for the video. I don't know you but I already owe you one.
leon malatesta thanks mate! Same here! That’s why I started doing these videos. So everyone can evolve faster and collect info together!
Richard - thank you so much for sharing your knowledge on this! Great video. I've been wanting to build a "poor man's" VizRT for years and I think this tech gets very close to that goal. Looking forward to seeing more of your work and tutorials.
Good educating video Richard would really like to see more on how to composite in virtual production with Unreal Engine
Richard Frantzén, Thanks for, great tutorial. Couple suggestions, that may work (unfortunately can’t test myself). A: instead of event tick you can use the time code. (Don’t remember the node name). Basically event tick depends on your work station so the results may b inconsistent. B: theoretically if you key your camera before feeding it to unreal you can try to place media actor in your camera blueprint and be independent from media plane in the scene. So basically you can freely walk around been limited only by base stations.
Ar. S. Gonna have a look at that time code instead! Check my latest video I’ve been experimenting with composite planes :) thanks
Thanks for sharing this video Richard! Lots of good info here to dive into Virtual Production. Keep up the great work, and the innovation!
This is amaizing work! I am experimenting with some stuff, this is incredible! Good job!
This is an incredible process!!! Can't wait to see what else you do with it
Excellent information Richard. Thanks so much!
This is truly inspiring stuff. I can't thank you enough for taking us through your process.
Thanks. Great tut. One suggestion to reduce jittering is to bake lights which are not moving. That will take off huge amount of load.
Cheers
Yes! Will look into that! Thanks mate :)
This is AMAZING! Exactly what I was looking for. Thanks so much for creating this walk-through!
Dude you are doing great work..!! Experiments are like that, don't feel bad about green screen!!
it's amazing .... an incredible amount of information. thank you very much
Hi Richard,
I just watched the whole video ... it's amazing .... an incredible amount of information.
You could in the next video make it clear how to apply the chroma video layer. By not having a green background at the moment of explanation you kind of reversed the process, putting a 3d element in the real environment .... I was a little confused. I didn't understand the relationship.
Thank you once again.
leon malatesta yes of course! But if you put the green screen layer in OBS it’s just a matter of adding a key filter in OBS and putting the cg layer underneath! Thanks mate!
Thank you so much for your time of sharing your great technical work!
25:43 Check Show Engine Content and the axis guide shows up.
thank so much for this uniqe tutorial...noone else went so in depth. it is the only video I found to explain so detailed how to set up e virtual production equipment....BUT I have to admit I got already a headache.. too complex men... I wish it would be more easier... Hope to create more tutorials.
08:30 Yay that's me!
Good that my huge muscles was put to good use!
Charlie Bandoola haha nice!!!!
Hi Richard. Thanks for the video.
About the hardware, do I need the whole HTC Vive kit, or just the base Stations and the Tracker would do??
19:34 - at here does the setup of the virtual production in unreal engine start
Waiting for your other video ..thanks mate for the info
Man this is really good and it’s saving lots of money than using stype or similar beers, is there any chance to have a tutorial just for the gear setup plz and what exactly equipment you need to use?
Hi. Thanks for sharing experience. Why do you need HTC Vive pro headset? Could I use only basestations and tracker? If not, is it possible use older HTC Vive (not PRO) bundle?
You can set up with only two base stations and one tracker, even without HMD. Base stations v1 works perfect. Version 2 is only needed for large areas
@@ablevi5774 do you have a link to set this up?
Probably a stupid question but:
Since the 2 base stations are relatively behind you, maybe that is why you had a hard time making the Blueprint detect the device !!?? Is that a possibility ?( problem you had around 25:15)
hello Richard, thanks about this tutorial. and i have other question, did you try two or more cameras? can we switch more Signal real-time from real camera and ue4 ?
I am also wondering if more than 1 camera can be connected and how much more demanding it is for the PC
Is it possible to use the vive tracker and base stations with out the vive pro display?
It is but I don’t know how!
@@richard.frantzen Thank you for the replay and the tutorial it really helped me out a lot, and got as far as adding the axis on the camera. I figured out how to do it with only a tracker and base stations, but when I hit "simulate" the axis spins out of control; have you ran into that problem before?.
@@OscarHernandez-js3pk im struggling to make it work like you did.
How did you achieve without HMD?
If anyone has trouble exporting the camera movement sequence: if your resulting movie just shows a static camera, it's because your camera is still controlled by the tracker. When exporting a movie, a new Play in Editor session starts that attempts to recreate all the recorded "moves". But since your camera location and rotation are set each tick by the vive tracker, the engine isn't able to replicate the recorded sequence. So when exporting the movie, make sure to disable your tick function and re-enable it after the export is done.
Thank you so much !
sorry my english. With this flowchart for every trackable camera you need a complete system, wouldn't it be possible for live to use a single htc live with a tracker for each camera and use the chromakey of an unreal engine?
Marco De Nuzzo hi!! You could use several trackers with the same system!
Thanks Richard!!, One question: Can I use the Blackmagic UltraStudio Recorder 3G? I was searching the mini recorder but in official page only appears the Ultrastudio recorder 3G, not the mini version... I think is the newer version but i need to be sure. Thanks for the knowledge, keep going please!
Hey Richard thx for comprehensive tut. I have 2 questions 1) Do I need vive setup as I can track via iPHONE X..? if yes ONE BASE and ONE TRACKER sufficient or no.?
Hey Richard, great video but I have a quick question. When you send the video out to be used in OBS can you explain how you are then capturing it in OBS? Are you simply taking one SDI out on the card then connecting it to a SDI in on the card or is there a way to internally capture the BM output that I dont know about? Thanks.
Christian Morrison hi In this video I’m using two ultra studio mini recorders on my laptop. Cheers
Thank you so much for this video!!!
Hey Richard! Thank you for this tutorial, it's gave a lot of tips that are not obvious and not explained anywhere! I am about to build similar setup, and this video got me few points that I didnt think about before. 1) why to use additional computer for comping and not composure? is that neccesary? 2) why can't you sync using the timecode? like i've done the same fast tilt for syncing when I tried the virtual camera in UE with the iPad as a tracking device, but it was nessesary cuz there's no other sync tools. and by the way, had you considered to mix Vive tracker with the iPad/iPhone, so it will cover you in cases Vive loses tracking? I've seen people combining devices for motion capture so I'm pretty sure you can combine Vive and iPhone. For my tests with IPhone only it gave pretty good results (it just doesn't have exact positional tracking it uses AR for that, only rotational is precise)
First off, awesome video thank you! I'm fighting with the axis of the tracker, flipped on the X/Z I've been adjusting the rotation but I can't figure it out. ideas? tracker flies upward instead of forward
Hi mate! I solved that by flipping the cinecamera rotation. And since the last few Ue updates you need to have the vive tracker green led indicator at the back not pointing forwards as I say in the video. I Will try to do an updated video.
I spent literally an entire day trying to figure out how to flip the orientation thing and nothing seemed to work!... and then it dawned on me that the solution is simple... I'll just rotate the puck itself using a ball mount to make the puck 90 degrees. Problem solved.
I know right 😂
Hello, tell me please, do you need a complete set of htc vive or just 2 base stations with a controller (tracker)?
Hello does anyone know if this would work with the Sony a7iii, and do you need just the Vice tracker or the whole setup with remote?
Cameron Nichols It may work but not as good since you need sdi to pass the timecode and video feed without the delay. Had the same question since have a7s laying around. Luckily cheapest option Zcam e2c, which is not that expensive. As for vive, you can try URemote app if you have ARkit enabled iOS device (iPhone 10 and newer), but it’ll involve slightly different setup.
Please let me know how to load live footage from BMPCC 6K pro into UE5. I tried using Media Bundle and using Deck Link 8K capture card with recommended settings. Thanks.
Hi Richard, thank you for the super detailed tutorial. I notice you have IClone installed. I'm new to the whole live streaming CG arena and am at the moment looking at the best and simplest approach at combining Moc-cap and facial capture and Iclone seems to be the way to go. Is it not possible to do what you're doing in UE4 all within Iclone?
Hi! UE4 has way better graphics rendering than iclone so no :)
@@richard.frantzen I see- although from the latest release of Iclone it looks to me that it's not that far off from UE4 in that it supports HDR and subsurface. From the test renders I've seen they seem pretty close.
Hi Richard Frantzén, i’m doing your way how you use the OBS key and comp. I using the Red with black magic card, but it’s can not connect with OBS for live action. I wonder, did i miss something? Can you help me? Many Thanks!
good good big job. please continue is an interesting topic
Marco De Nuzzo I will! Thanks mate!
Thanks , this information is a treasure !
It is really a great video. I would request for a details video of hardwares setup realtime.
Great video! do you know if the vive cosmos would work for this set up?
I don’t know! But could probably be a bit harder
Hey great video. I have a question tho. I have everything working but it seems that I only get tx ty and tz from the vive when I hit play. I don’t seem to be getting the rotation values. Any advise?
so enlightening! Richard
hi! how have you started your game from the vcam istead of the character?
Hi Richard I have a Black Magic Pocket Cinema Camera, would that be compatible with this workflow?
Why not use that tracking info on 3d softwear like blener for better cg quality but not unreal engine ? and also why not use tracker for body tracking like more professional equip.
Btw this was very interesting video
I cant seem to find the right ID of my sensor, I turned it on at first so I thought it would be 0 or 1 but its not working. Any tips to see what device has what ID?
So far the best overview! You are the best for sharing your knowledge. Do you recommend two base stations or 4?
Hi! Thank you very much! I had much success in using 4. Mostly because it makes the coverage better and larger. th-cam.com/video/oxw5THZlijQ/w-d-xo.html here is a 20 min long take with the Vive that we did a week or two back!
This Clip save my day,Thank you!!
Did you manage to link the vive tracker with the motioncontroller pawn in unreal, if yes, please tell me how.
Hi, would it be possible to skip the deck link and use the BM converters to link to the MacBook? So basically 2 converters. One for camera feed in and one for compositing output.
Hi everyone! I created a discord-channel for us to collaborate in. If you want to join it's over here: discord.gg/AzDc8vf
Hi Richard, great video. Can you share a link of the PDF check list or the final document that you generate in this video
Great tutorial. I recently got a Vive tracker (I had this setup working pretty well before with my controller), and I wonder if you have any idea about why the axes are all messed up in the program. For example, I tilt up the VCam tilts down, I pan right, the VCam rolls counter-clockwise. Any idea why this might be happening? It seems like it might just be a calibration thing but surely it should know which way around it is since it's being tracked...
Richard, a question!: there's so little info available about how to make the complete VSP set-up, how on earth did you manage to get all these elements to work together last year?
My axis’s for the vive tracker is all upside down for some reason
Like when I’m moving the tracker around keeping it straight going left to right the camera is rotating upside down
Any suggestions?
What is the secret for smooth movement of camera in UE4 for virtual production?
dearest Richard
if I start to compliment you I won't even finish in a month I can only thank you.
Having said that I have been dealing with this topic for a long time in the past I have tried to trace with Nintendo and then with Kinect and more but certainly HTC lives really good. My project is aimed at creating virtual sets for small stations that cannot afford expensive systems, at the moment with my htc he lives first versioe and I traced with the controller, I have not yet purchased the tracker lives and seems to work even if I believe there will be need to delay the 3d background model because the live cameras will be slower and even more if they come from hdmi. Another thing that would be comfortable but I still don't know if the quality is good is the internal chromakey of unreal. It would also be convenient to use obs but I should send 3 levels for each single camera and two chroma keys for two cameras and I don't know if this is possible with obs.
Hey Richard, Thank you very much for making this tutorial! I'm new to VP and really learned a lot. However, after this tutorial, I'm wondering if it is also possible to do all the recordings inside Unreal without exporting anything to OBS? Isn't it possible to compose the Video feed going into Unreal and the recorded movement of the virtual camera in Unreal directly and then just render out a final Image?
Yes absolutley!
@@richard.frantzen Thank you!
Great tutorial!
Is it possible to do this with just the Tracker or do I need the whole kit?
you at least ned the htc vive pro headset and two lighthouses 2.0.
@@youcantrhymeorange thanks!
There is another question I would like to ask you: the position of the vive locator is different from the position of the virtual camera, so there will be a certain gap between the picture taken by the real camera and the picture taken by the virtual camera. Does this need to be corrected?
thank you very much!
Yes! You need to add that in Unreal so it matches the location IRL!
@@richard.frantzen Thanks!
Thank you so much Richard!
Hi Richard, I really like your videos! keep it up! I have a question: to connect the vive tracker to a camera with unreal and only
capture the movement, it is necessary to connect it with the htc bases or the vive tracker does it alone without another accessory. I need to capture only camera movement in unreal. thanks!!
A3 Vision my understanding that you do need a base for tracking. Question is whether you need hmd, since you can buy base and tracker for pretty cheap.
Would you please explain how i can control the "SceneMover" actor with a joystick or keyboad W,A,S,D buttons please ?!
Thank you so much
Mohammed Idrei you can setup first person character controller, or just copy it. TH-cam has plenty of tutorials for that.
I am setting up at the moment and got all to work but one i couldnt accomplish which is how to control the focal distance .. i need to focus live with a 2nd vive tracker rotation yaw. .. i got the tracker in my scene tracked well but couldnt connect it to the cinecam focal distance ... Any information please ..
Thank you so much
Mohammed Idrei I think Cinematography database did something similar, and had pretty in-depth explanation of his blueprint on his Chanel, unfortunately I don’t remember the name of the video (something like 2 vive setup or 2 tracker setup). I’m trying to figure out midi solution with arduino atm.
Wondering if you could do a tutorial on how to do this if you just have a regular catpure card. For example I have an elgato cam link 4k and I have the imput into unreal engine.
Also dont really understand how to setup the scene mover.
Love these Videos! Just in the future just say what you are clicking on. Hard to find the windows or where to click as the mouse is not visible either.
Just found it hard to follow. Although thank you so much for this! Helps a lot!
Thanks for the tip! Yeah I think maybe the mouse disappeared in the recording! Sorry about that!
Thanks for your tutorial, it is so interesting ,i am trying to do same thing in unreal, i followed your step but at 39:42 of your video your work is very smooth my media capture only work one frame ,when i hit play my axis guide and camera sync with my tracker ,all looks ok but the capture only give me the first frame ,any idea what happening ?i send media out by black magic media output through sdi to one monitor , also can you explain or show us how to render or output these captured video ? thanks again my brother !
Great tutorial, Thank You. I would love to see the export from Unreal and using the camera data in After Effects or Cinema 4d
pls!!
Hi! i'm working on that!
@@richard.frantzen That would be lovely and super helpful! Thank you in advance!
Hi Richard,
Thank you for sharing.How do you calculate the relative transform between vive and unreal engine?I cannot map my studio camera to the virtual camera.
Hi! That's a cumbersome process. And you actually need to find the entrance pupil. Check out Greg Corson here on youtube. He has some great videos on that!
thank you , so amazing
Any possible way to connect vive tracker rotation to cinecam "focal distance" .. so i can focus manually by hand ?
Hi Richard, I'm really curios to know why do you use OBS instead of doing the compositing inside Unreal?
BTW great job by creating the discord-channel, making the videos and putting all the work and time. thanks man.
Hello friend, The Deckline Mini Recorder for Blackmagic is compatible with the UE4 for Live tracking? I tried much and i don´t serve... Help Please...!!
thank you Richard very usefull ! :)
Please make more tut with real actors on greenscreen and digital background unreal engline.
i keep clickin on his videos cos his thumbnail had it lol
Fantastic video! I'm curious why you are using OBS instead of composure in the engine to do the live composite? Cheers! :)
Glass Hand Studios Because I needed to compensate for latency. So live feed and CG could match up. But the great Greg Carson has solved that so now I’ve incorporated his solution :) I’ve linked his video in our discord-channel that I’ve pinned here in the comments! Cheers mate!
@@richard.frantzen Oh awesome! Thank you so much for the heads up!
Great, thanks for your sharing.
Hi! Thanks for the tutorial Richard, it's been very helpful. I've actually set up the Vive tracker the same way as in the video, but everytime I close the project and SteamVR (like turning off and on the PC), the coordinates of the Vive tracker changes everytime, like 5 to 10 cm randomly (sometimes the tracker on the floor is at 12cm, others at 3cm...). So although I place the trackers relative to the position of the scene I want to see, everytime I start the project the trackers are not in the exact same position. How can I get this solved? Thanks :)
Hi~My camera is not a professional movie machine,How am I supposed to correct the camera's field of view?Thank you.
Hey Richard. you are so good, But my Tracker orientation is at wrong direction. How to fix it?
Turn the tracker 180
@@richard.frantzen oo my god. it fixed. thank you so much my savior
Thank you, Richard Frantzén for such a great tutorials.
i am new to UE, Is it possible to record media plate?
Hi!! Thanks! Not that I have figured out yet :(
thank you soo much for this
I have a question for you, I want to buy the "Blackmagic Design Decklink Duo 2" but I have a camera that doesn't have an SDI connector, how could I connect the camera to the card? could I buy a converter from HDMI to SDI? any help is appreciated thanks.
I'm running my T2i through an hdmi to sdi adapter into a Decklink 8k pro. Works great. I just had to remember that the T2i's video output is 59.9i. !! Old Cannons! lol. Gotta make sure to set UE4 to 59.9i as well I was good to go.
Thank you for the tutorial. I appreciate it. Unfortunately though as a viewer I got lost pretty quickly. I'd really recommend you to work on your structure. Like for example divide 1. Introduction 2. Use Cases 3. Setup 4. Blueprints and so on. This is sooo important when making such a long video. I hope you do take this comment as constructive criticism.
Hi Sir, I am a studio owner in South Africa we really want build our studio for virtual production. Please can you inform me what tracking system are you useing and are you aware of Stype tracking system.
Hello Richard. How many decklink do i need for outputs?
Can you tell me how to synchronize the focus,I think it's very cool!
Hi can my timecode isn’t syncing with the Camera its red any tutorial for that?
Hi Richard,Can Nikon D750 be connected? It has no SDI interface,
Looking forward to your next video.
Hey Richard, do I just need the Vive Tracker and 2 base stations to make it working? No Vive HMD or linkbox is required? Thanks in advance! Best Pascal
I don’t know that’ sorry!
Hi. Can I send sign of 5 cameras to make a live streming with Unreal? ¿How can I do it? Thanks!!!!
It’s possible but would probably need more computers
10:08 Can someone help me where I can buy that lights?
would love to see the finished product
th-cam.com/video/KluwXzYykZE/w-d-xo.html It isn't my cup of tea, but their name was on the scene he showed. I was curious about what it ended up like too.
Just curious (as a beginner), what is crappy about the green screen?
Wrinkles all over the place and uneven lighting which results in several different color variances of green which makes it hard to pull a clean key
@@richard.frantzen OK, so i guess you want your greenscreen to be as matt as possible!?
Thanks Man!
The camera signal is not synchronized with the video card signal and there is a delay. Solved by obs delay in time, or there will be a problem that the camera cannot be matched when the camera is moved sideways because of the signal is not synchronized, how to solve it?
roland tung yeah I know! This tutorial is a little old. Check my latest blueprint here with added camera tracking delay. And check my other videos and live streams :) www.vphub.net/forums/unreal-engine/latest-blueprint-for-camera-tracking-with-htc-vive-including-delay/
Thank you very much!