Perfect Nodal Offset for Virtual Production using RETracker Bliss. No More Sliding Feet!

แชร์
ฝัง
  • เผยแพร่เมื่อ 11 ก.ย. 2024

ความคิดเห็น • 49

  • @adorablemarwan
    @adorablemarwan ปีที่แล้ว +1

    Masterpiece!!! The final piece of the puzzle for perfect tracking. Many thanks.

  • @jimvfx1000
    @jimvfx1000 10 ชั่วโมงที่ผ่านมา

    Hi Greg

    I am having an issue with this process. It’s been a few years since I’ve needed to do the nodal offset procedure in UE but I recently bought a Retracker Bliss (I love it!) so I thought I’d try the procedure again. 

(I do recall the procedure being a a bit finicky-- but I was using Vives and Optitrack at the time)
    


    

Very briefly, I did these steps:
    


- performed lens distortion calc
    



- imported april tag asset

    



- added Bliss as LL source (april tag tracks)

    



- performed observations in lens file for nodal offset


    




    



Upon clicking “Add to Nodal Offset Calibration”, I get this error: 
“Failed to resolve a camera position given the data in the calibration rows. Please retry the calibration.”
    




    





Any thoughts on this?
    






    







Thanks in advance.
    PS- I should mention I’m on UE5.4.4

  • @Ryoma0z
    @Ryoma0z ปีที่แล้ว

    Holy stromboli Greg perfect solution for Bliss!! Can't wait to get on this.

    • @GregCorson
      @GregCorson  ปีที่แล้ว +1

      Thanks! Oh great, now you've got me wanting a Stromboli sandwich, haven't had one in ages.

  • @jimbawb
    @jimbawb 11 หลายเดือนก่อน +1

    Hi Greg, very informative video! Could I use this workflow with the Xvisio DS80 tracker? I've heard people say its the same as the Bliss? Thanks!

    • @GregCorson
      @GregCorson  11 หลายเดือนก่อน

      To use this nodal offset method, you have to have something in the scene that can be tracked. Bliss software adds the ability to track an apriltag which is what we use here. I don't think the XVISIO camera includes software for this. Also Bliss software does other things to clean up the tracking and send it to my LiveLink plugin. If you use some other camera you would have to implement this yourself. The source code for my LiveLink plugin is on github if you want to use it.

  • @otegadamagic
    @otegadamagic ปีที่แล้ว

    Wow this is amazing. Thanks for sharing

  • @zippyholland3001
    @zippyholland3001 ปีที่แล้ว

    Thanks 🙏 very informative

  • @jiaxinli8811
    @jiaxinli8811 5 หลายเดือนก่อน

    Great video! Is the nodal offset a function to fix the problem that most camera rig's rotation center/pan and tilt axis is not aligned with the lens's nodal point? There must be a solution for this problem already because although we can rig the camera that the nodal point is aligned when camera is on tripod or jib, the nodal point will never align when we handheld the camera.

    • @GregCorson
      @GregCorson  5 หลายเดือนก่อน

      For virtual production it isn't important to know the offset from the pan/tilt axis of the tripod. What you need is the offset from the lens nodal point to the center of tracking on your tracker device. For example the tripod screw on a VIVE. In a VP project Unreal gets the position of your tracker, the offset to the lens nodal point allows UE to place the nodal point of it's camera in the same place as your real world camera. Without the offset the UE camera would be where the tracker is, which would be wrong and cause slipping. The correct lens nodal to tracker offset works for any kind of camera mount including handheld.

  • @user-vg3tb9ut4m
    @user-vg3tb9ut4m ปีที่แล้ว

    你是真的厉害!

  • @AdamSmith-pn5hk
    @AdamSmith-pn5hk ปีที่แล้ว +1

    Would this work with the realsense t265? Scratch that, just ordered my retracker bliss!

    • @GregCorson
      @GregCorson  ปีที่แล้ว +1

      To use this method, the camera tracker also needs to be able to track something else in the scene, in this case it's tracking the apriltag. I don't think the T265 has software to do that right now.

    • @AdamSmith-pn5hk
      @AdamSmith-pn5hk ปีที่แล้ว

      @@GregCorson Any possibility of a tutorial in regards to using the bliss and being able to adjust or color correct the actual camera footage in post? Being able to render the ue footage and then applying it with the camera footage in After Effects is what I’m planning on. Cheers and thanks again for all of the great videos and information.

    • @GregCorson
      @GregCorson  ปีที่แล้ว

      I haven't done a lot of work with color correction and off-line compositing (yet) but others on TH-cam have, check around and you might find some solutions you can use. People frequently use the output from unreal as a preview, record the tracking in unreal and then non-realtime render it out in higher quality to be composited in some external software. It's also quite common to use an external keyer like an Ultimatte when you need real-time output with higher keying quality than Unreal supplies. There are also some products like Aximmetry (based on unreal) that have higher quality built in keyers.

  • @garthmccarthy4911
    @garthmccarthy4911 ปีที่แล้ว

    Hi Greg, Can this be applied to the Vive pro tacking system? We are fine doing pan and dolly but we cant seem to lock down the tilt.

    • @GregCorson
      @GregCorson  ปีที่แล้ว

      Yes, it does work with the vive, see my earlier tutorial for how th-cam.com/video/H5t2BwUGqkI/w-d-xo.html It doesn't solve problems like the Vive's jitter, but it will get you a good nodal offset.

  • @AdamSmith-pn5hk
    @AdamSmith-pn5hk ปีที่แล้ว

    Hey Greg, so is this a “set it and forget it” process? I’m assuming this would have to be done everytime if you move the camera from the initial setup process? I’ve run through your lens calibration video several times for my Sigma 18-35 lens, one for 18mm, 24mm, and 35mm and I figured once I go through that process once and I’m happy with it that I could apply those lens calibration files into any project and then set the aruco tag at the beginning of each setup. I feel like using this April tag method I would have to do this everytime I start a new shoot?Cheers and thanks!

    • @GregCorson
      @GregCorson  ปีที่แล้ว

      Once you have done the lens distortion for a lens you can just keep re-using it. The lens distortion normally will not change unless you attach a lens to a different camera. For some lens, changing focus or f-stop can change the distortion a bit, you can check that by doing a lens distortion calibration at several focus settings. If you have lens encoders and zoom lenses you want to calibrate the zoom at at least three points (both ends of the range and the middle) as the nodal and distortion frequently change when zooming. How much they change depends on the lens, some don't change, others change a lot.
      The nodal offset information will be good as long as you don't change the way the camera and tracker are attached to each other. If you move either one, you need to re-do the nodal offset calibration. I redo this occasionally because the camera or tracker can get bumped, changing the offset between them. Usually just check to make sure real/virtual objects are staying aligned and if they are not, re-do the nodal calibration.
      Once you have a good calibration, it's a good idea to save the values. If you ever have to re-do that calibration you can compare the new to the old to see if they are in the same range or if something is badly off.

  • @xujerry7791
    @xujerry7791 ปีที่แล้ว

    Hi greg. Thanks for your great tutorial! Would you please help me to see if any possible reason caused my errors? 1.I find when my camera applied lens file that the Focus value is not lens calibration result and can’t be modified manually. I tried several times, results are not right mostly. It seems to be BUG that my colleague also find. 2. I tried your Nodal Offset method but the offset result is 60 -100 centimeters (RMS 0.01) deviation. I checked and found that when I move the Apriltag in real world and the Apriltag is not moved and the camera is moved in UE. My understanding is Apriltag in UE should follow Apriltag’s move in real world so I add a livelinkcontroller to Apriltag in UE with camera 101. So in UE, the Apriltag and camera will both move. Thus, I tried not move the Apriltag in real world and only rotate my camera in real world, but the offset result is about 20 centimeters deviation. 3. If no Nodal Offset result, in check points method, I chose Apriltag applied as a parent, the two Apriltag will not ovelap.

    • @GregCorson
      @GregCorson  ปีที่แล้ว

      First off, before you try to do nodal offset you MUST have a good lens distortion calibration. If this is not high quality, nothing you do will get a good nodal offset measurement. I'm not entirely sure what your issue is but when you set things up as in this tutorial, you should be able to see BOTH the camera and apriltag moving in the editor. That is, if you move the camera you will see the camera move and the apriltag stay still. Move the apriltag and it should move and the camera stays still. If your setup does not work this way, then something is wrong. Most common related to bliss is that it might lose track of the apriltag due to glare or poor lighting. Be sure to check the retracker app to see if you have a good track of the apriltag.
      When doing the nodal calibration you need to shoot apriltag images at a variety of distances and orientations. The algorithm Unreal uses has problems if all the images are at the same depth or orientation. I always shoot at least 2 with the tag nearly full frame, 2 more about half frame and 2 more at about 1/3 frame. Do some images facing the camera and some with the tag at about 45 degrees. Some with the tag in the center of the frame and some with it at the edges. This should give you a good result.
      Please feel free to ask more questions, I may have misunderstood your issue.

    • @xujerry7791
      @xujerry7791 ปีที่แล้ว

      @@GregCorson Thanks a lot, Greg, for your great help and tutorial!

  • @MattHandy-zm2uj
    @MattHandy-zm2uj ปีที่แล้ว

    Hi Greg. Thank you for all the amazing videos and insight you've shared. Can you please help me understand what image dimensions I should enter in the lens file? I have a RED Dragon that shoots in 6K (@16:9) but the SDI output is 1080p. I expect that the sensor size should be the sensor size of the camera (at that resolution - 27.84x15.66 mm) but what would the image dimensions be? Is it the 5568x3132 being recorded on the camera or the 1920x1080 that's being transmitted from the camera to the Blackmagic card that's feeding Unreal Engine? How might this impact calibration and tracking?

    • @GregCorson
      @GregCorson  ปีที่แล้ว

      As long as your sensor is 16:9 native, then the sensor size should be what the camera specs say. However some cameras shoot with a cropped sensor at some resolutions and don't use the whole thing, not sure if the red does this or not. As far as the image dimensions go, I usually give it whatever resolution is going into Unreal by way of your video input card. Again, don't think it will matter as long as the aspect ratio is the same.

    • @MattHandy-zm2uj
      @MattHandy-zm2uj ปีที่แล้ว

      Thanks Greg. RED does crop, but fortunately they provide very comprehensive cropping data calculator. I’ve been struggling with getting good nodal point numbers and you’ve eliminated one of the variables. I really appreciate it.

  • @GregJones-xw9sg
    @GregJones-xw9sg ปีที่แล้ว

    Thanks for the help. I just purchased the bliss and it's much smoother than the Vive Tracker. I'm having an issue, like others, where I get a .3 sometimes higher error and my nodal offset comes up 60 cm sometimes. It seems like maybe I'm missing a step or something, but I've tried 4 or 5 times and get widely varying results. Do you have a checklist of things that must be done to get an accurate result? Some people have said incorrect sensor size could be to blame but I've gotten an accurate nodal point when I was using a Vive Tracker. What are all the places that I need to enter the Sensor Size? Maybe I've missed one.

    • @GregCorson
      @GregCorson  ปีที่แล้ว

      Unfortunately, unreal's nodal offset thing doesn't always give accurate results. A .3 result is not necessarily bad for distortion calibration, that means accurate to 0.3 of a pixel. As far as the nodal point with apriltag. This is dependent on your lens distortion being right. Also you must measure the size of your apriltag as accurately as possible, from the edge of the black on one side to the other, it is important to get this right and to set the tag size in both bliss and inside unreal. When it is all setup and running, look at the position of the tag relative to the camera in your unreal scene and make sure it looks similar to what's in your studio. When clicking on the points on the tag, be sure you are clicking on the right spots. In the tutorial I just captured a few images for the nodal, I think I mentioned that it is usually good to have at least 6 with different positions. I usually do 1 or 2 that nearly fill the frame, a couple more at about half-frame size and a few at quarter frame. It helps if some of the images have the tag rotated left or right by about 45 degrees.
      To get accurate nodal numbers, having the tag at different distances and angles is important. Also be sure after moving the tag/camera that bliss says the tag is still detected as it can lose tracking. Make sure the tag is well lit without glare. Also avoid adjusting the lens during the nodal calibration. Changing the focus can throw things off. I usually turn off auto focus and set the focus and F-stop so the tags will be in good focus at all their positions. Some lenses "breath" or change FOV a bit when you refocus which can really mess things up.
      Also make sure that if your camera/lens has any "steadyshot" or stabilization features they should be turned off, they will prevent the tracking from staying in sync with the image.

    • @GregJones-xw9sg
      @GregJones-xw9sg ปีที่แล้ว

      @@GregCorson Thanks for the tips. I've had the Bliss now for 3 weeks. I love how smooth it moves. Unfortunately I still haven't been able to get an accurate Nodal Offset using your method above. I even used your project template that came with the Bliss Plugin. I have a Blackmagic Ursa Mini Pro G2 and a Blackmagic input card. My Nodal Offset readings vary widly. From 15-20cm to 300-400cm. I've gotten close by measuring and adding my own Nodal Offset, but it still isn't 100% perfect. I'd love to be able to get the Nodal Offset using your method. Any gotchas that I might be missing?

    • @GregCorson
      @GregCorson  ปีที่แล้ว

      Not really sure, when I get wrong results it is usually because of a bad lens calibration or because the nodal offset images weren't taken at enough different distances and rotations. Also as I mentioned, steadyshot and autofocus should be disabled.
      Sometimes I have found unwanted offsets have snuck into some part of the calibration actors. Before you start calibration, make sure the only things in the apriltag and camera actors that aren't set to 0,0,0 are the ones actually receiving live link tracking.

  • @kennytan5570
    @kennytan5570 ปีที่แล้ว

    Hey Greg. Can we make this work with an antilatency tracker?

    • @GregCorson
      @GregCorson  ปีที่แล้ว

      I don't have an antilatency system, but this approach will work with any object you can track at the same time as the camera. If you have a second antilatency tracker you could do it. Do you have one?

  • @Hub-Images
    @Hub-Images ปีที่แล้ว

    Hi Greg, I followed the procedure for the nodal point offset. It does not work for me. the RMS Reprojection Error is arround 0.3, and the offset result is absurd ,of the order of several meters. Have you got a solution for that. Thanhs

    • @GregCorson
      @GregCorson  ปีที่แล้ว

      Normally it comes up zero, 0.3 probably means you are not clicking on the right spots on the tag, or that the camera might be moving while clicking on the 5 points. Important things to check. Make sure the tag is mounted to something that will keep it flat, no folds or wrinkles. Measure the size of the tag as accurately as you can, just the black part (not the whole paper) from edge-to-edge in centimeters to the nearest tenth. Be sure you have entered the size of the tag in BOTH the bliss app and in the apriltag asset in the sample project. Make sure the bliss is actively tracking the tag through the whole process (the livelink panel will show two green lights). Be as accurate as you can when clicking on the corners and center of the tag. To get more accurate you can make the window bigger or use the mouse wheel to zoom in on the tag. Don't put the tag too far away, you need samples from near, middle and farther ranges. When it's near it can almost fill the frame, at farther distances you want it still to be about a quarter frame high. Larger Apriltags will be more accurate at a distance. I use a 21 cm wide one, some people print them bigger. If none of these things seem to help, please write again.

    • @Hub-Images
      @Hub-Images ปีที่แล้ว

      Hi Greg, It still doesn't work. I made a short vidéo(3min) to show you my work and config .I send you a private message via facebook

    • @adorablemarwan
      @adorablemarwan ปีที่แล้ว +1

      One good method to follow is to move the camera only while the tag is kept static all the way especially in a marker-less environment. Do not move the Tag in anyway during the process.

    • @Hub-Images
      @Hub-Images ปีที่แล้ว

      @@Virtuals.tv-Productions The problem is not solve, and i don't know what to do

    • @Hub-Images
      @Hub-Images ปีที่แล้ว

      my setup is Blackmagic 4k with decklink mini recorder 4k

  • @original9vp
    @original9vp ปีที่แล้ว

    Realsense T265 please @greg

    • @GregCorson
      @GregCorson  ปีที่แล้ว

      The realsense T265 can't use this method without additional software development. It needs to be able to track a tag or something similar for this method to work.

    • @original9vp
      @original9vp ปีที่แล้ว

      @@GregCorson so Aximmetry would work with this process with the T265?

    • @GregCorson
      @GregCorson  ปีที่แล้ว +1

      Aximmetry has it's own calibrator which is different from Unreal's, it should be possible to use that calibrator with T265. I haven't used it so I can't advise, but it works with nothing but a large checkerboard for calibration.