An Example Spatial Video MV-HEVC created with this method can now be downloaded for free here: bit.ly/46rSmWd. If you have an Apple Vision Pro, you can go check it out. We teach by example-show you what is possible to inspire you to push boundaries! Support us by joining our Patreon and inner circle!
Hi Hugh, I would like to thank you for all your content. It’s been really helpful, I have recently experienced the immersive video experience 180 and 360 view videos and I am just so excited and want to change my career now to making these kind of video. If you could guide it would be really helpful as I want to learn to scratch
Thank you so much! Yes, this channel is all about helping filmmakers and creators like you in VR180 or 360 immersive production. We will even go into the business side of thing very soon.
First thank you for all the amazing tutorials. Do you have a tutorial for the canon r7 with the rf-s 3.5mm edit workflow in premiere pro? That would be a life saver, thanks!
Canon R7 with the new dual fisheye editing is exactly the same as all the VR180 editing - just render as VR180 on EOS VR Utility and follow this tutorial - th-cam.com/video/iBa6JNwcOyE/w-d-xo.htmlsi=ZTcqHcNzoQorQbzf - I know it is 6 years old but still the same workflow
Hi! Quick question, is it possible to render out a VR360 mono 12K file in the Spacial Video Metadata Injector as well? We have the META Three camera for reference. I'm only able to render out a 8K and i have to render it out as a top/bottom with the same video over and under as if it is a stereo video for it to work. SBS VR180 stereo works fine. Happy to here your experiences with it :)
Hi Hugh, please advise - whichever .mp4 file I put into Spatial M GUI - at the end double images. As below I used settings for canon r5c with dual fisheye lens. As well source is VR180 which was exported from Davinci (using your video manual). Thank you in advance.
Yes, as below someone wrote about doubled images for everything that is far than 2m. Each eye separately looks fine. As well source file works fine if I watch via skybox. But the main reason for converting is that when you watch through vision pro gallery, the image is not so strengthen and because of it end quality looks better. So maybe there are some special setting for video from canon r5c&fisheye lens.
I see a hugh hou video, I press like. Never got disappointed :) Btw: Has Kandao fixed their problems (e.g. stitching) you found out in your Qoocam 3 Review?
Great video! So, are the VR180 videos that can be uploaded to TH-cam now all edited using DaVinci Resolve? I've been using Canon VR Utility to convert and then editing the videos with PR, but they haven't been uploading to TH-cam.
I still use VR Utility lol and PR as well. This workflow is for Spatial Immersive 180 film on Apple Vision Pro - not for TH-cam VR. But I know why TH-cam VR is having massive problem for Premiere render. Yes one way to bypass it is to use DaVinci Resolve - as I color grade there and final render there as well so I can bypass it. But I still have not finish that tutorial - and really TH-cam should really fix that problem on their end as it is not end user fault. So I am still hopeful. We support workflow in both Adobe Premiere and Resolve here.
@@hughhou I want to ask if it's possible to use DaVinci Resolve to edit and export videos that were converted using Canon VR Utility, and then upload them to TH-cam?
Thank you for an excellent GUI! Question, how can you play 3D 180 VR full screen once clip is finished. The clip opens in finder app in the small spacial window. I want to view it full screen. When playing the file in moon player and reality player the video playback skips around. Would like native playback.
Yes it is designed for player who use native Apple Vision Pro codec - which are currently SpatialGen or Kandao XR (both are free btw). Apple native player should support immersive 180 and 360 soon but it is still not out yet. But the format is correct. You just need to wait for Apple to release the update. But try it on SpatialGen or Kandao XR - they are both hardware accelerated playback so it should not have playback issue.
Bonjour Hugh ! Vous êtes rempli d'énergie et de passion, ce doit être contagieux. :-) bravo! Hello Hugh! You are filled with energy and passion, it must be contagious. :-)
Thanks for the tutorial. Would this work on Quest 3 too? I can see the demo spatiale video in Q3 but cannot view custom/downloaded one on Q3... Could you do a tutorial ? ❤
Thank you for the great video! It's always really helpful. I couldn't solve the problem myself, so I'd like to ask a question. I was able to install Homebrew, but when I run brew install spatial, the following error appears. Error: Cask spatial depends on hardware architecture being one of [{:type=>:arm, :bits=>64}], but you are running {:type=>:intel, :bits=>64}. - - - I'm using Mac Studio 2023 (mac OS Sonoma Ver.14.5), so I don't think the architecture is Intel.
Yes, Spatial Video only support on Apple Sillicon not Intel base mac. Check your Mac About page, if it said Intel it won't work. This is just how Apple work for Vision Pro - require at least M1 to process and encode
@@hughhou Thank you for your kind reply! 🤩I'm using an M2 Ultra processor. I don't understand why it is considered an intel Mac.🤔 If you have any tips, please let me know. Thank you as always!🙇♂
Thank you Hugh: I got the Spatial M GUI working. I am using the Canon R5 with the 5.2mm Dual Lens. I had the settings FishEye, SidebySide, Right, 144 and 60. But anything less than 2m from camera appears as a double image. And the Fisheye effect also spoils the vision. 1: Is there a way of flattening the image and 2: any suggested settings to eliminate the double vision. I also tried changing the Horiz Disparity to 0.3 but maybe the answer lies in some magic number for this? I have DaVinci - will that help in any way? Thanks In Advance.
OK - I then figured -I would use Canon's VR Utility to flatten the image to a 180VR file then brought it into SMG. But SMG converted it to 4K - are there plans to encode into 8K if the original was shot in 8K? Then back into DVR for stretching the file vertically and applying IS
@@JM-xl3ij Hi, please advise as I have the same issue with double image. How did you figure it out? As I use .mp4 file from davinci, but anyway images are doubled at the end.
@@Vik-b3f The problem is - you need to use the Canon VR Utility which is still very poorly developed - I cancelled my subscription. Once you process in VRU - then you can just view the file in the AVP but there is considerable distortion. The whole thing is badly implemented. But it was interesting to see the 3D effect. also although you shoot at 4K - the end result at best in the AVP is only 2k.
@@JM-xl3ij Thanks for reply. I figure out to process file in Davinci first (somehow my canon VR utility completely don't see any files from neither r5c nor from card reader), then in spatial metadata gui. But the thing is that only 8k 60fps are with more or less quality. When I watch end result in Skybox - it's kind of strengthen. When via native AVP gallery - picture is too small. Seems like we need to wait for some customisable players. As well did you figure out with converting pictures to watch via AVP?
Your videos are always so helpful. I think I've gotten 90% of the way there with this one, but whatever I do, I can't seem to get the final output correct to upload to TH-cam. I can get a 3D video, but it won't recognize it as 180. I'm using the RF-S Fisheye lens. Using spatial, I can get a spatial video, and that seems to work, but I would also like to be able to upload a 180 to TH-cam. I think I'm missing the secret step in the script you're running on export. :)
Oh yes. It’s in one of my old R5C tutorial using my Google VR Creator tool to inject meta before upload. I am working on an end stringer script for Resolve to do this task in auto. Will release the script soon.
I’ll wait for the script. I did try the Google tool, but TH-cam gives an error that it can’t process the video. Same if I try running it through Premiere or Media Composer. Hoping the script solves it. Thanks again for everything you do for the community!
Still an early version. I tried converting VR180 videos rendered from my old VuzeXR to Spatial and they didn't appear or were rendered in something unreadable or something unlike other spatial videos. When I converted spatials that were already converted to FSBS from a spatial the thumbnail in Meta Quest 3 was not like the others (although the video worked fine).
Did you already use your Vuze app to convert whatever Vuze format into standard VR180 mp4 file before throw into the encoder? The Encoder only work on standard video file - MP4 h264 h265 ProRes or other supported file format. I have no idea what Vuze file is and since it is a discontinued camera - you will need to cover it to standard before any app can read it :(
@@hughhou Yes, the Vuze software (still around and works) converts VR180 (and VR360 2D) videos to MP4 - h264 (not h265). The VR180 videos show up normally in the Preview window of the injector app. There's evidently more metadata involved or obstructive. BTW, your TH-cam videos are the best most complete with no fluff.
Hey Hugh, Your videos have been an indispensable resource for me as I dove into the world of spatial video creation a few months ago for the AVP! This update has been a game changer and huge time saver in the final edit process! Getting my edits to play spatial in my AVP works perfectly! but - A question for you. after using the Spatial Metadata GUI or kartaVR Reactor spatial metadata injector directly in Davinci Resolve studio, I am having issues sending that video to a friend who also uses the AVP. When I airdrop the final edited files to my AVP from my Mac it works perfectly! but when we send it via email link, or shared I cloud folder etc, it seems that the spatial metadata goes away and my buddy cannot watch it in spatial? Any ideas here?
Really helpful thanks a lot! :) Btw: Have you tried the Qoocam 3 when it is how outside? Really would like to know how long it lasts without overheating...
Another question Hugh : and thanks again for sharing all your knowledge: Have you ever tried the Canon RF 3.9 Dual Lens on the Canon C70 or Red Komodo to avoid the cropping effect? I have been waiting for an answer from Canon for over a week.
Does AVP spatial video support ambisonic audio. ? I am editing a VR 180. I captured the audio with zoom H3VR. So I can choose ambiaonic , stereo, binaural as output.
It support Apple version of Dolby Atmos - no one know what is the exact workflow yet for this. So if you are releasing on there, just stitch to stereo for now until we get more information from Apple
I keep getting this error on the GUI: Error: The value '64,' is invalid for '--cdist ' I tried various numbers and it is not working, It is always adding a comma when I exit the field.
It will assuming you follow my last tutorial to calibrate 3D correctly with your dual GoPro. Then this is your final step to inject MV-HEVC so your dual GoPro 3D video can be viewed on Vision Pro. 8K 60fps baby!!
Well, it’s up to Apple if they want to open source “spatial” standards. You need Apple to see Spatial so it make sense only Mac can make it. But when they publish the standard then it’s a lot easy for us to figure out PC workflow. It’s matter of time now.
Hehe you should see the answer in the video :) But 3D 360 video work natively in Q3 - Spatial 360 is for Vision Pro only injection. The idea is to SAVE Bandwith so people have see 8K to 16k 3D 360 video - no camera capture this specs tho expect Pro 2 or Titan!
Im trying to inject spatial metadata into a upres 16k canon r5c dual fisheye lens footage. but after 3 hours of injection. from a 20 sec footage it became 09 sec and only contains 2 still frames. any idea why this would happen?
First...what?! 16K? How even? Second, this is above Apple encoder limit and GPU processing. Do you have a M3 Max like I do? You might have a bottleneck in your hardware acceleration. 16K is a lot to process and Apple is not openning the pipeline for Non-Apple ways to do so. As 16K need to optimize and that optimize workflow is not public yet. 100% online claim 16K video is not real MV-HEVC 16K or even not 16K at all. As far as I know but you might know stuffs that I don't lol. According to Mike, 16K hit hardware limit and if you do not have the latest chips - might not be possible.
I ran the homebrew install in the terminal, but when I pasted the next line "brew install spatial" terminal reports" command not found. Also, if you are running Sequoia, you cannot just override the security by right-clicking. You need to go into Privacy and Security setting and enable non Apple software to be installed.
@@hughhou I actually got it working - somehow it was how I was pasting the app (second line) into the terminal. The App works really great with the GUI front - I was able to create spatial video out of my Canon R7 stereo footage. Looked great in Vision Pro, but of course it's presented as a circle.
Windows does not support MV-HEVC - it is not the version problem is the technology problem. Apple own Spatial Video format license and windows does not.
where will we see the final output and how long do we have to wait or does it take it immediately? I haven't tried with 200 birate but unfortunately I can't see the output on my desktop.
Hi. Im trying to convert a 2 GO PRO VR setup footage to be injected with the spatial metadata for it to be preview on the apple vision pro. however, i encountered some errors. by any chance you would know how to fix this error message when injecting the metadata Error Code - "Input video has an unspecified color transfer function." My current workflow is Align the timecode of both gopro footage in premier - bring it to mistika boutique to VR Stich - export out as prores LT - input to mac for MV-HEVC spatial metadata coversion.
So it can be missing color space tags. Try to bring it into Resolve to render with different color space tag (in delivery page uncheck same as project) - it’s known problem on Spatial code by Mike. I don’t know which color space to tag exactly as you run tho many software and it’s GoPro. But try force tag it to test?
The program didn't work for me. Using insta 360 1 inch edition. Ive followed the steps over and over again but still no success, is there any other recommendations? thanks Matt
Hey man, greetings from Turkey. I follow you on every platform. I'm curious about this. We render CRM files from Adobe starting from EOS VR Utility and upscale high bitrate (200-250 mbps) up to Topaz AI. We take 16k scenes and encode them from TOPAZ Ai at the end. Do you think this system will contribute to us, and if so, at which stage should we use it? Also, are you throwing the high gb file you get from CRM here or the render you get from Adobe and Resolve?
This workflow does not care your NLE and post workflow. So in general you get CRM to ProRES and edit in NLEs or AI with Topaz, at then end, you need to make your VR180 video into Spatial Immersive 180 - the format Apple support. Then you run thos this free app to have the correct MV-HEVC metadata. Then you play it back on Apple Vision Pro.
Yes. Apple MV-HEVC (spatial video) is a Mac only thing. Even on Meta Quest 3, you will require a Mac to do spatial video (assuming you have an iPhone). We hope Apple open it up soon but there is hack to do it on Windows as well. I will leave that for my next tutorial as it is a little bit complicated. Try to keep this one short and sweet.
@@hughhou thanks Hugh from a long time viewer that still uses the evo360 vr180 and Qoocam.. even though the OG Qoocam now doesnt work with vr180 on TH-cam :(
Just installed and get it running. There is a bug with number format. You have to switch to 123,456.00 instead of German format 123.456,00 to prevent syntax error by parsing arguments to spatial. The result of 8192x4096 to spatial is strange. Looks in the Vision Pro like a small window instead of immersive, similar to the AVP recordings. PS: This was a error from the photo library of Vision OS 2 beta 3. In Kandao Player it looks fine
Thank you for pointing this out. For VR180, you need SpatialGen or Kandao XR app to see fully immersive 180. The AVP native player does not understand immersive video yet. Both SpatialGen or KandaoXR are free app on VisionOS. Native Support should come soon from Apple.
You actually can not do it on Windows right now as Spatial Video is really an Apple thing and Apple MV-HEVC need Apple encoder. I am looking into a workaround right now
Oh okey @hughhou would be nice if you could test this in your review of the qoocam 3 ultra. :) Also some tests with dificult lighting like trees and overexposed skies, that seem to be a problem with the qoocam 3 would be really important for the Qoocam 3 ultra review. :)
After some tests…. Sadly it’s not working for me. I’ve an 8K VR 360 3D Over and Under Video. Already finished to upload it to TH-cam and would use the injector on my M3 Max MacBook Pro to set the Metadata for TH-cam. But no matter what I’m trying after encoding I get only an 2D 360 Video out. No 3D. After encoding you already see that the file doesn’t contain the second eye anymore. Please fix it. You’ve the only Apple silicon friendly solution with UI
unfortunately, the second line "brew install spatial" is not accepted from my terminal app: I get % brew install spatial zsh: command not found: brew it seems the first word "brew" is non recognised as a terminal command.....do you have a suggestion on how to proceed?
Oh did you run the first command? You need to install Brew. Brew is a basic package manager for everyone on Mac. It is very common in the developer community. Try search "how to install brew in Mac" - and 1 extra line install brew you are good to go
@@hughhou of course I did: now I can see in my application folder the Spatial Media Metadata Injector app installed but nothing else.......could be my MacOs system (Monterey 12.6.2) the problem, for your opinion? and always think for you efforts....
@@GoldWind420 I updated macos system to ventura, btw not sure it was depending on the system… It seems the meaning of an answer got from terminal was: “it’s already installed on your computer” So if the same with you, I only can suggest to verify it….sorry if I can’t help more
An Example Spatial Video MV-HEVC created with this method can now be downloaded for free here: bit.ly/46rSmWd. If you have an Apple Vision Pro, you can go check it out. We teach by example-show you what is possible to inspire you to push boundaries! Support us by joining our Patreon and inner circle!
Hi Hugh, I would like to thank you for all your content. It’s been really helpful, I have recently experienced the immersive video experience 180 and 360 view videos and I am just so excited and want to change my career now to making these kind of video. If you could guide it would be really helpful as I want to learn to scratch
Thank you so much! Yes, this channel is all about helping filmmakers and creators like you in VR180 or 360 immersive production. We will even go into the business side of thing very soon.
First thank you for all the amazing tutorials.
Do you have a tutorial for the canon r7 with the rf-s 3.5mm edit workflow in premiere pro? That would be a life saver, thanks!
Canon R7 with the new dual fisheye editing is exactly the same as all the VR180 editing - just render as VR180 on EOS VR Utility and follow this tutorial - th-cam.com/video/iBa6JNwcOyE/w-d-xo.htmlsi=ZTcqHcNzoQorQbzf - I know it is 6 years old but still the same workflow
Such a good way and easy explanation. It's always so good!
Glad you think so!
Hi!
Quick question, is it possible to render out a VR360 mono 12K file in the Spacial Video Metadata Injector as well? We have the META Three camera for reference. I'm only able to render out a 8K and i have to render it out as a top/bottom with the same video over and under as if it is a stereo video for it to work. SBS VR180 stereo works fine. Happy to here your experiences with it :)
Hi Hugh, please advise - whichever .mp4 file I put into Spatial M GUI - at the end double images. As below I used settings for canon r5c with dual fisheye lens. As well source is VR180 which was exported from Davinci (using your video manual). Thank you in advance.
What do you mean double image? Did you choose VR180 and Left and right?
Yes, as below someone wrote about doubled images for everything that is far than 2m. Each eye separately looks fine. As well source file works fine if I watch via skybox.
But the main reason for converting is that when you watch through vision pro gallery, the image is not so strengthen and because of it end quality looks better. So maybe there are some special setting for video from canon r5c&fisheye lens.
I see a hugh hou video, I press like. Never got disappointed :)
Btw: Has Kandao fixed their problems (e.g. stitching) you found out in your Qoocam 3 Review?
Oh yes! The stitching is so good now
@@hughhou Uhh great! :) thanks!
Great video! So, are the VR180 videos that can be uploaded to TH-cam now all edited using DaVinci Resolve? I've been using Canon VR Utility to convert and then editing the videos with PR, but they haven't been uploading to TH-cam.
I still use VR Utility lol and PR as well. This workflow is for Spatial Immersive 180 film on Apple Vision Pro - not for TH-cam VR. But I know why TH-cam VR is having massive problem for Premiere render. Yes one way to bypass it is to use DaVinci Resolve - as I color grade there and final render there as well so I can bypass it. But I still have not finish that tutorial - and really TH-cam should really fix that problem on their end as it is not end user fault. So I am still hopeful. We support workflow in both Adobe Premiere and Resolve here.
@@hughhou I want to ask if it's possible to use DaVinci Resolve to edit and export videos that were converted using Canon VR Utility, and then upload them to TH-cam?
@@BroBro-ww6roI’m trying this too, but Resolve doesn’t seem to inject the necessary VR metadata like Premiere does… looking for a solution.
Thank you for an excellent GUI! Question, how can you play 3D 180 VR full screen once clip is finished. The clip opens in finder app in the small spacial window. I want to view it full screen. When playing the file in moon player and reality player the video playback skips around. Would like native playback.
Yes it is designed for player who use native Apple Vision Pro codec - which are currently SpatialGen or Kandao XR (both are free btw). Apple native player should support immersive 180 and 360 soon but it is still not out yet. But the format is correct. You just need to wait for Apple to release the update. But try it on SpatialGen or Kandao XR - they are both hardware accelerated playback so it should not have playback issue.
@@hughhou Thank you! Will try :) PS: your videos are fire! Keep up the good work! Always inspire to keep doing the VR fight!
The genius does it again!!!
Awwww thank you!!
Big thanks to Andrew and Mike! And Hugh! Thank you!
Our pleasure!
Bonjour Hugh ! Vous êtes rempli d'énergie et de passion, ce doit être contagieux. :-) bravo!
Hello Hugh! You are filled with energy and passion, it must be contagious. :-)
Awww thank you thank you!!
Thanks for the tutorial. Would this work on Quest 3 too? I can see the demo spatiale video in Q3 but cannot view custom/downloaded one on Q3... Could you do a tutorial ? ❤
Thank you for the great video! It's always really helpful. I couldn't solve the problem myself, so I'd like to ask a question. I was able to install Homebrew, but when I run brew install spatial, the following error appears.
Error: Cask spatial depends on hardware architecture being one of [{:type=>:arm, :bits=>64}], but you are running {:type=>:intel, :bits=>64}.
- - -
I'm using Mac Studio 2023 (mac OS Sonoma Ver.14.5), so I don't think the architecture is Intel.
Yes, Spatial Video only support on Apple Sillicon not Intel base mac. Check your Mac About page, if it said Intel it won't work. This is just how Apple work for Vision Pro - require at least M1 to process and encode
@@hughhou Thank you for your kind reply! 🤩I'm using an M2 Ultra processor. I don't understand why it is considered an intel Mac.🤔 If you have any tips, please let me know. Thank you as always!🙇♂
Thank you Hugh: I got the Spatial M GUI working. I am using the Canon R5 with the 5.2mm Dual Lens. I had the settings FishEye, SidebySide, Right, 144 and 60. But anything less than 2m from camera appears as a double image. And the Fisheye effect also spoils the vision.
1: Is there a way of flattening the image and
2: any suggested settings to eliminate the double vision.
I also tried changing the Horiz Disparity to 0.3 but maybe the answer lies in some magic number for this?
I have DaVinci - will that help in any way?
Thanks In Advance.
OK - I then figured -I would use Canon's VR Utility to flatten the image to a 180VR file then brought it into SMG.
But SMG converted it to 4K - are there plans to encode into 8K if the original was shot in 8K?
Then back into DVR for stretching the file vertically and applying IS
@@JM-xl3ij Hi, please advise as I have the same issue with double image. How did you figure it out? As I use .mp4 file from davinci, but anyway images are doubled at the end.
@@Vik-b3f The problem is - you need to use the Canon VR Utility which is still very poorly developed - I cancelled my subscription. Once you process in VRU - then you can just view the file in the AVP but there is considerable distortion. The whole thing is badly implemented. But it was interesting to see the 3D effect. also although you shoot at 4K - the end result at best in the AVP is only 2k.
@@JM-xl3ij Thanks for reply. I figure out to process file in Davinci first (somehow my canon VR utility completely don't see any files from neither r5c nor from card reader), then in spatial metadata gui. But the thing is that only 8k 60fps are with more or less quality. When I watch end result in Skybox - it's kind of strengthen. When via native AVP gallery - picture is too small. Seems like we need to wait for some customisable players. As well did you figure out with converting pictures to watch via AVP?
Hi, amazing video. I just got the qoocam ego 3D, can I use the footage from that camera also. If so do you have any settings tips? Thanks.
你的讲解,我们很明白。我喜欢!
07:34 i can't find Script(spatial Meradate) in davinci resolve. how can i get it?
I will upload it soon still working on the settings and modified it.
Your videos are always so helpful. I think I've gotten 90% of the way there with this one, but whatever I do, I can't seem to get the final output correct to upload to TH-cam. I can get a 3D video, but it won't recognize it as 180. I'm using the RF-S Fisheye lens. Using spatial, I can get a spatial video, and that seems to work, but I would also like to be able to upload a 180 to TH-cam. I think I'm missing the secret step in the script you're running on export. :)
Oh yes. It’s in one of my old R5C tutorial using my Google VR Creator tool to inject meta before upload. I am working on an end stringer script for Resolve to do this task in auto. Will release the script soon.
I’ll wait for the script. I did try the Google tool, but TH-cam gives an error that it can’t process the video. Same if I try running it through Premiere or Media Composer. Hoping the script solves it. Thanks again for everything you do for the community!
@@hughhou I’d love this too! VR Creator isn’t opening for me anymore and I’m not sure why.
Still an early version. I tried converting VR180 videos rendered from my old VuzeXR to Spatial and they didn't appear or were rendered in something unreadable or something unlike other spatial videos. When I converted spatials that were already converted to FSBS from a spatial the thumbnail in Meta Quest 3 was not like the others (although the video worked fine).
Did you already use your Vuze app to convert whatever Vuze format into standard VR180 mp4 file before throw into the encoder? The Encoder only work on standard video file - MP4 h264 h265 ProRes or other supported file format. I have no idea what Vuze file is and since it is a discontinued camera - you will need to cover it to standard before any app can read it :(
@@hughhou Yes, the Vuze software (still around and works) converts VR180 (and VR360 2D) videos to MP4 - h264 (not h265). The VR180 videos show up normally in the Preview window of the injector app. There's evidently more metadata involved or obstructive. BTW, your TH-cam videos are the best most complete with no fluff.
Hey Hugh, Your videos have been an indispensable resource for me as I dove into the world of spatial video creation a few months ago for the AVP! This update has been a game changer and huge time saver in the final edit process! Getting my edits to play spatial in my AVP works perfectly! but - A question for you. after using the Spatial Metadata GUI or kartaVR Reactor spatial metadata injector directly in Davinci Resolve studio, I am having issues sending that video to a friend who also uses the AVP. When I airdrop the final edited files to my AVP from my Mac it works perfectly! but when we send it via email link, or shared I cloud folder etc, it seems that the spatial metadata goes away and my buddy cannot watch it in spatial? Any ideas here?
you're a god send Hugh! Do you know if there's a workaround for Intel Macs? :(
I don't have Intel Mac but I tho Intel Mac should work as they are just Mac. Mmmmm...Let me do some research
Oh and do you know when the Acer camera is being released ? I’ve been checking daily haha (hoping to take it away on holiday) :)
@@hughhou no worries just wondering because a message appeared in terminal stating only for other architecture 👍
Also helpfull to get TH-cam to recognize 360 3D Content?
We will work on the trigger script to do that as well. Sorry got busy have not finished the last part of this tutorial - hold tight!
Really helpful thanks a lot! :)
Btw: Have you tried the Qoocam 3 when it is how outside? Really would like to know how long it lasts without overheating...
Not long lol... Extreme weather, Insta360 is better.
Another question Hugh : and thanks again for sharing all your knowledge:
Have you ever tried the Canon RF 3.9 Dual Lens on the Canon C70 or Red Komodo to avoid the cropping effect?
I have been waiting for an answer from Canon for over a week.
Does AVP spatial video support ambisonic audio. ? I am editing a VR 180. I captured the audio with zoom H3VR. So I can choose ambiaonic , stereo, binaural as output.
It support Apple version of Dolby Atmos - no one know what is the exact workflow yet for this. So if you are releasing on there, just stitch to stereo for now until we get more information from Apple
@@hughhou Thanks Hugh. I'll stay tunned for any news.
I keep getting this error on the GUI: Error: The value '64,' is invalid for '--cdist '
I tried various numbers and it is not working, It is always adding a comma when I exit the field.
Sound like a bug. Can you try 64.3 - I just use the latest version and it work fine for me!
Is there a way to inject VR180 metadata into videos exported from Davinci Resolve?
Yes! Use Post script injection in render page. I will make the tutorial soon keep forget about it.
Ooooh this is exciting. Will it work with dual GoPros edited together or do we need some actual stereo-camera metadata to use this?
It will assuming you follow my last tutorial to calibrate 3D correctly with your dual GoPro. Then this is your final step to inject MV-HEVC so your dual GoPro 3D video can be viewed on Vision Pro. 8K 60fps baby!!
@@hughhou Thanks for the response!
喜欢!谢谢!你的讲解!
As usual those of us without a Mac are screwed.
Well, it’s up to Apple if they want to open source “spatial” standards. You need Apple to see Spatial so it make sense only Mac can make it. But when they publish the standard then it’s a lot easy for us to figure out PC workflow. It’s matter of time now.
Does this also work for 360 v3D videos and the Quest 3?
Hehe you should see the answer in the video :) But 3D 360 video work natively in Q3 - Spatial 360 is for Vision Pro only injection. The idea is to SAVE Bandwith so people have see 8K to 16k 3D 360 video - no camera capture this specs tho expect Pro 2 or Titan!
@@hughhou Uhh I'm hyped! :)
Im trying to inject spatial metadata into a upres 16k canon r5c dual fisheye lens footage. but after 3 hours of injection. from a 20 sec footage it became 09 sec and only contains 2 still frames. any idea why this would happen?
First...what?! 16K? How even? Second, this is above Apple encoder limit and GPU processing. Do you have a M3 Max like I do? You might have a bottleneck in your hardware acceleration. 16K is a lot to process and Apple is not openning the pipeline for Non-Apple ways to do so. As 16K need to optimize and that optimize workflow is not public yet. 100% online claim 16K video is not real MV-HEVC 16K or even not 16K at all. As far as I know but you might know stuffs that I don't lol. According to Mike, 16K hit hardware limit and if you do not have the latest chips - might not be possible.
I ran the homebrew install in the terminal, but when I pasted the next line "brew install spatial" terminal reports" command not found. Also, if you are running Sequoia, you cannot just override the security by right-clicking. You need to go into Privacy and Security setting and enable non Apple software to be installed.
Yes that is new on 15.1 - Here is spatial detailed website you can install manually without home brew - blog.mikeswanson.com/spatial/
Suck to know Homebrew is not working - let me reach out to Mike maybe he can fix it.
@@hughhou I actually got it working - somehow it was how I was pasting the app (second line) into the terminal. The App works really great with the GUI front - I was able to create spatial video out of my Canon R7 stereo footage. Looked great in Vision Pro, but of course it's presented as a circle.
when will a window version be ready?
Windows does not support MV-HEVC - it is not the version problem is the technology problem. Apple own Spatial Video format license and windows does not.
where will we see the final output and how long do we have to wait or does it take it immediately? I haven't tried with 200 birate but unfortunately I can't see the output on my desktop.
It output to the same folder your source video is.
Hi. Im trying to convert a 2 GO PRO VR setup footage to be injected with the spatial metadata for it to be preview on the apple vision pro. however, i encountered some errors. by any chance you would know how to fix this error message when injecting the metadata Error Code - "Input video has an unspecified color transfer function."
My current workflow is
Align the timecode of both gopro footage in premier - bring it to mistika boutique to VR Stich - export out as prores LT - input to mac for MV-HEVC spatial metadata coversion.
So it can be missing color space tags. Try to bring it into Resolve to render with different color space tag (in delivery page uncheck same as project) - it’s known problem on Spatial code by Mike. I don’t know which color space to tag exactly as you run tho many software and it’s GoPro. But try force tag it to test?
The program didn't work for me. Using insta 360 1 inch edition. Ive followed the steps over and over again but still no success, is there any other recommendations? thanks Matt
Doesn't seem to work on videos shot with a 360°/180° camera (like insta360, but not an actual insta360)
Yes 360 support on AVP is limited but the metadata is correct.
Hey man, greetings from Turkey. I follow you on every platform. I'm curious about this. We render CRM files from Adobe starting from EOS VR Utility and upscale high bitrate (200-250 mbps) up to Topaz AI. We take 16k scenes and encode them from TOPAZ Ai at the end. Do you think this system will contribute to us, and if so, at which stage should we use it?
Also, are you throwing the high gb file you get from CRM here or the render you get from Adobe and Resolve?
This workflow does not care your NLE and post workflow. So in general you get CRM to ProRES and edit in NLEs or AI with Topaz, at then end, you need to make your VR180 video into Spatial Immersive 180 - the format Apple support. Then you run thos this free app to have the correct MV-HEVC metadata. Then you play it back on Apple Vision Pro.
Mac only?
Yes. Apple MV-HEVC (spatial video) is a Mac only thing. Even on Meta Quest 3, you will require a Mac to do spatial video (assuming you have an iPhone). We hope Apple open it up soon but there is hack to do it on Windows as well. I will leave that for my next tutorial as it is a little bit complicated. Try to keep this one short and sweet.
@@hughhou thanks Hugh from a long time viewer that still uses the evo360 vr180 and Qoocam.. even though the OG Qoocam now doesnt work with vr180 on TH-cam :(
Glad. Kandao has something new coming that support HLG and Apple Color Space.
Just installed and get it running. There is a bug with number format. You have to switch to 123,456.00 instead of German format 123.456,00 to prevent syntax error by parsing arguments to spatial.
The result of 8192x4096 to spatial is strange. Looks in the Vision Pro like a small window instead of immersive, similar to the AVP recordings.
PS: This was a error from the photo library of Vision OS 2 beta 3. In Kandao Player it looks fine
Thank you for pointing this out. For VR180, you need SpatialGen or Kandao XR app to see fully immersive 180. The AVP native player does not understand immersive video yet. Both SpatialGen or KandaoXR are free app on VisionOS. Native Support should come soon from Apple.
Exactly. You are great 👍🏼
So would we run the same process in Windows but use the Command line in Windows?
You actually can not do it on Windows right now as Spatial Video is really an Apple thing and Apple MV-HEVC need Apple encoder. I am looking into a workaround right now
Oh okey @hughhou would be nice if you could test this in your review of the qoocam 3 ultra. :) Also some tests with dificult lighting like trees and overexposed skies, that seem to be a problem with the qoocam 3 would be really important for the Qoocam 3 ultra review. :)
ALL are coming!!! But under NDA so I can not talk about it
Nice :) @@hughhou
After some tests…. Sadly it’s not working for me. I’ve an 8K VR 360 3D Over and Under Video. Already finished to upload it to TH-cam and would use the injector on my M3 Max MacBook Pro to set the Metadata for TH-cam. But no matter what I’m trying after encoding I get only an 2D 360 Video out. No 3D. After encoding you already see that the file doesn’t contain the second eye anymore. Please fix it. You’ve the only Apple silicon friendly solution with UI
unfortunately, the second line "brew install spatial" is not accepted from my terminal app: I get % brew install spatial zsh: command not found: brew
it seems the first word "brew" is non recognised as a terminal command.....do you have a suggestion on how to proceed?
Oh did you run the first command? You need to install Brew. Brew is a basic package manager for everyone on Mac. It is very common in the developer community. Try search "how to install brew in Mac" - and 1 extra line install brew you are good to go
@@hughhou of course I did: now I can see in my application folder the Spatial Media Metadata Injector app installed but nothing else.......could be my MacOs system (Monterey 12.6.2) the problem, for your opinion?
and always think for you efforts....
@@robertomancuso2421 have you been able to work this out? I am in the same spot!
@@GoldWind420 I updated macos system to ventura, btw not sure it was depending on the system…
It seems the meaning of an answer got from terminal was: “it’s already installed on your computer”
So if the same with you, I only can suggest to verify it….sorry if I can’t help more
@@robertomancuso2421 I am not up to date so let me try thank you.
thanks