Amazing step by step, thank you for sharing. Like many I bought a Samsung Gear 360 (£20!!!!) confident that I’d find an app to convert the footage, I was so wrong, but this tutorial makes the camera the bargain of the century, thank you! I’m sharing the shit out of this!
Thank you very much. When I was in trouble because the file was corrupted and the metadata was lost, I was able to solve it by watching your video. I am grateful!
Alot has been learnt since without going into detail been working on some 360 film projects that have gone beyond what workflows typically highlight and thus had many challenges to overcome biggest being processing and optimising files that were in our cases 100s of GBs in size per clip
Thanks for this, a big help as I'd worked on a shoot and then found my 360 camera's software no longer worked. It's a nice way of doing it. I ended up outputting the inner 180 image and the outer 180 image to seperate image files, then I used warp tool a little and some carefull layer mask blending and work on brightness and colours in Photoshop to hide the joins really finely with lots of control.
Hey man, thank you for this. Just got a 2nd hand gear 360 for cheap for taking on-site virtual tours, and man have samsung dropped the ball with software support. Lucky there's ways to get around the official software... I was about to open up blender, have each fisheye projected onto the inside of half a sphere, then stick a equirectangular camera in the centre to render....this is probably faster lol
GREAT INFO, thanks. I have watched it few times to get a better understanding ... However, I do have a question ... When you increased the FOV for lens 2 from 180° to 201° (for the seam stitching) isn't that stretching/enlarging that lens' video by a small amount? Thus making the lens 2 view just slightly zoomed in compared to the lens 1 view? If yes, could you have (instead) adjusted lens 2 FOV by about half as much and then do a similar feathering setup and FOV adjustment for lens 1 that would be about the same amount? Perhaps doing so would even remove that faint dark line at the stitch? I am thinking that the faint dark line exists because there was no feathering for lens 1, thus creating a 'thicker' merge across that 30 pixel range; where the single feathered edge overlaps a solid edge vs lens 2 feather edge overlapping lens 1 feather edge. Or perhaps I am not grasping something? Thanks for the great video.
Yes you could increase both by even amount, it depends on how video was captured in my example increasing one lens was fine but principally yes increasing both to blend edges would be better in many cases. This video was made many moons ago and is due a remake / update to cover further things missed
hi, I have a few questions, I have the theta V, and I am wondering what is the difference with my camera between taking photos with the dual fisheyes plug in and the bracketing mode, in the first one I can also do bracketing but I don't have the ptgui to make easier the stitching process and the second one I can do it without stitching. It is actually a difference using one mode or the other? I know that with the theta Z1 with the dng pictures you can have a better picture, but what do you think about theta v.
Bracketing is more for HDRI where different apertures are taken and CAN create better images for adjustable lighting conditions, fish eye distortion is already applied by the nature of how the lenses work. I am not a huge fan of the Theta V cameras storage and battery wise (not upgradable) but picture quality is great for price. Biggest differences tend to be in per lens pixel density and expandability tbh, if a 360 camera says its 5K per lens its closer to 2.5K (if 2 lenses)
This is mind blowing! Would anyone be able to help me convert a file? I don't have After Effects (or the skills!!) - I just received a fisheye mp4 recording of my son's first flying lesson and as a fisheye file it seems next to useless (the software for the camera no longer exists). I would happily pay someone to convert it for me. Thanks
Amazing step by step, thank you for sharing. Like many I bought a Samsung Gear 360 (£20!!!!) confident that I’d find an app to convert the footage, I was so wrong, but this tutorial makes the camera the bargain of the century, thank you! I’m sharing the shit out of this!
No problem, I still have this issue years on using other 360 cameras so this workaround is still valid even if does sometimes create extra compression
Thank you so much for making this. I too have a Samsung 360 and it is now impossible to install the software so this clip is a lifesaver.
Thank you very much. When I was in trouble because the file was corrupted and the metadata was lost, I was able to solve it by watching your video.
I am grateful!
This tutorial was a life saver, thanks a million!
THANK YOU SO MUCH!! Lifesaver for using my old MiSphere after having it collect dust for many years 🤦♂
I'am fascinated about your deep understanding of the vr tools in AE!
Alot has been learnt since without going into detail been working on some 360 film projects that have gone beyond what workflows typically highlight and thus had many challenges to overcome biggest being processing and optimising files that were in our cases 100s of GBs in size per clip
Man, you saved my beacon, thank you a lot!
This video helped me so much! You are a lifesaver
That's what i was looking for. Thanks man.
Thanks for this, a big help as I'd worked on a shoot and then found my 360 camera's software no longer worked. It's a nice way of doing it. I ended up outputting the inner 180 image and the outer 180 image to seperate image files, then I used warp tool a little and some carefull layer mask blending and work on brightness and colours in Photoshop to hide the joins really finely with lots of control.
Hey man, thank you for this.
Just got a 2nd hand gear 360 for cheap for taking on-site virtual tours, and man have samsung dropped the ball with software support. Lucky there's ways to get around the official software...
I was about to open up blender, have each fisheye projected onto the inside of half a sphere, then stick a equirectangular camera in the centre to render....this is probably faster lol
Glad I could help!
You saved my life!
Thank you so much, you are a lifesaver. I was frustrated with FCP, they don't have this VR converter tool.
GREAT INFO, thanks.
I have watched it few times to get a better understanding ...
However, I do have a question ...
When you increased the FOV for lens 2 from 180° to 201° (for the seam stitching) isn't that stretching/enlarging that lens' video by a small amount?
Thus making the lens 2 view just slightly zoomed in compared to the lens 1 view?
If yes, could you have (instead) adjusted lens 2 FOV by about half as much and then do a similar feathering setup and FOV adjustment for lens 1 that would be about the same amount?
Perhaps doing so would even remove that faint dark line at the stitch?
I am thinking that the faint dark line exists because there was no feathering for lens 1, thus creating a 'thicker' merge across that 30 pixel range; where the single feathered edge overlaps a solid edge vs lens 2 feather edge overlapping lens 1 feather edge.
Or perhaps I am not grasping something?
Thanks for the great video.
Yes you could increase both by even amount, it depends on how video was captured in my example increasing one lens was fine but principally yes increasing both to blend edges would be better in many cases. This video was made many moons ago and is due a remake / update to cover further things missed
Very useful Thanks man !!
hi, I have a few questions, I have the theta V, and I am wondering what is the difference with my camera between taking photos with the dual fisheyes plug in and the bracketing mode, in the first one I can also do bracketing but I don't have the ptgui to make easier the stitching process and the second one I can do it without stitching. It is actually a difference using one mode or the other? I know that with the theta Z1 with the dng pictures you can have a better picture, but what do you think about theta v.
Bracketing is more for HDRI where different apertures are taken and CAN create better images for adjustable lighting conditions, fish eye distortion is already applied by the nature of how the lenses work.
I am not a huge fan of the Theta V cameras storage and battery wise (not upgradable) but picture quality is great for price. Biggest differences tend to be in per lens pixel density and expandability tbh, if a 360 camera says its 5K per lens its closer to 2.5K (if 2 lenses)
vr convertor is missing from my after effects? any ideas how to get it
Pls how can i convert a fisheye movie into a spherical mirror?
Is there a way for final cut pro x?
I dont unfortunately have access to Final Cut Pro but if there are similar tools it should be possible... sorry
This is mind blowing! Would anyone be able to help me convert a file? I don't have After Effects (or the skills!!) - I just received a fisheye mp4 recording of my son's first flying lesson and as a fisheye file it seems next to useless (the software for the camera no longer exists). I would happily pay someone to convert it for me. Thanks
very helpful. Thank you
Glad it was helpful!