- 29
- 32 042
Xlip Dev
Estonia
เข้าร่วมเมื่อ 15 ต.ค. 2015
Software engineer, playing around with Unreal Engine, I try to create useful tutorials and content ^^
Face Depth Frame Mancer 2025.2.0 - Coming Soon! #metahuman #performance #motioncapture #ue5 #editor
Ok.. iPhone real depth data performances, are you ready for a rematch now...
I just wanted to quickly show what's coming next... Let me know if you have any questions or different ideas!
Trying to learn video editing in the meantime.. 😆
The music in this video is from:
@soundridemusic
th-cam.com/video/rIRNljkLTKM/w-d-xo.html
th-cam.com/video/__7aqBRkYpo/w-d-xo.html
I just wanted to quickly show what's coming next... Let me know if you have any questions or different ideas!
Trying to learn video editing in the meantime.. 😆
The music in this video is from:
@soundridemusic
th-cam.com/video/rIRNljkLTKM/w-d-xo.html
th-cam.com/video/__7aqBRkYpo/w-d-xo.html
มุมมอง: 856
วีดีโอ
UE 5.4 & 5.5 MetaHuman Animator Android VS iPhone - Face Depth Frame Mancer 2025.1.7 #motioncapture
มุมมอง 1.2K21 วันที่ผ่านมา
Check out the cool new features in Face Depth Frame Mancer 2025.1.7 I compared the performances on the iPhone 14 Pro Max & Xiaomi Redmi Note 13 Pro with Unreal Engine 5.4 and 5.5. I’ll talk about what works, what’s different, and what’s coming next to make it even more amazing. Stick around, it’s gonna be fun! Grab the Plugin (Coming to Fab soon, hopefully!) ko-fi.com/s/486013e215 That was the ...
Unreal Engine 5.5 Metahuman Facial Animation Using Any Video! No iPhone, HMC, or Depth Cam Needed!
มุมมอง 6Kหลายเดือนก่อน
Metahuman facial performances with any video? Yees, let's do it! No need for an iPhone or fancy equipment! Grab the Plugin (Coming to Fab soon, hopefully!) ko-fi.com/s/486013e215 Huge shoutout to @tiedtkeio for showcasing the plugin with a webcam! Watch it here: th-cam.com/video/mu9q0EknDt4/w-d-xo.html Here is 2025.1.7 version features along with iPhone 14 Promax - Xiaomi Redmi Note 13 Pro comp...
Unreal Engine Holding Hands System - Bonus! AI Companion
มุมมอง 4962 หลายเดือนก่อน
In this tutorial, learn how to create a companion AI character that follows a player to hold hands in Unreal Engine. We’ll start by setting up a custom companion character blueprint that extends from an existing player character. Then, we’ll build an AI controller, set up a Blackboard and Behavior Tree, and define essential tasks for smooth following. Finally, we’ll cover how to fine-tune follo...
Unreal Engine 5.4 Holding Hands System Full Tutorial #6 - (Last touches, instance editable params)
มุมมอง 3342 หลายเดือนก่อน
The last video focuses on fine-tuning control rig variables to achieve precise control over follower animation. It also demonstrates how to create an instance-editable structure, allowing you to specify unique behaviour for each instance of the follower. Also includes setting up a small AI move-to behavior for demonstration purposes.
Unreal Engine 5.4 Holding Hands System Full Tutorial #5 -(Follower IK Control Rig, Animation states)
มุมมอง 4192 หลายเดือนก่อน
The 5th video dives into creating an IK control rig, demonstrating how to set up inverse kinematics for smoother and more dynamic animations. It also covers the setup of follower animation states, ensuring that the followers respond appropriately to the leader’s movements and interactions.
Unreal Engine 5.4 Holding Hands System Full Tutorial #4 - (Leader updates context variables)
มุมมอง 3032 หลายเดือนก่อน
The 4th video explains how the leader will set context variables for the follower, while also exploring considerations for handling world, control rig, and actor transforms, discussing the differences between them and what to use depending on the scenario.
Unreal Engine 5.4 Holding Hands System Full Tutorial #3 - (Leader Animation and ABP State Machine)
มุมมอง 6712 หลายเดือนก่อน
The 3rd video covers creating a leader's right arm animation, along with setting up the animation blueprint and state machine logic.
Unreal Engine 5.4 Holding Hands System Full Tutorial #2 - (Leader/Follower decide logic, collision)
มุมมอง 3322 หลายเดือนก่อน
The 2nd video focuses on implementing follower/leader logic, along with setting up sphere collision and overlap mechanics.
Unreal Engine 5.4 Holding Hands System Full Tutorial #1 - (Introduction, design and player creation)
มุมมอง 1.6K2 หลายเดือนก่อน
The 1st video covers system design and achieving the desired outcome, explaining what tools to use, how to use them, and guiding through the initial player creation process. If you want to download it as an asset pack: www.fab.com/listings/abc24af2-7e72-4ae2-98f9-4c4464835929
Unreal Engine 5.4: Smooth Camera Zoom - Should You Use Lerp with Timelines or Timers? With examples
มุมมอง 5172 หลายเดือนก่อน
Introduction & Define Zoom action: 00:00 - 08:34 Timeline Lerp: 08:34 - 18:41 Time Dilation & Timeline: 18:41 - 20:16 Timer Lerp: 21:22 - 33:25 Bonus Improvement with timers: 33:30 - 47:08 Time Dilation & Timer: 47:08 - 48:15 Conclusion: 48:36 - 50:41 I demonstrate how to create smooth camera zoom effects in Unreal Engine 5.4 using Lerp with both timelines and timers. I provide detailed example...
Unreal Engine 5.4 Start Up Crushes - How to Fix - Clear Shader Cache
มุมมอง 7073 หลายเดือนก่อน
Go to: C:\Users\{YOURUSER}\AppData\Local\UnrealEngine\Common\DerivedDataCache delete everything and C:\Users\{YOURUSER}\AppData\Local\UnrealEngine\Common\Zen delete "Data" folder.
Transparent/Fading Camera Blocking Object Material - Unreal Engine 5.4
มุมมอง 9573 หลายเดือนก่อน
Transparent/Fading Camera Blocking Object Material - Unreal Engine 5.4
How to Rig Your Character to Unreal 5 Skeleton In Blender without any Blender Plugin - Add on!
มุมมอง 7613 หลายเดือนก่อน
How to Rig Your Character to Unreal 5 Skeleton In Blender without any Blender Plugin - Add on!
Facial Mocap - Unreal Engine Metahuman Performance With Any Video for Free!! No Iphone or HMC
มุมมอง 9K3 หลายเดือนก่อน
Facial Mocap - Unreal Engine Metahuman Performance With Any Video for Free!! No Iphone or HMC
Unreal Engine 5.4 Baking Animation into Modular Rig - Shoulder, Arm Issue How to Fix
มุมมอง 9094 หลายเดือนก่อน
Unreal Engine 5.4 Baking Animation into Modular Rig - Shoulder, Arm Issue How to Fix
Unreal Engine 5.4.3 Modular Rig - Leg, Arm Issue (Pole Vector) How To Fix
มุมมอง 1.1K4 หลายเดือนก่อน
Unreal Engine 5.4.3 Modular Rig - Leg, Arm Issue (Pole Vector) How To Fix
Blender 4+ Rigyfy Rig to Unreal Engine 5+ Retarget animations - Without Plugins!
มุมมอง 3755 หลายเดือนก่อน
Blender 4 Rigyfy Rig to Unreal Engine 5 Retarget animations - Without Plugins!
Unreal Engine - Animated Text Widgets Demo
มุมมอง 796 หลายเดือนก่อน
Unreal Engine - Animated Text Widgets Demo
Unreal Engine - Animated UI Number Renderer Widgets Demo and Usage
มุมมอง 946 หลายเดือนก่อน
Unreal Engine - Animated UI Number Renderer Widgets Demo and Usage
LudumDare #36 SoundCarrier-Phone Inventor
มุมมอง 2988 ปีที่แล้ว
LudumDare #36 SoundCarrier-Phone Inventor
LudumDare #36 SoundCarrier the Phone Inventor..
มุมมอง 1158 ปีที่แล้ว
LudumDare #36 SoundCarrier the Phone Inventor..
i have followed your previous videos about this concept, ever since i searched for how to use android instead of iPhone, and only your solution made sense. i am glad now you have finished the plugin. great work. will surely get it. I have a question though, i wanted to know after creating an animation using your plugin, is it somehow possible to apply on CC4 characters (character creator).
Great thanks! I'm happy to hear you like it ^^ I've never used CC4 personally, but afaik CC4 uses a similar skeleton hierarchy and morph targets as UE MetaHuman. My plugin helps you create accurate depth maps from your video, which can then be turned into depth footage and processed through the UE MetaHuman pipeline. So, if you're able to use any MetaHuman animation with CC4, it means you should be able to use the performance output/result with CC4 as well.
와드
I have been trying to import a character into UE5 forever now, this will not work , because the head needs to be separate because of the morph targets for expressions. It is still very useful. Thank you!
@@johnnien2 you are right 👍 if you want to rig the character to a metahuman skeleton then yes, you have more things to do 😄
Can you help me please? I imported the video to UE5.5, clicked Extract Video Frames and got this error: "It seems there's an issue with the video. Please ensure the video exists and check the logs for more details." Captured with Samsung Galaxy S24, different resolutions, H264, tried to convert it with ffmpeg, still the same error
@@RobertKleinSk Can you jump into our discord server let's check, it would be nice if you can share error details from Output Log
@@xlipdev LogScript: Error: Script Msg: Traceback (most recent call last): File "<string>", line 1, in <module> File "C:\Program Files/Epic Games/UE_5.5/Engine/Plugins/Marketplace/FDFMancer/Content/Python\video_info.py", line 2, in <module> import imageio_ffmpeg as ffmpeg ModuleNotFoundError: No module named 'imageio_ffmpeg' LogScript: Error: Script call stack: /FDFMancer/ActionUtilities/EUB-FileMediaSourceUtilityAction.EUB-FileMediaSourceUtilityAction_C.ExtractVideoFramesWithManager <--- /FDFMancer/ActionUtilities/EUB-FileMediaSourceUtilityAction.EUB-FileMediaSourceUtilityAction_C.Extract Video Frames
yes I will join to discord
I'm happy we fixed that ^^ Let me know if you encounter any other issue
@@xlipdev this was absolutely instant support 😀I appreciate it! Thank you!!
This is awesome! i have question, can the plugin work using the video like th-cam.com/video/vtT78TfDfXU/w-d-xo.html as input?
Very good point! And yes! I actually shared a quick example on our Discord server that you can check out. I'll share a tutorial on it later! The results look pretty nice, but keep in mind that during MH Identity creation, you need natural poses of the actor (preferably front-facing and slightly to the left and right, without any facial expression). If your video provides these frames, you can achieve decent performance outputs!
@@xlipdev thanks, looking forward to your tutorials
This is so awesome! i can't wait! Very cool!!!
Great thanks! Happy to hear you like it ^^
worst tutorial ever watch in my entire life, all modules has problems, nothing worked at all, the new plugin you made 20 dollars thats too much, only professional can afford student like cant even experiment, I have wasted my one day on this and did not have a single outcome, so bad.
This repository requires coding experience if you are not sure what you are doing with your python env and your system of course expecting a magical outcome doesn't make much sense, here so many people have achieved very good results with this, you could ask questions in our Discord channel I’d be happy to help, as I have for many others, even maybe checking channel messages would help. Instead, you chose to express your frustration with this comment 😅
@@xlipdev I made this possible free and without your repo, I cant share with you, but got something way better and easier, costs around 5 minuets
@@NoishiXzenTM That's great I'm happy for you ^^
Thank you so much! This "zen" error mad me crazy! Do you know if there is a way to fix it permanently? The zen error is still there after restarting my PC...
🤔 what is that "zen" error? you can try deleting Zen/Install folder (In AppData/Local i showed here in the vid) unreal engine will try to create these exe files again and maybe resolves your issue
@@xlipdev I meant to say that in my case, the crash report diagnostic is only about the zen folder. If I delete it, it works, and I can launch Unreal without crashing. But when I reboot my computer, UE crashes again and I have to delete the zen data folder again. I'm still investigating because I noticed no crash when executing UE with admin-rights, but doing that, I can't drag'n'drop files into my content browser or import animations into my project.
the problem is that I'm still totally unable to create animation with that rig. It's a bit messy. I can't find any tutorial about creating animation by myself with that thing rigified
You should rely on Unreal Engine's animation pipeline to create animations with the Unreal mannequin skeleton, as it provides access to control rigs, blend shapes, and more, making the process much easier. However, some tutorials still use Rigify and import Blender-created animations into Unreal. Keep in mind, though, that the bone/joint systems in these two software are different, which makes the process quite challenging 🥲
@@xlipdev thanks, I already tried that, unfortunatly i cannot mirrorize animations in that way, can I?
@@nazariobiscotti8011 Nope, animation keyframes cannot be mirrored as I did for the skeleton in this video I believe. However, if you have a character rigged to the UE5 skeleton, you can use animation retargeting and various Unreal animation features, which might be more helpful than trying to import animations created in Blender
I am very happy that you continue in this field of animation, it is the Achilles heel of 3D production... On the other hand, I don't know if you dare to develop the same thing but in full-body animation. And like the betamax, the VHS and the DVD player, the motion capture equipment goes to the landfills😁 Thank you very much friend for sharing and also for selling your product so cheap.
Thanks to you for your kind words ^^ I actually started developing the plugin after your comment and didn’t expect this much attention tbh 😆There are many apps and AIs out there for creating body animations, and I’ve seen some great results! Currently, I’m not planning to dive into it, but let’s see how it goes. Thanks again for the appreciation it means a lot!
thank you
You are very welcome ^^
That's too impressive!
Saw my comment too 🎉🎉❤❤
You were the first one who comments ^^ ❤
@xlipdev Love your work 😊
Thank you very much. I always wanted to make cinematics with human characters but even if i animate the body it would be useless without facial animation. This might make my dream come true. Btw will it work with samsung mobile cameras? Ik thats under the goal of this project but just making sure! Thanks again!
You’re very welcome! ^^ Yeah, facial animation is key to bringing a character to life, and that’s exactly the idea! Any camera will work with any resolution!
AFTER 3 MONTHS YOU ONLY SOLD 75 CAUSE ITS GOOD BUT NOT 20$ GOOD IT WORTHS MORE LIKE 2$ LOWER YOUR PRICE OR REALESE IT FOR FREE THATS WHY GITHUB WAS MADE TO MAKE THINGS FREE AND EASY FOR DEVS NOT TO MILK THEM IF PPL HAD 20$ TO SPEND IN THIS THEY MIGHT ASWELL BUY AN IPHONE AND FLIP YOU OFF CAUSE THOSE WHO WANT THIS METHOD ARE HERE CAUSE THEY HAVE NO IPHONE
Have you ever checked other facial capture related software or equipment prices...
0:10 WELL LOOKS LIKE IT NEEDS TO BE FREE FOR EVERYONE NOT FOR MAKING 20 FREAKING $ PER PURCHASE ITS MORE LIKE 2 THAN 20
Bro no one forces you to buy it, why are you so triggered with this
I SUGESST TO YOU MAKE IT 2 $ ONLY OR FREE AT ALL
TURKS ARE SELFISH JEW WORSHIPERS
Great plugin man. But I'm looking for something like Quickmagic but for facial animation. Your plugin has a potential for someone that didn't buy Rokoko Facial head rig
Thank you so much! From what I’ve seen, the Rokoko Facial Head Rig doesn’t quite match the performance capabilities of Unreal’s MetaHuman system. However, I’ve heard great feedback from users who utilize a head rig with this plugin and are satisfied with the results. Just to clarify, the plugin generates depth frames that can be used with the MetaHuman Animator in Unreal, so even raw video from an iPhone can be utilized effectively. That said, using an HMC (Head-Mounted Camera) makes the process much smoother and more efficient ^^
Yeah! I saw my comment in the video! This trailer got me so excited. I’ve been debating whether to get an iPhone during the Christmas sale for motion and facial capture, but after watching this video, I’ve completely given up on the idea. I absolutely don’t want to use an iPhone anymore!
Yea it was there ^^ I use my mcbook to make sure the window in my room isn’t opened all the way. So, there are definitely some good use cases for Apple products 😅
Dude you BEAT APPLE !!
I will make another comparison video for sure ^^
that's phenomenal!!!!
Thanks a lot!
good boi
It is learning ^^
So excited! 😍
Me too 😆
Dude, you continue to amaze me:D
Thanks for the kind words ^^ I'll keep going 😆
Haha amazing vid 😂😂
Great thanks ^^
is it free or does it have a cost?
@@inteligenciafutura The editor plugin I showed in this video is a paid version, but I will update the repository I shared earlier once the release is out 👍
wow nice edit :D
Thank you! It's good to hear that ^^
You went battle mode with this trailer! lol I can't wait! For videos other than faces and metahumans, will it be possible to generate depth frames to be used as displacement maps?
Great to hear you like it ^^ I wasn't too happy about the last comparison video with the iPhone😆 Very interesting question! Honestly, Frame Mancer might not be the best for creating depth maps for non-face videos, as it heavily relies on facial landmarks. I believe there are many other apps specifically designed for generating displacement maps that would do a better job ^^ But good idea! I will definitely check..
@@xlipdev Ok. Cool. The reason I bring this up is how you are able to generate the depth frames locally within unreal. Could there be a way to use an AI model (Depth Anythingv2 for example) to then batch create depth frame sequence using your setup? Anyways just spitballing, might worth a try ... 🙂
@@damiakinbode Actually, that's a very nice idea! I also initially considered using an AI model, but most of them are licensed (I mean the models specifically trained with faces; other models don't give very good output) and cannot be used in commercial applications. However, it is possible to get a trainable model and train it with your own data locally! I'm still searching for some AI models that provide better results, but let's see how it goes.
ok ok, Now I need a mcbook tocalibrate my webcam! sounds cheap!!! keep up the good work man!
😆 No real calibration needed btw! You just tweak the focal length adjustment, and the plugin takes care of the rest ^^ My MacBook already did its job for everyone 😂
Can we extract any video footage data which is not calibrated through meta human animator (like as in the process we do) ?
Currently, extracted frames are scaled to 720p during extraction to align with the default calibration data for plugin processing. However, the new version will expose all settings, including FPS and aspect ratio, allowing you to customize resolution and other parameters for extraction and processing while creating custom calibration assets with the plugin ^^
Great 👍🏻👍🏻
Can we extract any video clip data through this plugin, which is not calibrated through metahuman animator?
Yes, but currently, extracted frames are scaled to 720p during extraction to match the default calibration data for plugin processing. However, in the new version, all settings will be exposed, enabling you to specify any aspect ratio (resolution) for extraction and processing as you create custom calibration assets with the plugin 👍
@@xlipdevcool 👍🏻👍🏻👍🏻
@@xlipdevbut what about the initial calibration process? The front view, side views and teeth view. How can that be done with a custom video clip ???
@@malikgillani Initially, you’ll set your video and depth frame resolution/fps etc. before extraction, (there’s no longer a restriction for the 9-16 aspect ratio anymore!) Before creating the Footage Capture Data asset, you’ll need a Camera Calibration asset. With the plugin, you simply adjust the focal length, and it will automatically create and assign a Camera Calibration asset to your footage. From there, you can proceed with the regular MetaHuman pipeline for identity creation etc.
@@xlipdev please do make a tutorial video for this too. Thank you for responding 😇
I've tested the plugin, it works nicely, where can I adjust it to make it capture at 60fps? Now all 30fps. Thanks.
Yes, in the new version, options for video/depth frame resolutions, FPS, and all other calibration data settings will be fully available👍
Thank!!! May i ask a question about skeleton? I don't know why there always a issue with metahuman skeleton when i import rigged character into UE5. It shows i need to combine skeleton, but i do nothing about stuckture of the skeleton. Thanks again!
Hey, are you sure the exported file contains only one armature? Unreal might get confused if there are multiple armatures and won’t know which one to use for creating a skeletal mesh 🤔
@@xlipdev Thank you for your nice reply! i will checkt it again, it seem like that will be very offen happened, which the "root" name of skeleton in Blender was changed regularly.
Question... at 10:18 I see on the Xiaomi Redmi there's some shadowy glitching on the face during smiling, what is that or what is going on there? Is that normal? Will it show up if there are textures or? I noticed it doesn't seem to do that as much on the iPhone.
Good point! I believe the reason was that the 2025.1.7 version of the plugin still uses a default camera calibration asset retrieved from an iPhone (which has lens distortions), so it wasn’t matching 100% with the footage which causes that. I can happily say that it’s not happening anymore and won’t be there in the new version ^^ Additionally, when I applied these animations to a MetaHuman, I didn’t see these glitches🤔
@@xlipdev not a problem brother good to know. I appreciate the honesty. That means a lot and I hope to continue to see this plugin grow and develop further. :)
@@deygus Great thanks for the kind words and support man ^^ Sure I'll make it better and better..
Very interesting!
Thanks! Good to hear it is not boring😆
Amazing Plugin! Are you still doing game dev work as well?
@@AecelotGrimmWyrdStudios Great to hear you like it ^^ Right now, I'm dedicating all my time to developing this plugin. Once I'm satisfied with the results and release the new version(s), I plan to continue with game dev 👍
@@xlipdev Let me know when you do! You seem like you know what you're doing and we're always looking for collaborations!
thanks for the video i am getting error in 12:26 ValueError: string is not a file: face_model_with_iris.obj
Thanks ^^ you need to be in face mesh directory before running that script. Simply go with 'cd ./face_mesh' first and then try running
@@xlipdev im sry if this is obv but im really new to this stuff but i did this cd C:\Users\Any\OneDrive\Desktop\faceDepthAI-master\faceDepthAI-master\face_mesh> then i ran the command python create_single_sample_and_display.py still the same issue
@@One1ye19 No worries, but yea this repo requires some knowledge about coding and development, you can jump here discord.gg/r4xj4hsk I will try to help ^^
Thank you TH-cam for recommending me this video! Can't wait to try this out!
@@itsMBWAAA Great to hear you like it ^^
HiHi, I bought the plugin, but when I want to process the frames in depth, this is what I get. The process was not completed :( Please check the logs for more details. If the process ended unexpectedly midway, the already processed depth frames have been saved. Consider running the process again and remaining in the Unreal Engine editor until it completes to prevent the engine from unexpectedly terminating background tasks.
Could you try lowering the values for "Parallel Process Count" and "Batch Size"? If that doesn't resolve the issue, join the Discord channel so we can take a closer look together.
Hi, I'm having some trouble. The Metahumun Identity panel is blank when I open it, even though I followed all the steps, but it doesn't function. Is there a specific issue? If so, please let me know how to fix it.
You're not providing much detail, my friend 😅. There could be many things that went wrong. Start by checking if the RGB and depth frames exist in the folders using a file explorer. Then, review your image sequences to ensure they open correctly, have the right dimensions, and display the expected FPS values. Finally, inspect your footage thoroughly.
A true genius, both the plug and your process are fascinating. Well done and thank you very much. You are my hero for the coming year, that's for sure.
Very kind of you ^^ Great to hear it helps 🤗There's more exciting stuff on the way!
I bought the addon yesterday! great work!
Great thanks for support 🤗I hope you enjoy it ^^ This is just the beginning, and it will get even better ^^
This is awesome.. A lot of people were waiting for something like this! May I suggest you change the name to something easy to type and remember? (Android Face Link or something)
Tbh I didn’t expect that much attention😅I’ve never been great at naming things 😄Android Face Link hmm I was thinking about an android app for real time tracking later on and actually not a bad name, let's see🤗
Great! I also considered doing something similar (using MiDaS to generate depth frames) but then came across your solution. Honestly, looking at the depth frames from the iPhone, I have doubts about whether these depth data actually used in any meaningful way during animation generation. I can’t quite grasp how such noisy data could help differentiate emotions. Have you tried feeding in blank, solid-color depth frames instead of the generated ones? I suspect the results might not differ much.
That's cool! I actually started with MiDaS too, but since the models aren’t trained for faces, I couldn’t get accurate depth maps. I ended up finding other models specifically trained on faces that produced much better high-quality maps. However, each frame is generated individually, which makes it incompatible with the entire sequence when using AI. And yeah, I also noticed that depth frames don’t significantly affect facial features in most cases, so I decided to stick with my current implementation. That said, there are still scenarios where they can have an impact, mainly due to camera calibration (like head distance, rotation, etc.). From my testing, it seems impossible to process the MH performance pipeline with empty or solid depth maps, it just crashes without providing any clear info. I guess the Epic team wants to keep the magic under wraps for now! 😆 Or maybe I just haven’t dug deep enough or found the right sources to investigate further.
Thank you brother. I spent days trying to recover my project. 😭 You just saved it. 😭
@@LestZA That's great to hear it helped 🤗I lost many projects already so decided to make a quick video 😁
awesome man, you did it! haha I saw marc demo on linkedin and tried my self to generate some depth maps in comfy ui and stuff, but work got me up and I didn't have time, but here you are you fucking did it! haha, Iphone is better but here in brazil your plugin costs around 160~~ for comparison an iphone 12 here is almost 3k, so I see it as "is the iphone 10x better?" N O W A Y. again thank you, love you, keep up the dev!
@@PowerfullUs Thanks a lot! ^^ I hope you enjoy it ^^ Round 2 is gonna be completely different 😎
INSANE!! Amazing job! I'm glad I don't need to buy an iPhone :D
@@OverJumpRally Thanks a lot! Hope you enjoy it ^^ By the way, I’ve surpassed iPhone performance internally 😆 I’m working on the new version, so keep eye on it, I’ll be sharing the details soon ^^
the lips are slightly not good , but it is a promising plugin that will help with friendly equipment budgets to compete with iphone quality !! keeep up, you are the best!
@@massinissa8697 Massive thanks for the support man 🥹 I'm almost there I will share the details very soon ^^