MalcolmMHAX

แชร์
ฝัง
  • เผยแพร่เมื่อ 26 ธ.ค. 2024
  • MetaM2.0 coming soon...a more straight forward way to replace original metahuman not needing create DNA for each new character... for now just a new MHA test

ความคิดเห็น • 8

  • @Joshsmith-h3t
    @Joshsmith-h3t ปีที่แล้ว

    this is awesome , what is the tool/repo you use to extract the depth data from the video source

  • @TinNguyen-dg4zv
    @TinNguyen-dg4zv ปีที่แล้ว +1

    Nice work!

  • @Babakkhoramdin
    @Babakkhoramdin ปีที่แล้ว

    very nice, can you explain about left-side footage? please

    • @marc1137
      @marc1137  ปีที่แล้ว

      this is using tool called metahuman animator... left side is the footage to track

    • @Babakkhoramdin
      @Babakkhoramdin ปีที่แล้ว

      Thank you very much for your reply, with what software do you track that footage? Can you explain a workflow in general?@@marc1137

    • @Babakkhoramdin
      @Babakkhoramdin ปีที่แล้ว

      I don't have money to buy an iPhone. I always wanted to be able to transfer my facial movements to my metahuman by a video. Your assistance would be much appreciated. thanks in advance@@marc1137

  • @Ateruber
    @Ateruber ปีที่แล้ว +1

    With deepfake it will be very cool

    • @marc1137
      @marc1137  ปีที่แล้ว +1

      could do a deepfake using metahuman with hundreds poses as source , but having the thing in 3d in the end is no limits to do whatever