Thanks for watching! Join us every Tuesday in April when Phil takes a look at what’s going on with AI. To watch more of our robot revolution coverage, check out our AI playlist here: th-cam.com/play/PLJ8cMiYb3G5ek1Ux66aJ_qWf6CfBaAkGG.html
@DavidLimReportyeah, as in next week. Commenting on what 'AI' cannot do is just about today, in the moment discussed. That's it. The public and media hasn't yet grasped the evolution rate of AI now. And still call it a pattern machine. They are talking about what the system was generations ago, when it started.
The Filmmakers who are choosing to still replace backgrounds in The Volume and other LED rooms are part of the Hollywood VFX problem. They aren't making enough of their decisions before filming, instead throwing more and more of the responsibility on VFX teams to solve their issues.
Yup, and they give the VFX team zero time to do it. They just scream "Fix it in post!" without having any idea what the words *really* mean. If all directors, producers and cinematographers tried to do what the VFX houses do for a month _(or six months would be perfect),_ I think movies would get better again instead of this decline of quality we've seen this last decade or so.
@@angelgjr1999 Honestly as a VFX worker, sitting in front of the screen is the easiest part. The hard part is being able to turn around something high quality in short time frames. Some productions are good and allot enough time for VFX, but others are nuts and often exploitative or don't have necessary assets prepared. VFX artists can do anything 2d and 3d as long as they have enough time and money.
just spent a day filming with a ton of rear projection... like Hitchcock would do in the 40s. But it looked amazing BECAUSE the lighting was accurate and the background image was meticulously crafted by an artist. My director made a huge point to make as many decisions before filming as possible and because of that the frames turned out so well. Greenscreen, AI, or even simple rear projection is a creative choice and it must be made in coordination with the story, the themes, the lighting, the sound, the cinematography in order to work well as an element of media art.
Why did they stop asking people to step out of the frame at the beginning of the call, like iChat used to do back in the day? Easy way to tell the AI what the background is
Thank you Phil! You always bring the most interesting videos on subjects I didn't even know I *needed* to know! And, _of course,_ a *big* thanks to Vox, *all* of you are amazing
In davinci resolve you can basically use a combination of magic mask and Keying colors and skin to get a near perfect key without a greenscreen. It takes some practice buy I can remove most backgrounds fairly quickly using this technique.
Solid tip. Resolve is moving in leaps and bounds with these features. It's what prompted me to switch from Adobe Ae - They rested on their laurels and are now being left in the dust.
I recently discovered AI background removal in Filmora and I couldn’t believe the results, helped me with a poorly lit background that was obvious during jump cuts and I was able to replace it with a still image
The real unappreciated change is the ability to store, share and evaluate millions, even billions, of gigabytes of data. TH-cam alone is something like 2 500 000 000 gigabytes of data.
There is a key difference that makes the comparison insufficient: traditional green screen has a permanent but static potential, while ai could be constantly trained until the perfect result will be achieved, but then you can apply even more features green screen can’t produce like changing the painted out parts to different things.
Timestamps & key points by Tammy AI 🔥🔥🔥Enjoy! 0:51: 🤖 AI's ability to separate foreground from background is improving through segmentation and learning from manually created datasets. 3:46: 🎥 AI green screens have limitations and are optimized for speed rather than Hollywood-level quality, but can be improved with the use of masks. 7:56: 🎥 Visual effects artists often replace LED walls with meticulously traced backgrounds for better quality.
Green screen work has its challenges but if some simple rules are followed - things like making sure your key is evenly lit, working with your stage lighting to match your digital environments/ set extensions, proper tracking markings if your doing a 3D solve so “fixing” green screen is a lot lower on my priority list than having to roto when chroma key is not available. Tools like Mocha and Silhouette make things easier but it’s still very labor intensive to get good results, especially when manually tracking and doing roto work on things like hair and edges of objects that are highly detailed or have semi opaque boundary areas of transition. If AI could aid with those problems I’d be very happy…
The GOD box, unreal, and led walls are nice. But there is a difference in picture quality of items recorded, vs the quality of the walls image. They seem to be what’s in and the future.
that very last sentence about artists having a lot of time in their hands has gotta be wishful thinking. as any VFX artist will tell you, more often than not the deadlines they're given are borderline unfeasible.
I missed this episode of vox until now Phil's line "we can get rid of Shelia Jr, she knows what she did" made me laugh so hard I woke my wife from her sleep thinking I was having some kind of deadly problem I'm so glad nurses run on love of their job and *mostly* caffeine, because that joke was worth buying her an extra energy drink in 3 hours
@@switchdeck9164 one is trying to help an artist finish their work quicker and easier, the other is trying to replace the artist altogether. Try and figure out which is which
Man, this is really interesting to see honestly about this whole thing in the whole process of the green screen effect in a different AI green screen. The whole differences in that.
Great video. As someone who has been mulling using a green screen or ai versus a natural background it was both amusing and enlightening to note that the experts featured in this video seemed comfortable using their laptop cameras with just their natural messy backgrounds and poor lighting 😂
yup, rotoing of things like the mandalorian, but oftentimes you have really good info on what is in the background because that is a seprate video that can be reconstructed from the game engine itself! (worked on a few of these volumes myself)
No, there is so much QC work we do just to greenscreen . Also don’t forget, how the shot is filmed too, having a perfect green screen background is rare
I've been doing motion graphics and VFX for 20 years and I have not seen a seamless quick solution yet for high quality keying and compositing. As mentioned in other comments, the variables of lighting, cameras, files, and color grades will always throw this off.
The point at the end about the LED screens, even the “accurate lighting” isn’t always correct. The screens can not push a brightness equivalent to natural sunlight meaning if the suns the light source, it’s probably far dimmer than is required. They not only have to roto to clean up the background but to create mattes for colourists to boost brightness especially on the reflective surfaces like mando’s armour. The screens also come with massive visual issues like moire patterns that are extremely hard to avoid. An awesome concept but still years away from what it is currently being marketed as!
Rotoscoping with the rear-screen for tweaking is still a win as you will be less likely to have "fringes", the composition is in the can, reflections, speculation and lighting are a done deal.
I see a possible solution to the problem in the possibility of integrating the metahuman technology from unreal engine 5 into a real person in the frame. For example, to cut the outline of a person at a certain distance and replace this outline with the outline from the 3D model of metahuman, created by the person in the video. The same goes for hair. Replace the edges of the head with 3D hair that rubs against the movement of the person's head in the video. And the most interesting thing is the question of lighting. If you recognize the light from the backspace and illuminate the 3D model of the person, you can take just a layer of light and overlay it on the 2d person from the video to make it appear as if they are so illuminated. In short, if you create a 3D layer of the person, you can solve the green background issue perfectly.
If I used runway for mattes, how could I go in afterwards and manually pick out the hairs? For me, I think in photoshop terms where I would take a tiny brush over a mask as well as slight feathering. Don’t know how it works in video.
You should look up the Lytro cinema camera that was abandoned when their employees went to work for Google. They made light field cameras that could calculate an object’s distance from the lens and remove background by thresholding distance. No green screen needed. I hope google does something with this technology but it seems to have disappeared.
I remember the green screen tech in the early days had a similar problem where you'll see a green or white outline around the subject. I don't know what had improved to make the outline disappear.
Back then, in the early ages of digital cameras, the colour information was sampled in less pixels than luminance, basically it was a sharp black and white image overlaying a slightly blurry colour image. But since the greenscreen effect depended on the blurry colour information, not the sharp luminance values, so it was less precise. Modern (cinema) cameras sample as much colour as luminance, so that problem is gone. A good/bad example of these greenscreen fringes is seen in the StarWars prequels, since they were shot with some of the first professional digital cinema cameras
@elfrjz I’m a high end VFX supervisor with 15 years of experience, I’ve used many softwares and I’m well known in the industry. Don’t be stubborn and get your terms right. AA has nothing to do with alpha channels. It’s a reformat filter.
Exactly as you mentioned the future for roto/alpha depends on learning. The process is soft key, hard key.. These are what can preform on industry Ai based already on nuke, silhouette.. Which have tons of data already. What surely need to keep in mind some finaling always need to happen
As long as profit and growth are the the primary motives in an enterprise, technology that helps a worker - an artist, in this case - do their work more efficiently and with less tedium will NOT free them up for more leisure time, but rather raise the standards by which their productivity is judged.
The same algorithm to get the matte applied to enlarged portions and limited by clarity-resolution DPI of target video-output (not master) for real time performance. Focal-length is a clue Zoom and movements of previous frames are vectors Stereoscopic "view" from other LCD-panel technology as a quick analog guide
After what people experienced from news and actors, they will be even more sceptical and cynical about the usual-suspects and will demand proof for every sentence. The pathological-liars are toast
I feel like the best method would be to have an LED wall with a really fast refresh rate and a global shutter camera recording at high speed, so then you can have some frames be the subject in silhoutte. Of course this will require interpolating the frames to perfectly line up with the non silhoutte images, but ideally it'll be so quick that it's only visible to the camera so that epilepsy is ruled out.
I know they've done similar things with lighting on the Matrix and recently on Thor 4 but to be able to dynamically keyframe changes in lighting in post.
On a more general note, one other thing that I was just thinking about was the movie Avatar: The Way of Water. 100% computer generated so no post production, right. NO! My nephew is a VFX Artist and he was really busy just prior to the release of the film "fixing things". There are some things that only the human eye notices and only a person with a really good eye for that (and artist) can fix.
@@henri-julien I guess the main questions is "what is production for an animated movie". O think of the production as the producing of the main video content - which, for an animated movie is the animation.
Working on a hybrid green screen/projection technique. The flexibility in post of a green screen with the lighting and look of an LED wall. 1/100th the cost. Oh, and it's 3D.
I am curious about the shadow and reflection effect of a green screen. I recalled seeing a reflection of the people on the floor in a talk show while actually, the background studio is all green including the floor. How did they do it?
I see that everyone is saying that the AI tools will improve efficiency so that people can enjoy their lives more with the extra time. I'm pretty certain that companies will be the ones benefiting most, with worker counts decreased and work load increased for the remaining people...
On Dune they used sand colored screens instead of green screens because they knew they’d have to roto everything anyway and that way there wouldn’t be green spill in the shot.
Thanks for watching! Join us every Tuesday in April when Phil takes a look at what’s going on with AI.
To watch more of our robot revolution coverage, check out our AI playlist here: th-cam.com/play/PLJ8cMiYb3G5ek1Ux66aJ_qWf6CfBaAkGG.html
@DavidLimReportyeah, as in next week. Commenting on what 'AI' cannot do is just about today, in the moment discussed. That's it. The public and media hasn't yet grasped the evolution rate of AI now. And still call it a pattern machine. They are talking about what the system was generations ago, when it started.
I think greenscreened footage will always be possible to spot until the technology can also match the lighting
AI relighting is in the works! I think Two Minute Papers has a video about it.
Clipdrop has an AI relight that’s pretty decent tbf
Ray tracing is doing just that
That's why they created The Volume
@@MatthewHoworko Hold on to your papers.
The Filmmakers who are choosing to still replace backgrounds in The Volume and other LED rooms are part of the Hollywood VFX problem. They aren't making enough of their decisions before filming, instead throwing more and more of the responsibility on VFX teams to solve their issues.
Yup, and they give the VFX team zero time to do it. They just scream "Fix it in post!" without having any idea what the words *really* mean. If all directors, producers and cinematographers tried to do what the VFX houses do for a month _(or six months would be perfect),_ I think movies would get better again instead of this decline of quality we've seen this last decade or so.
VFX are very overworked. It’s a very stressful job sitting in front of an editing screen for hours everyday.
@@angelgjr1999 Honestly as a VFX worker, sitting in front of the screen is the easiest part. The hard part is being able to turn around something high quality in short time frames. Some productions are good and allot enough time for VFX, but others are nuts and often exploitative or don't have necessary assets prepared. VFX artists can do anything 2d and 3d as long as they have enough time and money.
Exactly!
Change hair, replace hand, replace character with digital double, add weapon, etc.
All requests coming up in the editing bay, en masse.
just spent a day filming with a ton of rear projection... like Hitchcock would do in the 40s. But it looked amazing BECAUSE the lighting was accurate and the background image was meticulously crafted by an artist. My director made a huge point to make as many decisions before filming as possible and because of that the frames turned out so well. Greenscreen, AI, or even simple rear projection is a creative choice and it must be made in coordination with the story, the themes, the lighting, the sound, the cinematography in order to work well as an element of media art.
Why did they stop asking people to step out of the frame at the beginning of the call, like iChat used to do back in the day? Easy way to tell the AI what the background is
UX geeks thought users want filtering to happen instantly and this extra step was too disruptive?
That kind of approach doesn't work if someone moves their phone or laptop, either on purpose or accident. Then everything breaks down.
Thank you Phil! You always bring the most interesting videos on subjects I didn't even know I *needed* to know! And, _of course,_ a *big* thanks to Vox, *all* of you are amazing
how do i like this comment 100 times
The lighting is the things that make it stand out to me
Phil's videos are the best, the mix of information and humor makes them really stand out.
In davinci resolve you can basically use a combination of magic mask and Keying colors and skin to get a near perfect key without a greenscreen. It takes some practice buy I can remove most backgrounds fairly quickly using this technique.
Solid tip. Resolve is moving in leaps and bounds with these features. It's what prompted me to switch from Adobe Ae - They rested on their laurels and are now being left in the dust.
Any videos or guides on this?
I recently discovered AI background removal in Filmora and I couldn’t believe the results, helped me with a poorly lit background that was obvious during jump cuts and I was able to replace it with a still image
Who else here is amazed/shocked how fast AI is coming together in our world 😲
it's been developed for a long time
The real unappreciated change is the ability to store, share and evaluate millions, even billions, of gigabytes of data.
TH-cam alone is something like 2 500 000 000 gigabytes of data.
Its always been there, the youtube algorithm, the insta feed, the advertisements on Facebook, it's just more visible
@@BarnabyTheEpicDoggo you think those are AI?
I’m terrified
There is a key difference that makes the comparison insufficient: traditional green screen has a permanent but static potential, while ai could be constantly trained until the perfect result will be achieved, but then you can apply even more features green screen can’t produce like changing the painted out parts to different things.
This was great. Rip to Sheila Jr.
Sheila Junior had it coming.
Why AI hasn't replaced GoT S8 already?
Timestamps & key points by Tammy AI 🔥🔥🔥Enjoy!
0:51: 🤖 AI's ability to separate foreground from background is improving through segmentation and learning from manually created datasets.
3:46: 🎥 AI green screens have limitations and are optimized for speed rather than Hollywood-level quality, but can be improved with the use of masks.
7:56: 🎥 Visual effects artists often replace LED walls with meticulously traced backgrounds for better quality.
Why do you need timestamps on a fairly short video?
@@tomaccino playing with new technology. 😄😄
@@lindadawson902 Thanks for sharing! Tammy AI is very useful!
Phil is such a great host. Always love his segments!
Green screen work has its challenges but if some simple rules are followed - things like making sure your key is evenly lit, working with your stage lighting to match your digital environments/ set extensions, proper tracking markings if your doing a 3D solve so “fixing” green screen is a lot lower on my priority list than having to roto when chroma key is not available. Tools like Mocha and Silhouette make things easier but it’s still very labor intensive to get good results, especially when manually tracking and doing roto work on things like hair and edges of objects that are highly detailed or have semi opaque boundary areas of transition. If AI could aid with those problems I’d be very happy…
The Sheila definitely knows what it did!
The GOD box, unreal, and led walls are nice. But there is a difference in picture quality of items recorded, vs the quality of the walls image. They seem to be what’s in and the future.
0:24 This is a masterpiece?!
Loved this video series! Phil is a gem
3:46 It IS a hard problem. I bet sometimes even Ronnie has trouble separating what is Ronnie and what is office, causing him untold problems at home.
that very last sentence about artists having a lot of time in their hands has gotta be wishful thinking. as any VFX artist will tell you, more often than not the deadlines they're given are borderline unfeasible.
Apple just released new AI tech called "FaceLit: Neural 3D Relightable Faces" which add one more level of WOW to a green screen effect.
Loved the little fact at the end - that was really interesting!
Sheila Jr. will be missed.
Loving this series!
There is another detail to consider, and that is that the color palette used in the cuts must match the background lighting.
I missed this episode of vox until now
Phil's line "we can get rid of Shelia Jr, she knows what she did" made me laugh so hard I woke my wife from her sleep thinking I was having some kind of deadly problem
I'm so glad nurses run on love of their job and *mostly* caffeine, because that joke was worth buying her an extra energy drink in 3 hours
"She knows what she did." (We used to lose our allowance. Now Sheila Jr. gets deleted from the simulation!)
this is the kind of stuff AI should be doing. Not making entirely finished works
Why shouldn't it do both?
Burning a lot of processor for zero value add? Why bother?
@@switchdeck9164 one is trying to help an artist finish their work quicker and easier, the other is trying to replace the artist altogether. Try and figure out which is which
Man, people can do whatever they want with technology as long as it's not murdering people.
@cverse why should artists be exempt from their jobs being replaced by computers when no other profession is?
Great video! I cant wait to see how much these ai technologies improve over the next few years
Man, this is really interesting to see honestly about this whole thing in the whole process of the green screen effect in a different AI green screen. The whole differences in that.
So no one is commenting on: “Must destroy Sheila Junior”?
Funny how the arts were once the riskiest fields to get into and now they’re the safest
*if you're a specialist **and know how to readily and frequently get work :\
Great video. As someone who has been mulling using a green screen or ai versus a natural background it was both amusing and enlightening to note that the experts featured in this video seemed comfortable using their laptop cameras with just their natural messy backgrounds and poor lighting 😂
0:25 Masterpices. Mhm, horrifying.
yup, rotoing of things like the mandalorian, but oftentimes you have really good info on what is in the background because that is a seprate video that can be reconstructed from the game engine itself! (worked on a few of these volumes myself)
Best solution.
Amazing editing and explanatory powers!
No, there is so much QC work we do just to greenscreen . Also don’t forget, how the shot is filmed too, having a perfect green screen background is rare
I've been doing motion graphics and VFX for 20 years and I have not seen a seamless quick solution yet for high quality keying and compositing. As mentioned in other comments, the variables of lighting, cameras, files, and color grades will always throw this off.
Shiela junior deserves no sympathy. She knows what she did
The point at the end about the LED screens, even the “accurate lighting” isn’t always correct. The screens can not push a brightness equivalent to natural sunlight meaning if the suns the light source, it’s probably far dimmer than is required. They not only have to roto to clean up the background but to create mattes for colourists to boost brightness especially on the reflective surfaces like mando’s armour.
The screens also come with massive visual issues like moire patterns that are extremely hard to avoid. An awesome concept but still years away from what it is currently being marketed as!
Rotoscoping with the rear-screen for tweaking is still a win as you will be less likely to have "fringes", the composition is in the can, reflections, speculation and lighting are a done deal.
I see a possible solution to the problem in the possibility of integrating the metahuman technology from unreal engine 5 into a real person in the frame. For example, to cut the outline of a person at a certain distance and replace this outline with the outline from the 3D model of metahuman, created by the person in the video. The same goes for hair. Replace the edges of the head with 3D hair that rubs against the movement of the person's head in the video. And the most interesting thing is the question of lighting. If you recognize the light from the backspace and illuminate the 3D model of the person, you can take just a layer of light and overlay it on the 2d person from the video to make it appear as if they are so illuminated. In short, if you create a 3D layer of the person, you can solve the green background issue perfectly.
I think killing the boring aspects of a process kills the process itself
OMG! The Sobeys & Tim Horton’s at County Fair Mall in New Minas, Nova Scotia! Go Nova Scotia! 🎉
Magic Mask in DaVinci Resolve is really really good.
The AI field is moving way too fast. It needs to slow down a bit, until we can somewhat adjust to the consequences.
LOOL nonsense like if we can even control that. inevitable like computers getting better
Why do people keep saying this. My life hasn't changed one bit 🤣🤣
@@SoooooooooooonicableYet. Your life will probably look very different in 5-10 years due to AI.
I've used RunwayML. As good as it is with basic mattes, it's terrible with hair edges
If I used runway for mattes, how could I go in afterwards and manually pick out the hairs? For me, I think in photoshop terms where I would take a tiny brush over a mask as well as slight feathering. Don’t know how it works in video.
Very interesting video as an AI enthusiast, thank you!
Loved this video. ❤
You should look up the Lytro cinema camera that was abandoned when their employees went to work for Google. They made light field cameras that could calculate an object’s distance from the lens and remove background by thresholding distance. No green screen needed. I hope google does something with this technology but it seems to have disappeared.
Ai roti means artists can crank out more shots. Not go home early or spend time with family.
Love me a good Phil Edwards video, great as always!
I remember the green screen tech in the early days had a similar problem where you'll see a green or white outline around the subject. I don't know what had improved to make the outline disappear.
Back then, in the early ages of digital cameras, the colour information was sampled in less pixels than luminance, basically it was a sharp black and white image overlaying a slightly blurry colour image. But since the greenscreen effect depended on the blurry colour information, not the sharp luminance values, so it was less precise. Modern (cinema) cameras sample as much colour as luminance, so that problem is gone. A good/bad example of these greenscreen fringes is seen in the StarWars prequels, since they were shot with some of the first professional digital cinema cameras
@elfrjzanti aliasing isn’t a property of real footage, sometimes it’s better to have a basic grasp of the tech before trying to educate others
@elfrjz I’m a high end VFX supervisor with 15 years of experience, I’ve used many softwares and I’m well known in the industry. Don’t be stubborn and get your terms right. AA has nothing to do with alpha channels. It’s a reformat filter.
AI has grown so much, it's hard to keep track of them.
and it's gonna grow even more and faster
This host is so much fun to watch
Because it just started getting good and will only become better. It's already being used and will see more use in the future.
Hey it’s New Minas Nova Scotia!!
Yeah didn’t expect to see that in this video haha
Exactly as you mentioned the future for roto/alpha depends on learning.
The process is soft key, hard key.. These are what can preform on industry Ai based already on nuke, silhouette.. Which have tons of data already.
What surely need to keep in mind some finaling always need to happen
As long as profit and growth are the the primary motives in an enterprise, technology that helps a worker - an artist, in this case - do their work more efficiently and with less tedium will NOT free them up for more leisure time, but rather raise the standards by which their productivity is judged.
The same algorithm to get the matte applied to enlarged portions and limited by clarity-resolution DPI of target video-output (not master) for real time performance.
Focal-length is a clue
Zoom and movements of previous frames are vectors
Stereoscopic "view" from other LCD-panel technology as a quick analog guide
After what people experienced from news and actors, they will be even more sceptical and cynical about the usual-suspects and will demand proof for every sentence. The pathological-liars are toast
SHEILA jr™HAD IT COMING!!!
obs is pretty good
I feel like the best method would be to have an LED wall with a really fast refresh rate and a global shutter camera recording at high speed, so then you can have some frames be the subject in silhoutte. Of course this will require interpolating the frames to perfectly line up with the non silhoutte images, but ideally it'll be so quick that it's only visible to the camera so that epilepsy is ruled out.
I know they've done similar things with lighting on the Matrix and recently on Thor 4 but to be able to dynamically keyframe changes in lighting in post.
Loving all the AI videos, Vox!
Sweet Jesus, that background with the Sobeys is in New Minas, Nova Scotia 😳. How on earth did you come to select that!?
Address?
AI is so young yet already so good. It hasn't even been a year since it being so mainstream! give it time, it's the worst it'll ever be.
AI has been around for 70 years...
@pronto ‘mainstream’
@@thePronto but the AI boom just started. There's never been this much dedicated research to AI. you are completely missing the point
@@bj124u14 look up 5th Generation project from the 80's... Then look up 'AI winter'...
@@thePronto I'm ignorant and this is the internet so I don't care
Need this for VIDEO!
On a more general note, one other thing that I was just thinking about was the movie Avatar: The Way of Water. 100% computer generated so no post production, right. NO! My nephew is a VFX Artist and he was really busy just prior to the release of the film "fixing things". There are some things that only the human eye notices and only a person with a really good eye for that (and artist) can fix.
uhh what do you mean no post-production lol it would basically be all post-production
@@henri-julien I guess the main questions is "what is production for an animated movie". O think of the production as the producing of the main video content - which, for an animated movie is the animation.
One thing people forget is the lensing around an objects outline.
If you worked in VFX, then you know... you'll still need to rotoscope everything 🤣
Awesome video! Why is a Verge video on Vox though?
Working on a hybrid green screen/projection technique. The flexibility in post of a green screen with the lighting and look of an LED wall. 1/100th the cost.
Oh, and it's 3D.
That last part, even tokusatsu had started using it these days, most noticeably in KingOhger
I am curious about the shadow and reflection effect of a green screen. I recalled seeing a reflection of the people on the floor in a talk show while actually, the background studio is all green including the floor. How did they do it?
If the floor is reflecting the green, it can be replaced just like the actual green screen
@@DanielRieger I see. Thanks 😁
The fact that we get free videos on TH-cam by Vox is truly a gift. 👏👏👏
Well.... TH-cam has been offering such services for 18 years.
Low budget films can become top tier with the right amount of effort at this rate.
These problems seem very minor and could be overcome with a few tweaks to AI.
AI is progressing faster than Moore's law.
Saw this on my school board and it looked cool so I’m gonna watch it later
AI is getting so smart so rapidly it's hard to keep up
You can keep up with me. Don't be late!
don't be fooled, AI is not AGI, not even close.
@elfrjz doesn't AGI have the capability to ... rewire itself to become ASI soon?
@Zaydan Alfariz I’ll give it a decade before most jobs start disappearing, I wonder what will replace capitalism?
It is smart like a bench full of tools is smart. Is it the tools, or is it the craftsman?
County Fair Mall in New Minas, NS! Represent!
I love all the recent videos 🤗💛
I need to know what Sheila Jr. did 😂
Now what did Shiela Jr. do? 😰
I see that everyone is saying that the AI tools will improve efficiency so that people can enjoy their lives more with the extra time. I'm pretty certain that companies will be the ones benefiting most, with worker counts decreased and work load increased for the remaining people...
Well written Abhishek!
Loving these AI videos
Such a Beautiful Picture of a Wife and her sister. ❤
What a great example to look at in order to learn about how machine learning works!
They should try using switchable polarized lenses with those LCD screen walls.
Real smart video, detective Gordon.
Great video!
The subtle display of what a "normal happy family" means. The art of training the human brain by continuous reinforcement is displayed.
I remember a ps3 game called movie scene or something like that that allowed it to remove the background without a green screen.
Good job Phil!
On Dune they used sand colored screens instead of green screens because they knew they’d have to roto everything anyway and that way there wouldn’t be green spill in the shot.
But wouldn’t that be harder to detect what is skin tone and what is “green”screen?
It will change ❤
AI can't replace Vox editors though
there is NO COOLER background than the New Minas Sobeys