Ah Kinect some times I feel like the only person who wanted one with their Xbox. Glad to see someone get some use out of the device. Hope MS doesn't give up on it.
If we able to break through on this said elaborate code of body language. Off course its first breakthrough was not break through the code of humanity's body language of its entirety but the body language of an American first. Because the machine learning machine is exposed to mostly American citizens training it and it's research facility is in USI'm pretty sure despite no strong evidence or knowledge on psychology that cultures and environments plays a big part how we perform body language. So still a long way after all
I consciously choose my body language. So, I feel half the things you said were false. I tilt my head because I'd rather not look at you, or I scratch because something is bothering me, perhaps I'm having trouble understanding, and I lean back to make myself comfortable. It doesn't take a genius. Here's some more information to propel the tech: I put my hand to my chin because it's heavy, I'm probably slouching. I don't do the hand motions when I speak because that's a little dumb, but it adds emotional energy to the words, and the emotional wavelength depends on how high or low the hands are, shrugging usually has hands at the bottom and face palms are at the head. I'm teetering on insults, but it comes from a hate of inaccuracy and falsity.
It should be taken into account that the Human brain is very good at spatial reasoning and spatial improvisation. Research games like Foldit have shown that the human brain is better at finding more efficient spatial "sinks" for proteins when they need to be folded to the lowest energetic well possible. So in lay man's terms: our brains understand what the "tone" a gesture is sending. It may even know how that tone changes bases on the 3 dimensional coordinates that change over time and the relative speed and direction or vectoring those gestures perform. We humans just can't transliterate the meaning in real time yet, mainly because our wetware doesn't generate a rigid linguistic analog for these gestures and their relative coordinates in space.
Great video, bad title. It should be more like this: here's a video of a few guys in a university talking about tracking body language but exaggerating their own, even though they describe it as genuine and not at all on purpose. Also, nice socks. Genuine is the right leg movement after the compliment at 5:04
So for AI why don't we let it listen to millions of hours of human conversation and generate potential responses. Then tell it which responses are more appropriate. Except the audio can't be from movies or shows because that's not a good representation of real human interactions.
Telling it which responses are good or bad would be a slow manual process. A better approach would be to get the AI to look at millions of hours of conversation, study the body language of participants, tone, subject, changing of subject, etc. Then determine through deep learning what about that conversation caused it to derail, get negative, positive and so on. Wouldn't be surprised if someone was working on that at the moment.
the things machines don't understand come from the limitation of human beings understanding, because we don't understand. Why this person sounds like he is blaming the machines?
Great work, Ben! Super interesting video!
Please continue making these types of videos!
I love this topic! Deep learning is going to change everything.
Ah Kinect some times I feel like the only person who wanted one with their Xbox. Glad to see someone get some use out of the device. Hope MS doesn't give up on it.
Great video! Keep creating great content.
This is a nice, interesting video guys. Hope to see more like this in the future. Do something on Alphabet's X labs.
5:58 who was the idiot at Google who after watching 2001 a space oddyssey decided to teach computers to read lips?
No one. They decided for themselves.
Wow, someone actually found a use for kinects
If we able to break through on this said elaborate code of body language. Off course its first breakthrough was not break through the code of humanity's body language of its entirety but the body language of an American first. Because the machine learning machine is exposed to mostly American citizens training it and it's research facility is in USI'm pretty sure despite no strong evidence or knowledge on psychology that cultures and environments plays a big part how we perform body language. So still a long way after all
Dope video!🤘🏼😎
1:16 The only reason why *kinect* exists! XD
very rad, +The Verge
I consciously choose my body language. So, I feel half the things you said were false. I tilt my head because I'd rather not look at you, or I scratch because something is bothering me, perhaps I'm having trouble understanding, and I lean back to make myself comfortable. It doesn't take a genius. Here's some more information to propel the tech:
I put my hand to my chin because it's heavy, I'm probably slouching.
I don't do the hand motions when I speak because that's a little dumb, but it adds emotional energy to the words, and the emotional wavelength depends on how high or low the hands are, shrugging usually has hands at the bottom and face palms are at the head. I'm teetering on insults, but it comes from a hate of inaccuracy and falsity.
It should be taken into account that the Human brain is very good at spatial reasoning and spatial improvisation. Research games like Foldit have shown that the human brain is better at finding more efficient spatial "sinks" for proteins when they need to be folded to the lowest energetic well possible. So in lay man's terms: our brains understand what the "tone" a gesture is sending. It may even know how that tone changes bases on the 3 dimensional coordinates that change over time and the relative speed and direction or vectoring those gestures perform. We humans just can't transliterate the meaning in real time yet, mainly because our wetware doesn't generate a rigid linguistic analog for these gestures and their relative coordinates in space.
robots won't have human-like body language without human-like intelligence.
Very interesting. Great job.
Amazing!. Endless possibilites!
This is so cool.
Hello Westworld!
Great video, bad title. It should be more like this: here's a video of a few guys in a university talking about tracking body language but exaggerating their own, even though they describe it as genuine and not at all on purpose. Also, nice socks. Genuine is the right leg movement after the compliment at 5:04
What's more worrying than a Terminator apocalypse is the replacement of traditionally human-exclusive service jobs with avatar/automata replacements.
/r/cableporn is gonna be hella upset
So for AI why don't we let it listen to millions of hours of human conversation and generate potential responses. Then tell it which responses are more appropriate. Except the audio can't be from movies or shows because that's not a good representation of real human interactions.
what you'll get is a robot that imitates rather than think.
that's not how AI works,
Yes it is. That's how they learn. That's the whole concept for google deep mind or whatever
Telling it which responses are good or bad would be a slow manual process. A better approach would be to get the AI to look at millions of hours of conversation, study the body language of participants, tone, subject, changing of subject, etc. Then determine through deep learning what about that conversation caused it to derail, get negative, positive and so on. Wouldn't be surprised if someone was working on that at the moment.
Like ray the Microsoft bot?
Honestly thought Lok Cheung was in the video at 1:00 min.
Nice!
nice topic
damn, that Einstein robot freaks me out
Put some Italians inside!
WESTWORLD !!!
4:05
SciShow cameo?
Dat Xbox Kinect tho..
omg 3:14 it's Charlie Sheen 20 years ago
Let's discuss something, do you guys think what we saw on Westworld is just an imagination or a distant future we'll surely get
iRobot is becoming a real thing lol
a human face should never be used on a robot, creepy af
5:33 somebody punched this dude and broke his nose... or maybe he crashed against a glass wall :/
Damn
the things machines don't understand come from the limitation of human beings understanding, because we don't understand. Why this person sounds like he is blaming the machines?
he is not
I love you
11th comment bois
Aseem Doriwala Well done
Alexander Charters thanks thanks 😚
I'm the 2nd one to view this video
WHY is it always chinese???
Because Asians are smarter than you.
thiird