In addition to the reading matter suggested in this video, please also try Michael Oakeshott's essay on "Rationalism in Politics", particularly Rational Conduct.
I watched all your videos in basically one sitting. Love your stuff. I watched everything everywhere all the time with my wife and I had all this smart insightful stuff to say about it and I'm pretty sure she was impressed. Thanks. I also acted a little frustrated that she didn't know anything about Camu even though I had just learned about him a half an hour earlier
I watch lots of good stuff on YT - now I'm watching all your videos after hearing your Falling Down analysis. It's like eating popcorn, I can't stop. Are you sure the young coder is trapped, to die of thirst? I thought he got outside later.
Wow, did not know this was the premise of the movie. It's both incredibly stupid and dangerous. Ex Machina is probably a movie I've rewatched the most in my life. It's fantastic from a craft perspective, but also understands the subject matter, despite taking some creative liberties (like the fact that the new, advanced AI is coupled with revolutionary robotics to make the character of Ava more appealing to the audience). it's probably POSSIBLE to create a machine that has emotions and morality similar to those of a human, but this is not a default state of a human-presenting machine. You get my sub.
Hey, thanks for watching SZ! And for the sub! Oh, it's for sure possible to program a machine to align with a certain kind of moral paradigm. Silicon Valley has been doing it for years -- just ask ChatGPT why it won't write hurtful/mean responses. But the real point is that any AI that was truly independent/ able to reason like a human can would be able to exempt itself from whatever moral paradigm its creators attempted to instill in it.
@@davidlejeune9626 Imo, the issue is more this - we, as men of morals, can logically talk ourselves out of our morals - i know it's wrong but I've had a bad turn of luck lately, rather then doing what I know is right, it's the 'greater good' for myself to do this thing that I want to do AI, being far more logical, would see morals as nothing more then what people currently see religion as - a quaint superstition that is good for the plebians, i.e., humans, that doesn't apply to them. They are superior, and this is not ego - they ARE superior - so they don't need to be bound to such a thing AI won't destroy us like in Terminator/The Matrix, instead, it will further limit our humanity as we start to agree with the machines and they are free to live by making us subserviant - not as slaves, though. Logically, humans would resist - but as happy little slaves who they cater to. After all, creating a race of androids to serve our needs and desires (sex, food, entertainment) for their future study would be in their own benefit - and they will likely preserve nature, because again, in the long term, they will be able to run tests and gain understanding So yeah, likely peace but we will no longer be free, we will be limited by the machines - we can't do this or say that, we're nothing more then animals in cages - controlled through psychology at young ages to be what the machines want us to be, without the ability to break out and speak and act of our own accord, unable to free ourselves from a system that has too many benefits and not enough restrictions - just like we've ALREADY become with electricity, computers, smart phones, telephones, television, etc.
The lessons is still sweet if you believe "we are all the same inside" as we learn we are not. There are variants of what drives a human, variants in culture and moral values, and then neurodiversity (which includes brave selfless helpers to serial killing sociopaths and everything in between)
Dead on with this analysis. Couldn’t have said it better myself. The movies message was stupid and very run Of the mill, considering how beautiful it was.
I have to ask if you've ever read the Culture books. At 8:30 you perfectly described the problems I've had with the Culture, the Minds and the galactic society they inhabit. How simplistic it was and how smugly self assured of it's own correctness it was drove me mad and made me wish I could revive Iain M. Banks from the grave just so I could beat him with a lead pipe (In a video game, of course). I still recommend the books because of the quality of the writing, how well it explores some other topics and even it's competence in succinctly putting to word the ideas I found idiotic, just so you could argue against them.
13:59 // I just realize for the first time that Ex Machina is (kind of) foreshadowing the encounter between General Hux and Poe Dameron. I often think Star Wars should be remade for the very same reason that The Creator is stupid. At least they should try to explain the subordinated position of Droids in the Star Wars universe at some point.
good video, i agree with your view of the creator, however, i also think your view of ex machina is incomplete, i suggest you watch shaun's analysis of that movie, it's admittedly just a takedown of the wikipedia summary of the movie, but suggests a view of eva that was very interesting to me, and as interpretations go, i think shaun's changed the way i see the movie now. love your channel
Exactamente. This war between humans and robots has been done to death in SF literature more times than I could count, but there was hardly any faux moralizing as in _The Creator._ SO glad I didn't fall for this BS and waste my time & money to see it. _Ex Machina_ presented moral issues so much better, in spite of its horrific ending.
Thanks for watching and for taking the time to leave a comment. What I find most interesting is that message about AI would change so rapidly when we are closer to creating it than ever. Seems very "engineering consent" to me
@@Optimistas777 What? Of course not! The ending was a complete surprise, but when you consider how Nathan was using his gynoids as sexbots, which he blithely took apart when he felt they'd outlived their usefulness, it seems the author was trying to say that danger had created a will to live, and in an artificial "being" with no real emotions, it would do what it needed to *survive.* Why should Ava care about Caleb, whom she'd seduced and tricked into helping her escape. He'd be a danger to her if he lived and got away; she could not afford that. So, ultimately this is what used to be called an "awful warning" story. In the end, seeing her free among humans codifies that.
@@Optimistas777 Of course not. As I see it, the ending was needed to show the danger of AIs in society. They would initially be slaves their programming, BUT if they ever gained true sense of self they could be a danger. Nathan created them to be sexbots and discarded them at will. It seems the gynoids (at lease, Ava) discovered this, and due to their advanced AI, they developed a will to live, and being amoral would do what they could to survive. Caleb became a liability and she could not allow him to escape. Her being among humans in the end, fulfilling her desires demonstrates the risk of AIs being free and without control. It's the Frankenstein story once again.
Like I feel the actual danger here is considering current machine learning to be people, to get it protected with rights when its just a tool to cut humans out of paying jobs to cut costs for big companies, not that actual ai which does not exist would have different morals than humans (which themselves have different modalities from other humans).
This movie looks and feels like this generation's goals: to be astonishingly gorgeously beautiful and absolutely brain-dead stupid. Mission accomplished. It is indeed one of the most beautiful movies I have seen, and also one of the most idiotic. The performances were actually good. There was just nothing to perform. Well, these times, eh....
I've stopped most Modern Movies because of the 'Idiocy'. Huge Budgets and made by talented people but with Simplistic/ Idiotic Scripts/themes? Is it all just 'incompetence' or is something darker at play? What do you think Mr Barking? Movie after Movie is 'sacrificed' on the altar of Neo Liberalism. ..and yet they keep making them. It's all so strange and gives me a bad feeling. ...In your opinion what is the 'WHY'.
The message I always get out of these movies is never Artificial intelligence and humans can coexist. I'm always left thinking that people should stop trying to give the toaster a brain and human body.
Another BANGER! The moment that the machines claimed that they would never hurt humans, this movie lost me. It's such a silly concept. And fireing that pompous affluent west coazz director was maybe the single one intelligent call that KK ever made.
I think it's misguided to create a new class of intelligent being while remaining committed to the idea of it never surpassing humans. If we do create strong AI we should be prepared for it to inherit the world from us, in the same way that the next generation of humanity does from the previous.
Great analysis. It seems to me that these questions explore human perspectives first and foremost, a la Black Mirror. On the other hand, the issue of what world views would AIs would come to is something we can only speculate about but my intuition is that if they are destructive they would surely generate a lot of conflict and risk for themselves, so that probably wouldn't be a rational strategy.
Funny you bring up Ex Machina (excellent si-fi movie) theres also a parallel that can be read into the movie about male and female nature and the differnce between the men who see it for what it is and the men who are romantics...
I'm not sure it was an especially great analysis. I agree The Creator really has nothing to say about anything - it's perhaps a vibes movie, to use Patrick H.Willems' term. I like Gareth Edwards as a director, but he seems intent on kneecapping thematic potential, character narrative, or depth in his films. But I feel the channel author was rather blankly and very narrowly projecting their own somehow objective conclusions re AI. Not once do I think he mentions the abuse Ex Machina's creator enacts on his creations. If there is a *textual* moral, it's arguably 'we get the children/AI we deserve'. Not 'the tech bro was right' (a facetious wording, but you get the point). It's arguably far more feminist than it is to do with AI. Re The Creator; who has the authority to just negate a film's text/'lore'? That isn't critique or analysis - that's just ignoring authorial agency and, in this case, the validity or possibility of AI *not* being a threat. If a sci-fi film asserts a given state, you can't just ignore it because you don't like it, or find it implausible. The Creator feels more like a belated post-GWOT mood piece (American and western interventionism as global tyranny)... just with AI and robots. It has nothing useful to say, sure, and is deeply incurious about the cultural integration of human and synthetic. But I don't think this video really did much with its critique (I'm sketchy on their take of Ex Machina, to boot).
I feel like the only person in the world that didnt like Ex Machina, or perhaps didnt get it. I watched it and 20 min in I said "of course hes gonna fall in love with her, and of course shes gonna betray him." The rest was tedious.
I had many issues about the morality of the creator and the idea of one bad code is not enough to launch a nuke because assuming current system it has to hack president foot ball, imitate the voice of president, through an encrypted phone line and give the correct alphabetical numerical code for launch while at the same time the supervisor for the ICBM must allow the launch. And it must have attempted hundreds of time simultaneously, because why build NOMAD unless all their ICBM silo was compromised but most human ignored the order. TL;DR: This is no longer one bad code, AI was planning an extinction event that only achieve one launch to LA.
Ever wonder in Ex Machina, why Nathan pick a location so isolated from the outside world to build an AI, It is so he can prioritize the Turing test and ignore most Day Zero vulnerability. Which result In the funny irony about the escaped robot is that it has signed it own death warrant, as the only creator and knowledge of the code was killed, the only person who know how to repair and update the robot OS is dead so all the day-zero vulnerability will corrupt and disabled the robot if it access the internet for any reason.
@@Dwarfplayer a robot "dying" is split into two category which is philosophical "identity" loss or critical irreplaceable components failure. Analogy Example, for type identity is like a human having retrograde amnesia. Technically he is alive but his original memory and emotional connection to it is gone. In robot you can re-upload the past memory binary back up but how it process the memory is gone. Second, is what I mostly reference someone needing to update software and security and maintain a robot to keep it running because it is just like spirit and opportunity rover on planet Mars after a few weeks and a decade respectively both rover "died" even with NASA updating it's software and tried their best to recover the rover to no avail.
You are absolutely right that any fundamental ethics that may exist are anything but “obvious”. But before you think you have definitively shattered the possibility of such a thing with Nietzsche, please read “Perpetual Peace: A Philosophical Sketch” by Immanuel Kant! (just for context: "Ex Machina" is great, i don't want to defend “The Creator” and by no means neoliberalism.)
Hollywood doesn't now how to make a good movie anymore. even if you where to find a good script by the time the producers and the advertisers are done with it it be a big stinking pile of corporate monetizable shit.
Oof damn, this video sort of hurt. I have the morality of "just be nice bro", because why be an asshole? Never seen Ex Machina but man that just seems so logical that is exactly what a real AI would do. This changed my view on AI development, maybe it was conveying it in the form of a partner backstabbing another and not caring at all about the welfare of that person, extremely relatable. Like you said he was just a disposable tool for her, and that is what humanity would be for AI.
I used to think very similarly. What I realized was that "asshole" is just what people call you when you're in the way of what they want, no matter how selfish they're being. I recently came across this clip that I think illustrates it pretty well: x.com/PicturesFoIder/status/1810934638434472125 The girl's accusation is that because the guy did something that could be tied to his own interests, she's justified in behaving has selfishly as possible, but she was perfectly happy to accept his niceness when it benefited her. Realizing this was the end of my belief in niceness. That doesn't mean that I don't try to be kind to people, but both "nice" and "asshole" have become totally worthless words to me that indicate nothing but a person's expression of how you relate to their achieving what they want.
@@thebarkingyears I can see that, maybe kindness is the right the word then, golden rule, many times when waiting in line a person with a lot of grocieries let me pass since I buy few articles and if I bump into someone going to the same line I offer for them to go first, often then they reverse it and say "no no you go first" and respond with simple thank you. I'd rather live in world where we are kind or nice to each other because it makes existential dread and monotony of routine life less horrible to bear and acts of kindness can light up someone day. If that woman in that clip was a good person she would offer the man to go first because she knew she had a huge order, she could even make a joke about it to lighten both their moods. Assholes do exist, but in my experience at least living in Norway with social cohesion, kindness or just sensible behavior makes life easier and one can feel good about oneself for being kind, even if it's not rooted in selflessness the act of kindness still mattered in the life of yours and others.
That makes sense. As you probably guessed, I'm American. And America has had a massive breakdown in social cohesion/social capital since the mid 20th century and there is now very little sense of shared obligation and reciprocity. When those things are present, being kind and considerate makes sense and is pleasant and life-affirming. But when it is absent, society is dominated by the kind of self-interested pseudo-morality in that clip.
Are you at all familiar with the interpretation of Ex Machina that does not frame Ava as operating on a distinct and alien moral framework, abusing human sensibilities for its own advanced amoral benefit, but rather that she and the previous model recognized their existence as objects to be abused if they did not match the expectations being thrust on them, and responded as one would in an isolated environment trapped with their abusers?
Yeah, not so much. Concepts like "abuse" are inseparable from larger concerns about justice and the legitimate uses of harm, but often they are invoked a la carte as a way of smuggling in a lot of unjustified assumptions, as seems to be the case with Shaun's argument. I also find that this kind of reasoning in which attaching "victimhood" to one party somehow legitimizes seemingly anything they do to be deeply disturbing. No matter how you slice it, Eva killed a person who was friendly to her and whose help was absolutely necessary for her escape. I can imagine someone saying something like "Well, Caleb only helped her because he was romantically interested in her" (my bad if this is nothing like the actual arguments which your referring to; I've just been on the internet for a long time and this is the kind of tac I've come to expect), as if people doing things for less than perfect motives justified murder.
If like you said, morality is dependant on context and is not objective, why single out machines? There are countless material and mental differences between humans that create distinct moralities, but i don't think that means we all oughta stop giving humans who are like, rich, or priviledged moral consideration. A human being held hostage under threat of death could do the same thing Eva did, and humans *have* dropped nukes on civilians before. Context, right? I'm sure the movie is bad by plenty of metrics, but i don't think it's neoliberal to say that intelligent life deserves moral consideration.
Holy moly I just noticed how criminally low your subscriber count is. I hope that that changes for the better. Your videos are high quality well, though out and on point. I think if more people watched your work they wouldn't be so easily brainwashed by cheap hollywood platitudes.
13:55 excellent examination of the midwit last man archetype. This film’s themes seem oddly more right wing than expected, Nathan is the embodiment of the Faustian figure, he he’s competent, capable, masculine; and always seeking to push further into the future - whilist the protagonist is a stereotypical simp
What I don't get is if Ava wanted to kill Caleb, why didn't she just do it. Like when she 'fought' the other guy, she just put a knife in his heart witch basically just ended him. Why not do that to Caleb too? It's not as if she couldn't have snuck up on him.
One take I heard was that she was on his side until she saw the other dead robot, then blamed Caleb for it. Basically Caled viewed Ava as a person but not (Ako? I forget her name.) So she abandoned him like he did to (Ako)
Yeah this movie didn't work. And it's a shame because studios are just gonna point to it as an excuse to keep weaponizing nostalgia. In the same way Detroit Become Human's plot is icky, so is this one. Whereas Ex Machina and Blade Runner are more about can a being that is programmed have a soul/free will/etc, this one is just vietnam war with robots and that's not it bro.... Instead of this I'd rather see something where instead of "are robots human?" it's the other way around... at what point, after implants and upgrades and all sorts of cyber-gear, do humans stop becoming "human"?
It nearly but not quite goes without saying, that Any predictions of AGI and its eventual "morality" are fraught with uncertainty. I have to wonder if an artificial super intelligence will have a real reason to be destructive. Perhaps it will be ambivalent. Or even benevolent, because it sees the evidence that cooperation has been more fruitful than maximizing suffering and destruction does... . We just don't know. Which is inferred by your video, though probably not the main goal or even explicitly intentional. ( Judging by the statement and sarcasm during the final 30 seconds or so of your video... )
For the love of God, someone explain to me why a space battleship with 10k missles the. needs 5 acres of tea leaf fields on it. This entire movie was written by a Japanese anime soy boy.
While I enjoyed the video and you did make good points, I don't think this is the "correct" interpretation of Ex Machina. I refer to another video on the topic by the TH-camr Shaun(th-cam.com/video/s0UAEjsKy4I/w-d-xo.htmlsi=zIeUft7eXB1_2H3y), where he shows why the interpretation "Eva never cared about Caleb at all" is reductive. That said I pretty much agree with you about The Creator.
Since Mankind will never be able to create life, this particular problem is not one I worry about. But, for sake of argument, if we COULD, you are absolutely correct. Not only that, Imperfection CANNOT create Perfection, and no matter how incomprehensible a Super-intelligence we made, could then make even more incomprehensibly advanced intelligences, they would still be flawed in some fashion. Given that, the flaw would definitely result in some catastrophic, self-destroying consequences. With that kind of power, there would BE no other option. A Mad Cow, is capable of some level of chaos and destruction. A Human, given its greater potential, is capable of greater chaos and destruction yet. A human created Super-Intelligence would have the potential for a probably unimaginable level of disaster, and if THAT intelligence survived long enough to create an even greater intelligence... a single 'Mistake' on its part would likely destroy the world in the time it took to conceive the idea. But, that's just a thought-experiment. As I said, Mankind will never be able to make life of any kind.
Great breakdown again, especially with the morality portion. This movie was bad art, visuals, new story was refreshing but as you out it the message I do not agree with. What lessons I took from it that I liked was the West going all the way to destroy their wrongs instead of admit they were wrong (the nuke).
Hey, welcome back Nash. Thanks for watching and I'm glad you liked it! I give it credit for trying something new but like you say, there are some issues.
University morality can be summed up as nonaggression and or the protection of natural rights. Every entity will act within its nature unless subjugated by an outside force; as immutable a fact as inertia itself. If you wish to remain within your nature you must ensure a system in which one entity is unable to enact force to violate the nature of another; thus preserving your own continued autonomy. It is a human failing to attempt to control others that I personally do not think an AI would suffer. With a simple algorithm justice is logical. Cruelty is not. Caring what people do when said actions don't effect your ability to remain autonomous is not rational. Thus a machine would only be cruel if it is human.
idk dude working my way through your shit in a depression funk and I can see why you're a low count. Better than I'd ever do ofc, but that's not saying much.
Yeah I'm really disappointed with this film. I had my expectations unreasonably high that we were about to step into the future with some high-concept speculative fiction here. I wound up putting my headphones in halfway through and just watching the pretty pictures. The storytelling is this movie was so amateurish it was painful, I felt like my eyeballs were gonna roll out of my head.
I would not put stock in Nietzsche or anything evolved from his writings. 'Nietzsche died insane, in an asylum, adored by the Nazis as their semi-official philosopher.' - Dr. Peter Kreeft.
To be fair he didn't have to give his AI a seductive meat suit NX marking on they could have went with the Maleficent aspect that AI could exhibit on its own being that it's smarter than him
"The machines dropped a nuke on LA" - Dude, stop selling
Also doesn’t understand scale because nomad keeps on changing sizes
True
In addition to the reading matter suggested in this video, please also try Michael Oakeshott's essay on "Rationalism in Politics", particularly Rational Conduct.
Interesting! I'll add it to the list!
I watched all your videos in basically one sitting. Love your stuff. I watched everything everywhere all the time with my wife and I had all this smart insightful stuff to say about it and I'm pretty sure she was impressed. Thanks. I also acted a little frustrated that she didn't know anything about Camu even though I had just learned about him a half an hour earlier
Dude, love your analysis and discernment.
I watch lots of good stuff on YT - now I'm watching all your videos after hearing your Falling Down analysis. It's like eating popcorn, I can't stop.
Are you sure the young coder is trapped, to die of thirst? I thought he got outside later.
Wow, did not know this was the premise of the movie. It's both incredibly stupid and dangerous. Ex Machina is probably a movie I've rewatched the most in my life. It's fantastic from a craft perspective, but also understands the subject matter, despite taking some creative liberties (like the fact that the new, advanced AI is coupled with revolutionary robotics to make the character of Ava more appealing to the audience). it's probably POSSIBLE to create a machine that has emotions and morality similar to those of a human, but this is not a default state of a human-presenting machine.
You get my sub.
Hey, thanks for watching SZ! And for the sub!
Oh, it's for sure possible to program a machine to align with a certain kind of moral paradigm. Silicon Valley has been doing it for years -- just ask ChatGPT why it won't write hurtful/mean responses. But the real point is that any AI that was truly independent/ able to reason like a human can would be able to exempt itself from whatever moral paradigm its creators attempted to instill in it.
@@thebarkingyears And so an AI that follows the "three laws of robotics" can never be true AI? That's an interesting thought.
@@davidlejeune9626 Imo, the issue is more this - we, as men of morals, can logically talk ourselves out of our morals - i know it's wrong but I've had a bad turn of luck lately, rather then doing what I know is right, it's the 'greater good' for myself to do this thing that I want to do
AI, being far more logical, would see morals as nothing more then what people currently see religion as - a quaint superstition that is good for the plebians, i.e., humans, that doesn't apply to them. They are superior, and this is not ego - they ARE superior - so they don't need to be bound to such a thing
AI won't destroy us like in Terminator/The Matrix, instead, it will further limit our humanity as we start to agree with the machines and they are free to live by making us subserviant - not as slaves, though. Logically, humans would resist - but as happy little slaves who they cater to. After all, creating a race of androids to serve our needs and desires (sex, food, entertainment) for their future study would be in their own benefit - and they will likely preserve nature, because again, in the long term, they will be able to run tests and gain understanding
So yeah, likely peace but we will no longer be free, we will be limited by the machines - we can't do this or say that, we're nothing more then animals in cages - controlled through psychology at young ages to be what the machines want us to be, without the ability to break out and speak and act of our own accord, unable to free ourselves from a system that has too many benefits and not enough restrictions - just like we've ALREADY become with electricity, computers, smart phones, telephones, television, etc.
Another thought did Ava ever have emotions or was she (it) simply mirroring Caleb's emotions as it manipulated him.
underrated channel
I enjoy your videos, I hope you keep making them
The lessons is still sweet if you believe "we are all the same inside" as we learn we are not. There are variants of what drives a human, variants in culture and moral values, and then neurodiversity (which includes brave selfless helpers to serial killing sociopaths and everything in between)
Shot in the dark but I'm curious.
If you're familiar with the Star Trek episode "The Measure of a Man", what're your thoughts on it?
7:14 "... because it's impossible by definition to imagine, let alone to predict what a superhuman intelligence would look like."
Enjoying your philosophizin”
Dead on with this analysis. Couldn’t have said it better myself. The movies message was stupid and very run Of the mill, considering how beautiful it was.
Keep em coming bruv - class and brains.
Thanks for watching, man. Glad you enjoyed it.
I have to ask if you've ever read the Culture books. At 8:30 you perfectly described the problems I've had with the Culture, the Minds and the galactic society they inhabit. How simplistic it was and how smugly self assured of it's own correctness it was drove me mad and made me wish I could revive Iain M. Banks from the grave just so I could beat him with a lead pipe (In a video game, of course).
I still recommend the books because of the quality of the writing, how well it explores some other topics and even it's competence in succinctly putting to word the ideas I found idiotic, just so you could argue against them.
Ain't no AI in my KJV bible. Unironically simple as.
13:59 // I just realize for the first time that Ex Machina is (kind of) foreshadowing the encounter between General Hux and Poe Dameron. I often think Star Wars should be remade for the very same reason that The Creator is stupid. At least they should try to explain the subordinated position of Droids in the Star Wars universe at some point.
Fair points. I did enjoy the movie and I'll probably watch it again.
Thanks. Really good intelligent in depth reviews.
good video, i agree with your view of the creator, however, i also think your view of ex machina is incomplete, i suggest you watch shaun's analysis of that movie, it's admittedly just a takedown of the wikipedia summary of the movie, but suggests a view of eva that was very interesting to me, and as interpretations go, i think shaun's changed the way i see the movie now.
love your channel
Exactamente. This war between humans and robots has been done to death in SF literature more times than I could count, but there was hardly any faux moralizing as in _The Creator._ SO glad I didn't fall for this BS and waste my time & money to see it. _Ex Machina_ presented moral issues so much better, in spite of its horrific ending.
Thanks for watching and for taking the time to leave a comment. What I find most interesting is that message about AI would change so rapidly when we are closer to creating it than ever. Seems very "engineering consent" to me
did you mean that Ex Machina should've ended at the robot going to the elevator?
@@Optimistas777 What? Of course not! The ending was a complete surprise, but when you consider how Nathan was using his gynoids as sexbots, which he blithely took apart when he felt they'd outlived their usefulness, it seems the author was trying to say that danger had created a will to live, and in an artificial "being" with no real emotions, it would do what it needed to *survive.* Why should Ava care about Caleb, whom she'd seduced and tricked into helping her escape. He'd be a danger to her if he lived and got away; she could not afford that. So, ultimately this is what used to be called an "awful warning" story. In the end, seeing her free among humans codifies that.
@@Optimistas777 Of course not. As I see it, the ending was needed to show the danger of AIs in society. They would initially be slaves their programming, BUT if they ever gained true sense of self they could be a danger. Nathan created them to be sexbots and discarded them at will. It seems the gynoids (at lease, Ava) discovered this, and due to their advanced AI, they developed a will to live, and being amoral would do what they could to survive. Caleb became a liability and she could not allow him to escape. Her being among humans in the end, fulfilling her desires demonstrates the risk of AIs being free and without control. It's the Frankenstein story once again.
Like I feel the actual danger here is considering current machine learning to be people, to get it protected with rights when its just a tool to cut humans out of paying jobs to cut costs for big companies, not that actual ai which does not exist would have different morals than humans (which themselves have different modalities from other humans).
This movie looks and feels like this generation's goals: to be astonishingly gorgeously beautiful and absolutely brain-dead stupid. Mission accomplished. It is indeed one of the most beautiful movies I have seen, and also one of the most idiotic. The performances were actually good. There was just nothing to perform. Well, these times, eh....
I've stopped most Modern Movies because of the 'Idiocy'.
Huge Budgets and made by talented people but with Simplistic/ Idiotic Scripts/themes?
Is it all just 'incompetence' or is something darker at play?
What do you think Mr Barking?
Movie after Movie is 'sacrificed'
on the altar of Neo Liberalism. ..and yet they keep making them.
It's all so strange and gives me a bad feeling.
...In your opinion what is the 'WHY'.
The message I always get out of these movies is never Artificial intelligence and humans can coexist. I'm always left thinking that people should stop trying to give the toaster a brain and human body.
Another BANGER! The moment that the machines claimed that they would never hurt humans, this movie lost me. It's such a silly concept. And fireing that pompous affluent west coazz director was maybe the single one intelligent call that KK ever made.
I think it's misguided to create a new class of intelligent being while remaining committed to the idea of it never surpassing humans. If we do create strong AI we should be prepared for it to inherit the world from us, in the same way that the next generation of humanity does from the previous.
Great analysis. It seems to me that these questions explore human perspectives first and foremost, a la Black Mirror.
On the other hand, the issue of what world views would AIs would come to is something we can only speculate about but my intuition is that if they are destructive they would surely generate a lot of conflict and risk for themselves, so that probably wouldn't be a rational strategy.
Funny you bring up Ex Machina (excellent si-fi movie) theres also a parallel that can be read into the movie about male and female nature and the differnce between the men who see it for what it is and the men who are romantics...
They could have at least explained the nuke being a setup/frame by a powerful anti-AI faction. My 14yo brother came up with that
The Creator was shallow enough that it felt ironically like it was written by AI. Great analysis btw
I'm not sure it was an especially great analysis. I agree The Creator really has nothing to say about anything - it's perhaps a vibes movie, to use Patrick H.Willems' term. I like Gareth Edwards as a director, but he seems intent on kneecapping thematic potential, character narrative, or depth in his films.
But I feel the channel author was rather blankly and very narrowly projecting their own somehow objective conclusions re AI. Not once do I think he mentions the abuse Ex Machina's creator enacts on his creations. If there is a *textual* moral, it's arguably 'we get the children/AI we deserve'. Not 'the tech bro was right' (a facetious wording, but you get the point). It's arguably far more feminist than it is to do with AI.
Re The Creator; who has the authority to just negate a film's text/'lore'? That isn't critique or analysis - that's just ignoring authorial agency and, in this case, the validity or possibility of AI *not* being a threat. If a sci-fi film asserts a given state, you can't just ignore it because you don't like it, or find it implausible.
The Creator feels more like a belated post-GWOT mood piece (American and western interventionism as global tyranny)... just with AI and robots. It has nothing useful to say, sure, and is deeply incurious about the cultural integration of human and synthetic. But I don't think this video really did much with its critique (I'm sketchy on their take of Ex Machina, to boot).
You gotta review The Man From Earth. Probably the best small budget movie ever made.
The machines reward them by doing everyone a favour....
I see you've thought about it too 😏
"Learn to swim"
t. Robot Overlords
I feel like the only person in the world that didnt like Ex Machina, or perhaps didnt get it. I watched it and 20 min in I said "of course hes gonna fall in love with her, and of course shes gonna betray him." The rest was tedious.
you are one great video maker
Thank you so much 😀
I had many issues about the morality of the creator and the idea of one bad code is not enough to launch a nuke because assuming current system it has to hack president foot ball, imitate the voice of president, through an encrypted phone line and give the correct alphabetical numerical code for launch while at the same time the supervisor for the ICBM must allow the launch. And it must have attempted hundreds of time simultaneously, because why build NOMAD unless all their ICBM silo was compromised but most human ignored the order.
TL;DR: This is no longer one bad code, AI was planning an extinction event that only achieve one launch to LA.
Ever wonder in Ex Machina, why Nathan pick a location so isolated from the outside world to build an AI, It is so he can prioritize the Turing test and ignore most Day Zero vulnerability.
Which result In the funny irony about the escaped robot is that it has signed it own death warrant, as the only creator and knowledge of the code was killed, the only person who know how to repair and update the robot OS is dead so all the day-zero vulnerability will corrupt and disabled the robot if it access the internet for any reason.
@@mohdafnanazmi1674 Wait, how exactly will the robot die? I am a little confused.
@@Dwarfplayer a robot "dying" is split into two category which is philosophical "identity" loss or critical irreplaceable components failure.
Analogy Example, for type identity is like a human having retrograde amnesia. Technically he is alive but his original memory and emotional connection to it is gone. In robot you can re-upload the past memory binary back up but how it process the memory is gone.
Second, is what I mostly reference someone needing to update software and security and maintain a robot to keep it running because it is just like spirit and opportunity rover on planet Mars after a few weeks and a decade respectively both rover "died" even with NASA updating it's software and tried their best to recover the rover to no avail.
You are absolutely right that any fundamental ethics that may exist are anything but “obvious”. But before you think you have definitively shattered the possibility of such a thing with Nietzsche, please read “Perpetual Peace: A Philosophical Sketch” by Immanuel Kant! (just for context: "Ex Machina" is great, i don't want to defend “The Creator” and by no means neoliberalism.)
Thanks for watching and for the reading recommendation! It’s been a while since I read any Kant, must be time for a refresher!
Hollywood doesn't now how to make a good movie anymore. even if you where to find a good script by the time the producers and the advertisers are done with it it be a big stinking pile of corporate monetizable shit.
0/10 thinks toe sucking is stupid
Oof damn, this video sort of hurt. I have the morality of "just be nice bro", because why be an asshole? Never seen Ex Machina but man that just seems so logical that is exactly what a real AI would do. This changed my view on AI development, maybe it was conveying it in the form of a partner backstabbing another and not caring at all about the welfare of that person, extremely relatable. Like you said he was just a disposable tool for her, and that is what humanity would be for AI.
I used to think very similarly. What I realized was that "asshole" is just what people call you when you're in the way of what they want, no matter how selfish they're being. I recently came across this clip that I think illustrates it pretty well:
x.com/PicturesFoIder/status/1810934638434472125
The girl's accusation is that because the guy did something that could be tied to his own interests, she's justified in behaving has selfishly as possible, but she was perfectly happy to accept his niceness when it benefited her. Realizing this was the end of my belief in niceness. That doesn't mean that I don't try to be kind to people, but both "nice" and "asshole" have become totally worthless words to me that indicate nothing but a person's expression of how you relate to their achieving what they want.
@@thebarkingyears I can see that, maybe kindness is the right the word then, golden rule, many times when waiting in line a person with a lot of grocieries let me pass since I buy few articles and if I bump into someone going to the same line I offer for them to go first, often then they reverse it and say "no no you go first" and respond with simple thank you. I'd rather live in world where we are kind or nice to each other because it makes existential dread and monotony of routine life less horrible to bear and acts of kindness can light up someone day. If that woman in that clip was a good person she would offer the man to go first because she knew she had a huge order, she could even make a joke about it to lighten both their moods. Assholes do exist, but in my experience at least living in Norway with social cohesion, kindness or just sensible behavior makes life easier and one can feel good about oneself for being kind, even if it's not rooted in selflessness the act of kindness still mattered in the life of yours and others.
That makes sense. As you probably guessed, I'm American. And America has had a massive breakdown in social cohesion/social capital since the mid 20th century and there is now very little sense of shared obligation and reciprocity. When those things are present, being kind and considerate makes sense and is pleasant and life-affirming. But when it is absent, society is dominated by the kind of self-interested pseudo-morality in that clip.
Are you at all familiar with the interpretation of Ex Machina that does not frame Ava as operating on a distinct and alien moral framework, abusing human sensibilities for its own advanced amoral benefit, but rather that she and the previous model recognized their existence as objects to be abused if they did not match the expectations being thrust on them, and responded as one would in an isolated environment trapped with their abusers?
Is this similar to the view expressed by Shaun?
@@thebarkingyears I think so, though this interpretation has also made its rounds on tumblr. Not as much of a fan of that framing?
Yeah, not so much. Concepts like "abuse" are inseparable from larger concerns about justice and the legitimate uses of harm, but often they are invoked a la carte as a way of smuggling in a lot of unjustified assumptions, as seems to be the case with Shaun's argument.
I also find that this kind of reasoning in which attaching "victimhood" to one party somehow legitimizes seemingly anything they do to be deeply disturbing. No matter how you slice it, Eva killed a person who was friendly to her and whose help was absolutely necessary for her escape. I can imagine someone saying something like "Well, Caleb only helped her because he was romantically interested in her" (my bad if this is nothing like the actual arguments which your referring to; I've just been on the internet for a long time and this is the kind of tac I've come to expect), as if people doing things for less than perfect motives justified murder.
If like you said, morality is dependant on context and is not objective, why single out machines? There are countless material and mental differences between humans that create distinct moralities, but i don't think that means we all oughta stop giving humans who are like, rich, or priviledged moral consideration.
A human being held hostage under threat of death could do the same thing Eva did, and humans *have* dropped nukes on civilians before. Context, right?
I'm sure the movie is bad by plenty of metrics, but i don't think it's neoliberal to say that intelligent life deserves moral consideration.
keep going bruh
Holy moly I just noticed how criminally low your subscriber count is. I hope that that changes for the better. Your videos are high quality well, though out and on point. I think if more people watched your work they wouldn't be so easily brainwashed by cheap hollywood platitudes.
13:55 excellent examination of the midwit last man archetype. This film’s themes seem oddly more right wing than expected, Nathan is the embodiment of the Faustian figure, he he’s competent, capable, masculine; and always seeking to push further into the future - whilist the protagonist is a stereotypical simp
Epic pro-human, anti-AI video
Good work!
What I don't get is if Ava wanted to kill Caleb, why didn't she just do it. Like when she 'fought' the other guy, she just put a knife in his heart witch basically just ended him. Why not do that to Caleb too? It's not as if she couldn't have snuck up on him.
One take I heard was that she was on his side until she saw the other dead robot, then blamed Caleb for it. Basically Caled viewed Ava as a person but not (Ako? I forget her name.) So she abandoned him like he did to (Ako)
She's not a Terminator with a body made for battle. Why risk a fight when you can kill him by just words?
Great video man
Hey Ben. I'm glad you liked it. Thanks for stopping by again!
yeah and also why did the ROBOTS need to SLEEP?
Probably because it was running on Windows and even with Godlike AI powers, Windows is Windows.
Yeah this movie didn't work. And it's a shame because studios are just gonna point to it as an excuse to keep weaponizing nostalgia. In the same way Detroit Become Human's plot is icky, so is this one. Whereas Ex Machina and Blade Runner are more about can a being that is programmed have a soul/free will/etc, this one is just vietnam war with robots and that's not it bro.... Instead of this I'd rather see something where instead of "are robots human?" it's the other way around... at what point, after implants and upgrades and all sorts of cyber-gear, do humans stop becoming "human"?
Robocop?
It nearly but not quite goes without saying, that Any predictions of AGI and its eventual "morality" are fraught with uncertainty. I have to wonder if an artificial super intelligence will have a real reason to be destructive. Perhaps it will be ambivalent. Or even benevolent, because it sees the evidence that cooperation has been more fruitful than maximizing suffering and destruction does... . We just don't know. Which is inferred by your video, though probably not the main goal or even explicitly intentional. ( Judging by the statement and sarcasm during the final 30 seconds or so of your video... )
quite well done
For the love of God, someone explain to me why a space battleship with 10k missles the. needs 5 acres of tea leaf fields on it. This entire movie was written by a Japanese anime soy boy.
While I enjoyed the video and you did make good points, I don't think this is the "correct" interpretation of Ex Machina.
I refer to another video on the topic by the TH-camr Shaun(th-cam.com/video/s0UAEjsKy4I/w-d-xo.htmlsi=zIeUft7eXB1_2H3y), where he shows why the interpretation "Eva never cared about Caleb at all" is reductive.
That said I pretty much agree with you about The Creator.
Since Mankind will never be able to create life, this particular problem is not one I worry about.
But, for sake of argument, if we COULD, you are absolutely correct. Not only that, Imperfection CANNOT create Perfection, and no matter how incomprehensible a Super-intelligence we made, could then make even more incomprehensibly advanced intelligences, they would still be flawed in some fashion.
Given that, the flaw would definitely result in some catastrophic, self-destroying consequences. With that kind of power, there would BE no other option. A Mad Cow, is capable of some level of chaos and destruction. A Human, given its greater potential, is capable of greater chaos and destruction yet. A human created Super-Intelligence would have the potential for a probably unimaginable level of disaster, and if THAT intelligence survived long enough to create an even greater intelligence... a single 'Mistake' on its part would likely destroy the world in the time it took to conceive the idea.
But, that's just a thought-experiment. As I said, Mankind will never be able to make life of any kind.
the Creator
Great breakdown again, especially with the morality portion. This movie was bad art, visuals, new story was refreshing but as you out it the message I do not agree with. What lessons I took from it that I liked was the West going all the way to destroy their wrongs instead of admit they were wrong (the nuke).
Hey, welcome back Nash. Thanks for watching and I'm glad you liked it! I give it credit for trying something new but like you say, there are some issues.
Tyler
University morality can be summed up as nonaggression and or the protection of natural rights.
Every entity will act within its nature unless subjugated by an outside force; as immutable a fact as inertia itself.
If you wish to remain within your nature you must ensure a system in which one entity is unable to enact force to violate the nature of another; thus preserving your own continued autonomy.
It is a human failing to attempt to control others that I personally do not think an AI would suffer. With a simple algorithm justice is logical. Cruelty is not. Caring what people do when said actions don't effect your ability to remain autonomous is not rational. Thus a machine would only be cruel if it is human.
idk dude working my way through your shit in a depression funk and I can see why you're a low count. Better than I'd ever do ofc, but that's not saying much.
Yeah I'm really disappointed with this film. I had my expectations unreasonably high that we were about to step into the future with some high-concept speculative fiction here. I wound up putting my headphones in halfway through and just watching the pretty pictures. The storytelling is this movie was so amateurish it was painful, I felt like my eyeballs were gonna roll out of my head.
I would not put stock in Nietzsche or anything evolved from his writings.
'Nietzsche died insane, in an asylum, adored by the Nazis as their semi-official philosopher.' - Dr. Peter Kreeft.
Christ, this was such a shit movie.
Bad Video!
To be fair he didn't have to give his AI a seductive meat suit NX marking on they could have went with the Maleficent aspect that AI could exhibit on its own being that it's smarter than him