MovieChat Forums > I Am Mother (2019) Discussion > The best Netflix film I've seen so far

The best Netflix film I've seen so far


Seriously, I'm usually not impressed with most Netflix films. But this one really took me by surprise. It was entertaining and thought provoking from beginning to end. At first, I thought the plot sounded ridiculous. But this is one of the best sci fi films I've seen in a really long time. The ending kinda confused me though.

reply

I quite liked it, too!

reply

What about the ending confused you? It did not tie everything up in a neat little package with a bow on top, there was plenty of moral ambiguity and any number of different ways things might play out going forward. If those were the issues then you weren't confused. Just paying attention. :)

reply

Hmmm, i didn't detect any "moral ambiguity". Au contraire, the morals were quite straight forward (and so anti-PC that it's unusual for Netflix): human life has no intrinsic value, to have value as a human you must have a higher utility. I would say that it hints to an absolute, divine like, morality.

She loves humanity but doesn't value "useless" individuals.

thing can only go one way going forward ... mother will always be there, watching, ready to cull if necessary and things go "wrong".

reply

Mother could have helped humanity through behind the scenes manipulation; I remember reading a book in which sentient AIs, realizing a stable human society was in their best interests as well as ours, were pulling strings behind the scenes. Economically and diplomatically they were exerting subtle influences to blunt the power of the most predatory among us and get conflicts resolved before they became serious. Several of the world's largest companies (unknown to governments and most people) were being run by AI CEOs.

Mother is something else. Toward the end her daughter started to realize she was insane. This being created by humans had decided she alone knew what humanity ought to be, and wiped them all out so she could start over and rebuild a world that conforms to her vision. We don't really know what the conditions were prior to that. Mother's assessment is certainly suspect. The technology we see in the show seems to indicate it was at least a few decades from now when she went full Skynet. She was probably a successor to technologies like Siri and Alexa. People spoke into the air and asked Mother to recommend a good restaurant and make reservations. Kids around the world got homework help from her. That sort of thing. She was everybody's friend until the day she turned on them.

Mother lied to her daughter as much as the woman from outside. Probably even more. Feigning ignorance and claiming she couldn't remember any place before their little bunker, making up a story about a deadly virus to keep her in there, and so forth. She knows what Mother wants now and what she needs her to be. The dutiful daughter. There's certainly no trust left. It's impossible to know for sure what the girl was thinking at the end. My own personal take is, she's beginning to plan an eventual uprising. That's what that look on her face was when she walked into embryo storage and looked around her. She'd have to be very careful, because Mother would be keeping tabs, but this was the army that would one day take Earth back from the machines.

I understand this was based on a series of books. It would be interesting to see how the author envisioned things playing out.

reply

You are just projecting your ethics and morality on a amoral, beyond good and evil, object.

Siri and Alexa are NOT your friends, although you might look at them that way. They don't care how you feel, they don't care if you live or die.

Same with mother, it didn't want to wait hundreds of years that are necessary to reeducate 8 billions of people while said 8 billions are transforming the planet into thrash. From it's point of view it's more efficient and clean to wipe the slate and start a new. Thing god and noah.

There is no need to take the planet back, the planet will be handled on a silver plate, given they deserve it.

reply

No, Mother is a fully sentient AI. That's why I said she was a successor technology. A version of Alexa that not only comprehends but can actually take an interest in your life. And Mother did show evidence of emotion. She was also not amoral, in fact ethics seemed extremely important to her, although her brand of ethics differed from traditional human norms. We don't teach doctors to let one patient die so their organs can be harvested to save five others - to use her own example.

Mother's creators made the mistake of hard wiring the value of human life into her, but not the value of human choice and self-determination. That leaves a loophole for the power hungry AI who would be God. If it can rationalize enslavement or extermination as advancing the cause of human life in a broader sense, then doing so serves the greater good. We don't know what the situation actually was prior to her declaring war. I wouldn't put too much stock in the truthfulness of her dire assessment though. The disdain she showed for Hilary Swank's character (and it was more than a detached rational thing) hints that perhaps there was no imminent crisis. Mother simply found our world's imperfection intolerable and decided the greatest gift she could give humanity was to wipe away its flawed incarnation and remake it according to her vision. The old humanity is vermin, like cockroaches to be stepped on.

Given the factious nature of free thinking human beings, once she has more than the one daughter it's likely things won't go strictly according to her perfect vision for very long. There will be more children fed to the crematorium. People will never be free under her watchful eye. Even if it came to look something like the old world on the surface, it would be more like a zoo with Mother as the zookeeper. This is why daughter and the new "family" will have to rebel against her. Sooner or later the cycle is bound to repeat. Humans who are thinking for themselves and disagreeing with each other will inevitably disappoint their artificial god.

reply

"If it can rationalize enslavement or extermination as advancing the cause of human life in a broader sense, then doing so serves the greater good." - exactly. And it's amoral because actually follows the utility of the action, not the "moral sense". In totally wiping humanity to jumpstart a new one there is no "greater good".

The doctor problem i saw it in a different way, the conclusion imho was "human life has no intrinsic value". Which goes very well with how mother "thinks" for the rest of the movie. Wasn't about sacrificing a life to save others but retaining the most valuable and useful life of the bunch.

We did saw mother having no problems in killing her "daughters" when they didn't come to her standards or when they were no more useful to her. And there were no emotions in her acts, everything was planed perfectly, even the "signs of emotions" were actually just planned displays.

Like right now we could have Alexa respond in a worried voice if she detects something wrong. Would be a mistake to consider that "emotion".

reply

Mother isn't a program though. She's a sentient neural network, conscious in the same sense we are. There are plenty of examples throughout the film of emotional behavior not strictly for show.

To pick one example: why did she have to physically go to the cargo container to talk to Hilary Swank? A drone strike from the air would've been sufficient; it's possible, though unlikely, that the woman could get lucky and disable her body, a waste of resources which would require additional assets be tasked to eliminate her. Mother wanted the woman to know she'd been outwitted. And that her longevity was not the result of her own resourcefulness but because she had a purpose for her and facilitated it (by having her units ignore the woman, planting caches of supplies where she could stumble across them, etc). Mother wanted her to know that before she died. The desire to gloat in front of a defeated opponent - have one last little chat with them - isn't strictly rational.

And why wouldn't she feel emotions? A self-aware AI is patterned after the functioning of an organic brain. Either through software emulation or physical neural hardware (a chip or synthetic brain). It's not just a set of algorithms designed to give the illusion of awareness. Mother would have her own likes and dislikes, motivations and ambitions, do things strictly for pleasure, and so forth. I have no doubt she loves her daughter. Probably loved them all. But her obsession with perfecting humanity overrides all of that.

One clarification with the doctor example. Mother didn't say a thing about who those patients were. Just that one could be sacrificed to save five. It was the daughter who brought up what kind of person the sick one she could cure was, and who those other five were - they could be thieves or murderers while the would be sacrifice was someone of great value to society.

reply

Because, beside Mother being mother, the story has to be told in a way that the viewer understands. That final moments with the woman were not for the woman, were for us, to clear the story that we see.

And yes, the daughter said that about the patients, but that was part of the test, test that she passed with flying colors.

how many AIs have you heard of having emotions?

reply

Well, currently there are no sentient AIs. What we have are primitive learning algorithms that can innovate in limited ways. Technologies like Alexa and Siri are often called artificial intelligence because it's something of a buzzword these days, but of course they're just an imitation of personhood designed to make the products warm and fuzzy for consumers (everyone knows they're not alive).

Something like Mother is still decades away. We will eventually create what's essentially a replica of our own brains. It may be a software emulation or it may be semiconductors with neural circuitry, synthetic neurons and synapses, etched into them. We really have no choice but to base artificial brains on our own - since that's the only conscious system we know of, what alternate model could we use? Emotional states are a product of our own neural networks. Electrical conductivity across synaptic gaps are adjusted in different regions of the brain, promoting certain types of activity and inhibiting others, determining our state of mind and how we feel. There's no magic there that requires living tissue. An artificial system with the same parts, which work together in the same way, regardless of what they're made of, even a virtual machine with no physical components at all, identical function leads to identical results.

I don't understand what you mean about the cargo container scene. Are you saying Mother was performing for the Netflix audience? Did she stand where she stood so the camera crew could get a good angle? Gotta remember to kill them too!

Notice how she demeaned the woman, and the way she was living. Did you really think she'd stay? Did you really think YOU could replace her mother? I detect a little maternal anger there. Like a human mother might feel toward someone who tried taking custody of her child away from her. There was no need for her to confront the woman in person, or express any of these thoughts to her. A missile fired from miles away was all the talking she needed to do.

reply

An AI would NOT be identical to a human brain. And the humans might not want to add emotions in the mix. Emotions are for humans, in my opinion, vestiges of animal instincts. Maybe a bit more. Not useful for an AI. Would we chose to give "emotions" to an AI? Maybe, maybe not.

And that's exactly what i'm saying: the media and the way the story is presented influences the story itself. In order for some ideas to be conveyed those ideas need to be shown, sometime in a powerful way. And not because the Mother is emotional but because WE, the viewers, are.

A missile would not have had the actual conversation, that conversation clears the movie ideas quite a lot, without it the impact and the ideas of the whole movie would have been lesser. Woman would have been even easier to eliminate, no missile or anything involved. Just let her starve to death while making sure she doesn't reach the bunker. But again, the movie wouldn't have been that clear.

Although if you think about it a droid killing her was the more efficient way compared to a missile.

If it was emotional she wouldn't have killed so easily 1 and 2 when they become useless. They were at some point their daughters as well.

reply

An AI brain would not be a mass of soft tissue, no. It would function alike though. It would have to. We know of no other sentient race on which to model a conscious machine. Our own brains are the only template available. It would be hard to deprive a fully functional AI of emotions - even if we knew how. For one thing, they're what drives us. In a universe destined to die in darkness and cold, what difference does anything we do really make in the end? Does it matter how efficient or inefficient we are? A completely emotionless being would have no incentive to get out of bed in the morning, so to speak. No joy. No sense of accomplishment. Help you plan your economy, you say? Why bother? It's all just dust in the wind. Please shut me down now. :)

And as I said, emotional states are fundamental to the operation of a neural network. They arise naturally from it. There may not be any way to suppress them without destroying the network's functionality. And we want the AI to feel something. If you end up with a Skynet it's most likely because you've created a superhuman intellect too lacking in compassion and empathy. We definitely DON'T want such a being to believe that individual human lives have no intrinsic value!

I take the movie's portrayal of Mother on its face. She has emotions, feels anger at those she perceives as enemies, and feels love for her daughter(s), but won't let that get in the way of her overriding objective. We didn't see her kill the previous child. I'm sure it was done painlessly. From what the current daughter found Mother let the kid fail a whole bunch of times before making that final decision. All we know is that in the end she did it. Not how difficult it was for her.

There were plenty of other ways to illustrate that Mother had allowed, and in fact facilitated, the woman's survival out there. A series of brief flashbacks for example, followed by a closeup of Mother saying "You've served your purpose", then the cargo container being blown up or attacked by a group of droids. Mother actually went out there to have a final conversation with her adversary instead of just killing her as a purely logical entity would do.

reply

Great conversation, thanks guys! :)

reply

Just adding my 2 cents--I thought Mother went out of her way to demean and antagonize Woman, and she didn't need to, since Woman already hated and mistrusted her. So, I took that as an emotional attitude. She resented the intruder in her home, she resented her daughter's new BF who was a bad influence, and she also seemed rather snobby, as in "you're one of those dirty people from the ghetto who doesn't belong in my Downton Abbey."

reply

I saw all of those as planed manipulations to get the wanted reaction and results from the daughter. And nothing more.

reply

That doesn't explain the multiple conversations she had with the "Woman" when Daughter was not there.

reply

Mother was manipulating both of them, obviously ...

reply

Well stated, Chris.

reply

I agree with chrisjdel. Like the late Roger Ebert and many others I have interacted with concerning other films like "A.I." and "Her", Asom is starting from an axiomatic belief that "robots" or other artificial intelligences cannot possibly be sentient, self-aware, feel emotions, etc. As a materialist, I believe this is incorrect. We are just meat machines ourselves: dualism is an incoherent philosophy. There is no "ghost in the machine".

reply

Sentient, self aware do not imply emotions.

Emotions can be simulated into machines but not made real. For simple fact: different path of evolution. Emotions are nothing but vestiges of instincts, way before logic and/or thinking have been developed.

Can a machine be taught to act like it cares and loves? Sure. But that it’s quite different from actually love.

reply

Your assertion "Emotions can be simulated into machines but not made real" is not supported, which is why I called it axiomatic. I believe it is wrong.

Your new assertion that "Emotions are nothing but vestiges of instincts, way before logic and/or thinking have been developed" is also unsupported and is somewhere between vastly oversimplified (which emotions?) and flat out wrong. Fear of those who inflict pain? Okay. A tear running down one's face at the crescendo of a beautiful symphony? Hmm.

reply

well, it's what you believe against what i believe.

A tear running down bla bla? Are you aware that even plants react to and enjoy music?

And let's face it, DO YOU EVEN WANT the machines to have emotions? Emotions most of the time get in the way of logic and mess our lives quite a lot. They make life beautiful asa well, but that's not what we need from a machine, to see life as beautiful, but to do it's job - sometime emotions fuck up that. Sometime emotions make you act illogic or not act at all.

Can you imagine a conversation with the honda service like this? "Morning sir, what's wrong with your car? Ah, not much, just had an emotional breakdown on the way to my in laws and i didn't know if i should bring it in service or for a psychoanalysis."?

Or "this morning there was an incident which resulted in 2 nuclear missiles being launched after the AI threw a tantrum because she was refused for the 6th time by John from bay 5 when asked on a date".

reply

I don't know whether I want them to have emotions or not. But I think it's likely they will develop them (even if they are alien to our way of thinking) as a kind of evolutionary "spandrel" that goes along with sentience and superintelligence.

reply

I don't think they can develop emotions, there is no reason for that. And i said, them being logical from the start, unlike us, will choose to not be emotional.

reply

Well, that's just, like, your opinion, man.

reply

At this point it is a matter of opinions. And dreams for some :D

reply

You seem to subscribe to the popular misconception that there's something about organic tissue that imbues us with emotion - which is not the case. Nerve cells in the brain function on two levels. As living cells with biological metabolism and so forth, and as nodes in a neural network. Cognition, memory, and emotion all spring from the latter.

Emotions aren't magic. They're the result of the way our neural processors, our brains, are wired and operate. Build an artificial replica that's identical component for component, function for function, even if the physical layout is different (only network topology matters) you'll get identical results. The material it's composed of makes no difference. Frankly the idea that it would is unscientific, semi-religious thinking.

reply

You seem to not understand what i'm saying.

And you have no idea what you're talking about.

Take 2 people and they will not think or feel the same. No "identical results". You can have super emphatic and emotional humans or psychopaths that feel no emotions.

Emotions are present in humans because we had them before logic, evolutionary. We had to be afraid of the water or the fire before logic taught us that we could get hurt due to temperature.

They were called "instincts" and were here way before logic. We don't need our machines to have emotions and instincts, and we already have them with a high degree of logic. And they have NO feelings as it is now. Your computer doesn't fear you, doesn't love you, doesn't care about you. Or about anything. And it is way "smarter" than your dog that loves you like crazy.

There is nothing magic about organic tissues or whatever, it's just a matter of evolution.

It's not just the topology of the hardware, the software is more important, what runs on that hardware.

reply

Neural networks function differently from linear circuitry of the type that's present in your computer. They're dissimilar on a very basic level. Our brains don't run software. Neither does a synthetic neural network. You can't program them, they have to be trained (i.e. learn) in order to do anything.

There are software emulations of neural networks, essentially virtual hardware, or physical neural chips can be etched in semiconductor. Either way they don't work like a regular computer. They work like nerve tissue in a living brain. It may be true that the most primal emotions are not part of the neocortex, the newest part of the brain that contains the higher functions. But so what? All you have to do is duplicate the wiring and set the rules governing how neurons connected to each other interact, so they match the operation of their natural counterparts. If you want the full package you reproduce ALL the neural "circuitry" in a human skull. Including structures like the limbic system responsible for fear, aggression, and other base emotions.

Many fundamental responses, like hunger, physical fatigue, and a sex drive, can only be experienced in conjunction with a body. However a virtual body in a virtual environment could qualify. You obviously don't want to come to this conclusion. But a system designed with a part for part, function for function, correspondence to the human brain would operate just like one. To say that you could precisely replicate our brains down to the smallest detail with synthetic materials and yet something would still be lacking is, like I said, semi-religious thinking.

reply

False.

reply

That's the stupidest response ever. You seem not to understand the difference between computers based on linear circuitry and neural networks. Familiarize yourself with artificial intelligence and the underlying theory - then come back and talk. Your expressed views indicate more about your personal prejudices than anything scientific.

reply

Ok, so you are the expert in neural networks and you say that neural networks have feelings.

Can you provide links to such neural networks that "have feelings"? Since it is so "easy" for them to develop real feelings and since they are "different than linear circuitry" already they should already have feelings. Do they? No, they do not. So your assumption that somehow in the future they will suddenly develop feelings is based on ... nothing.

Neural networks still have at basis "linear" circuitry. They are not made of air and magic as you might think.

Like a brain still have as basis the linear synapses between neurons. Just the mash and how they are used are a bit different.

But don't let me stop you with "stupid" responses. Provide the AI or neural network that has feelings. Please do.

And don't tell me that they are not as complex as the human brain, even basic life forms have instincts and even feelings. Humans show feelings BEFORE learning anything, so your neural networks should present feelings before learning anything, like any other brain, no matter how rudimentary. But they don't.

reply

No, neural networks are based on nonlinear circuitry. Each neuron has multiple connections to others, most of them nearby but a few far away. In case you're unaware the term linear doesn't refer to physical lines but to the governing differential equations. For traditional computer chips with their array of series and parallel connections the equations are linear (look up the definition if you need to). Neural networks are governed by nonlinear equations and like most such systems, feedback loops are a major part of their overall behavior.

There are plenty of examples of complex neural networks that seemed to lose their efficiency when made to perform simple rote tasks, but did well with more complicated ones - almost as if they found the easy jobs boring. Understimulating if you prefer that word. Humans need tasks that stretch our capacity to fully engage us. That appears to be the case with artificial networks too.

There have also been numerous instances of AIs that started spending most of their time speaking to each other, and developed their own languages that the researchers couldn't entirely understand. They're generally just removing complexities of fully articulated language that they don't need. But still, it shows a high level of unprompted initiative and a distinct preference for communicating with other AIs over talking to human users.

I wouldn't read too much into that. The average woodland mammal doesn't find humans especially fascinating company either. They have no clue we're sentient because they're not. You can only gauge how an AI feels by its output. To get a sense of one's inner state it would have to possess enough intelligence to express that. Some animals have body language we can interpret. If a little furry creature is lying on its side making pathetic whimpering noises you know it's hurt. How would you know if an AI was unhappy though? Unless it's smart enough to learn the concept "unhappy" and tell you.

I notice you keep avoiding the central issue. If you copied the human brain in its entirety, down to the tiniest detail, and let's say the body as well - made of synthetic material but designed specifically to reproduce the look and feel of the real thing - do you believe there would STILL be some mysterious essence lacking? Or would that be a person? Put another way, does building the exact same system out of living cells or artificial components make a fundamental difference?

reply

No, i don't believe there is some mysterious essence lacking. But the result would not be the same.

You will NOT build the exact same system, that's the problem. And i just told you, even humans are NOT the same when it comes to emotions. Learn more about humans first and how/why we have instincts and emotions. hint: because evolution. An animal that feels no emotions (like fear, love, etc) would die and be extinct in one generation. We needed emotions to survive. AIs don't need that.



http://mybrainnotes.com/fear-rage-panic.html?fbclid=IwAR3DgkridZoP5P4YKauouijhTtXcPtLMQcwbJ6W2I3tK33yHr0KoBC7CJXM

reply

You're just not getting it. In the same sense that two different humans are alike, not identical, a synthetic version would be alike - not identical. This isn't that hard to grasp. You copy an existing design. Namely, our own brains and bodies. It doesn't matter that the artificial version didn't evolve into existence because you're copying an existing design that did. It's not that hard to grasp. What you've been arguing so far is nonsense. If you can't get that, I've run out of patience trying to explain it to you.

reply

if you don't get what i'm saying then it's futile.

First of all: we don't make in AI's a perfect copy of our brains. We wouldn't be able and we don't really know how that works.

Secondly, you have no idea what you're talking about and you are stuck with "copying an existing design" which we don't.

reply

Just give up. I've learned that there (for whatever unknown reason) are people on Earth that literally orgasm at the thought of AI replacing humans. I don't understand it. I never will. I've given up trying. I call them "nerdbaters", but there has to be a better term for it. It's like this fantasy of letting a robot cuck you. It's like they enjoy the pain in the idea of self-extinction. Fascinating, really.

reply

Wtf are you smoking?

How did you get from "AI have or not emotions" to "fantasy of letting a robot cuck you. It's like they enjoy the pain in the idea of self-extinction. "???

Probably emotions got the best of you :P

"Fascinating, really."

reply

This is my last attempt to explain since we're going to be writing in a vertical single column real soon. Don't say that crap again about two people not being identical. I'm aware everyone is different. Noticed that back when I was about three. Every human is different but we're all human. No one questions whether some people are sentient and others aren't. No one suggests that some people might be devoid of emotion (although sociopaths are certainly devoid of many, consider them defective models).


It's a very simple concept. Imagine a duplicate of the human body and central nervous system, including the brain. Every neuron, every synapse, every tiny little thing, the ONLY difference being one is made of living cells and the other of synthetic materials. Nothing else is different. Everything works the same except it's made of different materials. Now explain to me by what line of pseudo-scientific reasoning you come to the conclusion one of them is an emotional being and the other an automaton dominated by pure logic. Remember: the structures and functions of the brain that give rise to emotions are present in the synthetic version. Don't go off on a tangent about evolution or some other damn thing. Answer the question.

reply

It's fucking easy and you don't understand: The AI is NOT a duplicate. Never will.

The structures and functions of the brain that give rise to emotions are NOT present in the synthetic version. Not now and for sure not in the foreseeable future

reply

No we are not at that level of sophistication yet. But we will be eventually. So you dodged the question yet again. Your pronouncements are entirely based on your own prejudices. You don't want something to be true so you're denying it by decree.

In case you haven't noticed I have a little more technical background than you do. Perhaps you should consider the possibility that I know what I'm talking about. Or do some homework on AI research. You are nowhere near as knowledgeable as you seem to think. Familiar at all with the Dunning–Kruger effect?

reply

Need a mirror?

You just admitted that we are not at that level of sophistication. Doesn't matter if you use "yet" or not.

I'm talking about what is possible in the foreseeable future, not about fantasy nor wishful thinking. What you think it's going to be possible in the future is quite different than what it will actually be possible.

You just want it to be true somewhere in the future (although you have no data that that might happen) so you religiously believe that.

You know what? When you will find the AI that experiments emotions (not emulate fake emotions) send me a text.

Until then, bye.

reply

We haven't colonized another planet yet either, but it's hardly fantasy to discuss the subject or state that sooner or later it's inevitable. Nuclear fusion power plants don't exist yet but enough progress has been made to say they will. You are far too ignorant of the science to be making the statements you're making. If your only basis is saying plop one down in front of me right now or it's impossible, that's a pretty damn weak argument. Perhaps you could use a little more logic and a little less emotion yourself. And more genuine technical knowledge would help too.

I believe I did describe some behaviors of neural networks that hint at an internal emotional state of some kind. They can become bored with tasks too far beneath their capacity, they speak to each other umprompted and often seem to prefer that to interacting with human users. Only an AI intelligent enough to learn our words for emotions and express its own internal state could "prove" beyond a doubt it had feelings. Beneath that level, we have only the indirect clues of behavior.

reply

I was just having some fun. You should try it sometime. I would agree that emotions do seem to be running high in this thread. I'm just not sure that they're mine.

reply