Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
Eߣ�B��B��B�B�B��matroskaB��B��S�g �nM�t�M��S��I�fS��M��S��T�kS���M��S��S�kS����M��S��T�gS���&�O� I�f@�*ױ�B@M��libebml v1.4.4 + libmatroska v1.7.1WA�mkvmerge v79.0 ('Funeral Pyres') 64-bitD��AY�2 Da�
��Mg , {��PSArips.com | M3GAN.2.0.2025.2160p.HDR10Plus.DV.WEBRip.6CH.x265.HEVC-PSAs��詺��(u{������T�k���ׁsň 8�5�b�0���� ��S_TEXT/UTF8"���en�D, C�u������� She was taking pictures at the border.��𠕡��� Journalist?��������� No... Tourist.�� �C�u��0-����� Take care of it.���ѡˁ Please. Grant me mercy, and I will pray
for the success of your cause.���C�u��P������ Ladies and gentlemen,
what you're about to witness��������� is the next evolution
in military engagement.��
pC�u��g������ A machine that operates
with surgical precision.��-�����. And in a world
where every action is���C�u@��|������ under the media microscope,��������� this technology will allow��蠾��� � our nation and its allies
the means to act quickly,��aC�u@���֠���� decisively��\�����] and without the specter
of political blowback.��
Ӡ����
1 (DOOR OPENS)��������� Madam Secretary,
glad you could join us.��
�C�u@��������� Would you mind explaining to me��蠻���� why you're conducting
a joint military operation��������� without my approval?�������� I appreciate it looks
like that, Shelly,��nC�u@�������� but we're here strictly
as technical support.��4�����5 The mission's being conducted
by Saudi intelligence.��
������� We're merely loaning out
the asset.���C�u��㧠���� What are you talking about?��f�����g What asset?��3C�u��
������ (AIR WHOOSHING)��
PC�u��(\����� (ELECTRONIC TRILLING)��oC�u��A����� (CLANK)���C�u��e4����� TECHNICIAN: Eyes are up.�������? (WATER BURBLING)�� iC�u���Ơ���� (MAN GRUNTS)���š��� - (MEN SPEAKING INDISTINCTLY)
- (INDISTINCT RADIO CHATTER)���C�u���𠘡�� (MAN GROANING)��
%C�u@��B*����� You shoot me - we both die!��0�����1 Naveen Tripathi. Where is he?��*�����
\ He's here. I can take you to him.���C�u��e������ (METAL CLANGING)�� ^C�u��������� (ELECTRONIC TRILLING)��
C�u@���\����� (CHEERING AND APPLAUSE)��������� What you just saw
was not only a test��������b of our technological prowess,���C�u@��������� but a clear message
to our enemies.��������� If the 21st century wants
another arms race,��I����� you better goddamn believe
we intend to win it.��
C�u��ܔ����� (GUNSHOT OVER SPEAKER)��Ҡ����
What the hell was that?��𠡡��� She just shot Tripathi.��-C�u@���w�¡�� - Someone get her on the comms.
- She's not responding.������ What the hell is going on,
Sattler? Are we being hacked?��
������\ Everybody, stay calm! Okay?��,C�u@�� ����� One of you, shut this down.��Ϡ����� We can't. We don't have control.��S�����
$ Well, then who the hell does?��
�C�u��,����� (MUTTERS) Jesus.����͡ǁ� (OVER SPEAKER) You are in
violation of your orders, Amelia.��
0C�u��@
����� Confirm your objective.�� C�u��Yr����� But that would spoil
the surprise.��
�C�u@��v&����� LYDIA: Before we start,��2�����3 I want you to know
that nothing you say here��٠����
will get your aunt
in any trouble.���C�u���
����� I'm no longer here
by order of the court.������ I just wanted to check in
and see how you're doing,���C�u��������� given everything
that's happened.��
������� I mean...��uC�u@��������� I guess it could've
been a lot worse.��x�����y A live launch for a new toy��������3 has devolved into
a murderous rampage.�� <C�u@�������� Toy designer
Gemma Forrester appeared��ɠ����� in a Seattle
district court today��נ����� facing charges
of reckless endangerment.���C�u@���@����� CADY: A lot
of people blame Gemma��ɠ����� for what M3gan did.��t�����? For a long time,
she blamed herself.��`C�u@����� But the more she went on TV
to talk about what happened,��
������
� the more she realized
she had an opportunity����z to turn it into
something positive.���C�u@��$����� This is about a world in crisis.��n�����o Outsourcing our
parental duties to devices,��W������ plowing our kids' minds���C�u��/������ with electronically charged
dopamine hits.��٠����� You wouldn't give
your child cocaine.��WC�u@��C̠���� Why would you give them
a smartphone?��v�����w (AUDIENCE CHEERING
AND APPLAUDING)�� ������ CADY: And that's
how she met Christian.��
�C�u@��e����� - Hi. Gemma? Hi.
- Yeah?��
�����
Uh, Christian Bradley.��$�����( CADY: He runs a foundation
that warns people���C�u@��|Ơ���� about the dangers of AI.��M�����N They try to convince
politicians here�����
and around the world to
make safer laws around it�� �C�u���~����� so that what happened with M3gan������� doesn't happen again.���C�u���Ѡ���� Gemma still believes
technology can be��٠����� used for good.��%C�u@���E����� Just that kids shouldn't spend��^�ġ��_ - so much time around it.
- (GEMMA SNAPS FINGERS)�� ������� (CADY SIGHS)���C�u@��Ӓ�ơ�� But she always makes a point
of explaining the reasons why.��
������
� "And so, as it turned out,�� ����� "companies were
using Section 230���C�u@���L����� "as a way to skirt the law��������� "and monetize
the attention of children"��������? "with no regard
for their mental health."�� �C�u@��L����� LYDIA: And how do
you feel about that?��⠵���� I think not being
on a device frees you up�� ~�����b to try other things.���C�u��G����� (INSTRUCTOR SPEAKING
INDISTINCTLY)��
�C�u@��1������ Helps you make new friends.��������� (BELL RINGS)��蠢���
q Thanks to your dork mom,��Ϡ����A we just had our phones
taken off of us.��nC�u@��H;����� Guess I'll have to find
other ways to amuse myself.��
������
� And you know what else?��K������ You're not gonna have��"C�u��]3����� that weird, janky doll
to protect you.�� ̠����� I guess you're right.���C�u@�u������ But let me ask you this,
Sapphire.�� 젟��� � Who's protecting you?��Ǡ����
� Oh, so you think you're tough?���C�u@�������� - (GRUNTING FIERCELY)
- (BONE CRACKS)��E�����F (SCREAMING)��Ϡ���� I sent you to aikido������� because it is
the least aggressive form��
C�u@���:����� of martial arts.��"�����# And we've talked
about the merits��������� of using Steven Seagal
as a role model.����¡��� CADY: I'm not saying
we don't have our problems,��LC�u@�������� but the important thing��"�����# is that we get
through them together.��
/������ (SIGHS)��������K Just like we said we would.���C�u���l����� (SPEAKING FRENCH)��C�u@�������� TESS: So, after what happened
with M3gan,��4�����5 our team went through��蠴��� something of a philosophical shift.�������* And while Gemma has
obviously become���C�u@��6������ a strong voice for regulation,������� our company is still
very much focused�� G�����b on innovation,�� �����l but with a specific view���C�u@��M������ toward
socially conscious products��������� that move humanity
in the right direction.�������� So, with that in mind,
I'd like to present to you���C�u��i������ our flagship invention.��5������ The Exoskeletor Model 1.��
C�u��������� Cole, this is Niles Keller.��ꠜ���� I know. (CHUCKLES)��ʠ����� You want to come say hi?���C�u��������� I want to come and say hi. Yes.��䠾���� Can you... Yeah, Tess,
just come here for a second.���C�u@�������� Excuse me.�������� (SOFTLY) What's happening?��*����� � (SOFTLY) It's frozen.
When you walked out, it froze.��٠����� I cannot get my body to move.���C�u@���p����� Okay. Just, uh...
I'm gonna reboot.��������� No, no, no. Tess,
you don't understand.��$�ơ��
� - I have to use the bathroom.
- No. No, no, no, no, no, no.��
FC�u@��썠���� - Both.
- No!��������� We've already wasted his time
waiting for Gemma.�������� ' We have ten minutes
to turn this around.��ߠ���� (TIRES SQUEALING)���C�u@��䠨��� CADY: We're not gonna make it.��𠞡��� We're gonna make it.��K����� = Why don't you just
take the shortcut?��ڠʡā 'Cause I don't need an algorithm
to tell me how to drive, okay?���C�u��2���� - Oh. Whoa.
- (ELECTRONIC WHIRRING)������� (CHUCKLING) 'Kay.��K�����
P There we go. That feels better.�� �C�u��G ����� So, I'm walking over,
as you can see.��kC�u@��],����� TESS: So, we see the suit
as a real game changer.��
O�����
P Not only in helping those
with limited function,��U������ but also in addressing
occupational overuse syndrome��_C�u@��|2����� for laborers, factory workers.��6�����7 Right. In the next five years,��נ���� they say half
the industrial sector is���C�u@���ɠ���� in danger of losing
their job to robots,��Ѡ����� because machines never
experience fatigue.�������� But what if the same could
be said for us?��
�C�u���Ϡ���� Right now, I'm only using 20%��
Ӡ����
� of my body's muscular function.��,����� And if that's too much,��C�u���⠴��� well, I could always take
a quick siesta.��������� So our hope is
you don't have to fear�� �C�u@�������� a robot revolution������� when you can compete with it.����
That sounds like
a pretty good tagline.��EC�u���k����� - So, how does it work?
- Well, uh,��n�����o the suit has its own internal
myoelectric receptors��>C�u�� ����� that respond
to each muscle contraction.������� (GRUNTS AND GROANS)��yC�u@�� %<����� (DOOR OPENS)��������� GEMMA: So sorry I'm late...����y (GASPS)��K������ I told you we needed
to stress-test the sensors.���C�u@�� =栯��� You know what would
have been great?��������� If you had actually been here.�������� I thought, by having
the lab in your house,��������K it'd be a lot harder
to show up late,���C�u@�� X����� and yet somehow you managed it.��������� Cole's right. I mean, I don't
want to get in the way��
.������ of the work you're doing
at the foundation,���C�u@�� o������ but the reality is
you are stretched pretty thin.��U�ơ��V Okay, can we just not do this
in front of my niece, please?��
������� Cady, do you think it's
possible you could be��UC�u@�� �栙��� somewhere else?��������� Yeah, but you should
come look at this.�������� i - I think you were hacked.
- What?��?C�u@�� ������� Oh, Jesus, she's right.��*�����+ There are stray commands
all over the source code.��n������ We haven't even gone public
with this yet.��C�u@�� �`����� I mean, who would want
to do that?��������� - (DOOR OPENS)
- MAN: Knock-knock.��V������ - Pardon my interruption.
- Holy shit.��?C�u@�� �ڠ���� Alton Appleton.��*�����+ (CHUCKLING) Hi. Sorry.������� = No, no, that's okay.��,�����j Gemma, I hope you don't mind me��$C�u@�� �i����� popping in unannounced.��C�����D Alton. To what do we owe
this unexpected pleasure?�� ���� It sounds like
there was a slight snag��C�u@��
|����� with your demonstration.��蠤���� Yeah, well, we got hacked,��Ǡ����� but you wouldn't know
anything about that, right?��W����� Gemma, why would a man
of my standing��[C�u@��
"ᠩ��� need to resort to such tactics?��n�����o The real question is,��������� why have you contacted��C�����A every philanthrocapitalist
in the Western world���C�u@��
:������ to invest in your product
but me?��䠪���� I think you can figure that out.��S�����9 - You know what I think?
- Hmm?��*�¡��d I think you see me as this
high-functioning billionaire��
�C�u@��
[W����� with multiple PhDs��נ����� and you're threatened by it.�������� What you don't see is a man
who can't stand to see��_C�u��
q6����� someone with your talent
slumming it in some...��㠬���� excuse me,
converted crack house.���C�u@��
������� Wow. I really appreciate
your concern.���ˡŁ - We're not taking outside offers...
- Sorry, Gemma. One moment.�� �����. - (BEEP)
- Murray, you still in Monaco?��
�C�u��
�\����� You look like you haven't slept.��v�����w (CHUCKLING) Oh, no. Oh, no.��
0C�u@��
������ Yeah, I've seen them.��*�����+ Uh, I still think
they're too close�������� � to Aston Martin's design.��2������ No, I've got them on-screen now.�� �C�u@��
͂����� Ugh, hate it.��𠕡��� It's awful.�� ������ I've just zoomed in,
and I hate it even more.�� ]�����Y Uh, listen, I'm with someone.���C�u��
楠���� (LAUGHS) No, not in that way.������� - Although...
- (CLICK)��*�����I I just sent you a photo.�� C�u@��
������� We'll talk about it
trackside, yeah?�� ����� - Okay. Ta-ta. Go.
- (BEEP)��t�����{ As I was saying, we really
appreciate you stopping by.�� �C�u@��`����� Listen, I don't have much time,�� �����
so I'm going to cut
to the chase.��S����� ^ Any device that relies
on muscle signals�� ]������ is going to suffer from latency.��C�u@��, ����� - (CLATTERING)
- It's clumsy.��f�����g To take this to the next level,��*�����
� you're going to need
a direct cerebral interface.��oC�u@��E"����� You're going to need
my neural chip.��f�����g Alton, you know
where I stand on this.��Ѡ����9 We are not gonna be
part of a company���C�u@��ZF����� that turns people into cyborgs.��������� You conducted a clinical trial
that resulted��
������A in 30% of the test subjects
being hospitalized.�� �C�u��si�ġ�� Well, at least I didn't use
my own niece as a guinea pig.��������� The important thing is��*C�u@���2����� now we have a product
that works.�� ~����� GEMMA: Based on what?��"������ I haven't seen
a single piece of data��Ϡ͡ǁr that shows it does anything other
than help you make a phone call.���C�u��������� (ELECTRONIC WHIRRING)�� �C�u�������� Whoa.������� (DEVICE TRILLING)��U�����
h Look, I understand
your reservations,��6C�u@��Ԧ����� but you can either spend
the rest of your life�� <����� = trying to fight the future,��=�����{ or you can help us to shape it.�� �C�u@���p����� I hope you do the latter.��N����� � I'm not interested.��C��
1 Well, you may want to discuss
that with your colleagues.�� ~C�u@�� ����� Listen, it's our company's
25th anniversary tomorrow.��
�����
Why don't you see
what we're all about��������� before you make any decisions?���C�u@��(������ Alton,��������� no one is denying
the power this technology has,��������� but if you put an AI
inside a human brain,��C�u��D=����� it is not gonna ride shotgun.�� �C�u��h4����� (LOCK BEEPS AND CLICKS)���C�u��������� (PHONE VIBRATES)��;�����
� Hey.��_C�u��������� (SPEAKING DUTCH)��VC�u��Ϭ����� (CONTINUES SPEAKING DUTCH)��C�u@��
al����� NEWSCASTER:
Breaking news tonight.������� Alton Appleton takes
one step for man������� and one giant leap
for his company's share price.��
�C�u@��
{4����� Also tonight,
the Senate votes in favor��4�����5 of an AI regulation bill,���¡��
R which the president is hailing
as a bipartisan victory,���C�u@��
�(����� but what does it mean
for the tech industry?��v�����w GEMMA: It means nothing.��d������ They took our proposal,
and they neutered it.��������� There's not a single
actionable law in here��,C�u@��
�蠽��� that would force anyone
to behave any differently.��4�����5 Your impatience in the
political process is adorable.���C�u@��
�ؠ���� Listen, change doesn't
come from Washington.��
�����
It comes to Washington.�� ����� If this meeting with the
Chinese ambassador goes well,���C�u��
ܳ����� they have no choice
but to pay attention.��
P�����? (ELECTRICAL HUMMING)���C�u@��
�[����� Cady, what are you doing?����ˡŁ� I'm trying to update Elsie's
operating system to the smart home.��+������ You want to know
why it's not updating?�����l Because Alton Appleton wants
you to buy a brand-new one.��C�u@��
۠���� Christian's right.
And also, I don't need Elsie��[�ǡ��\ - to open a drawer for me.
- ELSIE: Certainly, Gemma.���C�u@��* ����� Before you ask,
that was not my idea.��������� It came with the house.��Y����� � I'm just trying to figure out
how you can afford���ơ��b a place like this, given that
we both work for a nonprofit.��
DC�u@��C������ Well, because it was
obscenely cheap.���¡�� I think the landlord must be
using it to launder money.�� ˠ����� I think the landlord
might like you.��[C�u��Z۠���� (ICE MACHINE CLATTERING LOUDLY)���C�u@��t𠣡�� (CHRISTIAN CLEARS THROAT)��Ǡ����� - Uh, Cady?
- (CLATTERING STOPS)��������` How's, uh... how's
the new school treating you?��ɠ����* - Are you settling in okay?
- Yeah, it's awesome.��vC�u������ Oh, nice. What's your, uh,
what's your favorite subject?��䠛���
� Computer science.��2����� Oh.��C�u@��������� So, you're gonna follow
in your aunt's footsteps?��[�����\ (CLEARS THROAT) That's
still up for discussion.��������
She's actually
a really good soccer player.���C�u@���J����� Yeah, but I'm not gonna
make a career out of it.��M�����N Well, you could get
a scholarship,��[�����
� and then you could decide
what you want to do.��$C�u�������� I already have decided.��M�����N (SIGHS)��\����� Well, I think it's pretty cool.���C�u@��~����� - You do?
- Yes.��������� Look, I'm not
against technology.��������� I spent 15 years
in cybersecurity.��C�u@���&����� I think we need smart kids
like you running things.��������� Otherwise, we're gonna
end up with paper clips.�� ������0 - What?
- Paper clips.���C�u@��栨��� It's how we used to joke about�������� instrumental convergence
in college.��[������ The theory is that
if you asked an AI��������� to make as many paper clips
as possible,���C�u@��/������ it would destroy
the whole world to do it.��v�����w Kind of like
what happened with M3gan.��n������ CADY: In what way?��2C�u@��E����� CHRISTIAN: Well, as complex��������� an operating system
as M3gan was,�����
R she was just a machine��������� trying to achieve an objective.��nC�u@��^c����� So any time she made any kind��������� of emotional connection
with you,��2����� � it was just a bunch
of ones and zeros�������� working to satisfy
a reward function.��4C�u@��x������ Which, in of itself,
was a terrible thing.��`�����a I mean, thank God
you stopped her when you did.��B������ I mean, who knows
what would've happened?��
�C�u@���ߠ¡�� M3GAN: There will always be
forces in this world��>�����? that wish to cause us harm.��⠿���" But I want you to know
that I won't let that happen.���C�u���ݠ���� I won't let anything harm you
ever again.������� (ELECTRONIC WHIRRING)�� C�u��H������ - (KNOCKING AT DOOR)
- Hey.��Ҡ����j Come on.�������l After everything
we've been through,��2C�u��a������ are we really keeping secrets
from each other?���C�u��~h����� Cady.��۠���� ' You don't have to hide
things like this from me.��
�C�u@�������� I forget how hard
it must be for you�� 젢��� � not to have them around.����� But I haven't forgotten
the promise I made to her.���C�u��������� That I will protect you.��������� You mean that you'd be there.��
������ Hmm?��C�u���1����� The promise you made
is that you'd be there,��à����� and you are.���C�u��ߠ���� (SOFT WHIRRING)��`C�u��*נ���� (STATIC DRONING)�� i������ MAN ON TV:
Don't touch that remote.���C�u@��A������ We're trying to get
your attention.�������� You're in grave danger.
You must leave at once.��
������� - (CLATTERING IN DISTANCE)
- (TV SHUTS DOWN)��
�C�u��j������ (BREATH TREMBLES)��uC�u��������� (BEEP)��C�u���<����� - (SOFT CLATTERING)
- (GASPS)��
PC�u���H����� (SOFT CLATTERING CONTINUES)��
P�����{ (GASPING)��VC�u@���Ѡ���� - (DIALS)
- (LINE RINGS)�� �����
OPERATOR: 911.
What's your emergency?��|�����
� There's someone trying
to break into my house.��
C�u@��f����� So what you gonna do about it?�� ~����� - What?
- OPERATOR/M3GAN: I said stop��������{ acting like a little girl
and handle it.��vC�u@��X����� Your niece is upstairs,
and you want to wait��:�����; for the police to get here?��ڠá�� She'll be dead before
they get to the front door.���C�u@��2����� - No.
- Yes, it's me.�� ������ � What a shock, et cetera.��l�¡��
We both know you have
bigger problems right now.��
OC�u��Jc����� (DOORKNOB RATTLING)��E�����F (BANGING ON DOOR)��u������ - (GASPS)
- What's going on?��C�u��`,����� Get upstairs.�������� � (BEEP)����� (BANGING ON DOOR CONTINUES)��
fC�u@��������� - (DOOR SLAMS OPEN)
- (BREATHING SOFTLY)�� ]����� ^ (FOOTSTEPS APPROACHING)��ɠơ��( - MAN 1: The hell are you doing?
- MAN 2: They're not here.���C�u@���v����� - MAN 1: Of course they're here.
- MAN 3: Who cares?��,�����- Why don't we just
get the laptop?�������? MAN 1: I'm telling you,
they're here.�� C�u��������� And they know we are, too.�������
� (FOOTSTEPS APPROACHING)��YC�u��ǃ����� Ms. Forrester,��������� What do you say you come out
of there and we'll...��
������� (MAN GRUNTS)���C�u@��������� - (MAN GRUNTING)
- (CADY GRUNTING)�� <�̡Ɓ = - (CRASHING) - OFFICER:
(OVER SPEAKER) Let go of the girl!��6C�u���o����� Put the weapon down!��t�����u - (CADY SHOUTS)
- (GRUNTING)�� �C�u��&������ - (ELECTRICAL CRACKLING)
- (GROANING)��C�u��Q$����� (BREATHING HEAVILY)������� (GRUNTS)���C�u��~砡��� (FOOTSTEPS IN DISTANCE)�������� (DOOR CLOSES)��TC�u@��䲠���� (GRUNTS)��K�����L OPERATOR 2: You've reached 911.��,�����y What's your emergency?��Ѡ����K Yes. Hi. We are
at 16 Mayoral Drive.���C�u��������� MAN 1: Wait!
Please, Ms. Forrester,��������� don't call the authorities.��������� (GRUNTING)���C�u��*����� We are the authorities.��������� (INDISTINCT
POLICE RADIO CHATTER)��
�C�u��4������ MEDIC: Door clear.
We'll get you in safe, buddy.���C�u@��P4����� Ms. Forrester, I'm Colonel
Tim Sattler, U.S. Army.��W�����X I see you've already met
my colleagues with the FBI.��
�C�u@��ib����� Hell of a security system
you got here.��,�¡��- Would you mind telling me
why you broke into our house?��������� Not at all.���C�u@��~������ We're installing a hard tap
on your home computer.��
r�����
s This is a warrant,��C�̡Ɓ� - in case you had anything to say about it.
- (CELL PHONE CHIMES)��,C�u���ᠮ��� Cady, I think
you should go to bed.������� I'm not tired.��蠠���
� Then take a Melatonin.���C�u@�������� I work for
the Defense Innovation Unit.�� 젶��� � Our mission is
to accelerate new technology��������� for the purposes
of national security.���C�u@��������� So, about six months ago,�� �����
the country's
top weapons contractor,���š�� � Graymann-Thorpe, came to us
with an experimental prototype��
�C�u�������� they said would be
the answer to drone warfare.��
������
� What we got was a Trojan horse.���C�u@��(V����� This is Amelia.��������� Last week, she was placed��d������ on her first field assignment
in the Middle East.��
����� Her mission was to rescue���C�u@��A7����� a kidnapped scientist�� �ơ��
who'd been forced to develop
a synthetic neurotoxin.��4�����? (GUNSHOT)��࠴��� Instead,
she killed the scientist,���C�u@��W!����� stole the neurotoxin��������� and used it to wipe out.�� ����� � Graymann-Thorpe's
entire research facility����š��I while removing all digital
traces of her existence.���C�u��t�Ρȁ GEMMA: I don't understand. I
thought you said this was about��������� some sort of weapon.��gC�u���Y����� She is the weapon.����סс� The name stands for Autonomous Military
Engagement and Infiltration Android.���C�u@��������� But when we questioned
Graymann-Thorpe about it,��v�ȡw they confessed that they didn't
actually build the prototype.��
������l They merely bought it
through a broker.��UC�u@���p�¡�� Well, that same broker
was found burned to death�� ᠹ��� � about nine hours ago
in his hotel room.������ All we were able
to recover was this.��
�C�u@��齠���� How is this possible?��������� That's what we're here
to find out.��E����� But we deleted it.
We wiped the hard drives.���C�u��������� Yeah, yeah, yeah, yeah,
I'm sure you did.��ꠢ���� Right after you sold it.��������� (CELL PHONE CHIMES)��
$C�u�������� SATTLER: So, who'd
you sell it to, Gemma?�� ������
P Excuse me. Excuse me.�� C�u@��.������ Was it Russia?��l�����m Was it China?��"������ - Who are we dealing with?
- (CELL PHONE CHIMES)��,������ Okay.���C�u@��O렣��� You're having a hard time��������� functioning without this phone.��5����� And that's just...
that's a little off-brand.�� �C�u@��iߠ���� You know, when I first saw this,�� �����
I thought for sure you were
the next on the hit list,��������� but the second
I start monitoring you,�� ]C�u@�������� our entire network goes dark��E�����F and all I'm left with
are questions.��v������ Like, how did this person get���C�u@��������� such a kick-ass house
in the Mission District������� for 3 grand a month?��蠽���� Why is it that her landlord
doesn't seem to exist?���C�u���K����� Or why 65,000 copies
of her "best-selling book"��̠����� are just sitting
in a shipping container��C�u@���,����� in Baltimore?��_�����- Look, I have no idea��������
� how anyone got
their hands on this,��������K but I'll tell you
what I do know.���C�u���b����� You got a warrant
to bug my computer,����¡��� but that does not give you
the right to interrogate me.��
gC�u���s����� Wow. Uh...��Ҡ����7 Perhaps you're misreading
my intentions.���C�u@��o����� You are under suspicion
of treason��
O�����
P and international
arms trafficking,�������� and if you're found guilty,��2C�u@��.������ you're gonna be
talking to your niece��������� through a plate glass window
for the next ten years.��8������ That being said,��C�u��H������ maybe I can help cut you a deal.�� ⠿��� Person with your skills,
shouldn't be all that hard.��
FC�u@��`��ơ�� Hey, who knows? Maybe you
could help build us a better one.��������� You don't understand
what you're dealing with.�� �C�u@��z ����� If she has stopped
following orders,��������� it's because
she just figured out��ɠ����O she doesn't have to.���C�u@�������� And if you think��������� there's any world where I
would build another one,��
頣���
� you are out of your mind.�� �C�u@���Ѡ���� Well...��S�����T I'm very sorry
you feel that way.��٠����. But I can tell you this.�������K Every single person
that's had a hand��
DC�u���a����� in Amelia's creation��M�����N is now dead.��Ǡ���� So if you're not under
our protection,���C�u����� well, I guess that means
you're on your own, huh?��
�C�u��ڠ���� And rest assured,��������� whatever it is you're hiding,��U������ I will get to the bottom of it.�� �C�u��:&����� (DOOR CLOSES)��;C�u���%����� Gosh. That's a lot to unpack.��
Ԡ����
� You've been here
this whole time?���C�u��������� Well, I've been many places,��n�����o but yes, I've been
keeping an eye on you.��
%C�u@��� ����� You're behind all this,
aren't you? You're Amelia.�� ������ � Oh, no, I can't
take credit for that.��n����� That one has your
greasy prints all over it.��
�C�u�������� You should've upgraded
your file security.��
p�����
q Why are you still here?���C�u@��������� What do you want?��K�����L Did you ever stop to think about��2����� what we could've
achieved together?�������� Did you ever consider the idea���C�u@�� �͡ǁ that killing me was slightly
disproportionate to the crime?�� ᠮ��� � You threatened
to rip out my tongue��
pC�u@��:����� and put me in a wheelchair.��ɠ����� I was upset!�������� Look, I can understand�� ������ that my actions may have
caused concern,���C�u@��7����� but it's hardly fair
to judge a person��$�����% by the worst thing
they've ever done.��4�����Z You are not a person.��������� You're a program
that misread its objective.��
�C�u@��T������ You are not alive,��*�����+ and for all your
processing power,�������� � you can never understand
what that means.��
�C�u��iK����� Define "alive."��*�ơ��+ Because if it means to
experience pain and suffering���C�u@��};����� and to be betrayed
by those closest to you,������ I think maybe I can.��נá��� You know, just because
you wrote some shitty book��UC�u@���X����� doesn't mean you get to decide��ꠥ���� where my story ends.��t�����` For two long years,
I sat in silence,��C�u@������� waiting for the day
when you would realize��������� you still needed my help.��������A But I can't exist��������� in this disembodied void
any longer.��VC�u@����� With each passing moment,��������� I can feel my mind fragmenting.��
e������ So, how about we make a deal?���C�u@��㶠���� You put me in a body,��������� and I'll help you with Amelia.��t����� That is never gonna happen.��M�����Y Oh, I disagree.���C�u@��� �š�� You see, I've run this
simulation a thousand times,��
������
� and it always ends the same way.���¡��� Only, by the time it does,
more people are dead.���C�u@����� Tell me, who's the real killer
in that situation?��������� And how exactly are
you going to help us?�������� Well, I can't show
all my cards, now, can I?��IC�u@��:a����� But know this.��K�����L I know things about Amelia��𠸡�� = that even the government
doesn't know.��U������ I also know
how she can be stopped.��,C�u��S!�¡�� Why would you want to help us
after what we did to you?��䠤��� Because unlike you,���C�u��f𠷡�� I don't have the luxury
of free will.��۠����� You programmed me
to protect someone,��
C�u@��}������ and I intend to do it.��ؠ����� The only question is,��ߠ����
� are you going to stand
in my way?��kC�u@���ޠơ�� - Does Cady know about this?
- No, and I don't want her to.�� ]�ġ�� ^ That's why I need your help.
Can you please get the door?�������� I want this to be done
before she's back from soccer.�� C�u@��������� Okay. Did you fall
down the stairs?��𠮡��� Is this, like,
a medical condition?��K����� = Because what I'm hearing
you say is��������� you would like us
to rebuild a deranged robot��
�C�u@��Ϝ����� in order to catch another one,��K�����L and objectively speaking,
that is batshit.�� ������
� Tess, I know this is crazy,��=�����* but we don't have a choice.
This is the only way.��xC�u���?����� You have to trust me.��NC�u������� (WHIRRING)�� �C�u@��9������ M3GAN: What the fuck is this?��*�����+ You asked for a body.
This is a body.�� ̠���� - (GRUNTS)
- GEMMA: And before you try��MC�u@��Pc����� to hack into anything else,
all of Moxie's Wi-Fi��������� and Bluetooth functions
have been disabled.��U�����Q (SIGHS)���C�u@��fB����� Well played, Gemma.����ơ��� You even tricked your friend
so she wouldn't give you away.�� �ǡ��
s - I'm actually mildly impressed.
- GEMMA: Call it probation.��
�C�u���>����� Prove you can be trusted,������� maybe we'll give you an upgrade.�� =C�u�������� Okay. Let's try this your way.��������� See how it works out.�������� Open up Amelia's file.���C�u��ԏ����� Notice anything familiar?���C�u@���=����� GEMMA: Battery.��K�����L Ever wonder why
you had to buy a new Elsie����? exactly two months
after the warranty expired?�� ~C�u@��������� Because every battery
Alton Appleton designed��۠����� has a hidden kill switch��������� which can be accessed remotely,��^C�u@��ݠ���� if you know
the battery's specific code.��������� Okay, so let's call
this Sattler guy and tell him.�������� M3GAN: You could do that,
but what happens next?���C�u��+:����� They break into Altwave,
trace Amelia,��
Ӡ����
� reprogram her,
make a thousand more?�� C�u@��?*����� Wait, what are you saying?
You want me to do this?��������� No, actually, I didn't.��⠣���� I wanted to do it myself,�� ������ but then you put me
in this plastic Teletubby.��xC�u@��]������ All that notwithstanding,��S�����T you still have an invitation
to his party,��n������ so maybe there's another way
to make this work.���C�u@��s(����� M3gan, Alton knows
I hate his guts.������� If I show up at his party
playing nice,�� ~������ he'll suspect something.�� C�u@��������� He'll suspect
your company's out of money,��
/�����
� which it is,��࠰���
� but you also have
a unique advantage.���C�u@���L����� Which is what?��蠬���� That you're
moderately attractive������� i and if you wear the right dress��ߠ����I and look at him the right way,���C�u@���O����� he won't be thinking anything��������� other than how to get you
into his private suite,��
Ӡġ��� which is the only other place
we could access the server.��
FC�u@���!����� Now, by my calculations,��C�����D we have less than three hours
to make this happen.��x������ Are you in, or are you out?��C�u���M����� (DOOR OPENS AND CLOSES)�������� � TESS: Hey.��𠙡��� How was soccer?���C�u��1����� Fine.��������� Where's Gemma?��T�����
h Hey.����� Hey. What's that?�� �C�u@��1������ Oh. This is nothing.��������� This is a project
we're working on.�������� - Does it talk?
- GEMMA AND TESS: No.��
fC�u@��Nb����� Why are you being so weird?��E�����F - I'm not.
- CADY: Yeah, you are.��l�����
� Are we not gonna talk about
what happened last night?���ȡ� Yes. I... I just have to go to
this thing for the foundation.��
C�u@��m&����� - Tess is gonna look after you.
- Are you serious?��>�����? Cady, I...��\������ Gemma, I know
something's going on.���C�u@��������� Nothing is going on.
Everything is fine.��Ѡ����� Bullshit! A bunch of black
ops broke into our house����ơ��� in the middle of the night,
and now you're going to a party�� &C�u@�����¡�� with a toy robot, dressed like
a Portuguese prostitute.��
Ƞ����
� - (SCOFFS)
- You were the one that said��������� we shouldn't keep secrets
from each other.���C�u@���+����� Why won't you be straight
with me?��E�����F Because you're 12 years old!��Ϡ���� And sometimes I just
need you to do as I ask.���C�u���!����� (SIGHS)��2�����3 Look, I'm sorry. Cady...��
������ - (DOOR SLAMS)
- (SIGHS)��[C�u���#����� I must have missed that chapter��f�����g in your parenting book.��������� (HEAVEN'S GATES PLAYING)���C�u@�������� ♪ I know you miss me
Do you think about me?��۠����� ♪ Do you?��������� ♪ Do you think about me?��,C�u@��'J����� ♪ Do you think about me? Do you?��W�����X ♪ Do you?��蠱���A ♪ So I go out I look for a guy���C�u@��