Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:18,757 --> 00:00:21,673
It was devastating.
It was very hard.
2
00:00:21,717 --> 00:00:27,375
It was hard for us
to understand and believe
that this could happen
3
00:00:27,418 --> 00:00:29,203
in a developed country
like the United States.
4
00:00:33,859 --> 00:00:37,472
Both my husband, Pat, and my
son, Cal, experienced
5
00:00:37,515 --> 00:00:40,866
what I would say
classic diagnostic errors.
6
00:00:40,910 --> 00:00:45,219
Cal suffered brain damage from
his newborn jaundice when
it was misdiagnosed
7
00:00:45,262 --> 00:00:47,525
and it was never tested or
treated appropriately,
8
00:00:47,569 --> 00:00:50,137
and today he has
significant cerebral palsy.
9
00:00:52,052 --> 00:00:55,533
At about 16 hours, the nurse
charted that he was yellow,
10
00:00:55,577 --> 00:00:59,146
it was no big deal. We were
basically discharged
11
00:00:59,189 --> 00:01:02,236
with a very sick baby, but we
were told he was a well baby.
12
00:01:02,279 --> 00:01:06,370
I was familiar with jaundice
and it was communicated to us
13
00:01:06,414 --> 00:01:08,198
that it was no big deal
and not to worry about it.
14
00:01:08,242 --> 00:01:10,374
And they asked me if
I was a first-time mom.
15
00:01:10,418 --> 00:01:14,857
I said I was, and they reminded
me that first-time moms are
16
00:01:14,900 --> 00:01:19,905
often over-reactive, and they
didn't seem worried at all.
17
00:01:19,949 --> 00:01:23,126
I didn't really know
at the time,
18
00:01:23,170 --> 00:01:27,739
but I learned later on that Cal
was in the process of dying.
19
00:01:27,783 --> 00:01:34,398
We actually watched our son
suffer brain damage in the
hospital before our eyes,
20
00:01:34,442 --> 00:01:40,056
and... Quite honestly,
that will haunt me forever.
21
00:01:41,318 --> 00:01:45,975
And Pat, my husband, died when
he was 45 from cancer,
22
00:01:46,018 --> 00:01:48,804
a cancer that was
appropriately diagnosed,
23
00:01:48,847 --> 00:01:54,114
but the pathology failed
to get communicated
to the doctor or Pat.
24
00:01:54,157 --> 00:01:56,768
They did an MRI and they
discovered that there was a mass
25
00:01:56,812 --> 00:02:00,381
in his neck at the base of the
skull, and so Pat had surgery.
26
00:02:00,424 --> 00:02:04,080
Six months later, the pain
returned in Pat's neck.
27
00:02:04,124 --> 00:02:06,996
A whole series of doctors
came through asking Pat
28
00:02:07,039 --> 00:02:10,608
why he never got treatment
after his first surgery,
29
00:02:10,652 --> 00:02:13,568
and I had all the documents,
and I said, well,
because it was benign.
30
00:02:13,611 --> 00:02:18,181
And then by the time
the third doctor came through,
I said, wait a second,
31
00:02:18,225 --> 00:02:21,315
what was his pathology on
the first surgery?
32
00:02:21,358 --> 00:02:25,275
And the final pathology was
a high-grade malignant
synovial cell sarcoma.
33
00:02:25,319 --> 00:02:32,195
And that document either never
arrived or was placed in his
chart without the doctor seeing.
34
00:02:32,239 --> 00:02:37,418
And I remember showing it to
Pat, and I remember Pat crying.
35
00:02:37,461 --> 00:02:42,249
You know, to think that
another error had taken place,
and this time with him,
36
00:02:42,292 --> 00:02:45,904
that was difficult for us
to witness in our
healthcare system.
37
00:02:45,948 --> 00:02:48,168
[somber music]
38
00:02:48,211 --> 00:02:51,562
Stories like Sue Sheridan
and what happened to her,
39
00:02:51,606 --> 00:02:57,655
where a small mistake can
really be a life-altering event
that remind us the human cost
40
00:02:57,699 --> 00:03:00,441
of what we're talking about.
These are not
theoretical events.
41
00:03:00,484 --> 00:03:03,705
These are not just things
that happen to other people.
They happen to us.
42
00:03:03,748 --> 00:03:07,404
They happen to our families.
And they are things
that we need to work on.
43
00:03:07,448 --> 00:03:08,318
[tense music]
44
00:03:14,106 --> 00:03:18,198
[narrator] In 1999, the first
significant report on
medical mistakes
45
00:03:18,241 --> 00:03:23,507
was released by the Institute of Medicine. They called it
To Err Is Human.
46
00:03:23,551 --> 00:03:30,210
This report claimed that as many as 98,000 people die every year as a result of medical mistakes.
47
00:03:31,123 --> 00:03:35,476
Over the next 15 years,
efforts to better understand
this number increased,
48
00:03:35,519 --> 00:03:37,304
but so did the number itself.
49
00:03:39,828 --> 00:03:43,701
Recent studies have raised
the projected number of deaths
to as high as 440,000.
50
00:03:44,746 --> 00:03:50,142
To put this in perspective,
that's more than the number of
graves in Arlington Cemetery.
51
00:03:50,186 --> 00:03:55,060
It's the equivalent of 2-3 jumbo jets crashing every single day.
52
00:03:55,104 --> 00:03:58,716
So, where does that rank
medical mistakes on
the leading causes of death?
53
00:03:59,717 --> 00:04:04,113
Number three. Right behind
cancer and heart disease.
54
00:04:06,333 --> 00:04:11,033
Now suddenly, whoa. This isn't
just some egg-headed study.
55
00:04:11,076 --> 00:04:13,775
This is a big deal. This
could be you, and they're right.
56
00:04:13,818 --> 00:04:18,040
Wait a second, you mean those
hospitals, my local hospital
was killing people?
57
00:04:18,083 --> 00:04:19,998
Is that what you're
really saying?
58
00:04:20,042 --> 00:04:24,829
We could prevent many, many,
many of these deaths immediately
59
00:04:24,873 --> 00:04:27,005
if we just put in the effort.
60
00:04:27,049 --> 00:04:29,791
Things are happening.
Let's take a look at this.
61
00:04:29,834 --> 00:04:33,795
I just think this is like
a massive epidemic
that we have underestimated,
62
00:04:33,838 --> 00:04:39,409
and the reason is because
it's happening to people
who are already sick.
63
00:04:39,453 --> 00:04:42,020
But, like, they were sick,
that doesn't mean
they were going to die.
64
00:04:42,064 --> 00:04:47,025
And their death is no less
of a tragedy because they
already had a medical problem.
65
00:04:47,069 --> 00:04:50,159
Every time you get on a plane,
you don't expect
that plane to crash.
66
00:04:50,202 --> 00:04:53,336
And everybody who dies in
a plane crash, you say, "Well,
those people were healthy.
67
00:04:53,380 --> 00:04:56,774
They were going to do fine
otherwise." I think the problem
with patient safety
68
00:04:56,818 --> 00:04:58,863
is you say, "Oh, well, these
people were sick anyway."
69
00:04:58,907 --> 00:05:01,736
And I think it's
a very problematic way
to look at the world.
70
00:05:01,779 --> 00:05:05,609
Maybe they didn't die, but they
spend the rest of their life in
a wheelchair or a nursing home
71
00:05:05,653 --> 00:05:08,873
and that accelerates their death
and obviously harms
their quality of life.
72
00:05:08,917 --> 00:05:12,181
So, the numbers about
deaths are a big deal,
73
00:05:12,224 --> 00:05:15,793
but in some ways they
underestimate the overall toll
of preventable harm.
74
00:05:15,837 --> 00:05:21,538
We don't have a stable,
agreed way to measure
safety or injuries.
75
00:05:21,582 --> 00:05:24,672
Actually, the number you get
depends on how you look.
76
00:05:24,715 --> 00:05:27,283
One rule is the harder you look,
the more you find.
77
00:05:27,327 --> 00:05:31,243
So, when you really throw the
book at it and you do everything
78
00:05:31,287 --> 00:05:33,855
you can to look for injuries,
you're going to find
a ton of them.
79
00:05:33,898 --> 00:05:39,469
When people start debating,
you know, is it 40,000
or 90,000 or 100,000?
80
00:05:39,513 --> 00:05:45,040
Uh, it's a lot. It's a ton.
And our job is to make it zero.
81
00:05:45,083 --> 00:05:48,826
This is urgent.
It's a public health emergency.
82
00:05:50,698 --> 00:05:54,397
[narrator] While the number of deaths related to medical error is staggering,
83
00:05:54,441 --> 00:05:58,358
the number of patients who
experience non-fatal errors
is even bigger.
84
00:05:59,446 --> 00:06:04,842
Recent studies suggest one-third of all hospital admissions
experience a medical mistake,
85
00:06:04,886 --> 00:06:09,064
and 1.7 million
hospital-acquired infections
occur every year.
86
00:06:10,152 --> 00:06:13,416
69% of those infections
could have been prevented
87
00:06:13,460 --> 00:06:17,072
through methods that already
exist, like hand washing.
88
00:06:17,115 --> 00:06:21,381
But healthcare workers wash
their hands less than
50% of the time,
89
00:06:21,424 --> 00:06:24,471
with some research suggesting
it's as low as 30%.
90
00:06:25,689 --> 00:06:29,780
There are even more dramatic
examples. In a five-year span,
91
00:06:29,824 --> 00:06:34,263
surgeons operated on the wrong
body part over 2,000 times,
92
00:06:35,133 --> 00:06:37,919
left nearly 5,000 tools
inside patients,
93
00:06:38,876 --> 00:06:42,967
and in 27 cases operated
on the wrong patient entirely.
94
00:06:44,534 --> 00:06:49,191
But diagnostic errors, like the ones that left Cal Sheridan
with cerebral palsy
95
00:06:49,234 --> 00:06:55,371
and delayed the detection of Pat Sheridan's cancer contribute to 1 in 10 patient deaths.
96
00:06:55,415 --> 00:07:00,376
But whether it's a diagnostic
error or any other preventable
harm, the only way to fix it
97
00:07:00,420 --> 00:07:03,248
is to first understand
what causes it.
98
00:07:03,292 --> 00:07:08,340
[Boaz Keysar] When we study
communication in my lab, we look
at how people communicate
99
00:07:08,384 --> 00:07:11,561
and what are the reasons
for miscommunication.
100
00:07:11,605 --> 00:07:17,611
In very simple experiments, when
we ask people to communicate
something to somebody else,
101
00:07:18,786 --> 00:07:22,877
about 50% of the time when
they thought the other person
understood them,
102
00:07:22,920 --> 00:07:27,751
they were wrong.
Now, I don't know the extent of
miscommunication in medicine,
103
00:07:27,795 --> 00:07:34,149
but I am sure it is more than,
uh, physicians think.
104
00:07:34,192 --> 00:07:38,588
Part of the problem is that
when you, when, when,
say a doctor miscommunicates,
105
00:07:39,894 --> 00:07:43,941
he or she might not know.
That's the core of
the problem, right?
106
00:07:43,985 --> 00:07:47,292
They might not get
immediate feedback that
they miscommunicated.
107
00:07:47,336 --> 00:07:52,080
And if that happens,
then that error
108
00:07:52,123 --> 00:07:57,215
could amplify without anybody
realizing that the source was
109
00:07:57,259 --> 00:08:00,175
just a minor miscommunication.
110
00:08:00,218 --> 00:08:04,353
Now I know how-- what happened
to my husband. Now I understand
how it happened,
111
00:08:04,396 --> 00:08:10,577
that there's been no
system-based intervention
to ensure
112
00:08:10,620 --> 00:08:13,275
that lab tests are
followed up on,
113
00:08:13,318 --> 00:08:16,278
that pathologies and radiology
reports are followed up on.
114
00:08:16,321 --> 00:08:21,022
To know that this happens in our
country, that's unacceptable.
115
00:08:23,590 --> 00:08:24,460
[siren wailing]
116
00:08:26,157 --> 00:08:27,028
[tense music]
117
00:08:32,424 --> 00:08:34,731
[narrator 2] Most of us think of
a hospital as a place
118
00:08:34,775 --> 00:08:37,865
where people go after they have an accident,
119
00:08:37,908 --> 00:08:40,911
not as a place where people go
to have accidents.
120
00:08:41,912 --> 00:08:46,656
However, like just about
any place, there are safety
hazards in a hospital.
121
00:08:47,744 --> 00:08:52,183
Some are unique to the hospital environment, and some are not.
122
00:08:52,227 --> 00:08:57,188
Generally, the hospital staff
is very aware of medical
safety practices,
123
00:08:57,232 --> 00:09:00,583
such as the proper handling
of infectious cases,
124
00:09:00,627 --> 00:09:05,893
careful checking
of patient ID before
administering any medication,
125
00:09:05,936 --> 00:09:08,678
keeping things sanitary
and disinfected.
126
00:09:08,722 --> 00:09:13,901
Yet, all of us at times tend to overlook some potential hazards that we are around every day.
127
00:09:15,206 --> 00:09:19,341
We must try to learn to think
safety in everything we do.
128
00:09:19,384 --> 00:09:24,607
But safety doesn't come just
by learning a lot of rules.
It comes from an attitude.
129
00:09:26,174 --> 00:09:31,658
For everyone who works in
a hospital, safety has to be
a full time job.
130
00:09:35,096 --> 00:09:38,795
[Albert Wu] This is a problem
that's, you know,
hiding in plain sight.
131
00:09:38,839 --> 00:09:43,626
And I think that no one is
really surprised when they
think about it for a minute.
132
00:09:43,670 --> 00:09:47,717
If we think the amount
of harm that is currently
existing is just fine,
133
00:09:47,761 --> 00:09:52,853
then maybe it's not a
crisis, it's not a problem.
If that's okay, then we're done.
134
00:09:52,896 --> 00:09:57,118
Most of us in medicine just
said, "Well, that's
the way it is, you know.
135
00:09:57,161 --> 00:10:00,730
Things go wrong.
People make mistakes. There's
nothing you can do about it."
136
00:10:00,774 --> 00:10:05,822
It's pretty obvious that safety
is not number one priority
in most hospitals.
137
00:10:05,866 --> 00:10:08,651
When it is,
wonderful things happen.
138
00:10:08,695 --> 00:10:11,785
What is the problem you're
trying to solve?
139
00:10:11,828 --> 00:10:15,223
And the answer is, for most
hospital administrators,
140
00:10:15,266 --> 00:10:18,792
life is too short to get
the doctors angry at you.
141
00:10:18,835 --> 00:10:22,534
Building a new cancer center,
your oncologists love you,
142
00:10:22,578 --> 00:10:25,886
the other doctors love you,
it brings in revenue,
the community loves you.
143
00:10:25,929 --> 00:10:29,541
If you reduce medical error,
you can't advertise it because
144
00:10:29,585 --> 00:10:33,502
the patients all think that
everything's safe anyway.
Nobody knows the problem exists.
145
00:10:33,545 --> 00:10:36,940
The doctors are angry because
you start to talk about
medical error.
146
00:10:36,984 --> 00:10:39,769
So, that's why you have
an invisible problem.
147
00:10:39,813 --> 00:10:46,036
Every human being will make
mistakes, and will-- so the goal
cannot be zero errors.
148
00:10:46,080 --> 00:10:50,084
Our goal needs to be zero harm,
because we know
errors will occur.
149
00:10:50,127 --> 00:10:55,393
So, how do we make sure
those errors don't actually lead
to harm and are caught early?
150
00:10:55,437 --> 00:11:00,137
10 or 15 years ago, we thought
central line infections were not
preventable.
151
00:11:00,181 --> 00:11:04,185
We thought that was part of
kind of doing business
in healthcare that, okay,
152
00:11:04,228 --> 00:11:07,144
people have central lines,
occasionally they'll
get infections,
153
00:11:07,188 --> 00:11:11,758
and that's just-- Now we know
infections can go down to zero.
154
00:11:11,801 --> 00:11:19,417
Preventing preventable harm is
a skill and a commitment
and a technology
155
00:11:19,461 --> 00:11:24,161
all of its own.
It's not glamorous, but it's
what keeps all of us safe.
156
00:11:24,205 --> 00:11:27,295
If you believe,
"First, do no harm",
157
00:11:27,338 --> 00:11:33,170
there is no excuse for
not investing in things
which will prevent harm.
158
00:11:33,214 --> 00:11:36,217
Health care nowadays is
incredibly complicated.
159
00:11:36,260 --> 00:11:40,830
A patient has literally hundreds
of things done to them,
160
00:11:40,874 --> 00:11:44,921
having blood drawn for a test
or getting an x-ray or whatever.
161
00:11:44,965 --> 00:11:49,360
And so, there are many, many,
many opportunities for things
to go wrong.
162
00:11:49,404 --> 00:11:53,625
So, even when nurses and doctors
and technicians and radiologists
163
00:11:53,669 --> 00:11:58,761
are functioning at a 99% level,
which is, you know, pretty good
for human activity,
164
00:11:58,805 --> 00:12:03,113
that still means a lot of
opportunity for things
to go wrong.
165
00:12:03,157 --> 00:12:08,249
I think this is a general
problem that you have when
you deal with people.
166
00:12:08,292 --> 00:12:15,256
We are not built to not make
mistakes. We are not built
to be perfect.
167
00:12:15,299 --> 00:12:20,217
Are you going to try and change
the person or are you going to
try and change the situation?
168
00:12:20,261 --> 00:12:25,657
One way to do it is to design,
say the work environment,
169
00:12:25,701 --> 00:12:32,969
in a way that would not
necessarily prevent the error,
170
00:12:33,013 --> 00:12:36,059
but would assume the error.
171
00:12:36,103 --> 00:12:40,629
We have to acknowledge that
to err is human, and then
to figure out
172
00:12:40,672 --> 00:12:45,460
what do we do with that fact
in terms of building a system
that's safe for patients.
173
00:12:45,503 --> 00:12:46,374
[tense music]
174
00:12:51,814 --> 00:12:56,297
[Sue Sheridan] Between Cal's
patient safety event and Pat's
patient safety event,
175
00:12:56,340 --> 00:13:00,518
we had Mackenzie in the middle
there. Exactly at 16 hours,
just like Cal,
176
00:13:00,562 --> 00:13:06,873
she also had a very high
bilirubin, which the hospital
took action.
177
00:13:06,916 --> 00:13:10,790
They tested it and they treated
it. I took a shower.
178
00:13:10,833 --> 00:13:13,880
And it was the first shower
after delivery and I remember
179
00:13:13,923 --> 00:13:18,188
I stayed in the shower for
an hour and they sent a female
chaplain in, and I was crying.
180
00:13:18,232 --> 00:13:24,412
And the chaplain thought I was
crying because my daughter was
getting treated for her jaundice
181
00:13:24,455 --> 00:13:28,285
and I explained to them I was
not crying because of that.
I was crying because
182
00:13:28,329 --> 00:13:31,593
I witnessed what the only thing
they had to do with my son,
183
00:13:31,636 --> 00:13:35,379
that it was so easy to prevent
what happened to my son.
184
00:13:37,381 --> 00:13:38,426
[tense music]
185
00:13:40,558 --> 00:13:43,866
[Mackenzie Sheridan]
When I got into about first
grade, people started asking me,
186
00:13:43,910 --> 00:13:47,565
"What's wrong with your brother?
Why, like, can't he move like
the rest of us?"
187
00:13:47,609 --> 00:13:52,092
I didn't really get it, because
I was never told necessarily,
188
00:13:52,135 --> 00:13:55,095
you know, your brother
has cerebral palsy,
your brother has kernicterus.
189
00:13:55,138 --> 00:13:56,966
You know, to me,
he was just my brother.
190
00:14:06,280 --> 00:14:07,150
[ambient music]
191
00:14:42,664 --> 00:14:47,974
[Mackenzie Sheridan] Recently,
I became more interested in
the case, my brother's case,
192
00:14:48,017 --> 00:14:54,067
because I knew, before
looking it up, I knew that he
wasn't given a bilirubin test
193
00:14:54,110 --> 00:14:57,766
and because of that he got
cerebral palsy and kernicterus.
194
00:14:57,809 --> 00:15:03,815
And I got frustrated and I got
angry and confused and
195
00:15:03,859 --> 00:15:08,690
my mom has taught me that I can
do something positive with that
kind of anger and fervor.
196
00:15:08,733 --> 00:15:12,085
I can, you know, go out
and make sure that those
kind of things don't happen.
197
00:15:21,572 --> 00:15:24,662
So, I used to be a little scared
hearing all of the things
198
00:15:24,706 --> 00:15:27,665
that could go wrong in
the health system.
199
00:15:27,709 --> 00:15:32,714
I just learned to be cautious
and to ask questions
and to, you know,
200
00:15:32,757 --> 00:15:36,022
ask the doctors, "What are you
doing? Have you washed your
hands? Have you done this?"
201
00:15:36,065 --> 00:15:38,763
I look at doctors in
a different sense than, I think,
202
00:15:38,807 --> 00:15:42,071
a lot of people do
and as a child I looked at
doctors differently as well.
203
00:15:42,115 --> 00:15:46,554
I know why kids would think like
a doctor don't make mistakes,
but I knew from a very young age
204
00:15:46,597 --> 00:15:50,471
that they do, and that their
mistakes could cost a life.
205
00:15:50,514 --> 00:15:52,821
The first thing that we wanted
was to tell somebody.
206
00:15:52,864 --> 00:15:57,565
Some kind of high authority that
could tell all of the hospitals
207
00:15:57,608 --> 00:16:01,438
about what happened,
so all hospitals
could implement change.
208
00:16:01,482 --> 00:16:05,965
And I thought somebody was
in charge of patient safety
in the United States,
209
00:16:06,008 --> 00:16:08,619
and I learned that
that simply does not exist.
210
00:16:14,321 --> 00:16:19,761
When people think about science
in healthcare, they think
about genes and cells
211
00:16:19,804 --> 00:16:24,461
and drugs and chemistry.
Yeah, that's science.
That's one science.
212
00:16:24,505 --> 00:16:28,509
But, there's another science,
which is the science of
organizing care,
213
00:16:28,552 --> 00:16:34,167
which is how to you actually get
the help, what are the flows
like, how do you do surgery.
214
00:16:34,210 --> 00:16:38,519
How do you take care of
a chronic illness.
There's science there too,
215
00:16:38,562 --> 00:16:42,392
and luckily this country began
investing in that really
in the past few decades.
216
00:16:42,436 --> 00:16:45,700
The Agency for Healthcare
Research and Quality, for
example, it's an American
217
00:16:45,743 --> 00:16:50,357
investment in developing
the sciences for delivering
better care.
218
00:16:50,400 --> 00:16:54,274
[narrator] In 2000, after
speaking with leaders in
healthcare,
219
00:16:54,317 --> 00:16:57,059
President Bill Clinton made a
bold statement regarding
220
00:16:57,103 --> 00:17:00,323
the country's new efforts in
managing medical errors.
221
00:17:00,367 --> 00:17:07,113
Just think about it, we can
cut preventable medical errors
in half in five years.
222
00:17:07,156 --> 00:17:12,292
[narrator] The Agency for Healthcare Research and Quality took on this task.
223
00:17:12,335 --> 00:17:16,078
Today, AHRQ remains focused on
improving the quality
224
00:17:16,122 --> 00:17:18,559
and safety of healthcare
for Americans.
225
00:17:20,082 --> 00:17:23,955
It does so by funding research, developing tools and training,
226
00:17:23,999 --> 00:17:27,176
and collecting measures and
data on the healthcare system
as a whole.
227
00:17:28,177 --> 00:17:33,530
In 2016, a report was released
on the recent progress
in patient safety efforts.
228
00:17:33,574 --> 00:17:38,100
The report showed that from 2010-2015, there were 3 million
229
00:17:38,144 --> 00:17:43,105
fewer hospital-acquired
conditions, showing
a 21% reduction.
230
00:17:43,149 --> 00:17:49,329
125,000 deaths were prevented,
saving $28 billion
in healthcare costs.
231
00:17:49,372 --> 00:17:55,074
All with a budget that
annually hovered between
$400-$450 million.
232
00:17:55,117 --> 00:17:59,339
But it's part of a healthcare
system that spends over
$3 trillion,
233
00:17:59,382 --> 00:18:05,301
and has more than 5,000
hospitals, with over 800,000
physicians, 4 million nurses,
234
00:18:05,345 --> 00:18:11,655
and 330 million patients. That means the agency is working with 1/100th of a percent
235
00:18:11,699 --> 00:18:17,226
of national health spending
and is tasked with improving
the other 99.99%.
236
00:18:18,575 --> 00:18:23,276
It is such an underinvestment
that, you know, a doubling
of the amount
237
00:18:23,319 --> 00:18:28,019
for the agency would be
a vast improvement, but
it still is not nearly enough.
238
00:18:28,063 --> 00:18:32,720
We need this information for us
to take care of our patients
properly, for health plans,
239
00:18:32,763 --> 00:18:37,986
for leaders of large clinics
to say, "Actually, no, I need
to better understand
240
00:18:38,029 --> 00:18:42,164
the choices I make,
how it impacts our ability
to deliver safe care."
241
00:18:42,208 --> 00:18:45,863
It has funded some of the
seminal studies that have had
massive improvements
242
00:18:45,907 --> 00:18:50,433
in patient safety. So, it funded
the studies that led us to
create the checklists for
243
00:18:50,477 --> 00:18:55,177
central line infections.
That alone has saved the
American healthcare system
244
00:18:55,221 --> 00:18:57,614
hundreds of millions of dollars,
if not billions of dollars,
245
00:18:57,658 --> 00:19:00,487
but more importantly,
has probably saved tens
of thousands of lives.
246
00:19:00,530 --> 00:19:04,099
[Ashish Jha] There are tens
of thousands of Americans
walking around today
247
00:19:04,143 --> 00:19:09,887
who would be dead if it had not been for some of the work
that AHRQ has funded.
248
00:19:09,931 --> 00:19:15,850
It's really about how we apply
the best of science
249
00:19:15,893 --> 00:19:19,114
to your individual needs
and preferences.
250
00:19:19,158 --> 00:19:23,074
To some extent I do know
some systems that are doing
a terrific job,
251
00:19:23,118 --> 00:19:26,339
and when I learn from them
about how they are doing it,
a lot of them
252
00:19:26,382 --> 00:19:31,039
are using the tools and
methods pioneered by AHRQ.
253
00:19:36,175 --> 00:19:37,176
[ambient music]
254
00:19:39,830 --> 00:19:43,356
Much of the work that we use to
train around patient safety
255
00:19:43,399 --> 00:19:49,144
and how to make healthcare safer
is actually derived from AHRQ
research and tools.
256
00:19:49,188 --> 00:19:54,193
When they put out a toolkit
or research tools, I know
that they've been vetted
257
00:19:54,236 --> 00:19:58,284
and they've been tried
and investigated and shown
to be of benefit.
258
00:19:58,327 --> 00:20:04,028
So, a big problem that we face
in safety in hospitals is really
improving handoffs,
259
00:20:04,072 --> 00:20:07,336
which is when a patient moves
from one area to another
260
00:20:07,380 --> 00:20:10,426
or when their doctors
or nurses change shifts.
261
00:20:10,470 --> 00:20:15,257
Handoffs are somewhat invisible
to patients, but they actually
have a huge impact on them.
262
00:20:15,301 --> 00:20:18,086
Like, if an average patient got
hospitalized tomorrow,
263
00:20:18,129 --> 00:20:20,523
they would face upwards
of 15 handovers.
264
00:20:20,567 --> 00:20:24,179
And we know from AHRQ-funded
research, it's got to be
more than just
265
00:20:24,223 --> 00:20:27,443
a passive listening where you're
like, uh-huh, okay, I got it,
266
00:20:27,487 --> 00:20:32,231
but really engage,
ask questions, because often
times you'll pick up things.
267
00:20:33,449 --> 00:20:38,237
Combining AHRQ TeamSTEPPS,
with a standardized tool
to improve handoffs
268
00:20:38,280 --> 00:20:42,763
actually led to a 30% reduction
in preventable adverse events.
269
00:20:42,806 --> 00:20:47,376
We also develop our own
home-grown patient safety
teaching programs.
270
00:20:47,420 --> 00:20:49,987
One of my personal favorites
that we've actually
271
00:20:50,031 --> 00:20:53,208
developed here is called
the Room of Horrors.
272
00:20:53,252 --> 00:20:55,819
We take 10 patient
safety hazards
273
00:20:55,863 --> 00:20:58,953
and we embed it into a hospital
room, into a simulation.
274
00:20:58,996 --> 00:21:02,043
This is training where
you're walking into a room
275
00:21:02,086 --> 00:21:06,482
and you're actually seeing
with your own eyes,
can you spot what's wrong?
276
00:21:06,526 --> 00:21:12,662
[Trainee 1] Ammonia. C-diff
positive. So, probably should be
some kind of like precautions.
277
00:21:12,706 --> 00:21:14,490
[Trainee 2] Yeah, he should be
contacted less.
278
00:21:14,534 --> 00:21:18,146
Allergies, latex and penicillin.
That's fine.
279
00:21:18,189 --> 00:21:21,932
Umm. Let's see here.
Oh, those are gloves over there.
280
00:21:23,151 --> 00:21:27,547
[Trainee 2] Are these
latex gloves? Uh-oh, we got
latex gloves.
281
00:21:27,590 --> 00:21:29,853
So, it looks like he's got some
[unintelligible] hanging,
282
00:21:29,897 --> 00:21:34,597
and he's allergic to penicillin
so that's definitely not ideal.
283
00:21:34,641 --> 00:21:37,426
[Trainee 1] Yes, absolutely.
Why does he have magnesium?
284
00:21:37,470 --> 00:21:41,909
I don't know.
It's actually not for his name.
His name is Washington, right?
285
00:21:41,952 --> 00:21:44,738
[Trainee 1] Yeah.
Michael Johnson. Alright.
286
00:21:44,781 --> 00:21:50,134
-Different Michael. I'm also
going to put the stress ulcer.
-Okay. Good call.
287
00:21:50,178 --> 00:21:53,747
[Vinny Arora] They have
10 minutes to identify all
the hazards that they can,
288
00:21:53,790 --> 00:21:57,098
and then right after,
when they come out,
I actually debrief with them,
289
00:21:57,141 --> 00:22:01,320
so we go over how they did,
not only what they got right,
290
00:22:01,363 --> 00:22:04,758
where did they miss things,
and perhaps why did they miss
those things.
291
00:22:04,801 --> 00:22:08,718
If you train people this way,
this is the way their brain
is running in the background.
292
00:22:08,762 --> 00:22:13,070
Every time they enter a room
they can automatically spot it
from the corner of their eye.
293
00:22:13,114 --> 00:22:18,511
As an organization, we cannot
improve patient safety unless
we have front line personnel,
294
00:22:18,554 --> 00:22:21,601
including our residents and
nurses and everyone else that
works in healthcare
295
00:22:21,644 --> 00:22:24,386
raising their hand to say,
"Hey, I saw something wrong."
296
00:22:24,430 --> 00:22:28,521
And so that's why it's really
important to embed people into
a clinical situation
297
00:22:28,564 --> 00:22:33,003
where they are able
to recognize what types of
events they should report.
298
00:22:33,047 --> 00:22:34,091
[somber music]
299
00:22:40,315 --> 00:22:44,145
[Bob Wachter] Probably
the most important foundational
thinker in the field of
300
00:22:44,188 --> 00:22:47,235
patient safety is a gentleman by
the name of James Reason,
301
00:22:47,278 --> 00:22:50,804
who is now retired or
semi-retired psychologist in
Manchester, England.
302
00:22:50,847 --> 00:22:55,765
What Reason was doing was, as
a psychologist, studying what he
called organizational accidents.
303
00:22:55,809 --> 00:23:00,727
How did terrible errors
and accidents and harm happen
in industries,
304
00:23:00,770 --> 00:23:06,733
whether it was nuclear power or
space shuttles or intelligence
failures in the CIA?
305
00:23:06,776 --> 00:23:10,650
So, he studied a bunch of them,
and what he found was the same
pattern over and over again.
306
00:23:10,693 --> 00:23:15,785
What he found was if you look at
it superficially, you would see
a human being who screwed up.
307
00:23:15,829 --> 00:23:18,658
That was the superficial
understanding. It was easy
because it fit with
308
00:23:18,701 --> 00:23:21,312
the human model that
I need to blame somebody
309
00:23:21,356 --> 00:23:24,272
and if I can just point
a finger, you know, I have
solved a problem.
310
00:23:24,315 --> 00:23:28,102
What was really right was
that in unsafe organizations,
311
00:23:28,145 --> 00:23:32,976
these organizational accidents
happen because of
a long chain of events
312
00:23:33,020 --> 00:23:38,199
that allowed that human error,
sometimes several human errors
to cause terrible harm.
313
00:23:38,242 --> 00:23:41,463
So, he came up with a model
that, to me, I remember the
first time I read this,
314
00:23:41,507 --> 00:23:44,335
it's called the Swiss Cheese
Model. A little lightbulb
went off and I said,
315
00:23:44,379 --> 00:23:47,948
"Aha! Oh, now I get it."
And now I look back on errors
316
00:23:47,991 --> 00:23:51,430
I have seen through my entire
career, and now it makes sense.
317
00:23:51,473 --> 00:23:54,824
Organizations build in
protections to block
318
00:23:54,868 --> 00:23:57,697
those simple human glitches
from causing harm.
319
00:23:57,740 --> 00:24:00,787
The problem is,
those layers of protections
320
00:24:00,830 --> 00:24:03,311
he likened to pieces of Swiss
cheese, they all have holes.
321
00:24:03,354 --> 00:24:06,140
If I kind of blow something one
day, I kind of forget something,
322
00:24:06,183 --> 00:24:11,493
or write something in the wrong
space, most days the first layer
of Swiss cheese blocks it.
323
00:24:11,537 --> 00:24:15,236
But, on a bad day,
the first layer misses.
It goes through the hole
324
00:24:15,279 --> 00:24:17,891
and it hits the second layer
and the second layer blocks it.
325
00:24:17,934 --> 00:24:22,635
When we kill someone in
medicine because we gave them
the wrong medicine or cut off
326
00:24:22,678 --> 00:24:28,031
the wrong leg or there's a
space shuttle crash or Three
Mile Island and you look back,
327
00:24:28,075 --> 00:24:32,775
you realize there were
a lot of layers, each one
of them had a lot of holes,
328
00:24:32,819 --> 00:24:36,649
and also that particular day
the karma was pretty terrible
329
00:24:36,692 --> 00:24:39,434
and it just happened to be
that all of the holes aligned.
330
00:24:39,478 --> 00:24:42,698
And that's how the error
made it through all of
these quote "protections"
331
00:24:42,742 --> 00:24:48,008
to cause terrible harm.
My instinct was no longer, "Let
me figure out who screwed up."
332
00:24:48,051 --> 00:24:51,620
My instinct was now Swiss
cheese. It became automatic.
333
00:24:51,664 --> 00:24:56,364
Here's a bad error, what's the
Swiss cheese? What are the
layers of protection that we had
334
00:24:56,407 --> 00:25:02,326
that failed, how do we shrink
the size of the holes, and how
do we create enough overlap
335
00:25:02,370 --> 00:25:07,854
in layers of cheese so an error
never makes it through all those
layers to cause terrible harm?
336
00:25:07,897 --> 00:25:09,029
[ambient music]
337
00:25:44,978 --> 00:25:50,505
[Sue Sheridan] With Pat,
I actually spoke to the
pathologist about why he didn't
338
00:25:50,549 --> 00:25:55,292
pick up the phone and call
the neurosurgeon when
they learned it was cancer,
339
00:25:55,336 --> 00:26:01,472
and it was a rare kind of
cancer, and his answer was,
"It's not my job."
340
00:26:05,738 --> 00:26:08,784
[Mackenzie Sheridan]
The doctor told our family,
you know, your dad is fine.
341
00:26:08,828 --> 00:26:12,222
He's benign, the tumor is
benign, everything's great,
go on and live your life.
342
00:26:12,266 --> 00:26:15,225
A few months later
my dad got very sick.
343
00:26:16,618 --> 00:26:21,449
[Sue Sheridan] And I got the
documents from the neurosurgeon
and it said that the pathology
344
00:26:21,492 --> 00:26:25,845
was an atypical spindle cell
neoplasm, which the doctor said
was benign.
345
00:26:26,933 --> 00:26:32,939
We expected the hospital
to fully describe to us
what had happened, to...
346
00:26:32,982 --> 00:26:39,119
you know, take care of us,
and we were discharged
without any explanation.
347
00:26:39,162 --> 00:26:41,469
So, we left there with all
the documents in our hands
348
00:26:41,512 --> 00:26:44,428
with absolutely no explanation
that this was an error.
349
00:26:45,386 --> 00:26:46,256
[sighs]
350
00:26:47,693 --> 00:26:51,914
I think our first reaction
was fear. We were scared.
351
00:26:51,958 --> 00:26:57,354
It scared us that a hospital,
a well-known hospital,
with professionals,
352
00:26:57,398 --> 00:27:00,880
would intentionally cover up
that kind of information.
353
00:27:00,923 --> 00:27:04,666
So, the first,
the first emotion was fear.
354
00:27:04,710 --> 00:27:08,539
One day, Pat woke up paralyzed
from his waist down,
355
00:27:08,583 --> 00:27:12,674
and we're at home in Boise,
Idaho, and we thought maybe
he had a stroke.
356
00:27:12,718 --> 00:27:16,983
We learned then that his cancer
had returned explosively.
357
00:27:18,506 --> 00:27:20,551
They estimated
he had about 10 days to live.
358
00:27:21,552 --> 00:27:25,382
[Mackenzie Sheridan]
And I remember my mom sitting
Cal and I down right before
359
00:27:25,426 --> 00:27:30,736
and she said, "You know,
your dad is sick, and he is
going to no longer be with us."
360
00:27:33,826 --> 00:27:38,091
[Sue Sheridan] I requested
a meeting with the doctor,
and with the CEO,
361
00:27:38,134 --> 00:27:43,618
and with the risk manager.
They agreed to it
and I flew down there
362
00:27:43,662 --> 00:27:47,056
and nobody showed up,
except the chaplain.
363
00:27:47,100 --> 00:27:52,627
I demanded that they implement
a disclosure procedure that
when there was an error at their
364
00:27:52,671 --> 00:27:57,980
hospital that they sit down with
the family, which, you know,
which they did not with us.
365
00:28:01,897 --> 00:28:04,944
[David Mayer]
Historically, you've probably
heard the term deny and defend.
366
00:28:04,987 --> 00:28:10,384
That was the model that is still
existent today unfortunately
at many hospitals;
367
00:28:10,427 --> 00:28:16,477
That if we cause a preventable
medical harm, the goal has
always been to shut things down,
368
00:28:16,520 --> 00:28:20,568
let the lawyers handle it,
don't talk to the patients
and families,
369
00:28:20,611 --> 00:28:24,964
and then it turns into a legal
battle for 4, 5, 6 years where
370
00:28:25,007 --> 00:28:28,924
the hope is that the patient
and family will just give up
and go away
371
00:28:28,968 --> 00:28:33,668
and that's been the model.
And now we've moved to more open
and honest communication.
372
00:28:36,671 --> 00:28:38,412
[Heather Young]
We do a simulation on
373
00:28:38,455 --> 00:28:41,023
how to tell someone that
you've made an error,
374
00:28:41,067 --> 00:28:43,678
and that's a skill that's very
difficult to develop,
375
00:28:43,722 --> 00:28:48,422
to do in a way that
conveys that you care and
that you are concerned about
376
00:28:48,465 --> 00:28:51,642
the person's safety
and that you are going
to do something about it
377
00:28:51,686 --> 00:28:56,822
when you may face a family
member who is irate,
very upset by the news.
378
00:28:56,865 --> 00:29:01,261
And you know, as a new
clinician, you need to have
the skills to be open and
379
00:29:01,304 --> 00:29:06,222
transparent and talk honestly
and authentically with people.
380
00:29:06,266 --> 00:29:10,574
So, I'm about to go in and see
a standardized patient,
is what we call it.
381
00:29:10,618 --> 00:29:14,056
It's an actor that I have no
idea how he's going to react.
382
00:29:14,100 --> 00:29:17,799
We're going to break him some
bad news about a test result
that we missed 3 months ago.
383
00:29:17,843 --> 00:29:21,803
They are told to react
differently to each student.
384
00:29:21,847 --> 00:29:24,632
So I don't know what
I'm going to get when
I break him the news.
385
00:29:24,675 --> 00:29:29,637
He could be angry, frustrated,
or he could go easy on me.
I just don't know.
386
00:29:29,680 --> 00:29:37,036
One of the things that was
ordered a couple weeks ago
was a CT scan, uh, which,
387
00:29:37,079 --> 00:29:43,172
umm, indicated, umm, some
results that could indicate
colon cancer.
388
00:29:44,565 --> 00:29:49,962
[Doctor] Listen, I've got to cut
this. Um, you don't want to say
there was another
389
00:29:50,005 --> 00:29:55,445
test result that might indicate
colon cancer at this short intro
into it, right?
390
00:29:56,403 --> 00:29:57,578
-Ah?
-Oh.
391
00:29:59,101 --> 00:30:02,452
I mean you went right to:
"That could be colon cancer."
392
00:30:02,496 --> 00:30:07,066
His dad died of colon cancer.
You could have a patient
falling apart in moments.
393
00:30:07,109 --> 00:30:12,071
Do you want to look at
those pearls on effective
communication?
394
00:30:12,114 --> 00:30:17,859
Lay out the facts,
that you know them, and say
that 3 months ago on the CT--
395
00:30:17,903 --> 00:30:22,690
And then as he's like,
and I know your dad passed,
396
00:30:22,733 --> 00:30:26,128
it could be a cancer,
but we don't know that yet.
397
00:30:26,172 --> 00:30:28,739
You know, all that gingerly,
careful stuff.
398
00:30:30,089 --> 00:30:32,395
-Hey, Walt. How's it going?
-Hey, Jason. I'm alright.
399
00:30:32,439 --> 00:30:34,180
-It's good to see you again.
-Thank you.
400
00:30:34,223 --> 00:30:36,747
-How was the drive in?
-Uh, fine.
401
00:30:39,185 --> 00:30:41,665
-Three months ago, remember
you came in three months ago?
-I do.
402
00:30:41,709 --> 00:30:45,756
It showed that you had some
thickening of your colonic wall
403
00:30:47,193 --> 00:30:50,979
and some enlarged
mesenteric lymph nodes.
404
00:30:51,023 --> 00:30:53,895
We need to do
a colonoscopy immediately.
405
00:30:54,809 --> 00:30:57,507
We want to make sure, and I'm
not saying it's colon cancer,
406
00:30:57,551 --> 00:31:01,250
but we want to make sure
that it's not colon cancer
and rule it out.
407
00:31:01,294 --> 00:31:05,994
Why did it take 3 months to,
uh, that I know this?
408
00:31:06,038 --> 00:31:09,868
That was my mistake.
We were looking for structural
abnormalities on your kidneys
409
00:31:09,911 --> 00:31:12,653
and I overlooked that part of
the report 3 months ago.
410
00:31:13,872 --> 00:31:18,920
[sigh] I mean,
I would have been upset
411
00:31:18,964 --> 00:31:22,532
hearing it first
when the CT scan happened,
412
00:31:22,576 --> 00:31:27,363
but now I'm really pissed off
that it's been 3 months,
that it was delayed.
413
00:31:27,407 --> 00:31:33,282
Right, and, I mean,
I understand that you're angry,
I can see that you're frustrated
414
00:31:33,326 --> 00:31:37,156
and I can't, I can't do
anything to fix that
mistake 3 months ago.
415
00:31:37,199 --> 00:31:42,030
But, what I can do now is
make this a priority as
your primary care provider,
416
00:31:42,074 --> 00:31:46,513
and I can't even imagine
how you're feeling right now
with the mistake,
417
00:31:46,556 --> 00:31:52,911
but let's take it from here, and
we'll figure this out together.
I'll make this a priority, OK?
418
00:32:32,385 --> 00:32:33,908
-[Charlie] Good morning.
-Good morning. Hi Charlie.
419
00:32:33,952 --> 00:32:35,736
Hi Walt, nice to meet you.
420
00:32:35,779 --> 00:32:37,999
I'm sorry. Wait,
I've met you before.
421
00:32:38,043 --> 00:32:41,872
Yeah. We've known
each other for years.
422
00:32:41,916 --> 00:32:45,137
[Heather Young] The closer
you are to the error, the more
important it is that you have
423
00:32:45,180 --> 00:32:47,966
some accountability for it,
and that you communicate
424
00:32:48,009 --> 00:32:49,924
with the people who
might be harmed by it.
425
00:32:49,968 --> 00:32:53,797
And so all of us need to learn
the skills to be able to
426
00:32:53,841 --> 00:32:57,062
acknowledge what we've done wrong and what we're planning to do to fix it.
427
00:32:58,193 --> 00:33:02,110
[Don Berwick]
We built it completely wrong.
We were trained, I was trained,
428
00:33:02,154 --> 00:33:05,200
"No, you don't talk about
your mistakes with a patient,
429
00:33:05,244 --> 00:33:08,116
that's liability, the lawyers
will be all over us."
430
00:33:08,160 --> 00:33:13,904
This is a time for openness
and honesty, and so we can learn
and grow together.
431
00:33:13,948 --> 00:33:17,125
Healthcare is not like
a toaster where I make it
and I sell it to you,
432
00:33:17,169 --> 00:33:22,391
and you take it and plug it in.
No, it's always a cooperative
enterprise so that
433
00:33:22,435 --> 00:33:26,439
the patient and the family,
and the doctor and the nurse,
they're co-producing the care.
434
00:33:26,482 --> 00:33:28,789
And now that we're more
aware of that over time,
435
00:33:28,832 --> 00:33:31,487
there's a lot of possibility for
much more participation by both.
436
00:33:37,624 --> 00:33:41,802
[John Eisenberg] I recalled a
woman whom I took care of.
437
00:33:41,845 --> 00:33:45,066
We had had a pap test done to
screen her for cervical cancer.
438
00:33:46,415 --> 00:33:49,984
The result was suspicious,
but I never knew that,
439
00:33:50,854 --> 00:33:54,684
because I never got the report
back. And I didn't realize
440
00:33:54,728 --> 00:33:58,949
that I hadn't gotten the report
back until she called me and
asked about the report.
441
00:33:59,994 --> 00:34:05,521
I tracked it down. I found out
that it was suspicious. We
followed it up and fortunately
442
00:34:05,565 --> 00:34:10,439
it turned out not to be
anything serious.
But that was a near miss.
443
00:34:11,614 --> 00:34:17,707
It was a near miss that
could have been a tragedy
had she not called me.
444
00:34:17,751 --> 00:34:22,756
Senator, when I spoke at
three medical school
graduations last Spring,
445
00:34:22,799 --> 00:34:27,369
I asked all the students
who were graduating,
and I asked all of the faculty
446
00:34:28,283 --> 00:34:33,158
to raise their hands if they
had ever made a mistake in
taking care of a patient,
447
00:34:33,201 --> 00:34:39,164
and every single student raised
his or her hand, every faculty
member raised his or her hand.
448
00:34:39,207 --> 00:34:44,473
When I was a medical student on
one of my very first rotations,
449
00:34:44,517 --> 00:34:47,955
I inadvertently,
during a code,
450
00:34:48,956 --> 00:34:56,311
gave a full syringe of
morphine to a patient IV and
they had a respiratory arrest.
451
00:34:56,355 --> 00:35:01,490
Fortunately, the person was
intubated and resuscitated
and did just fine.
452
00:35:01,534 --> 00:35:06,147
That was
a shocking experience,
453
00:35:06,191 --> 00:35:11,065
and made me aware at a very
early point in my medical career
454
00:35:11,109 --> 00:35:15,330
that we have the potential
to do things wrong and
to potentially harm patients.
455
00:35:15,374 --> 00:35:18,507
No one ever heard about it
besides me and that nurse.
456
00:35:19,508 --> 00:35:22,294
So, it's not clear to me that
any changes were ever made
457
00:35:22,337 --> 00:35:24,339
as a result, and I don't think
the patient ever heard.
458
00:35:24,383 --> 00:35:27,734
I've made medical errors;
I have, uh,
459
00:35:27,777 --> 00:35:31,651
I prescribed the wrong
medication on a patient.
There were two patients of mine
460
00:35:31,694 --> 00:35:34,915
with very similar names
and I just prescribed it
on the wrong patient.
461
00:35:34,958 --> 00:35:40,050
I felt terrible.
I felt incompetent.
I felt a little ashamed.
462
00:35:40,094 --> 00:35:45,186
And I, my first instinct was not
just to fix the problem,
but then not to tell anybody.
463
00:35:45,230 --> 00:35:48,537
That's just a normal
human instinct.
464
00:35:48,581 --> 00:35:53,368
It is completely understandable
why people's first reaction is
465
00:35:53,412 --> 00:35:57,198
cover it up, don't talk about
it. It's a very human response.
466
00:35:57,242 --> 00:35:59,635
Doesn't make it the right thing,
it's actually clearly
467
00:35:59,679 --> 00:36:02,421
not the right thing, it's
clearly bad to do that.
468
00:36:02,464 --> 00:36:06,207
But I think we have to begin
by acknowledging that
it's a very human response.
469
00:36:06,251 --> 00:36:08,209
You can feel very
self-righteous. You can say,
470
00:36:08,253 --> 00:36:10,777
"Well, the patient got the wrong
drug, fire the nurse.
471
00:36:10,820 --> 00:36:14,694
There's a complication of the
surgery, bad surgeon."
472
00:36:14,737 --> 00:36:17,392
You're wrong.
You're almost always wrong.
473
00:36:17,436 --> 00:36:21,135
It feels good to blame someone.
You've got a culprit? Put them
in jail, fire them.
474
00:36:21,179 --> 00:36:25,008
Many things caused it.
So, who's responsible?
Everybody's responsible.
475
00:36:25,052 --> 00:36:29,317
Everybody can contribute to
the enterprise of closing
the vulnerabilities,
476
00:36:29,361 --> 00:36:33,060
of making the whole thing
less likely to go wrong.
477
00:36:33,103 --> 00:36:38,631
The most recent survey
I have seen is that nearly
50% of nurses in America
478
00:36:38,674 --> 00:36:43,636
still don't feel it is safe
to talk about a mistake
they've made.
479
00:36:43,679 --> 00:36:45,638
That's an absolute disgrace.
480
00:36:47,553 --> 00:36:51,252
If something bad is going to
happen to you when you speak up
481
00:36:51,296 --> 00:36:56,866
about something you've seen
or done that could help.
482
00:36:56,910 --> 00:37:01,262
If you're going to get punished
for that, why would you speak
up? You don't.
483
00:37:01,306 --> 00:37:06,006
You run and hide. You lie.
That's normal human behavior.
484
00:37:06,049 --> 00:37:09,531
We're not talking about bad
people; we're talking about
normal people become frightened.
485
00:37:09,575 --> 00:37:13,753
And so leaders, you got a
choice: you can scare your
workforce and give up the hope
486
00:37:13,796 --> 00:37:19,802
for improvement, or you can
celebrate, invite,
work with your workforce,
487
00:37:19,846 --> 00:37:22,544
and have a chance of
learning together to get
to a better world.
488
00:37:22,588 --> 00:37:26,766
What we have learned from other
industries is that if you could
change the culture
489
00:37:26,809 --> 00:37:30,204
and reward people for
being open, reward people
for being honest,
490
00:37:30,248 --> 00:37:34,121
reward people for coming
forth and talking about
their errors,
491
00:37:34,164 --> 00:37:38,256
then you being to counter
that kind of normal instinct
that we all have,
492
00:37:38,299 --> 00:37:41,911
and begin to create a culture
of patient safety where people
are much more open.
493
00:37:41,955 --> 00:37:46,089
And the system gets better
because it learns from mistakes
and doesn't hide them.
494
00:37:46,133 --> 00:37:50,877
And we found in the food
industry they were years ahead
of us. They had programs.
495
00:37:50,920 --> 00:37:55,185
For instance, Burger King had
a program if the employee
saw another one
496
00:37:55,229 --> 00:37:59,320
not washing their hands, they
went over and they tapped them
and said, "Got you",
497
00:37:59,364 --> 00:38:03,193
and then they got either two
hours compensation off
498
00:38:03,237 --> 00:38:06,893
or some other reward.
I mean, they're on board.
499
00:38:06,936 --> 00:38:11,854
Safety reporting is like
democracy. Democracy isn't about
having a free and fair election.
500
00:38:13,203 --> 00:38:18,296
We can do that. Democracy is
about having a second free and
fair election.
501
00:38:18,339 --> 00:38:20,646
The same thing is true
with safety reporting.
It's not about
502
00:38:20,689 --> 00:38:23,126
filing a safety report,
it's about filing a second.
503
00:38:23,170 --> 00:38:26,347
And where you see
an organization with
a high rate of reported error,
504
00:38:26,391 --> 00:38:30,786
what that tells you is it tells
you that they must be doing
something about those reports,
505
00:38:30,830 --> 00:38:34,312
because if they are just
sitting on them, people
will stop reporting.
506
00:38:34,355 --> 00:38:38,141
Because even if you
tell people they have to,
in the end it's all voluntary.
507
00:38:38,185 --> 00:38:41,449
I mean, you can't solve it if
you can't see it. We can see it.
508
00:38:41,493 --> 00:38:43,973
And more and more
people are aware of it.
That's the good news.
509
00:38:44,017 --> 00:38:47,107
Bad news is you're still at
risk, really at risk.
510
00:38:47,150 --> 00:38:54,114
I mean we haven't pervaded
healthcare with the designs
and approaches and cultures
511
00:38:54,157 --> 00:38:57,378
that actually make you super
safe and that's the task ahead.
512
00:38:57,422 --> 00:39:02,949
It's amazing how quickly
hospitals can completely
overhaul their safety
513
00:39:04,037 --> 00:39:06,909
when they know that it's
important to their patients.
514
00:39:06,953 --> 00:39:10,435
Hospitals had to hear the
message from their own patients
515
00:39:10,478 --> 00:39:15,657
that it matters that they wash
their hands, it matters that
they keep a safe environment,
516
00:39:15,701 --> 00:39:20,967
it matters that they put
the safety and protection
of their patients first
517
00:39:21,010 --> 00:39:25,101
every minute of every day.
The only way they're really
going to get that message
518
00:39:25,145 --> 00:39:27,800
is when the American public
gets involved and pushes.
519
00:39:34,459 --> 00:39:37,200
[narrator] One way to improve
the quality of hospitals
in America
520
00:39:37,244 --> 00:39:40,421
is to put a microscope
on the data they do
actually provide.
521
00:39:41,944 --> 00:39:45,687
[narrator] Leah Binder and her
team at the Leapfrog Group
in Washington, DC,
522
00:39:45,731 --> 00:39:48,821
worked with leaders in patient
safety to create a new way
523
00:39:48,864 --> 00:39:52,259
to rate the quality of hospitals that patients can understand.
524
00:39:53,216 --> 00:39:56,611
We worked with the foremost
experts in patient safety
525
00:39:56,655 --> 00:40:01,399
and we asked them to look at all
this data and decide which of
the data is most reliable,
526
00:40:01,442 --> 00:40:04,358
which gives us the best
information about the safety
of a hospital,
527
00:40:04,402 --> 00:40:08,667
and then help us figure out
a reliable criteria to put it
all together.
528
00:40:08,710 --> 00:40:13,585
And then, we did something else.
We decided to issue
a letter grade.
529
00:40:13,628 --> 00:40:18,720
The letter grade would apply to
each hospital on how safe they
are relative to other hospitals.
530
00:40:18,764 --> 00:40:22,028
So, were they an
A, B, C, D, or F?
531
00:40:22,071 --> 00:40:26,206
When we first did it,
we got calls from some
hospital CEOs who said
532
00:40:26,249 --> 00:40:30,471
to me, memorably,
"I've decided I don't want
a letter grade from you."
533
00:40:30,515 --> 00:40:35,084
And I said, "Well, I've decided
you're getting one anyway,
because you serve the public,
534
00:40:35,128 --> 00:40:38,174
and the public you serve
deserves to know
how you're doing."
535
00:40:38,218 --> 00:40:41,439
It's very important to do these
kinds of ratings because
536
00:40:41,482 --> 00:40:45,834
who wants to work in a terrible
organization? And so if you can
make it very obvious to all the
537
00:40:45,878 --> 00:40:49,534
doctors and nurses in that
hospital that this is a highly
unsafe hospital,
538
00:40:49,577 --> 00:40:53,886
I think there is going to be
internal pressure to reform and
internal pressure to get better.
539
00:40:53,929 --> 00:40:56,758
But, certainly I think
it's true that, like,
if you're in an isolated area,
540
00:40:56,802 --> 00:41:00,066
there's one hospital
in town or you could be
in the middle of Chicago,
541
00:41:00,109 --> 00:41:03,852
but your insurance company
covers one hospital only,
542
00:41:03,896 --> 00:41:08,161
it's going to be a challenge of
choices. But that doesn't mean
you couldn't go to your doctor
543
00:41:08,204 --> 00:41:11,643
who works in that hospital
and be like, "Hey,
why are you guys a D hospital?"
544
00:41:11,686 --> 00:41:14,994
And I think if consumers
started talking to doctors
and nurses that way,
545
00:41:15,037 --> 00:41:19,172
it would actually begin
to change the conversation,
where doctors would say,
546
00:41:19,215 --> 00:41:22,044
"Why do I work at a hospital
that has such high
infection rates?"
547
00:41:22,088 --> 00:41:25,700
Virtually every other industry
in this country has their
products and services
548
00:41:25,744 --> 00:41:27,963
in a transparent market,
and people choose.
549
00:41:28,007 --> 00:41:32,751
So, if you're buying a car,
you can look up auto reviews
550
00:41:32,794 --> 00:41:35,580
and you can compare
among different cars
and different features.
551
00:41:35,623 --> 00:41:38,191
In New York, which I know
particularly well,
552
00:41:38,234 --> 00:41:41,977
restaurants that had,
for many years,
been getting public ratings
553
00:41:42,021 --> 00:41:45,154
from the health department
on how safe they were;
554
00:41:45,198 --> 00:41:48,462
those were all public,
but nobody paid any
attention to them.
555
00:41:48,506 --> 00:41:51,596
So, the health department said,
from now on you're going
to get a grade
556
00:41:51,639 --> 00:41:54,642
on how safe you are and you have
to post it in your window.
557
00:41:54,686 --> 00:42:00,866
So, restaurants started posting
it, and within six months any
restaurant that didn't have an A
558
00:42:00,909 --> 00:42:05,261
was either out of business
or they were very quickly
getting to their A.
559
00:42:05,305 --> 00:42:08,177
So we said, "Well, let's do
the same thing with hospitals."
560
00:42:08,221 --> 00:42:12,138
I mean in our dream, hospitals
would put their letter grade on
561
00:42:12,181 --> 00:42:16,708
you know, their front door
and everyone would know that
this hospital was safe or not.
562
00:42:16,751 --> 00:42:17,796
[sombr music]
563
00:42:24,585 --> 00:42:28,067
[Helen Burstin] John Eisenberg
used to tell a great story of
the drunk who lost his keys.
564
00:42:29,285 --> 00:42:32,114
And he's out in front of
the bar in the street
looking for his keys
565
00:42:32,158 --> 00:42:34,595
and some guy comes over and
goes, "What are you doing?"
566
00:42:34,639 --> 00:42:36,510
He says,
"I'm looking for my key."
567
00:42:36,554 --> 00:42:38,556
"Well, why are you only
looking right here?"
568
00:42:38,599 --> 00:42:40,427
He said, "Well, that's where the lamplight is."
569
00:42:40,470 --> 00:42:42,429
[clock ticking]
570
00:42:42,472 --> 00:42:44,562
[narrator] This is known
as the streetlight effect.
571
00:42:45,867 --> 00:42:49,436
Many in the patient safety field have been looking outside
572
00:42:49,479 --> 00:42:52,047
healthcare for solutions to
preventable errors.
573
00:42:52,091 --> 00:42:56,617
Industries like nuclear power,
aircraft carriers,
and commercial aviation
574
00:42:56,661 --> 00:42:59,664
have become known as
high-reliability organizations
575
00:42:59,707 --> 00:43:02,928
due to significant efforts
to improve safety.
576
00:43:02,971 --> 00:43:08,847
High reliability is different
in healthcare because it points
directly at examples
577
00:43:08,890 --> 00:43:13,286
of very hazardous industries,
organizations
578
00:43:13,329 --> 00:43:18,987
that have solved the problem
of getting to zero harm
that healthcare has not solved.
579
00:43:19,031 --> 00:43:23,296
Tools and methods
and lessons from that work
580
00:43:23,339 --> 00:43:27,779
are very directly
applicable to healthcare
and we're starting to see
581
00:43:27,822 --> 00:43:31,478
healthcare organizations
use them to make improvements
582
00:43:31,521 --> 00:43:34,133
at a level that
we have never seen before.
583
00:43:34,176 --> 00:43:38,920
So over here we have
the complex system of
the modern American hospital,
584
00:43:38,964 --> 00:43:44,056
and over here we have
other industries
that have learned to
585
00:43:44,099 --> 00:43:47,886
simplify and deal with
these complex systems.
586
00:43:47,929 --> 00:43:53,108
In the last calendar year there
has been no fatality worldwide
587
00:43:53,152 --> 00:43:56,155
in commercial aviation
due to an accident.
588
00:43:56,198 --> 00:44:01,290
Compare that to our business
where we have 20 wrong-site
surgeries every week.
589
00:44:02,814 --> 00:44:07,775
[David Mayer] Pilots make one
error per hour in the cockpit
every day they work
590
00:44:07,819 --> 00:44:11,692
and yet we wonder why planes
aren't falling out of the sky.
591
00:44:11,736 --> 00:44:16,654
If aviation had said, "Well, you
know what, to fly you 600 miles
an hour it's going to
592
00:44:16,697 --> 00:44:20,745
come with some mishap. And you
got to expect a plane or two to
fall out of the sky,"
593
00:44:20,788 --> 00:44:25,445
and thank god they didn't say
that and they said, "No, we can
drive it to zero.
594
00:44:25,488 --> 00:44:29,449
We can drive it down to
virtually no mishap,"
and they have.
595
00:44:29,492 --> 00:44:33,975
The aviation industry is the
safest it's ever been since the
invention of the jet engine.
596
00:44:34,019 --> 00:44:40,547
What we're really doing when we
go up in an airliner is pushing
a tube filled with people
597
00:44:40,590 --> 00:44:46,684
through the upper atmosphere,
7 or 8 miles above the earth,
at 80% the speed of sound,
598
00:44:46,727 --> 00:44:51,340
in a hostile environment with
outside air pressure one-quarter
that at the surface,
599
00:44:51,384 --> 00:44:56,606
and we must return it safely
to the surface every time,
and we do.
600
00:44:56,650 --> 00:45:02,700
In this country alone,
28,000 times a day,
10.2 million times a year.
601
00:45:02,743 --> 00:45:08,096
In a little over
100 years you've gone
from quite a dangerous
602
00:45:08,140 --> 00:45:13,754
industry to the first ultra-safe
mode of transport bar none.
603
00:45:13,798 --> 00:45:19,238
One of the reasons is because
it is studied so well,
and every single event
604
00:45:19,281 --> 00:45:23,111
is clearly understood
and is made public so
others can learn from them.
605
00:45:23,155 --> 00:45:28,900
I had been flying airplanes
for 42 years. I had 20,000 hours
in the air.
606
00:45:28,943 --> 00:45:32,294
And throughout that entire time,
I had never been so challenged
in an airplane
607
00:45:32,338 --> 00:45:37,604
I doubted the outcome. I never
thought I would be. I was wrong.
608
00:45:37,647 --> 00:45:42,087
[narrator] In January of
2009, Captain Sullenberger's
training and instincts
609
00:45:42,130 --> 00:45:48,354
saved the lives of all
155 passengers aboard
US Airways Flight 1549
610
00:45:48,397 --> 00:45:52,532
after it struck a flock
of geese and lost
all engine power.
611
00:45:52,575 --> 00:45:55,491
The dramatic landing on the Hudson River reminded Americans
612
00:45:55,535 --> 00:45:58,320
of the importance of experience
in the cockpit.
613
00:46:00,627 --> 00:46:07,329
In an industry in which we work
very hard to make everything
easy and routine and safe,
614
00:46:07,373 --> 00:46:10,942
100 seconds after takeoff we
were suddenly confronted with an
615
00:46:10,985 --> 00:46:13,771
ultimate challenge of
a lifetime, to do something
we'd never done before
616
00:46:13,814 --> 00:46:16,425
and get it right the first time
never having practiced it.
617
00:46:16,469 --> 00:46:20,386
In a similar fashion
in medicine, there are
some things that just can't be
618
00:46:20,429 --> 00:46:23,868
practiced safely any other way
than in a simulation for the
first time.
619
00:46:23,911 --> 00:46:29,743
And it gives you a chance
to practice things over
and over and over again.
620
00:46:29,787 --> 00:46:33,355
And so it's important that
the simulations be done
not simply individually,
621
00:46:33,399 --> 00:46:37,533
but also collectively
as a whole team.
622
00:46:47,848 --> 00:46:52,331
[narrator] Flight simulators
have been used to train pilots
for nearly 100 years.
623
00:46:52,374 --> 00:46:56,335
And while medicine has used
cadavers to train doctors
for much longer,
624
00:46:56,378 --> 00:46:59,686
only recently have institutions begun using robotics
625
00:46:59,729 --> 00:47:03,298
to simulate any kind of
situation a care provider
may face.
626
00:47:04,952 --> 00:47:08,477
[Heather Young] Simulation is
a very big part of
our educational program here
627
00:47:08,521 --> 00:47:14,657
and it involves anything from
patients who come in as actors
and will work with a student,
628
00:47:14,701 --> 00:47:18,748
all the way up to very
high-fidelity robots,
and environments that
629
00:47:18,792 --> 00:47:22,361
are tricked out to look out
exactly like a hospital
operating room
630
00:47:22,404 --> 00:47:24,972
or an emergency department
or hospital ward.
631
00:47:27,714 --> 00:47:30,848
The airline industry is the
prototype of using simulation
632
00:47:30,891 --> 00:47:36,070
where you can practice landing
in San Diego with a terrible
storm or a tsunami
633
00:47:36,114 --> 00:47:38,420
or on a very calm day
and you can practice
634
00:47:38,464 --> 00:47:41,859
all different kinds of failures
within the airplane.
635
00:47:41,902 --> 00:47:45,253
It's newer in healthcare,
but it's really something
that's catching on,
636
00:47:45,297 --> 00:47:51,912
and you can really put people
through the steps of handling
many important situations.
637
00:47:51,956 --> 00:47:55,568
[Ian Julie] So we're going to be
practicing our new simulated
protocol
638
00:47:55,611 --> 00:47:59,964
for our actual sepsis
patients. Sepsis care can be
very, very difficult.
639
00:48:00,007 --> 00:48:03,402
We know the science behind it,
we know what helps,
but we don't necessarily know
640
00:48:03,445 --> 00:48:06,361
how to do it in a way that's
organized and consistent.
641
00:48:06,405 --> 00:48:11,018
We'd rather practice on our
friend the mannequin here who
it's very hard to injure,
642
00:48:11,062 --> 00:48:15,196
rather than on real patients.
That way we can standardize
things within our hospital
643
00:48:15,240 --> 00:48:19,026
and give our nurses and doctors
a chance to practice what it is
their doing,
644
00:48:19,070 --> 00:48:21,333
before they have to do it
on real patients.
645
00:48:21,376 --> 00:48:24,423
Hi Robert. My name is Emily.
I'm going to be you nurse today.
646
00:48:24,466 --> 00:48:28,296
I'm here to do your morning
assessment and take your vital
signs. How are you feeling?
647
00:48:28,340 --> 00:48:29,950
[mannequin] I'm not feeling
very well.
648
00:48:29,994 --> 00:48:31,996
[Nurse 1] You're not?
What's going on?
649
00:48:32,039 --> 00:48:34,172
[mannequin] I just can't catch
my breath this morning and
650
00:48:34,215 --> 00:48:36,000
I feel like my cough is worse.
651
00:48:36,043 --> 00:48:38,741
He's remaining stable.
Based on the alert, um,
652
00:48:38,785 --> 00:48:42,180
and the lactic acid, I think I'm
going to start some oxygen.
653
00:48:43,137 --> 00:48:47,446
[Nurse 2] Okay, are there signs
or symptoms of an infection?
654
00:48:47,489 --> 00:48:50,405
[Nurse 1] Well,
he's saying that he has an
increased work of breathing.
655
00:48:50,449 --> 00:48:52,625
He has a white count of 16.
656
00:48:52,668 --> 00:48:54,453
[Nurse 2] OK, sound good.
I'll be right over.
657
00:48:54,496 --> 00:48:56,368
-OK, thank you!
-[Nurse 2] Alright.
658
00:48:58,022 --> 00:49:00,459
-[Nurse 2] Hi, Mr. Robert!
-[mannequin] Hi!
659
00:49:00,502 --> 00:49:03,853
-How are you feeling?
-[mannequin] I've had
better days.
660
00:49:03,897 --> 00:49:05,986
-Are you short of breath?
-[Mannequin] Yeah.
661
00:49:06,030 --> 00:49:08,032
[Nurse 2] OK and when
did this start?
662
00:49:08,075 --> 00:49:13,341
Here you go. You've drawn
cultures already, correct?
663
00:49:13,385 --> 00:49:16,562
-[Nurse 2] This is Robert Doe?
-[Nurse 1] Yes, Robert Doe.
664
00:49:17,432 --> 00:49:20,653
[Ian Julie] We can make
the scenario more complex,
and we do on occasion.
665
00:49:20,696 --> 00:49:23,047
We could have the patient enter
a state of shock,
666
00:49:23,090 --> 00:49:26,311
or not respond properly to
the fluids or the antibiotics.
667
00:49:26,354 --> 00:49:30,010
So, much of what we've done is
related to the need to kind of
668
00:49:30,054 --> 00:49:32,926
fulfill the recommendations
that have been given.
669
00:49:32,970 --> 00:49:35,233
In addition to wanting to do
what's right for the patient
670
00:49:35,276 --> 00:49:38,627
and following through on
the best available
scientific evidence.
671
00:49:38,671 --> 00:49:42,022
When I graduated as a nurse, the
first time I ever had a chance
672
00:49:42,066 --> 00:49:44,720
to shock a person
whose heart had stopped
673
00:49:44,764 --> 00:49:47,288
was in the middle of
the night in a rural hospital
674
00:49:47,332 --> 00:49:50,944
and it was my first time
I had ever turned on
the paddles in my life.
675
00:49:50,988 --> 00:49:55,557
And someone's life depended on
that. That's not acceptable.
676
00:49:55,601 --> 00:50:00,606
We want our students to practice
and practice and practice
how to shock people
677
00:50:00,649 --> 00:50:04,697
in a simulated situation,
so that when someone is
really depending on them,
678
00:50:04,740 --> 00:50:07,221
they do it right
the first time.
679
00:50:07,265 --> 00:50:12,661
I shudder to remember how
I was trained as a doctor
to learn how to do stuff.
680
00:50:12,705 --> 00:50:17,710
Lumbar punctures, spinal taps,
put IVs in, or even chest tubes.
681
00:50:17,753 --> 00:50:21,801
You practiced on the patients.
I mean, that was
the only option.
682
00:50:21,844 --> 00:50:25,544
Some patient, some time,
was the first patient
I ever put a chest tube in,
683
00:50:25,587 --> 00:50:30,070
and that person paid the price.
They were paying for my tuition.
684
00:50:30,114 --> 00:50:32,812
You know, we don't do that
with pilots, we put them
in the simulator
685
00:50:32,855 --> 00:50:35,728
and they fly something that
isn't really a plane for
a while, first,
686
00:50:35,771 --> 00:50:38,296
with high fidelity. Now we know
how to do that in health care.
687
00:50:38,339 --> 00:50:42,039
The growth of simulation so that
the first chest tube doesn't go
into a human being,
688
00:50:42,082 --> 00:50:46,956
it goes in a mannequin that
looks like a human being,
that's great. And I think that
689
00:50:47,000 --> 00:50:50,656
it's one of the emerging ways
to help build skills
690
00:50:50,699 --> 00:50:54,790
hmm, in a work force without
the patients paying the tuition.
691
00:50:57,010 --> 00:51:00,535
[narrator] Many aspects of
the aviation industry have
been applied to medicine,
692
00:51:00,579 --> 00:51:04,452
from checklists before an
operation to monitoring
physicians for fatigue.
693
00:51:04,496 --> 00:51:09,022
But there are still elements
of safety in aviation that
have not been explored.
694
00:51:09,066 --> 00:51:13,722
One of the most well-known
improvements in airline
safety is the black box.
695
00:51:13,766 --> 00:51:17,335
A surgeon in Toronto
has been working with
a group of designers
696
00:51:17,378 --> 00:51:20,686
to create a similar
tool for the operating room.
697
00:51:20,729 --> 00:51:23,471
[Teodor Grantcharov]
I want my patients to feel
the same way when
698
00:51:23,515 --> 00:51:27,258
they enter the operating room
as I feel when I enter
a modern aircraft.
699
00:51:27,301 --> 00:51:31,740
Unless we create a system where
we understand, that we tolerate,
700
00:51:31,784 --> 00:51:35,309
and we learn from our errors,
we will never be able
to improve.
701
00:51:35,353 --> 00:51:38,138
We've tried for many years
to create something
like the black box.
702
00:51:38,182 --> 00:51:43,622
Finally, in 2012 we were able
to create a technology
that allows us to capture
703
00:51:43,665 --> 00:51:49,149
video and audio and data from
everything that's happening in
an operating room.
704
00:51:49,193 --> 00:51:52,196
We've been developing and
implemented a number of sensors.
705
00:51:52,239 --> 00:51:55,024
So, we know how many times a
door opens and closes.
706
00:51:55,068 --> 00:51:58,637
We know how we wash our hands
prior to a surgical procedure.
707
00:51:58,680 --> 00:52:01,161
And all these data feeds
are combined
708
00:52:01,205 --> 00:52:03,685
and perfectly synchronized on
the same platform.
709
00:52:03,729 --> 00:52:07,254
When we talk with our patients
about the black box
and what we are doing here,
710
00:52:07,298 --> 00:52:11,215
the first reaction,
the most common reaction
in 90% of the patients is,
711
00:52:11,258 --> 00:52:13,826
"I can't believe
this hasn't been done before."
712
00:52:13,869 --> 00:52:16,698
From the point we started
recording our surgeries,
713
00:52:16,742 --> 00:52:19,092
we had a tremendous amount
of media attention.
714
00:52:22,008 --> 00:52:26,317
Everybody believed in the
transparency doctor,
that's what he was nicknamed.
715
00:52:26,360 --> 00:52:29,537
He doesn't have anything
to hide. I'm definitely going
to go to him.
716
00:52:29,581 --> 00:52:32,540
Patients need to know that
when they walk into a hospital,
717
00:52:32,584 --> 00:52:36,979
everything is being done
to learn from mistakes and
possible risks that take place.
718
00:52:37,023 --> 00:52:39,721
This has to be
common standard practice.
719
00:52:39,765 --> 00:52:45,292
We've heard for too long that
healthcare is complex, that
our patients are not aircraft,
720
00:52:45,336 --> 00:52:50,079
that surgeons are not pilots.
I just want us to start doing
something and changing it.
721
00:52:50,123 --> 00:52:53,300
We're trying to create
a system that identifies
722
00:52:53,344 --> 00:52:56,477
performance deficiencies
and improves safety.
723
00:52:56,521 --> 00:53:00,612
A new gadget comes out from an
industry provider all the time.
724
00:53:00,655 --> 00:53:03,919
Usually it's a very
emotional attachment like,
"Oh, this looks sexy,"
725
00:53:03,963 --> 00:53:06,966
or "I like how this handle
feels when I'm using
it during surgery."
726
00:53:07,009 --> 00:53:09,229
You need something
deeper beyond that.
727
00:53:09,273 --> 00:53:12,406
So, this is what we call
a full surgical timeline
and you can see
728
00:53:12,450 --> 00:53:15,844
the entire procedure broken
down from beginning to end.
729
00:53:15,888 --> 00:53:21,241
As you scroll down
this timeline, you'll start
to see little beeps here,
730
00:53:21,285 --> 00:53:25,767
and that's where our surgical
expert analysts have coded
where they saw errors.
731
00:53:25,811 --> 00:53:29,510
This screenshot establishes
what one of the errors is.
So in this case
732
00:53:29,554 --> 00:53:33,427
an error took place during
the suturing task, and it was
inadequate visualizations.
733
00:53:33,471 --> 00:53:38,650
As the surgeon was suturing with
the needle and driver, he might
have gone off frame,
734
00:53:38,693 --> 00:53:42,175
which is incorrect because now
you have no idea where that
needle is.
735
00:53:42,219 --> 00:53:45,047
You even see issues in say,
leadership or communication,
736
00:53:45,091 --> 00:53:48,050
and we have a whole toolset
determining exactly that.
737
00:53:48,094 --> 00:53:51,271
This points a really interesting
storyline because
the blue bar establishes
738
00:53:51,315 --> 00:53:57,190
that the surgical resident, the
trainee under the main surgeon
was doing the actual case,
739
00:53:57,234 --> 00:53:59,497
and then when a cluster
of errors takes place,
740
00:53:59,540 --> 00:54:02,761
you can see the switch over
to the main surgeon,
to Dr. Grantcharov.
741
00:54:02,804 --> 00:54:07,592
This entire timeline
is the data quantified.
742
00:54:07,635 --> 00:54:11,857
We're breaking down
the entire set of errors
into tangible areas
743
00:54:11,900 --> 00:54:15,295
to provide further education on
it to essentially improve it.
744
00:54:15,339 --> 00:54:18,037
Analytics is at the heart
of what the black box does,
745
00:54:18,080 --> 00:54:20,431
but we're jumping
into different areas.
746
00:54:20,474 --> 00:54:23,564
Our engineers are working on
tools to improve handwashing
747
00:54:23,608 --> 00:54:27,133
to essentially create
a detector that lets you know,
748
00:54:27,176 --> 00:54:30,397
yes, you've spent the right
amount of time and the right
technique to wash your hands.
749
00:54:30,441 --> 00:54:34,009
So over here, one of our
engineers, Kevin, has been
working on just that.
750
00:54:34,053 --> 00:54:38,013
It's a motion sensing tool
that will look at
how you wash your hands
751
00:54:38,057 --> 00:54:40,799
and look at the surface plane
you are working with,
the amount of time
752
00:54:40,842 --> 00:54:44,281
spent on washing your hands,
and give you real time feedback.
753
00:54:44,324 --> 00:54:47,458
The key here is the data. So, I
can go and tell any surgeon
754
00:54:47,501 --> 00:54:51,679
you have to wash your hands
this many number of times
in this fashion,
755
00:54:51,723 --> 00:54:55,640
but if I have hard data
showing... because of doing it
this particular way
756
00:54:55,683 --> 00:54:59,992
we have reduced site infections
by this much, it's irrefutable.
757
00:55:04,953 --> 00:55:05,998
[piano music]
758
00:59:09,154 --> 00:59:13,071
[Sue Sheridan] Cal has gone on
to become part of
a comedy community.
759
00:59:13,114 --> 00:59:16,074
He's producing.
He's produced two comedy shows.
760
00:59:16,117 --> 00:59:19,904
Cal uses his comedy
in really novel ways
761
00:59:19,947 --> 00:59:23,385
that helps him deal
with losing a dad.
762
00:59:44,102 --> 00:59:48,236
Everybody develops their own
way to deal with death
or loss or grief
763
00:59:48,280 --> 00:59:52,240
and I think that comedy
is Cal's, uh, his outlet.
764
01:00:05,558 --> 01:00:09,214
[Sue Sheridan] He feels
like he doesn't suffer, but he
sometimes struggles
765
01:00:09,257 --> 01:00:15,046
to be understood because his
speech is impaired. He struggles
when he rides on airplanes
766
01:00:15,089 --> 01:00:18,310
because sometimes his scooters
or walkers are broken.
767
01:00:19,616 --> 01:00:23,663
He struggles in environments
where it's not easy
to get around.
768
01:00:23,707 --> 01:00:30,757
The first year, or year and
a half, we had 183 separate
medical visits
769
01:00:31,976 --> 01:00:36,023
for physical therapy, and ENT,
eyes and ears and teeth,
and neurology,
770
01:00:36,067 --> 01:00:39,940
and during that time,
in my heart, I knew something
was wrong with Cal.
771
01:00:39,984 --> 01:00:43,814
Our local doctors were
really not willing to offer
a diagnosis.
772
01:00:43,857 --> 01:00:49,646
We took-- I took Cal out of
state to a leading university
where a team of specialists
773
01:00:49,689 --> 01:00:52,344
reviewed Cal's charts
that I had never looked at.
774
01:00:52,387 --> 01:00:55,564
I didn't think that there was
any reason for me to look at my
birthing charts,
775
01:00:55,608 --> 01:00:58,698
and back then charts weren't
that available to patients.
776
01:00:58,742 --> 01:01:05,705
And they showed to me a report
from an MRI that they did on Cal
when he was 5 days old
777
01:01:05,749 --> 01:01:12,581
that clearly shared
abnormalities in his brain
from his jaundice.
778
01:01:12,625 --> 01:01:15,976
And our healthcare system
really didn't...
779
01:01:17,630 --> 01:01:21,678
Well, they covered up.
They covered up
Cal's injury and, umm...
780
01:01:24,942 --> 01:01:27,727
I wasn't empowered with
information and knowledge
781
01:01:27,771 --> 01:01:30,338
to challenge some of it or
ask the appropriate questions.
782
01:01:30,382 --> 01:01:33,341
You know, in healthcare they say
that patients, we need to ask
more questions,
783
01:01:33,385 --> 01:01:35,430
but sometimes we simply
don't know what to ask.
784
01:01:40,827 --> 01:01:45,745
[Michael Millenson]
Understand, before you go in
for any particular procedure,
785
01:01:45,789 --> 01:01:48,922
what are the questions you need
to ask to keep yourself safe?
786
01:01:48,966 --> 01:01:51,708
And if we all start asking
those questions,
787
01:01:51,751 --> 01:01:54,536
then pretty soon it will
become clear to any hospital
788
01:01:54,580 --> 01:01:58,976
that's not doing those things
that there is pressure
on them to do it.
789
01:01:59,019 --> 01:02:03,763
If you're in a hospital, by
definition today, you're seeing
a lot of different doctors,
790
01:02:03,807 --> 01:02:06,505
there's a lot of caregivers
coming in and out of the room,
791
01:02:06,548 --> 01:02:09,813
most of whom work to communicate
with each other, but sometimes
they miss.
792
01:02:09,856 --> 01:02:13,730
So, if you see something that
doesn't look right, or sound
right, you say,
793
01:02:13,773 --> 01:02:16,297
"Whoa, wait a minute,
that's not what they told me."
794
01:02:16,341 --> 01:02:19,387
Patient safety is a team sport.
And one of the ways
795
01:02:19,431 --> 01:02:22,042
to really make a difference
is you've got to get
patients engaged.
796
01:02:22,086 --> 01:02:25,872
So, if patients begin
walking into hospitals
with an expectation
797
01:02:25,916 --> 01:02:28,875
that they are not going
to get an infection
and they start saying,
798
01:02:28,919 --> 01:02:32,705
"Hey, have you washed your hands
before you come over to see me?"
That's how it happens.
799
01:02:32,749 --> 01:02:37,318
If they've done it outside
the room, or they've done it
at the nurse's station,
800
01:02:37,362 --> 01:02:40,844
on the way into the room
they're touching the door,
they're touching things,
801
01:02:40,887 --> 01:02:42,933
and then they are coming in,
so that doesn't count.
802
01:02:42,976 --> 01:02:45,892
It has to be before and
after patient contact.
803
01:02:45,936 --> 01:02:50,505
Here's what I look for
in a hospital that's really
outstanding on safety;
804
01:02:50,549 --> 01:02:56,120
the sink is placed in a way that
it is easy to walk into a room
and immediately wash your hands.
805
01:02:56,163 --> 01:03:01,908
You'll see charts on patient
floors, right there for anyone
to see, that will show
806
01:03:01,952 --> 01:03:06,826
how they are doing on patient
falls, for instance, or how they
are doing on infections.
807
01:03:06,870 --> 01:03:10,438
People have an attitude about
safety, you just can feel it.
808
01:03:10,482 --> 01:03:15,792
There's an apocryphal story of
President Kennedy visiting Cape
Canaveral during his presidency
809
01:03:15,835 --> 01:03:20,013
and he takes aside a custodian
and says, "What's your job?"
And the custodian says,
810
01:03:20,057 --> 01:03:25,062
"Mr. President, my job is to
help get a man to the moon and
return him to earth safely."
811
01:03:26,280 --> 01:03:32,460
Everybody has a job to do
to protect patients,
not just doctors.
812
01:03:32,504 --> 01:03:38,858
Every nurse, pharmacist,
physician, custodian,
has a role in safety.
813
01:03:38,902 --> 01:03:43,123
I think it's deeply unfair
to expect patients who are sick,
in the middle of an illness,
814
01:03:43,167 --> 01:03:46,561
to try and sort this out
on their own. Now,
it may be unfair,
815
01:03:46,605 --> 01:03:49,216
but the reality is
that's where we are.
816
01:03:49,260 --> 01:03:53,133
The best thing they can do
is have a family member or
a friend around, because again,
817
01:03:53,177 --> 01:03:56,049
when in the middle of
an illness it's very hard
for you to pay attention
818
01:03:56,093 --> 01:03:59,009
to know what's going on, but
your friend can, your family
member can.
819
01:03:59,052 --> 01:04:01,881
If somebody says you're going
to get medication X,
820
01:04:01,925 --> 01:04:04,666
is that the medication
that actually showed up?
821
01:04:04,710 --> 01:04:08,148
And asking in a very
friendly and respectful way,
822
01:04:08,192 --> 01:04:11,935
when a nurse comes by to hang
a medication or give you a pill,
823
01:04:11,978 --> 01:04:16,461
you know, what is this?
What am I getting? It's
a totally reasonable question.
824
01:04:16,504 --> 01:04:19,725
Patients should feel
comfortable doing it.
And if you have a provider
825
01:04:19,768 --> 01:04:24,164
that responds badly to that,
you should try to figure out
if you can switch providers.
826
01:04:24,208 --> 01:04:28,429
My father was a doctor in
a small town in Connecticut.
827
01:04:28,473 --> 01:04:33,478
For a lot of time he was
the only doctor there
and he was revered.
828
01:04:33,521 --> 01:04:38,048
You didn't question him.
It wasn't my father's fault
in any way.
829
01:04:38,091 --> 01:04:41,747
He was a proud and successful
professional honored by his
community.
830
01:04:41,790 --> 01:04:46,143
That's not actually adaptive
if we really want care to be
what it can be.
831
01:04:46,186 --> 01:04:52,192
I think, I understand
the hesitation people may feel
to ask the doctor,
832
01:04:52,236 --> 01:04:58,329
"What's going on here?"
But that's healthy, that's good,
and we need to train doctors to,
833
01:04:58,372 --> 01:05:02,289
not just to accept that,
but to absolutely welcome it.
It's better medicine.
834
01:05:02,333 --> 01:05:03,508
[tense music]
835
01:05:09,470 --> 01:05:14,867
[Sully Sullenberger] If,
as reports indicate, there are
as many as 440,000 preventable
836
01:05:14,911 --> 01:05:19,176
medical deaths in this
country alone every year,
that is the equivalent of
837
01:05:19,219 --> 01:05:23,615
7 or 8 airliners crashing every
day with no survivors.
838
01:05:23,658 --> 01:05:29,142
Before the first day of that
kind of carnage was complete,
airplanes would be grounded,
839
01:05:29,186 --> 01:05:33,886
airlines would stop operating,
airports would close,
no one would fly
840
01:05:33,930 --> 01:05:38,064
until some of the fundamental
issues had been resolved.
841
01:05:38,108 --> 01:05:41,459
But because aviation
accidents are dramatic,
842
01:05:41,502 --> 01:05:45,811
they receive the kind of
attention that they do,
and the public awareness.
843
01:05:45,854 --> 01:05:53,645
Medical deaths occur singly and
often behind the scenes, but in
aggregate the harm is huge.
844
01:05:53,688 --> 01:05:58,519
We need to change
the way we think about
these medical deaths.
845
01:05:58,563 --> 01:06:03,960
We need to think about
them not as unavoidable,
but as unthinkable.
846
01:06:04,917 --> 01:06:11,228
We've got to get better at
making sure whatever hospital
you go into in the U.S.
847
01:06:11,271 --> 01:06:14,318
you're getting the same quality
care, and we are not there.
848
01:06:14,361 --> 01:06:16,668
I mean, you're asking people
to do things differently.
849
01:06:16,711 --> 01:06:19,149
You're asking doctors
to think differently
and work differently.
850
01:06:19,192 --> 01:06:23,240
You're asking architects
to build different spaces,
nurses to work
851
01:06:23,283 --> 01:06:26,025
differently in teams, patients
to have a different role.
852
01:06:26,069 --> 01:06:30,029
To change patient safety, you
have to change everything. If
you look at preventable harm
853
01:06:30,073 --> 01:06:33,032
across American hospitals,
it has gone down considerably,
854
01:06:33,076 --> 01:06:35,208
you know, saving hundreds
of thousands of lives
855
01:06:35,252 --> 01:06:38,951
and billions of dollars.
That doesn't mean we fixed it.
856
01:06:38,995 --> 01:06:45,610
It is quietly, slowly,
but definitely becoming
the professional norm
857
01:06:45,653 --> 01:06:51,442
to take certain precautions,
to do things in a certain way
so that patients are safe.
858
01:06:51,485 --> 01:06:56,795
Because you can be satisfied if
you have very low expectations
and the reality is that all of
859
01:06:56,838 --> 01:07:01,104
our expectations should be
raised, that we all get very
high quality, safe healthcare.
860
01:07:01,147 --> 01:07:06,544
We all want to do well.
We all want to get better.
Nobody comes to work
861
01:07:06,587 --> 01:07:11,331
to harm a patient
or wanting to harm a patient
or to give bad care.
862
01:07:11,375 --> 01:07:15,161
We haven't made this
a public health issue
where the public is really
863
01:07:15,205 --> 01:07:19,209
thinking about this, and yet
when you talk to any person
864
01:07:19,252 --> 01:07:22,690
who's had a family member
or themselves in healthcare,
they all have a story.
865
01:07:22,734 --> 01:07:26,346
I've also talked to doctors
and nurses who have committed
a terrible error and they say,
866
01:07:26,390 --> 01:07:30,133
"I know I can't take that back,
but what will really give
that meaning is
867
01:07:30,176 --> 01:07:33,092
if I do something that makes
the system safer for
the next person."
868
01:07:33,136 --> 01:07:35,964
I think, in part,
the job of people like me
in leadership roles
869
01:07:36,008 --> 01:07:38,619
is to harness that passion,
harness that energy.
870
01:07:38,663 --> 01:07:42,449
All of the rest of these guys
are much more serious about
medical error reduction
871
01:07:42,493 --> 01:07:45,931
than they ever were.
Is it going as fast as it could?
No, of course not.
872
01:07:45,974 --> 01:07:49,413
It is not
"stuff happens" anymore.
873
01:07:49,456 --> 01:07:53,199
That's where we're going,
and that's the good future
that we're moving towards.
874
01:07:53,243 --> 01:07:56,463
It feels like we should be
further along than we are,
but actually
875
01:07:56,507 --> 01:07:59,858
I think we've made tremendous
progress in 15 years.
It is on the map.
876
01:07:59,901 --> 01:08:03,122
We have these examples in the
U.S. and around the world.
877
01:08:03,166 --> 01:08:08,388
It's not any longer
a question of possibility,
it's a question of will.
878
01:08:15,613 --> 01:08:19,834
Many of us go kind of through a
self-blame. Although we know it
wasn't our fault,
879
01:08:19,878 --> 01:08:25,710
we feel like we didn't...
protect our son.
880
01:08:25,753 --> 01:08:30,236
And so there was really, really
significant grieving,
881
01:08:30,280 --> 01:08:34,849
so the anger at first
was immeasurable.
882
01:08:34,893 --> 01:08:39,419
When we discovered Pat's error,
we both felt tremendous fear.
883
01:08:39,463 --> 01:08:43,510
I think at that point
it was just plain disbelief.
884
01:08:43,554 --> 01:08:48,036
He said: "Whatever you do, do
not give up on patient safety."
885
01:08:48,080 --> 01:08:51,779
So, that led me onto a journey
to... I wanted to make sure
our healthcare system,
886
01:08:51,823 --> 01:08:55,174
our government knew
what happened to Pat and Cal.
887
01:08:55,218 --> 01:08:59,657
So, it took us 8 years, but
we really did make some changes
in our healthcare system
888
01:08:59,700 --> 01:09:05,097
where babies being discharged
would have a bilirubin test
before they were discharged.
889
01:09:05,141 --> 01:09:08,927
[Mark Graber] Thanks to Sue
and the work that she's done,
there are now processes
890
01:09:08,970 --> 01:09:12,235
in place in every hospital to
screen for that condition.
891
01:09:12,278 --> 01:09:16,717
And the odds that that's going
to happen again are now
approaching zero,
892
01:09:16,761 --> 01:09:19,720
and that's what we'd like to see happen throughout medicine.
893
01:09:19,764 --> 01:09:23,550
And the work that Sue has done is our model for how to do that.
894
01:09:23,594 --> 01:09:29,556
She turned what she had gone
through into empowerment
and positivity,
895
01:09:29,600 --> 01:09:34,300
and if she can do it, so can I,
and so can a lot of people.
896
01:09:34,344 --> 01:09:39,392
I've obviously always idolized
my Mom and I understood
her job very well
897
01:09:39,436 --> 01:09:42,961
and people would ask me, "What
does your Mom do?" I would say,
"Well, she saves lives."
898
01:09:43,004 --> 01:09:46,878
Having witnessed these
tragic outcomes in
our healthcare system,
899
01:09:46,921 --> 01:09:52,971
the one place that we should
feel unquestionably safe.
900
01:09:53,014 --> 01:09:57,280
[Mackenzie Sheridan] And it kind
of ignited a fire inside me
that wanted to, you know,
901
01:09:57,323 --> 01:10:01,066
do what my Mom does, which is,
you know, talk with hospitals
and talk with doctors
902
01:10:01,109 --> 01:10:04,939
and figure out how we
can make those kind of things
not happen again.
903
01:10:04,983 --> 01:10:09,901
So, I went to Portland State
University and I chose to do
Public Health
904
01:10:09,944 --> 01:10:15,646
because I wanted to feel like
I was making a difference
and feel like I, you know,
905
01:10:15,689 --> 01:10:20,346
could help prevent things
that happened to my family,
happening to other people.
906
01:10:29,312 --> 01:10:33,403
I am unwilling to believe that
we have done all that we can do.
907
01:10:34,708 --> 01:10:39,626
My experience with diagnostic
errors and the healthcare system
has been without a doubt
908
01:10:39,670 --> 01:10:45,415
the most powerfully emotional
experience in my life.
909
01:10:45,458 --> 01:10:50,811
However, my family's story
is also a story of awakening,
910
01:10:50,855 --> 01:10:56,730
of passion, of change,
and hope for the future.
911
01:10:57,905 --> 01:11:02,475
I cannot change what happened
to Cal and Pat,
but I've always felt
912
01:11:02,519 --> 01:11:05,870
that I can somehow be part
of it and make a difference.
913
01:11:07,306 --> 01:11:11,441
My teacher in courage, in hope,
914
01:11:11,484 --> 01:11:14,487
in determination, in passion,
915
01:11:14,531 --> 01:11:19,536
of course he's my teacher
in sense of humor which he
believes his mother has none of,
916
01:11:19,579 --> 01:11:23,627
but he's the reason
for what's in me.
917
01:11:51,350 --> 01:11:52,917
[audience applauding]
918
01:12:06,104 --> 01:12:10,761
[Sue Sheridan] You know, when
Pat was dying he said, "Never
give up on patient safety."
919
01:12:10,804 --> 01:12:14,417
At that time, I did not envision
my whole family being engaged.
920
01:12:14,460 --> 01:12:18,682
Before we went on stage today,
I thought about Pat,
my daughter in the front row,
921
01:12:18,725 --> 01:12:22,033
my son on stage.
It was, umm, just surreal.
922
01:12:28,648 --> 01:12:31,347
[singing Happy Birthday]
923
01:12:43,663 --> 01:12:46,579
[Mackenzie Sheridan]
On March 8th, which is the day
that my dad passed away,
924
01:12:46,623 --> 01:12:49,843
we spread his ashes
on Table Rock.
925
01:12:53,020 --> 01:12:57,198
Whenever we go there
I always feel like a warm,
just like, presence.
926
01:12:57,242 --> 01:12:59,113
It's because it's such a
beautiful place,
927
01:12:59,157 --> 01:13:02,203
and it's beautiful that
he's there as well.
928
01:13:04,510 --> 01:13:07,948
[Sue Sheridan] Pat will always
be alive in our hearts
and in our memories,
929
01:13:07,992 --> 01:13:11,474
and it was very hard for them
to lose a Dad when
they were only 4 and 6.
930
01:13:13,693 --> 01:13:18,655
They will continue to honor and miss and wonder about their Dad.
931
01:13:21,962 --> 01:13:24,051
I've always had this hope:
932
01:13:26,619 --> 01:13:30,275
I will not believe that our
leadership in our country, in
our healthcare system,
933
01:13:30,318 --> 01:13:32,190
will continue to think
this is okay.
934
01:13:34,018 --> 01:13:35,976
Because it's not.
935
01:13:46,683 --> 01:13:48,119
[piano music]
98477
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.