07x15 - Facial recognition system

Episode transcripts for the TV show, "Last Week Tonight with John Oliver". Aired: April 27, 2014 – present.*
Watch/Buy Amazon

American late-night talk and news satire television program hosted by comedian John Oliver.
Post Reply

07x15 - Facial recognition system

Post by bunniefuu »

LAST WEEK TONIGHT
WITH JOHN OLIVER

Hi there. Welcome to the show,
still taking place in this white,

antiseptic room filled
with one increasingly weak adult.

It's like a hospital, only sadder.

It's been a week of continued
Black Lives Matter protests,

which generated far-reaching
effects, both big and small.

For instance, "Cops" was canceled
after 33 years of being

"That Show Your Dad
Falls Asleep To."

And "Sesame Street"
and CNN held a town hall to explain

the current moment to children,
which didn't please all kids.

In fact, just wait
for one giant baby's reaction.

Across the country, people of color,
especially in the black community,

are being treated unfairly
because of how they look.

It's a children's show.
Got that, Bobby ?

America is a very bad place
and it's your fault.

So, no matter what they do
to you when you grow up,

you have no right to complain.

First: obviously, f*ck off, Tucker,
you one-man homeowner's association.

And second: that unspecified "they"

in "what they do to you
when you grow up"

is doing a lot of heavy lifting there.

There's basically two options
for what that could mean.

One: that Tucker and his viewers
have benefitted from a r*cist system

that renders any specifications
of who "they" are unnecessary.

Or two: that his show
is a badly-written piece of garbage.

Which is it, Tucker ? Are you a r*cist
or are you a total f*cking moron ?

The answer can be,
and indeed is, both.

But the momentum
wasn't just confined to television.

Minneapolis City Council has pledged
to dismantle that city's police force,

while New York lawmakers voted to
make police disciplinary records public

and passed a law criminalizing
the use of chokeholds by police.

Both of which seem like common sense
and the bare minimum,

and yet, police union leaders
have continued to freak out.

Stop treating us like animals
and thugs

and start treating us
with some respect.

That's what
we're here today to say.

We've been left out
of the conversation.

We've been vilified.

It's disgusting.

Yeah, you have been
left out of the conversation,

but I'll tell you why: because you've
been f*cking terrible at conversing.

If a high school debate team
argued their rebuttals

by tear gassing the other team,

they probably wouldn't be
invited back to regionals.

Meanwhile, some have taken
dismantling symbols of white supremacy

into their own hands.

Statues across the world
are being torn down.

A sl*ve trader in Bristol,
England was hurled into the harbor,

King Leopold II in Belgium was spray
painted and ultimately removed

and statues of Christopher Columbus
have been toppled, beheaded

and, in one case,
thrown into a lake,

where, and this is true, its location
was quickly updated on Google Maps.

Unlike Christopher Columbus himself,
Google knows how to read a map.

These protests have given momentum
to official campaigns,

like Virginia's effort
to remove a statue of Robert E. Lee,

an effort that some, like this
state senator, have been troubled by.

It's all about shoving
this down people's throats

and erasing the history
of the white people.

And I think that's wrong.

Setting aside her almost clinical
ability to miss the point,

you cannot erase
the history of white people.

It's like the skid marks on
the ass of your favorite shorts.

No matter how hard you try,
that shit's never coming out.

Statues aren't the only tributes to
a horrifying past being reconsidered.

NASCAR banned
the Confederate flag,

which prompted one driver to announce
he's quitting after this season.

Although, to be fair, that driver
had never won a race,

so you understand why a flag for losers
might have been important to him.

The country group Lady Antebellum
changed their name to just Lady A.

Which is a solid fix

as long as nobody ever asks
what the "A" stands for

or points out that that name's been
used by a black blues singer

for the last 20 years.

And then there was this.

"Gone with the Wind"
is gone from HBO.

It's one of the most popular
films of all time,

but it's also been condemned
for ignoring the horrors of sl*very.

White House Press Secretary
Kayleigh McEnany tore into HBO,

making it clear she was speaking
on behalf of the president.

Where do you draw the line here ?
I'm told that no longer can you find

on HBO "Gone with the Wind" because
somehow, that is now offensive.

Where do you draw the line ?
Should George Washington,

Thomas Jefferson and James Madison
be erased from history ?

First: as I've said before, the answer
to: "Where do you draw the line ?"

is literally always: somewhere.

You draw it somewhere. HBO is not
permanently pulling the movie,

it's going back up
with additional context.

And finally: who gives a shit
if something's not on HBO Max ?

There may be no better way
to obliterate all evidence

of something's existence
than to put it on HBO Max,

the only ash heap of history
that costs $15 a month.

Obviously:
symbolic progress is progress.

And a lot of these changes
have been a long time coming.

But, this week also brought
stubborn reminders

of the institutional inertia that is
going to make real change so difficult,

like Joe Biden sticking by his plan
to invest an additional $300 million

into community policing efforts,
which is an example

of whatever the precise
opposite of reading the room is.

We saw yet more footage of police
handcuffing black teens for jaywalking,

as well as a newly released video
of Oklahoma officers

responding to a man saying:
"I can't breathe"

by saying: "I don't care."

That man, incidentally, Derrick Scott,

would go on to die at the hospital
of a collapsed lung.

Perhaps most infuriatingly of all,

as protestors continued to demand
justice for Breonna Taylor,

who was k*lled in her own home
by police executing a noknock warrant,

the Louisville Police Department
responded with this.

Police in Louisville are releasing
the incident report

from the night officers shot
and k*lled Breonna Taylor.

As the "Courier-Journal" reports,
it's almost entirely blank.

The four-page report
says Taylor had no injuries,

even though police shot her
at least eight times.

Holy shit.
That is appalling.

When it comes to erasing history,
this seems a f*ck of a lot worse

than leaving a bunch of statues
toppled, cracked and beheaded.

Or, as that would probably be described
on a Louisville police report:

"No injuries."

It's important that we deal honestly

with the uncomfortable aspects
of our past.

But, there's also hard,
necessary work to be done

in changing the unacceptable
conditions of our present.

And the only hope for that
is if,

to say something that has never been
said about the offerings on HBO Max,

people don't take
their f*cking eyes off this.

And now, this.

C-SPAN Callers Have Some Thoughts
on the Coronavirus

for the Second Most Patient Man
on Television.

I only have one question.
I wanted to know who was

the last politician or ex-politician
to visit China before this virus ?

- And why it matters ?
- I'm curious.

I've seen a guy walking down
through a grocery store

with a jockey strap on his face.

I practiced this physical distancing
long before anybody tried to tell me

I have to,
'cause I don't like people.

Sam is in Seattle, Washington.
Sam, go ahead.

- So, how you doing ?
- Fine, thanks. Go ahead.

What do you want to ask me ?

Richard in Maryland, hello.

I would just like to thank you
for taking my call, number one.

And number two, you
should go f*ck yourself.

Who the hell is he ?
He's nobody.

He's just a man from Brooklyn, Fauci.
Who cares about what he thinks ?

He's the lead epidemiologist
on the White House task force team.

Who is he ? We've got somebody
at the door. Who's that ?

I hope you guys have a great day.
Stay safe and baba booey.

Moving on. Our main story
concerns facial recognition,

the thing that makes sure my iPhone
won't open unless it sees my face

or the face of any toucan,
but, that is it.

Facial recognition technology

has been showcased in TV shows
and movies for years.

Denzel Washington even discovered
a creative use for it

in the 2006 action movie "Déja Vu".

- We have facial recognition software ?
- Yeah.

Let's use it on the bag. Cross-match it
to all the bags on the south side

in the 48 hours
leading up to the expl*si*n !

Don't think
it's ever been used this way.

Look, same bag.

Bingo.

Bingo, indeed, Denzel !

With smart, believable plot
development like that,

it's no wonder "Déja Vu" received
such glowing IMDB reviews as:

"An insult to anybody
who finished elementary school",

"Worst movie of all time",
"More like Deja Pooh."

And my personal favorite:
a one-star review that reads:

"Bruce Greenwood, as always,
is great and so sexy"

"and there's a cat who survives."

A review that was clearly written
either by Bruce Greenwood or that cat.

Technology behind facial recognition
has been around for years.

As it's grown more sophisticated,
its applications have expanded greatly.

For instance, it's no longer just
humans who can be the targets.

The iFarm sensor scans each fish

and uses a*t*matic image processing
to uniquely identify each individual.

A number of symptoms are
recognized, including loser fish.

Yes, "loser fish",

which, by the way,
is an actual industry term.

That company says it can detect
which fish are losers by facial scan.

Which is important,
'cause, can you tell

which one of these fish is a loser
and which one is a winner ?

Are you sure about that ?
'Cause they're the same fish !

This is why you need a computer !

But the growth of facial recognition
and what it's capable of

brings with it a host of privacy
and civil liberties issues.

If you want a sense of how terrifying
this technology could be

if it becomes part of everyday life,

just watch as a Russian TV presenter
demonstrates an app called FindFace.

If you find yourself in a café
with an attractive girl

and you don't have
the guts to approach her, no problem.

All you need is a smartphone
and the application FindFace.

Find new friends,

take a picture

and wait for the result.

Now you're already looking
at her profile page.

Burn it all down.

Burn. Everything. Down.
I realize that this is a sentence

that no one involved in creating
that app ever once thought,

but imagine that
from a woman's perspective.

You're going about your day
when you get a random message

from a guy you don't know:
"Hello, I saw you in cafe earlier,"

"and used FindFace app to learn
your name and contact information."

"I'll pick you up from your place
at eight. I know where you live."

But one of the biggest users of
facial recognition is law enforcement.

Since 2011,

the FBI has logged more than


The databases law enforcement
are pulling from include

over 117 million American adults and
incorporate, among other things,

drivers' license photos
from residents of all of these states.

Roughly one in two of us have had
our photos searched this way.

And the police will argue
that this is all for the best.

An official with the London police
explaining why they use it there.

In London we've had
the London Bridge attack,

the Westminster Bridge attack.

The suspects involved, the people
who were guilty of those offenses,

were often known
by the authorities.

Had they been on some database,

had they been picked up
by cameras beforehand,

we may have been able
to prevent those atrocities

and that would definitely be a price
worth paying.

It's hard to come out against
the prevention of atrocities.

This show is, and always has been,
anti-atrocity.

But the key question there is,
what's the trade-off ?

If the police could guarantee
they could prevent all robberies,

but the only way to do that is
by having an officer stationed

in every bathroom watching you
every time you take a shit,

I'm not sure everyone
would agree that it's worth it.

The people who do

might want that for reasons
other than preventing crime.

Now is actually a very good time
to be looking at this issue.

Because there are serious concerns
that facial recognition

is being used to identify
Black Lives Matter protesters.

And if that's true, it wouldn't
actually be the first time,

as this senior scientist at Google,
Timnit Gebru, will tell you.

There was an example with Baltimore
police and the Freddie Gray marches

where they used face recognition
to identify protesters

and then they tried to link them up
with their social media profiles

and then target them for arrests.

Right now, a lot of people
are urging people

not to put images of protesters
on social media,

'cause there are people out there

whose job it is just to look up
these people and target them.

It's true.
During the Freddie Gray protests,

police officers used facial recognition
technology to look for people

with outstanding warrants
and arrest them.

Which is a pretty sinister way
to undermine the right to assemble.

So tonight, let's take a look
at facial recognition.

Let's start with the fact that even
as big companies like Microsoft,

Amazon and IBM
have been developing it,

and governments all over the world
have been happily rolling it out,

there haven't been rules or a framework
in place for how it is used.

In Britain, they've been experimenting
with "facial recognition zones",

putting signs up alerting
that you're about to enter one.

Which seems polite,
but watch what happens when one man

decided he didn't actually
want his face scanned.

This man didn't want to be
caught by the police cameras,

so he covered his face.

Police stopped him.
They photographed him anyway.

An argument followed.

What's your suspicion ?

The fact that he walked past
clearly marked "facial recognition"

and he covered his face.

So, I walked past like that.
It's a cold day as well.

I've just done that and the police
officers asked me to come with him.

I've got me back up.
I said to him: "f*ck off".

I've got now a 90 pound fine.
There you go. Look at that.

Thanks, lads. 90 pound.
Well done.

Yeah, that Guy Ritchie character
was rightly mad about that.

If you are not British
and you're looking at that man,

then at me, and wondering how
we both came from the same island,

let me explain.

British people
come in two variations:

so emotionally stunted that
they're practically comatose

and cheerfully telling
large groups of policemen to:

"f*ck off and do one if you're
gonna take a photo of me face !"

There's absolutely nothing
in between the two.

And the U.K. is by no means alone
in building out a system.

Australia is investing heavily
in a national facial biometric system

called "The Capability",

which sounds like the name
of a Netflix original movie.

Although that's actually perfect
if you want people to notice it,

think: "That seems interesting"
and then forget it ever existed.

And you don't have to imagine
what this technology would look like

in the hands
of an authoritarian government,

because China is unsurprisingly
embracing it in a big way.

We can match every face
with an ID card

and trace all your movements
back one week in time.

We can match your face with your car,
match you with your relatives

and the people
you're in touch with.

With enough cameras, we can
know who you frequently meet.

That is a terrifying
level of surveillance.

Imagine the Eye of Sauron,

but instead of scouring
Middle-earth for the one ring,

he was just really into knowing where
all of his Orcs like to go to dinner.

Some state-funded developers in China
seem weirdly oblivious

to just how sinister
their projects sound.

Skynet. What is that ?

"The Terminator" is
the favorite film of our founder.

So, they used the same name,

but they want to put
something good into this system.

In "The Terminator", Skynet is evil,
rains down death from the sky.

- But in China, Skynet is good.
- Yeah, that's the difference.

That's the difference, is it ?

It's not exactly reassuring
that you called your massive,

all-encompassing AI network
"Skynet, But a Good Version".

It'd be like if "The Today Show"
built a robot journalist

and called it
"Matt Lauer, But Good".

This one's completely different !

Sure, he does also have a button
under his office desk,

but all it does is release
lilac air freshener !

This is the good version.

This technology raises

troubling philosophical questions
about personal freedom.

And right now, there are also some
very immediate, practical issues.

Because even though
it is currently being used,

this technology is still
very much a work in progress.

Its error rate is particularly high

when it comes to matching faces
in real time.

In the U.K., when human rights
researchers watched police

put one such system to the test,

they found that only eight out
of 42 matches were verifiably correct.

That's even before we get into
the fact that these systems

can have some worrying blind spots,
as one MIT researcher found out

when testing out algorithms, including
Amazon's "Rekognition" system.

At first glance, MIT researcher
Joy Buolamwini says

the overall accuracy rate was high,
even though all companies

better detected and identified
men's faces than women's.

But the error rate
grew as she dug deeper.

Lighter male faces were
the easiest to guess the gender on

and darker female faces
were the hardest.

One system couldn't even detect
if she had a face

and the others
misidentified her gender.

White guy, no problem.

Yeah, "White guy, no problem."
The unofficial motto of history.

But it's not like what we needed
right now

was for computers to somehow
find a way to exacerbate the problem.

And it gets worse. In one test,

Amazon's system even failed
on the face of Oprah Winfrey,

someone so recognizable
her magazine only had to type

the first letter of her name
and your brain autocompleted the rest.

A federal study of more than a hundred
facial recognition algorithms

found that Asian
and African American people

were up to 100 times more likely
to be misidentified than white men.

So, that is clearly concerning.
And on top of all this,

some law enforcement agencies
have been using these systems

in ways they weren't
exactly designed to be used.

In 2017, police were looking
for this beer thief.

The surveillance image
wasn't clear enough

for facial recognition software
to identify him.

So instead,
police used a picture of a lookalike,

which happened to be
actor Woody Harrelson.

That produced names of several
possible suspects and led to an arrest.

They used a photo of Woody Harrelson
to catch a beer thief !

And how dare you drag
Woody Harrelson into this ?

This is the man that once got drunk
at Wimbledon in this magnificent hat,

made this facial expression
in the stands, and in doing so,

accidentally made tennis
interesting for a day.

He doesn't deserve prison for that,
he deserves the Wimbledon trophy.

And there have been multiple instances
where investigators

have had such confidence in a match,
they've made disastrous mistakes.

A few years back, Sri Lankan
authorities mistakenly targeted

this Brown University student
as a suspect in a heinous crime,

which made for a pretty awful
finals week.

On the morning of April 25th,
in the midst of finals season,

I woke up in my dorm room
to 35 missed calls,

all frantically informing me
that I had been falsely identified

as one of the t*rrorists involved
in the recent Easter att*cks

in my beloved motherland, Sri Lanka.

That's terrible. Finals week
is already bad enough,

alternating sh*ts of Five Hour Energy
and Java Monster Mean Bean

to remember the differences between
Baroque and Rococo architecture,

without waking up to find out that
you've also been accused of terrorism

because a computer sucks at faces.

On the one hand, these technical issues
could get smoothed out over time.

But, even if this technology
eventually becomes perfect,

we should really be asking ourselves
how much we're comfortable

with it being used, by police,
by governments, by companies,

or indeed by anyone.

We should be asking that right now,

because we're about
to cross a major line.

For years, tech companies approached
facial recognition with caution.

In 2011, the then-chairman of Google
said it was the one technology

the company had held back, because
it could be used "in a very bad way".

And think about that, it was too
Pandora's Box-y for Silicon Valley,

the world's most enthusiastic
Pandora's Box openers.

Even some of the big companies that
developed facial recognition algorithms

have designed it for use
on limited data sets,

like mug sh*ts
or drivers' license photos.

But now,
something important has changed.

And it is because of this guy,
Hoan Ton-That,

and his company, Clearview AI.

I'll let him describe what it does.

Quite simply, Clearview is basically
a search engine for faces.

Anyone in law enforcement
can upload a face to the system

and it finds other publicly available
material that matches that face.

So the key phrase there
is "publicly available material".

Because Clearview says it's collected
a database of three billion images,

that is larger than any other facial
recognition database in the world.

And it's done that by scraping them
from public-facing social media,

like Facebook, Linkedin, Twitter
and Instagram.

So, for instance, Clearview's system
would theoretically include

this photo of Ton-That
at what appears to be Burning Man,

or this one, of him wearing a suit
from the exclusive

"Santa Claus After Dark Collection"
at Men's Wearhouse.

And this very real photo of him
shirtless and lighting a cigarette

with blood-covered hands,
his profile photo on Tidal,

because yes, of course
he's also a musician.

I can only assume that that's the cover
of an album called:

"a*t*matic Skip If This Ever
Comes Up On A Pandora Station".

Ton-That's willingness to do
what others have not been willing to do

and that is scrape
the whole internet for photos,

has made his company a genuine
game-changer in the worst way.

Watch as he impresses a journalist
by running a sample search.

So, here's the photo you uploaded
of me. A headshot from CNN.

So, first few images it's found,

it's found a few different versions
of that same picture.

But now as we scroll down,
we're starting to see pictures of me

that are not
from that original image.

My god.

So, this photograph is from my local
newspaper where I lived in Ireland.

And this photo would've been taken
when I was like 16.

That's crazy.

Yeah, it is.

If there is an embarrassing photo
of you from when you were a teenager,

don't run away from it.

Make it the center of your TV show's
promotional campaign and own it.

Use the fact that your teenage years
were a hormonal Stalingrad.

Harness the pain.

But the notion that someone
can take your picture

and immediately find out
everything about you is alarming enough

even before you discover
that over 600 law enforcement agencies

have been using
Clearview's service.

And you're probably in that database,
even if you don't know it.

If a photo of you has been
uploaded to the internet,

there is a decent chance
that Clearview has it.

Even if someone uploaded it
without your consent,

even if you untagged yourself or later
set your account to "private".

If you think:

"Isn't this against the terms
of service for internet companies ?"

Clearview actually received
cease-and-desist orders

from Twitter, YouTube
and Facebook earlier this year.

But it has refused to stop, arguing

that it has a First Amendment right
to harvest data from social media.

Which is just not at all
how the First Amendment works.

You might as well argue that you have
an Eighth Amendment right

to dress up rabbits
like John Lennon.

That amendment does not cover
what I think you think it does.

And yet, Ton-That insists
that this was all inevitable,

so we should all frankly be glad
that he's the one who did it.

I think the choice now

is not between, like, no facial
recognition and facial recognition.

It's between bad facial recognition
and responsible facial recognition.

And we want to be
in the responsible category.

Well, sure, you want to be.
But are you ?

Because there are
a lot of red flags here.

Apps he developed before this
included one called "Trump Hair",

which would just add Trump's hair
to a user's photo

and another called "ViddyHo"
that phished its own users,

tricking them into sharing access
to their Gmail accounts

and then spamming
their contacts.

I'm not sure that I would want
to trust my privacy to this guy.

If I was looking for someone to build
an app that let me

put Ron Swanson's mustache on my
face as my account was drained,

then he'd be the top of my list.

Despite Clearview's
repeated reassurances

that its product is intended
only for law enforcement,

as if that is inherently a good thing,

he's already put it in a lot
of other people's hands.

In addition to users
like the DEA and the FBI,

he's also made it available
to employees at Kohl's, Walmart

and Macy's, which has alone completed
more than 6 000 facial searches.

And it gets worse, they've reportedly
tried to pitch their service

to congressional candidate
and white supremacist Paul Nehlen,

suggesting they could help him
use "unconventional databases"

for "extreme opposition research",
which is a terrifying series of words

to share a sentence with
"white supremacist".

Clearview says that
that offer was "unauthorized",

but when questioned about who else
he might be willing to work with,

Ton-That's answer
hasn't been reassuring.

There's countries I would never sell to
that are very adverse to the U.S.

- For example ?
- Like China and Russia.

Iran, North Korea. Those are the things
that are definitely off the table.

What about countries that think

that being gay should be illegal,
it's a crime ?

Like I said, we want to make sure
that we do everything correctly,

mainly focus on the U.S.
and Canada.

And the interest has been
overwhelming.

Just so much interest,
we're taking it one day at a time.

Yeah,
that's not terribly comforting !

When you ask a farmer if he'd
let foxes into the hen house,

the answer you hope for
is "No",

not: "The interest from foxes
has been overwhelming,"

"just so much interest,
we're taking it one day at a time."

Reporters for Buzzfeed have found
that Clearview has offered its services

to entities in Saudi Arabia
and the United Arab Emirates,

countries that view human rights
laws with the same level of respect

that Clearview seems to have
for Facebook's terms of service.

So, facial recognition technology
is already here.

The question is,
what can we do about it ?

Some are trying to find ways
to thwart the cameras themselves.

Hi guys. It's me, Jillian, again,
with a new makeup tutorial.

Today's topic
is how to hide from cameras.

First, that's probably
not a scalable solution,

and second, I'm not sure if that
makes you less identifiable

or the most identifiable
person on earth.

Officers are on the lookout
for a young woman,

dark hair, medium build, looks like
a mime who went through a shredder.

What we really need to do

is to put limits on how
this technology can be used.

Some locations
have laws in place already.

San Francisco banned
facial recognition last year,

but the scope of that is limited
to city law enforcement.

It doesn't affect state and
federal use or private companies.

Illinois has a law requiring
companies to obtain written permission

before collecting fingerprints,
facial scans

or other identifying biological
characteristics and that is good.

We also need a comprehensive,
nation-wide policy. We need it now.

There are worries that it is being used
in the protests that we are seeing now.

And the good news
is that just this week,

thanks to those protests
and to years of work by activists,

some companies
did pull back from facial recognition.

IBM says they'll no longer
develop facial recognition.

Amazon said it was putting a one-year
hold on working with law enforcement.

Microsoft said it wouldn't sell
its technology to police

without federal regulation.

But, there is nothing to stop those
companies from changing their mind

if people's outrage dies down.

While Clearview says
it's canceling its private contracts,

it's also said it will keep working
with the police,

just as it will keep harvesting
your photos from the internet.

If Clearview is gonna keep grabbing
our photos, at the very least,

there may be a way to let them know
what you think about that.

So, the next time you feel
the need to upload a photo,

maybe throw in an extra one
for them to collect.

Maybe hold up a sign that says:
"These photos were taken unwillingly"

"and I'd rather
you not be looking at them."

Or, if that feels too complicated,
just: "f*ck Clearview".

That really does
get the message across.

These photos are often being searched
by law enforcement,

so take this opportunity

to talk to the investigators
looking through your photos.

Maybe something like:
"I don't look like Woody Harrelson,"

"but while I have your attention,
defund the police."

Whatever you feel is most important
to tell them, you should put on a sign.

That's our show. We'll see you
next week. Good night !

"I Am Not Steve Mnuchin".
Post Reply