The need for strong, independent local journalism
is more urgent than ever. Please support the city we
love by joining Friends of Willamette Week.

What Most Alarms Portland City Officials About Facial Recognition Software Are the Faces It Can’t Recognize

The city’s top reason for issuing the ban: The software is racist.

WW presents "Distant Voices," a daily video interview for the era of social distancing. Our reporters are asking Portlanders what they're doing during quarantine.

In Portland, Big Brother can't recognize you. It's illegal.

On Sept. 9, Portland passed the first ban in the nation on corporate use of facial recognition software. The Portland City Council passed two bans: One stops city government from using such software, and the other bars private companies from scanning faces in public places.

The city's top reason for issuing the ban: The software is racist.

As Hector Dominguez explains, the programs that analyze faces haven't been given enough examples of women and people of color to sufficiently distinguish them. In effect, the algorithm behaves like a white security guard who thinks Black people look alike. Racial bias was built into the software by not giving it enough information.

Dominguez, the city's open data coordinator, says that problem—which would lead to false identifications of people of color—alarmed city officials enough to trigger an outright ban.

In an interview with WW editor and publisher Mark Zusman, Dominguez discusses the nature of that concern, how the ban will be enforced, and why Amazon stopped by City Hall to argue for less regulation.