Using Amazon's commercially available Rekognition software -- running on smartphones strapped to our heads -- our team ran 13,732 biometric face scans in Washington, DC. By comparing live footage against a database we had assembled, the system successfully identified a member of Congress in real time: Representative Mark DeSaulnier of California.
Amazon's facial recognition software also thought that it had identified 7 journalists and 25 Amazon lobbyists that we had pre-loaded into the database. But all of those matches turned out to be incorrect. The software even thought that it spotted singer Roy Orbison who of course has been deceased since 1988 (RIP).
This underscores our message: facial recognition is invasive and dangerous when it works, but it's also dangerous when it doesn't work. In our case, it's easy to laugh when the software thinks a member of our team is an Amazon lobbyist, or when it thinks a random staffer is a prominent journalist. But law enforcement agencies are using flawed facial recognition software right now -- and the potential harm of a mismatch is staggering. It could land an innocent person in prison, or worse. And current facial recognition algorithms exhibit systemic racial bias, exacerbating existing forms of discrimination in our criminal justice system. [...]
After several hours of scanning thousands of faces, our team of activists were approached by Capitol Hill police and threatened with arrest if they did not leave the Capitol grounds. They were thrown out not because they were using facial recognition surveillance -- that's perfectly legal until Congress gets off their butts and passes laws to ban it -- but because police claimed they were violating a law against blocking passageways. Of course, they weren't blocking passageways and we have the whole thing on video to prove it. It seems that Congress thinks facial recognition surveillance is just fine as long as its used on all of us but not them.
Previously, previously, previously, previously, previously, previously, previously.
I'd really be interested to see what the Roy Orbison false positive looked like.
I wish they had taken the TeleTubbies resemblance just a little further. The eventual viral video with the cops would be that much better.
"It would be pretty easy to use it to get cars to drive straight into walls"
LOL, the systemic racial bias they cite is about how non-whites have higher odds of flying under the radar.
The hypothetical targeted marginalized population they're so worried about is literally white nationalists.
if by "flying under the radar" to mean "minorities are disproportionately likely to be misidentified, detained, questioned, and otherwise treated as a criminal suspect" then your statement is factual, but is this news?
Many Facial-Recognition Systems Are Biased, Says U.S. Study, New York Times, Dec. 19, 2019
I hope they can finally find Elvis too!
When I first learned about facial recognition I thought it was a great idea. They get a match they check it out and find out. A few people are inconvenienced for everyone's good. Then it occurred to me, the false positives are not spread out randomly. Some unlucky souls are going to have show their ID 10 times a day. They'll have to leave an hour early for everywhere they go.