SPYING ON AMERICANS BY THOSE WHO SHOULD
BE HELPING US
COMPILATION AND COMMENTARY
BY LUCY WARNER
JUNE 14, 2020
FACIAL RECOGNITION TECHNOLOGY SOUNDS EFFICIENT, BUT IT
HAS MORE THAN ONCE PULLED UP AN IMAGE OF A GORILLA AS A CATEGORIZATION OF BLACK
PEOPLE. ALSO, ACCORDING TO THE ARTICLES BELOW, WOMEN ARE NOT AS ACCURATELY
RECOGNIZED AS MEN. WHETHER THE RACISM THERE IS INTENTIONAL OR NOT, IT IS STILL
PRESENT, OR WAS AS LATE AS 2018. GOOGLE HAD THREE YEARS IN WHICH TO CORRECT
THEIR ALGORITHMS, AND EITHER COULDN’T DO IT OR SIMPLY DIDN’T.
THERE ARE AT LEAST TWO WEIGHTY REASONS WHY THIS IS A
PROBLEM: THE RACIST POLICING THAT CAN OCCUR AS A RESULT IS THE FIRST; AND THE
SIMPLE FACT THAT AS TECHNICALLY FANCY AS IT IS, IT IS DANGEROUSLY INACCURATE
AND DIRECTLY AFFECTS THE APPLICATION OF JUSTICE IN AMERICA, NOT TO MENTION THE
MATTER OF EXPENSE. RETROFITTING A CITY WITH SURVEILLANCE CAMERAS HAS TO COST A
HECK OF A LOT OF MONEY. YOU COULD HIRE MORE HUMAN BEINGS FOR THAT MONEY INSTEAD
OF PUTTING THEM OUT OF A JOB.
THERE’S ONE MORE REASON IN MY BOOK TO BAN THESE CAMERAS.
IT IS A VIOLATION OF PERSONAL FREEDOM TO BE CONSTANTLY AND UNOBTRUSIVELY
SURVEILLED AS WE WALK DOWN THE STREET OR GO INTO A GROCERY STORY. EVEN AS A
WHITE PERSON I’M VERY UNCOMFORTABLE WITH IT. I DON’T USUALLY PARTICIPATE IN
POLITICAL DEMONSTRATIONS, BUT I HAVE AND I AM LIKELY TO AGAIN. THAT IS AS IMPORTANT
A CHECK ON OVERREACHING POLITICAL FORCES AS THE FREE PRESS IS. LIKE SO MANY “CONSERVATIVES,”
PRESIDENT TRUMP HAS TREATED THE PRESS AS “THE ENEMY OF THE PEOPLE,” AND SPOKE
OF THE RECENT DEMONSTRATIONS OVER THE TOTALLY UNNECESSARY DEATH OF GEORGE FLOYD
AT THE HANDS OF POLICE AS “WAR.”
ON THE SUBJECT OF FACIAL RECOGNITION, SEE THESE THREE ARTICLES:
*https://www.cnet.com/news/why-facial-recognitions-racial-bias-problem-is-so-hard-to-crack/
[SEE THE CNET ANALYSIS OF THE TECHNICAL
PROBLEM PRESENTED SECOND.]
HERE IS A REPORT BY COMMON DREAMS, ONE OF MY FAVORITE
SOURCES FOR BERNIE SANDERS INFORMATION BECAUSE THEY ARE BOTH ACCURATE AND
DISTINCTLY FRIENDLY TO PROGRESSIVE IDEAS. BERNIE, AS USUAL, HAS HIS FINGER ON
THE PULSE OF LIFE IN AMERICA IN YET ANOTHER IMPORTANT AREA OF CONCERN. WHETHER
OR NOT HE IS IN THE RUNNING FOR PRESIDENT IN 2020, I STILL “FEEL THE BERN.” I
DO WISH THE AVERAGE POLITICIAN WERE AS INTELLIGENT, ALERT AND CARING AS HE IS.
Published on
Thursday, June 11, 2020
byCommon Dreams
Bernie Sanders Calls on Congress to 'Ban Facial
Recognition Technology for All Policing'
"Facial recognition technology violates the privacy
and civil liberties of Americans, and deepens racial bias in policing."
byJake Johnson, staff writer
PHOTOGRAPH -- A
U.S. Customs and Border Protection officer instructs an international traveler
to look into a camera as he uses facial recognition technology to screen a
traveler entering the United States on February 27, 2018 at Miami International
Airport in Miami, Florida. (Photo: Joe Raedle/Getty Images)
Sen. Bernie Sanders on Thursday called on Congress to
enact a total ban on police use of facial recognition technology after
Microsoft's president said the company—following a sustained outside pressure
campaign—will not sell its surveillance software to law enforcement until
stricter privacy regulations are implemented.
"Facial recognition technology violates the privacy
and civil liberties of Americans, and deepens racial bias in policing,"
the Vermont senator tweeted. "Congress must ban facial recognition
technology for all policing."
It's not the first time Sanders has made the demand, as a
proposal to "ban the use of facial recognition software for policing"
was included in his 2020 presidential campaign's criminal justice reform
agenda.
Bernie Sanders
✔
@SenSanders
Facial recognition technology violates the privacy and
civil liberties of Americans, and deepens racial bias in policing.
Congress must ban facial recognition technology for all
policing. https://twitter.com/ACLU/status/1271123787065503746 …
ACLU
✔
@ACLU
BREAKING: Microsoft just announced it will not sell face
recognition technology to the police.
14K
1:42 PM - Jun 11, 2020
Twitter Ads info and privacy
2,993 people are talking about this
Sanders' demand came after Microsoft president Brad Smith
said during an event hosted by the Washington Post that his company "will
not sell facial-recognition technology to police departments in the United
States until we have a national law in place, grounded in human rights that
will govern this technology." Smith
clarified that Microsoft does not currently provide facial recognition
technology to U.S. police.
As Common Dreams reported Thursday, Amazon—which has
been selling its facial recognition product to law enforcement for years—also
committed to temporarily bar police from using its notoriously inaccurate
Rekognition software.
Matt Cagle, technology and civil liberties attorney with
the ACLU of Northern California, said in
a statement that "when even the makers of face recognition refuse to
sell this surveillance technology because it is so dangerous, lawmakers can no
longer deny the threats to our rights and liberties."
"Congress and legislatures nationwide must
swiftly stop law enforcement use of face recognition," said Cagle.
"It should not have taken the police killings of George Floyd, Breonna
Taylor, and far too many other black people, hundreds of thousands of people
taking to the streets, brutal law enforcement attacks against protesters and
journalists, and the deployment of military-grade surveillance equipment on
protests led by black activists for these companies to wake up to the
everyday realities of police surveillance for black and brown communities."
Our work is licensed under a Creative Commons
Attribution-Share Alike 3.0 License. Feel free to republish and share widely.
Our pandemic coverage is free to all. As is all of our
reporting.
No paywalls. No advertising. No corporate sponsors. Since
the coronavirus pandemic broke out, traffic to the Common Dreams website has
gone through the roof— at times overwhelming and crashing our servers. Common
Dreams is a news outlet for everyone and that’s why we have never made our
readers pay for the news and never will. But if you can, please support our
essential reporting today. Without Your Support We Won't Exist.
Please select a donation method
THIS TECHNICAL ARTICLE IS VERY HELPFUL IN UNDERSTANDING
WHAT IS GOING ON WITH AI AND THE AMERICAN NORMALIZATION OF SURVEILLANCE – TWO SEPARATE,
BUT EQUALLY SERIOUS ISSUES TO MODERN SOCIETY. IF YOU HAVEN’T READ THE TRULY GREAT
TRAGIC DYSTOPIAN FUTURISTIC THRILLER “1984,” GO CHECK IT OUT OF THE
LIBRARY NOW.
ON THE DANGERS OF AUTOMATED THOUGHT, READ ON.
Why facial recognition's racial bias problem is so hard
to crack
Good luck if you're a woman or a darker-skinned person.
Queenie Wong
March 27, 2019 5:00 a.m. PT
IMAGES – DIAGRAMMATIC IMAGES SHOWING HOW COMPUTERS MAKE
THEIR FACIAL ANALYSES, Matthias
Graben/Getty Images
Jimmy Gomez is a California Democrat, a Harvard graduate
and one of the few Hispanic lawmakers serving in the US House of
Representatives.
But to Amazon's facial recognition system, he looks like
a potential criminal.
Gomez was one of 28 US Congress members falsely matched
with mugshots of people who've been arrested, as part of a test the American
Civil Liberties Union ran last year of the Amazon Rekognition program.
Nearly 40 percent
of the false matches by Amazon's tool, which is being used by police, involved
people of color.
image-from-ios
This is part of a CNET special report exploring the
benefits and pitfalls of facial recognition.
James Martin/CNET
The findings reinforce a growing concern among civil
liberties groups, lawmakers and even some tech firms that facial recognition
could harm minorities as the technology becomes more mainstream. A form of the
tech is already being used on iPhones and Android phones, and police,
retailers, airports and schools are slowly coming around to it too. But studies
have shown that facial recognition systems have a harder time identifying women
and darker-skinned people, which could lead to disastrous false positives.
"This is an example of how the application of
technology in the law enforcement space can cause harmful consequences for
communities who are already overpoliced," said Jacob Snow, technology and
civil liberties attorney for the ACLU of Northern California.
Facial recognition has its benefits. Police in Maryland
used the technology to identify a suspect in a mass shooting at the Capital
Gazette. In India, it's helped police identify nearly 3,000 missing children
within four days. Facebook uses the technology to identify people in photos for
the visually impaired. It's become a convenient way to unlock your smartphone.
But the technology isn't perfect, and there've been some
embarrassing public blunders. Google Photos once labeled two black people as
gorillas. In China, a woman claimed that her co-worker was able to unlock her
iPhone X using Face ID. The stakes of being misidentified are heightened when
law enforcement agencies use facial recognition to identify suspects in a crime
or unmask people in a protest.
"When you're selling [this technology] to law
enforcement to determine if that individual is wanted for a crime, that's a
whole different ball game," said Gomez. "Now you're creating a
situation where mistaken identity can lead to a deadly interaction between law
enforcement and that person."
The lawmaker wasn't shocked by the ACLU's findings,
noting that tech workers are often thinking more about how to make something
work and not enough about how the tools they build will impact minorities.
VIDEO -- Watch this: Facial recognition: Get to know the
tech that gets to...
5:11
Tech companies have responded to the criticism by
improving the data used to train their facial recognition systems, but like
civil rights activists, they're also calling for more government regulation to
help safeguard the technology from being abused. One in two American adults is
in a facial recognition network used by law enforcement, researchers at
Georgetown Law School estimate.
Amazon pushed back against the ACLU study, arguing that
the organization used the wrong settings when it ran the test.
"Machine learning is a very valuable tool to help
law enforcement agencies, and while being concerned it's applied correctly,
we should not throw away the oven because the temperature could be set wrong
and burn the pizza," Matt Wood, general manager of artificial
intelligence at Amazon Web Services, said in a blog post.
Recognition problem
There are various reasons why facial recognition services
might have a harder time identifying minorities and women compared with white
men.
PHOTOGRAPH -- [AN ASIAN] Businessman using using[sic] face
recognition outdoors, Facial recognition technology has a harder time
identifying people with darker skin, studies show. Getty Images
Public photos that tech workers use to train computers to recognize faces could include more
white people than minorities, said Clare Garvie, a senior associate at
Georgetown Law School's Center on Privacy and Technology. If a company uses
photos from a database of celebrities, for example, it would skew toward white
people because minorities are underrepresented in Hollywood.
Engineers at tech companies, which are made up of mostly
white men, might also be unwittingly designing the facial recognition systems
to work better at identifying certain races, Garvie said. Studies have shown
that people have a harder time recognizing faces of another race and that
"cross-race bias" could be spilling into artificial intelligence.
Then there are challenges dealing with the lack of color contrast on darker
skin, or with women using makeup to hide wrinkles or wearing their hair
differently, she added.
Facial recognition systems made by Microsoft, IBM and
Face++ had a harder time identifying the
gender of dark-skinned women like African-Americans compared with white men,
according to a study conducted by researchers at the MIT Media Lab. The gender
of 35 percent of dark-skinned women was misidentified compared with 1 percent
of light-skinned men such as Caucasians.
Another study by MIT, released in January, showed that
Amazon's facial recognition technology had an even harder time than tools by
Microsoft or IBM identifying the gender of dark-skinned women.
The role of tech companies
Amazon disputed the results of the MIT study, and a
spokeswoman pointed to a blog post that called the research
"misleading." Researchers used "facial analysis" that
identifies characteristics of a face such as gender or a smile, not facial
recognition that matches a person's face to similar faces in photos or videos.
"Facial analysis and facial recognition are
completely different in terms of the underlying technology and the data used to
train them," Wood said in a blog post about the MIT study. "Trying to
use facial analysis to gauge the accuracy of facial recognition is ill-advised,
as it's not the intended algorithm for that purpose."
That's not to say the tech giants aren't thinking about
racial bias.
Microsoft, which offers a facial recognition tool through
Azure Cognitive Services, said last year that it reduced the error rates for
identifying women and darker-skinned men by up to 20 times.
dif-cropped2-300x287
IBM made a million-face dataset to help reduce bias
in facial recognition technology. IBM
Research
A spokesperson for Facebook, which uses facial recognition
to tag users in photos, said that the company makes sure the data it uses is
"balanced and reflect the diversity of the population of Facebook."
Google pointed to principles it published about artificial intelligence, which
include a prohibition against "creating or reinforcing unfair bias."
Aiming to advance the study of fairness and accuracy in
facial recognition, IBM released a data set for researchers in January called
Diversity in Faces, which looks at more than just skin tone, age and gender.
The data includes 1 million images of human faces, annotated with tags such as
face symmetry, nose length and forehead height.
"We have all these subjective and loose notions of
what diversity means," said John Smith, lead scientist of Diversity in
Faces at IBM. "So the intention for IBM to create this data set was to dig
into the science of how can we really measure the diversity of faces."
The company, which collected the images from the photo
site Flickr, faced criticism this month from some photographers, experts and
activists for not informing people their images were being used to improve
facial recognition technology. In response, IBM said it takes privacy seriously
and users could opt out of the data set.
Amazon has said that it uses training data that reflects
diversity and that it's educating customers about best practices. In February,
it released guidelines it says lawmakers should take into account as they
consider regulation.
"There should be open, honest and earnest dialogue
among all parties involved to ensure that the technology is applied
appropriately and is continuously enhanced," Michael Punke, Amazon's vice
president of global public policy, said in a blog post.
Clear rules needed
Even as tech companies strive to improve the accuracy of
their facial recognition technology, concerns that the tools could be used
to discriminate against immigrants or minorities aren't going away. In part
that's because people still wrestle with bias in their personal lives.
Law enforcement and government could still use the
technology to identify political protestors or track immigrants,
putting their freedom at risk, civil rights groups and experts argue.
PHOTOGRAPH -- US Rep. Jimmy Gomez, a California Democrat,
was one of 28 US Congress members falsely matched by Amazon's facial
recognition system with mugshots of people who've been arrested, a study by the
ACLU showed. 2017 Los Angeles Times, Al Seib/Getty Images
"A perfectly accurate system also becomes an incredibly
powerful surveillance tool," Garvie said.
Civil rights groups and tech firms are calling for the
government to step in.
"The only effective way to manage the use of
technology by a government is for the government proactively to manage this use
itself," Microsoft President Brad Smith wrote in a blog post in July.
"And if there are concerns about how a technology will be deployed more
broadly across society, the only way to regulate this broad use is for the
government to do so."
The ACLU has called on lawmakers to temporarily prohibit
law enforcement from using facial recognition technology. Civil rights groups
have also sent a letter to Amazon asking that it stop providing Rekognition to
the government.
Some lawmakers and tech firms, such as Amazon, have asked
the National Institute of Standards and Technology, which evaluates facial
recognition technologies, to endorse industry standards and ethical best
practices for racial bias testing of facial recognition.
For lawmakers like Gomez, the work has only begun.
"I'm not against Amazon," he said. "But
when it comes to a new technology that can have a profound impact on people's
lives -- their privacy, their civil liberties -- it raises a lot of questions."
PHOTOGRAPHS OF REPRESENTATIVE EXAMPLES
Security cameras with facial recognition tech inside
See all photos
26COMMENTS
TAGS -- Sci-Tech Tech Industry Amazon IBM Microsoft
Mobile
**** **** **** ****
No comments:
Post a Comment