The Think Species
Social and Cultural Essays
CD) — Privacy advocates are responding with alarm to Amazon’s claim this week that the controversial cloud-based facial recognition system the company markets to law enforcement agencies can now detect “fear” in the people it targets.
“Amazon is going to get someone killed by recklessly marketing this dangerous and invasive surveillance technology to governments,” warned Evan Greer, deputy director of the digital rights group Fight for the Future, in a statement Wednesday.
Amazon Web Services detailed new updates to its system—called Rekognition—in an announcement Monday:
With this release, we have further improved the accuracy of gender identification. In addition, we have improved accuracy for emotion detection (for all 7 emotions: ‘Happy’, ‘Sad’, ‘Angry’, ‘Surprised’, ‘Disgusted’, ‘Calm’, and ‘Confused’) and added a new emotion: ‘Fear’. Lastly, we have improved age range estimation accuracy; you also get narrower age ranges across most age groups.
Pointing to research on the technology conducted by the ACLU and others, Fight for the Future’s Greer said that “facial recognition already automates and exacerbates police abuse, profiling, and discrimination.”
“Now Amazon is setting us on a path where armed government agents could make split second judgements based on a flawed algorithm’s cold testimony. Innocent people could be detained, deported, or falsely imprisoned because a computer decided they looked afraid when being questioned by authorities,” she warned. “The dystopian surveillance state of our nightmares is being built in plain sight—by a profit-hungry corporation eager to cozy up to governments around the world.”
VICE reported that “despite Amazon’s bold claims, the efficacy of emotion recognition is in dispute. A recent study reviewing over 1,000 academic papers on emotion recognition found that the technique is deeply flawed—there just isn’t a strong enough correlation between facial expressions and actual human emotions, and common methods for training algorithms to spot emotions present a host of other problems.”
Amid mounting concerns over how police and other agencies may use and abuse facial recognition tools, Fight for the Future launched a national #BanFacialRecognition campaign last month. Highlighting that there are currently no nationwide standards for how agencies and officials can use the emerging technology, the group calls on federal lawmakers to ban the government from using it at all.
Fight for the Future reiterated their demand Wednesday, in response to Amazon’s latest claims. Although there are not yet any federal regulations for the technology, city councils—from San Francisco to Somerville, Massachusetts—have recently taken steps to outlaw government use such systems.
Activists are especially concerned about the technology in that hands of federal agencies such as U.S. Immigration and Customs Enforcement (ICE) and Customs and Border Patrol (CBP), whose implementation of the Trump administration’s immigration policies has spurred condemnation from human rights advocates the world over.
Civil and human rights advocates have strongly urged Amazon—as well as other developers including Google and Microsoft—to refuse to sell facial recognition technology to governments in the United States and around the world, emphasizing concerns about safety, civil liberties, and public trust.
However, documents obtained last year by the Project on Government Oversight revealed that in the summer of 2018, Amazon pitched its Rekognition system to the Department of Homeland Security—which oversees ICE and CBP—over the objection of Amazon employees. More recently, the corporation has been targeted by protesters of the Trump administration’s immigration agenda for Amazon Web Service’s cloud contracts with ICE.
In a July report on Amazon’s role in the administration’s immigration policies, Al Jazeera explained that “U.S. authorities manage their immigration caseload with Palantir software that facilitates tracking down would-be deportees. Amazon Web Services hosts these databases, while Palantir provides the computer program to organize the data.”
“Amazon provides the technological backbone for the brutal deportation and detention machine that is already terrorizing immigrant communities,” Audrey Sasson, executive director of Jews For Racial and Economic Justice, told VICE Tuesday. “[A]nd now Amazon is giving ICE tools to use the terror the agency already inflicts to help agents round people up and put them in concentration camps.”
“Just as IBM collaborated with the Nazis, Amazon and Palantir are collaborating with ICE today,” added Sasson. “They’ve chosen which side of history they want to be on.”
This article was originally published in 2019.
Amazon announced in mid 2020 after widespread demands they would halt law enforcement use of its facial recognition platform for one year, to curb aggressive policing in the wake of George Floyd’s killing.
By Aral Bereux.
In Hangzhou, China, a high school has implemented facial recognition technology in the classroom. The system scans and stores the emotions of each student’s face every 30 seconds to determine if they are angry, fearful, confused, happy or upset. Hangzhou No. 11 High School categorizes these emotions to determine how the student is progressing.
The government-run system also scans the student’s skills and concentration. According to a local Chinese website, monitoring students’ reading and writing, raising a hand to a question and even sleeping at the desk is a good thing.
The new system is likely to be distributed throughout other Chinese high schools. The system, called the “intelligence classroom behaviour management system,” also tracks attendance. It uses facial recognition to allow the borrowing of library items or pay for canteen lunches, storing the student’s diet and book logs on a local server.
According to one school official, the “subtle facial expressions in the class” can help to “analyse the behaviour of the entire class. And, of course, this is a very efficient way to check attendance.”
In response to the attendance, the system crosschecks its database with students’ faces, making roll call unnecessary. The facial recognition system can determine who is absent in less than a minute.
Ni Ziyuan, the school’s principal, discussed the raised privacy concerns. According to Ni Ziyuan, the tracking technology saves and stores the faces on a local server rather than on the cloud. This protects the students from data breaches similar to those that occurred with the Chinese company Qihoo 360. In the instance of Qihoo 360, surveillance live streaming channels were shut down after several swimming pools and classrooms were live-streamed to the public.
Despite much criticism of putting students under constant surveillance, principal Ni Ziyuan maintains this is a positive education experience.
“With the aid of this management system, it is equivalent to having one additional teaching assistant for teachers, which can improve the pertinence of education and the effect of classroom teaching,” Ni Ziyuan says. He also explains that teachers are also being monitored to improve efficiency.
In the last few years, the Chinese have refined their Social Credit System. This system monitors individuals via CCTV in Chinese provinces and is already proving socially destructive. Using a points-based system, each citizen is awarded points. Dependent on their behaviours monitored in supermarkets, on the street and in employment, citizens are given a positive or negative ranking. Those with high points have access to decent education, health and can travel outside provinces and China. Those who have low points have these rights removed and can be forbidden from travelling or entering certain buildings. The points system carries on for generations, with some families marked as socially unacceptable – the black sheep of China.
As for surveillance in schools, China isn’t alone. In early 2019, Delhi schools in India have confirmed the use of surveillance cameras in all government schools.
The Police State. Is it real or something we’ve conjured in our imaginations? Has it always been there, in the background, now coming to the forefront thanks to social media accounts and our smart phones? Is it more real for some groups than others? Do we really need to worry about it?
The images that accompany the Police State are found in writings across the 19th, 20th and 21st centuries, from Phillip K Dick’s Minority Report to Orwell’s 1984, the contemporary The Fat Years and The Hunger Games. The nightly news presents us with a screen full of nasty images of criminals, looters, and police shootings – by the cops and against the cops, and now the general civilian. Drones hover above us in the skies, politicians ramp up the cause of arming our police through hype, and the grasp on the citizen's rights as an independent, sovereign citizen ever tightens. But is it all hype or is there some truth to it?
I’ve written about the Police State in my books, and since 2012, when I first locked my protagonist into a cell heavily guarded by police-come-military, I’ve watched in dread how my more brutal plot twists have slowly bled into real life. Not unlike J Rae, who waits behind the bars for a captor to reprogram her beliefs and convictions as her world crumbles into a tightly controlled population without freedoms; it seems we may be on the precipice of similar circumstances.
Now, I’m not wanting to sound alarmist. I’m merely questioning the world around me with some urgency, and currently, it isn’t a pretty sight to behold.
I’ve reported on DAPL protesters being held in dog-like cages, with their arms penned in ink by authorities wanting to notate their identity, not by name but by a number so reminiscent of the camps in Germany. We've seen how cops revel in their new emergency powers to "just follow orders." Citizens rat out others for not donning the correct symbolism over their faces, and our DNA is now used to trace us at sewage plants. Snippets of history are returning with some of them on steroids, and by themselves they’re harmless enough. But like the plot line in my books, it’s the snippets you have to watch out for. What toes the line soon crosses it if we’re not watching, and once crossed, where does it stop?
A colleague of mine, an ex NYPD cop-turned-journalist, wrote about the latest technologies leeching into the police system across the Western, and sometimes Asian and European nations. Reading it, I got chills. The handcuffs I hinted at under the guise of suspended reality have just been patented – ready to drug and taser you from an officer-held remote control. The drones haunting my characters in their dilapidated city are fully engaged. The SciFi-esque Pain Ray Cannon microwave blaster designed specifically for protesters – that should only exist in the movies – is now being used. One day, we’ll beg for the return of the gun and human error, but only after the newly developed ‘Robocop‘ does a no-knock house raid on our falsely accused neighbor, leaving a pool of blood behind them and no court to convict them.
No. I’m not exaggerating. I wish I was. This isn’t the world I want for my children and nor is it the one I want for myself. I’d sooner be happier writing about pixies and unicorns dancing together under a rainbow, but this is fast becoming our reality.
The military police; the drones; the electronic tattoos to identify each citizen and their 'vaccination' records (among other social status markers); RFID chips; digital currencies; internment camps; free thinkers arrested, missing, killed . . . mad men in power; these were all once just works of fiction belonging to the pages of my books. As the books go on, the scenario gets darker, grim . . . acutely aware of my surroundings, perhaps subconsciously I filtered what I was already seeing? I didn’t want this to happen. I really didn’t want to see this happen . . .
I understand the need to arm our police, and one argues if you have nothing to hide you have nothing to worry about. But the faint line in the sand, the one that we’re currently toeing . . . the one that I fear is almost rubbed out . . . what happens when we cross it?