5.3 C
New York
Friday, March 29, 2024

Does a Robot Get to Be the Boss of Me?

support request:

I’m disturbed by the fact that law enforcement agencies are increasingly using robots for neutralizing threats, surveillance, and hostage situations. Maybe I’ve just seen RoboCop too many times, but I’m wary of machines making crucial, life-or-death decisions—especially given how often actual human officers abuse their authority. Do I have any kind of moral obligation to obey a police robot? 

—SUSPECT

Dear Suspect—

Hollywood has not been particularly optimistic about robots in positions of authority. RoboCop is just one example of the broader sci-fi canon that has burned into our minds the tragic consequences of relinquishing critical tasks to inflexible machines—robots whose prime directives are honored with a literalism that can turn lethal, who can blast a person to death but are confounded by a set of stairs. The message of these films is clear: Rigid automatons are incapable of the improvised solutions and moral nuance that’s so often required in moments of crisis.

It may have been this stereotype that led Boston Dynamics, some of whose robots are being incorporated into police departments, to release a video last December of its models dancing to the 1950s Contours hit “Do You Love Me.” Maybe you saw it? The robots included Atlas, an android that resembles a deconstructed storm trooper, and Spot, which served as inspiration for the killer dogbots in the “Metalhead” episode of Black Mirror. Neither machine seems to have been designed to quell fears about a robot takeover, so what better way to endear them to the public than to showcase their agility? And what better test of said agility than a skill considered so uniquely human that we invented a move designed to mock an automaton’s inability to do it (the Robot)? Watching the machines shuffle, shimmy, and twirl, it’s difficult to avoid seeing them as vibrant, embodied creatures, capable of the same flexibilities and sensitivities as ourselves.

>

Never mind that Spot’s joints can slice off your finger or that police robots have already been used to exercise deadly force. One way to answer your question, Suspect, without any appeals to moral philosophy, might be in terms of pragmatic consequences. If you have plans, as most of us do, to remain alive and well, then yes, you should absolutely obey a police robot.

But I sense that your question is not merely practical. And I agree that it’s important to consider the trade-offs involved in handing policing duties over to machines. The Boston Dynamics video, incidentally, was posted at the tail end of 2020 as a way “to celebrate the start of what we hope will be a happier year.” One week later, insurgents stormed the Capitol, and images proliferated of police officers showing little resistance to the mob—photos that were strikingly juxtaposed, on social media, against the more severe responses to the Black Lives Matter protests last summer.

At a moment when many police departments are facing a crisis of authority due to racial violence, the most compelling argument for robotic policing is that machines have no intrinsic capacity for prejudice. To a robot, a person is a person, regardless of skin color, gender, or cause. As the White House noted in a 2016 report on algorithms and civil rights, new technologies have the potential to “help law enforcement make decisions based on factors and variables that empirically correlate with risk, rather than on flawed human instincts and prejudices.”

>

Of course, if current policing technology is any evidence, things are not that simple. Predictive policing algorithms, which are used to identify high-risk persons and neighborhoods, are very much prone to bias, which the roboticist Ayanna Howards has called the “original sin of AI.” Because these systems rely on historical data (past court cases, previous arrests), they end up singling out the same communities that have been unfairly targeted in the first place and reinforcing structural racism. Automated predictions can become self-fulfilling, locking certain quadrants into a pattern of overpolicing. (Officers who arrive at a location that has been flagged as ripe for crime are primed to discover one.) These tools, in other words, do not so much neutralize prejudice as formalize it, baking existing social inequities into systems that unconsciously and mechanically perpetuate them. As professor of digital ethics Kevin Macnish notes, the values of the algorithm’s makers “are frozen into the code, effectively institutionalizing those values.’’

At present, the officers who act on algorithmic recommendations are still human, but it’s easy to envision a not-so-distant future where policing decisions are not only informed but carried out by machines—a day when some Atlas-like robot will show up on a street that a predictive model has identified as high-risk and, with the help of its “fine motor-skills capabilities” and “28 degrees of freedom,” arrest the first likely candidate. Perhaps it’s a sign of the times that such dystopian scenarios are, if still undesirable, beginning to look not categorically worse than our current state of affairs. The actions of Derek Chauvin alone stand as a reminder that humans can be just as coldhearted and unfeeling as a machine.

Still, the fact that the officer was human is, in part, what provoked public outrage. We react viscerally to the sight of people abusing their power—far more so than we do to cases of machine malfunction, even when it can be traced back, through the shadowy byways of bureaucracy, to human error. As the criminal justice system automates more and more of its operations, its actions are becoming increasingly opaque, shellacked in a detached objectivity that risks obscuring acts of injustice. As the writer Jackie Wang points out in her book Carceral Capitalism, personification is a necessary component of moral indignation. “‘All police databases are bastards’ makes no sense,” she writes. Neither does “All police robots are bastards,” regardless of how human they appear or how well they can dance.

>

I would add to this that if personification is crucial to cultivating outrage, it’s also necessary for countering it. Among the many remarkable images caught on film during the George Floyd protests was a video showing members of the National Guard dancing with protesters in Atlanta, only days after the streets had been filled with tear gas. The dance they were doing—the Macarena—was somewhat mechanical, more simple than many of the moves the Boston Dynamics robots are capable of executing. And yet the moment itself demonstrated an agility that is not merely physical but spiritual. It was one of those outflashings of grace that sometimes appear when people let their guard down and improvise, breaking through the rigidity of choreographed social roles and long-standing tensions.

One protester, Amisha Harding, told reporters that the dancing opened up a space for dialog with the officers. “In talking to them,” she said, “I realized that so many of them believe in what we’re fighting for as well.” Although the officers could not, being subject to their own prime directives (the oath of duty), publicly express their support for the movement, many revealed to the protesters that their hearts were at odds with the tasks they’d been asked to perform. In the movies, it’s precisely this dissonance that marks a robot’s acquisition of consciousness. The machine that develops a conscience and becomes troubled by the actions it’s been hardwired to carry out has transcended its status as a tool and has become, essentially, human.

Perhaps one day our machines will achieve that level of complexity. Until then, moral flexibility—the willingness to change, to break the rules, to abandon beliefs and practices that are no longer serving the public good—is something we alone can enact. I realize, Suspect, that it’s difficult these days to believe that people, let alone systems, are capable of change. But it’s also true that humans maintain more than 28 degrees of freedom, and at least some of those choices might be worth preserving.

Faithfully,

Cloud


Be advised that CLOUD SUPPORT is experiencing higher than normal wait times and appreciates your patience.

Related Articles

Latest Articles