[View this email in your browser](
[Open Ars Technica in your browser](
[Image]
[Catalina, an island off the coast of California and the new MacOS release name.](
George Floyd died at the hands of a Minneapolis police officer 16 days ago. Yesterday, his funeral was held in the city of Houston. Today, nationwide protests bringing awareness to disproportionally high numbers of violent police acts against Black Americans continue indefinitely.
Things must change moving forward for America to become the more equitable, diverse, safe, harmonious country many hope and believe it can be. And part of that change must happen in a very specific areaâin how authorities utilize technology.
If you've read Ars for any extended period of time, you already know a universal tech truth: most technology is not inherently evil in and of itself, but how a piece of technology is utilized most certainly can be. In recent years as rapidly evolving tech like artificial intelligence has enabled [insta-analysis for vast quantities]( of data or powered [autonomous weaponry](, there hasn't been enough transparency or scrutiny of how this kind of tech is used by those in authority.
For this week's Orbital Transmission, we're highlighting a few recent examples of this playing out in light of our country's protests. Hopefully, movements of this scale are simply too large, too society-sweeping to be ignored entirely no matter what historical precedent indicates. That said, inevitably there will be pushback on many suggested changes moving forward. Let's all hope a re-examination of authorities' use of technology can be one of the actions that sticks around.
A quick reminder: As one small action towards improving the state of diversity in the media and supporting current protests, Condé Nast is pledging $1,000,000 in advertising support across the company's platforms to help give voice to non-profit organizations combating racial injustice. If you or someone you know works with an organization that could benefit from such resources, reach out to communications [at] condenast [dot] com.
â[@NathanMattise](
Orbital Transmission 06.10.2020
[(image) ](
[Hand-me-down military gear hasn't made policing better](
That's a Miami cop, not a soldier, in the above picture. And that's in part a structural problem, as our pals at Wired recently wrote: "Created as part of 1997âs National Defense Authorization Act, the 1033 program allows the Department of Defense to get rid of excess equipment by passing it off to local authorities, who only have to pay for the cost of shipping. (A precursor, the slightly more restrictive 1208 program, began in 1990.) According to the Law Enforcement Support Office (LESO), which oversees the process, over $7.4 billion of property has been transferred since the programâs inception; more than 8,000 law enforcement agencies have enrolled. Much of that inventory is perfectly ordinary: office equipment, clothing, tools, radios, and so on. But the haul also includes some of the so-called controlled equipmentârifles, armored vehicles, and so onâthat have helped create such a spectacle of disproportion." And as many [past studies have shown](, the more weapons are easily acquired out in the world, [the more violence]( of various types [tends to follow](.
[(image) ](
[IBM is walking away from facial recognition and urging Congress to limit its use](
One of the ways protests can ultimately be effective is by making those in power uncomfortableânot necessarily physically-speaking, but by sparking someone or something with sway to reconsider their actions. In that sense, this current moment is already making waves and IBM is a prime example. This week, CEO Arvind Krishna wrote to Congress urging lawmakers to act against police misconduct and regulate the way technology can be used by law enforcement. In doing so, Krishna also revealed IBM would be walking away from facial recognition as a business endeavor. "IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency."
[(image) ](
[Speaking of facial recognition, Clearview AI seems to be taking the opposite approach so far](
You may have only recently heard of Clearview AI. The company operated largely without the public knowing about it until January, when [The New York Times published]( a detailed report about the company. The Times described Clearview as a "groundbreaking" facial-recognition service, which allowed a user to match an imported photo against a database of more than 3 billion images. The company claimed to have ~600 law enforcement customers at the time, though Buzzfeed reported more than 2,200 clients after a Clearview AI data breach. (O_o) But in light of nationwide protests calling out police violence against Black Americans, Senate Democrats led by Edward Markey (Massachusetts) are now demanding Clearview release that list of clients in order for action to be taken. "As demonstrators across the country exercise their First Amendment rights by protesting racial injustice, it is important that law enforcement does not use technological tools to stifle free speech or endanger the public," he wrote.
[(image) ](
[An old study, but insight that scaling back policing isn't automatically dangerous](
In late 2014 and early 2015, escalating tensions in New York City led to the NYPD staging a slowdown in which the department performed only its most essential duties. That might be expected to lead to an increase in crime, but a 2017 analysis of official statistics showed the opposite: a significant drop in major crime for the period of the slowdown. During the slowdown (which occurred in response to the death of Eric Garner at the hands of NYC police), police continued to respond to calls, and the arrest rate for major crimes (murder, rape, robbery, felony assault, burglary, grand larceny, and grand theft auto) remained constant. But the arrest rate for non-major crime and narcotic offenses dropped, as did the number of stop-and-frisk events. It became a rare opportunity to explore questions that couldnât be tested experimentally, for practical or ethical reasons. The researchers urged for more analysis of this data and these types of situations, but their takeaway? "The results imply that aggressively enforcing minor legal statutes incites more severe criminal acts." And rather than proactive policing deterring major crime, the authors concluded itâs more likely that this kind of aggressive enforcement âdisrupts communal life, which can drain social control of group-level violence.â In other words, overly aggressive policing brings a level of social disruption that actually leads to more crimeâand the reduced proactive policing during the slowdown produced a calming effect in contrast.
[Facebook]( [Twitter]( [YouTube]( [Instagram](
Copyright © 2020 Condé Nast, All rights reserved.
Our mailing address is:
Condé Nast
One World Trade Center
New York, NY 10007
Want to change how you receive these emails?
You can [update your preferences](newsletter=ars) or [unsubscribe from this list](newsletter=ars).