Main Idea Questions

Now that we have a better sense of what an argument is and the parts that compose arguments. Let’s start refining our big picture strategy.

Previously we were focused on finding the sentence that captures the point, purpose, or perspective of each paragraph. Our focus isn’t going to change but we will begin to integrate our argumentative structure to improve the highlights that we make. As we go through, look out for claims that capture the passage argument.

Additionally we will want to be aware of different viewpoints being presented. This is true for both argumentative and descriptive passage, but we will focus on argumentative passages for now. For example, an author might introduce someone else’s viewpoint so they can either agree or disagree with it. While this seems like a fairly specific bit of advice the MCAT likes to try and trip you up by presenting these viewpoints as the authors. To see what I mean and how we can start to apply our big picture strategy to argumentative passages let’s look at the excerpt below. Since we are still getting started with this strategy I will recap the highlights at the end of each paragraph.

Drone Bomb Me

Drones, or “unmanned aerial vehicles,” as they are called in military jargon, have come to play an increasingly prominent role in global warfare and in security and policing operations. They are now starring actors in the new “theaters of war” that have sprung up around the globe. “Hunter-killer” drones used in such assassinations are partly operated via remote control by American soldiers stationed thousands of miles away from their targets, and partly by complex software systems that use data analysis and facial recognition technologies to select, target and kill specific individuals. Restructuring the very dynamics of warfare, drones have attracted the attention of journalists, politicians, activists, artists and academics who have voiced moral, ethical and legal concerns regarding their use. But as the French philosopher Gregoire Chamayou points out in the introduction to his book Drone Theory (2015), such debates surrounding this new technology tend to ring a bit hollow. Fundamental categories customarily employed in our discussions of warfare – concepts such as space and time, peace and war, or bravery and cowardice – seem to falter in the face of drones.

Recap: At first the author introduces the topic, drones, then quickly moves towards discussing Chamayou’s views on drones and how we should talk about them.

The drone, Chamayou holds, poses a challenge to understanding. Our received moral, political and legal lexicon, as well as the narratives we use to justify acts of warfare (stories about bravery, honor, courage, self-defense), seems to shatter to pieces when confronted with the drone. A similar intuition guides the work of Susan Schuppli, whose essays point out that the use of intelligent software complicates political and legal attempts to introduce a system of checks and balances to set boundaries on the deployment of drones. This problem is particularly urgent for “signature strikes,” when technologies of facial recognition, data analytics, and predictive software enable the identification of suspect patterns of behavior by anonymous individuals, who are subsequently targeted for killing.

When a drone executes software that is designed to kill, it is difficult to determine who is to be held accountable. There are now situations where it is unclear whether the decision to strike was made by a person, a machine, or some hybrid of the two. The idea that an actual human being, or ‘legal person’ stand behind the invention of every machine who might ultimately be found responsible when things go wrong, or even when they go right, is no longer tenable and obfuscates the fact that complex systems are rarely, if ever, the product of single authorship; nor do humans and machines operate in autonomous realms. Indeed, both are so thoroughly entangled with each other that the notion of a sovereign human agent functioning outside the realm of machinic mediation seems wholly improbable. (“Deadly Algorithms,” 5)

Schuppli concludes that we need a more radical rethinking of the legal and moral frameworks we use to understand “agency.” One possible road to take, she suggests, is to rethink “personhood” itself, and to perhaps add “algorithmic personhood” to the forms of legal personhood currently granted to other non-human actors such as states and corporations. As Schuppli points out, our current situation indicates that “personhood” and the vocabulary of accountability and responsibility can no longer be detached from the question of technology. 

Although I am sympathetic to Schuppli’s suggestion, I feel that the absurdity of “algorithmic personhood” as a notion unwittingly raises the question whether the received lexicon of personhood, and the logic of accountability and responsibility that it entails, aren’t precisely what drones undermine. Power, no longer (or not only) exercised in courts of law or within institutions such as schools, prisons and hospitals, is now exercised by complex techno-social hybrids that capture and manage datastreams in ever more complex systems that can barely be understood by those involved. Whereas the earlier regimes ruled by the giving or withholding of rights to persons, or sought to discipline and mold individual bodies, control societies are run by forces that exert a depersonalizing effect. They operate by using technologies that capture data that are aggregated in banks and are manipulated by complex algorithms. Therefore the attempts to articulate the implications of drone warfare in the received legal and political language of personhood – even if it involves the coining of new terms – may, despite Schuppli’s intentions, obfuscate precisely what is new and challenging about drones and may risk serving ultimately as a mere fetish enabling us to prolong our moment of non-understanding.