Opinion

When Kafka met Orwell: Arrest by algorithm

In April 2017, an extraordinary claim of the Israeli security services was published in Ha’aretz (Hebrew): over 400 Palestinians had been detained under suspicion they may be involved in future terrorist attacks. They were detained not on the basis of evidence, but on the decision made by an algorithm.

The practice grew out of a security issue for the Israeli authorities. The so-called “Intifada of the Individuals”, beginning in late 2015, presented the ISA (Israeli Security Agency, AKA Shin Beth) with a quandary. The ISA had spent decades dismantling Palestinian society via a network of informers and intimidation, but those tools, while very useful against any sort of a cell organization, proved helpless in the face of individuals who decided to go on an attack on a whim.

It took the service a few months to recalibrate, and then – most likely with the assistance of Israel’s version of the National Security Agency, the vaunted Unit 8200 – it began analyzing the social media profiles of Palestinians, and deriving from them a series of indicators which, when aggregated, produced a profile of a possible attacker.

The past few years have seen algorithms used to predict the likelihood of a convict returning to crime, and those results were used to determine whether that person is worthy of parole. Those systems, when checked, often show proof of bias – for instance, against African Americans in the United States. At least the systems used by the American justice system can be challenged; US courts are now dealing with several appeals by prisoners whose sentencing or parole refusal were determined by algorithms.

But when it comes to the military justice system as applied to Palestinians, the situation becomes much more twisted than in the U.S. Who precisely is going to oversee a system developed and used by the ISA?

As John Brown and Noam Rotem noted (Hebrew), the fact that someone fits such a profile – for instance, he praised attackers and changed his profile picture – does not in any way serve as evidence a court will accept. Basically we are asked to believe a system, of which we know nothing, may accurately predict the actions of a specific person in the future, and, on the verdict of said system, we may then detain that person – not for something he did, or even planned to do, but for something he may do.

The first option is to get around the courts: when a Palestinian is detained by algorithm predicting future actions, the security services simply put him under administrative detention. This basically means no legal process: the military commander of the West Bank rubber-stamps an order actually issued by the ISA, and the person is thus sentenced to six months incarceration with no possibility of appeal. After six months, the general may rubber-stamp the detention order again, ad infinitum; people have served long years without ever seeing a court.

This draws attention, however, and as Brown and Rotem noted, Israeli lawyers of Palestinians noticed a new pattern: when someone is arrested by algorithm, he is charged with “incitement.” The level of proof required is quite low. Even supporting a Palestinian armed group online may suffice. The smoking gun is what happens in the rare cases when an algorithm detainee is acquitted: then he is almost immediately put into administrative detention, i.e. the one without any judicial oversight.

So, basically: A computer program whose biases may never be discovered, as it is a state secret, decides that a person may commit a crime; the person is then detained, but is not informed of the real evidence against him or her, as there is no valid evidence; he or she is then charged with the faux crime of “incitement”, and, should the judge refuse to be a cypher and acquit the person, the person is thrown into the maws of a technically lawless system of administrative detention.

All the while, the prisoner is repeatedly told they should confess to the phantom crime of “incitement”: they are informed that should they confess, they will be given a relatively light sentence – but should they plead innocence, they will be held in detention until the process is over. That may take several years; the offered sentence is lesser. So, our software says you’re guilty. Do you want to take it to court and go home after five years, or confess and go home after three?

While this particular legal twist may only be used against Palestinians, the algorithm itself is not so limited. Former Unit 8200 soldiers are highly-sought-after programmers, and what they know is surveillance. The systems used for practice against Palestinians are often later sold to other countries. Your local police department may soon acquire one.

Do you have a documented history of protest? Do you dislike the president? The government in general? Israeli-made software may soon be secretly cataloguing it.

A major part of the problem, of course, is the fact that we have become used to sharing information on social media platforms that collect numerous data points about us. We have done so willingly, for ephemeral benefits: we have forged the bars of our own cell.

And yet, for all that, someone is shaping the bars into a cell, and we better stop it soon – because doing so from within will be infinitely more difficult. We should not accept the normalization of tools used by a military dictatorship against an occupied people.

11 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments

Very “Minority report” – Israel, start up nation, is well ahead of 2065:

http://www.imdb.com/video/imdb/vi3232936729

1/1

This two days ago in Independent:

“How UK police are turning to Israel for help stopping ‘lone wolf’ terror attacks”

http://www.independent.co.uk/news/world/middle-east/uk-anti-terror-israel-lone-wolf-attacks-help-islamists-advise-westminster-palestinians-london-bridge-a7817286.html

Titbits:

“But even if human intelligence begins to dry up, Israeli officials point out that they built up wide ranging communication and surveillance system. In addition, no fewer than a thousand members of the police force 29,000 strong are involved in monitoring the Internet and social media sites.

Israel has rapidly expanding export sales in surveillance. The country’s leading defence electronic contractor, Elbit, showcases a nationwide database system called “WIT” (Wise Intelligence Technology) which can be used in conjunction with cameras, “Skyeye”, mounted on drones or helicopters. It draws in information from signals intelligence as well as open sources such as Facebook and Twitter and can automatically alert emergency services if a terrorist attack is underway. ”

Obviously, UK democracy is more democratic than Israeli “democracy” preventive detention is not yet used for pre-crime, like in Israel, or Turkey, although it has been tried on Samina Malik in UK in 2007.

2/2

When that Israeli general came to speak at Brown University, I remember he summarized the ‘research findings’ regarding the ‘profile of a terrorist’ — the profile that presumably underlies the algorithm. One criterion is that the ‘terrorist’ is likely to be young and single: he does not yet have family responsibilities. Another is that he is likely to be related to someone who has been arrested and tortured by the Israelis, especially a brother, father, or other close relative, so that he has a motive for revenge. This points in the direction of arresting entire families, as in North Korea.

Disturbing, at the very least. I am reminded of Brave New World and 1984. Modern Israel and Palestine will need much more than a truth-in-reconciliation commission.

Suspect same algorithm is used in the dreadfully increasing summary executions at checkpoints.