When Kafka met Orwell: Arrest by algorithm

Middle East
on 11 Comments

In April 2017, an extraordinary claim of the Israeli security services was published in Ha’aretz (Hebrew): over 400 Palestinians had been detained under suspicion they may be involved in future terrorist attacks. They were detained not on the basis of evidence, but on the decision made by an algorithm.

The practice grew out of a security issue for the Israeli authorities. The so-called “Intifada of the Individuals”, beginning in late 2015, presented the ISA (Israeli Security Agency, AKA Shin Beth) with a quandary. The ISA had spent decades dismantling Palestinian society via a network of informers and intimidation, but those tools, while very useful against any sort of a cell organization, proved helpless in the face of individuals who decided to go on an attack on a whim.

It took the service a few months to recalibrate, and then – most likely with the assistance of Israel’s version of the National Security Agency, the vaunted Unit 8200 – it began analyzing the social media profiles of Palestinians, and deriving from them a series of indicators which, when aggregated, produced a profile of a possible attacker.

The past few years have seen algorithms used to predict the likelihood of a convict returning to crime, and those results were used to determine whether that person is worthy of parole. Those systems, when checked, often show proof of bias – for instance, against African Americans in the United States. At least the systems used by the American justice system can be challenged; US courts are now dealing with several appeals by prisoners whose sentencing or parole refusal were determined by algorithms.

But when it comes to the military justice system as applied to Palestinians, the situation becomes much more twisted than in the U.S. Who precisely is going to oversee a system developed and used by the ISA?

As John Brown and Noam Rotem noted (Hebrew), the fact that someone fits such a profile – for instance, he praised attackers and changed his profile picture – does not in any way serve as evidence a court will accept. Basically we are asked to believe a system, of which we know nothing, may accurately predict the actions of a specific person in the future, and, on the verdict of said system, we may then detain that person – not for something he did, or even planned to do, but for something he may do.

The first option is to get around the courts: when a Palestinian is detained by algorithm predicting future actions, the security services simply put him under administrative detention. This basically means no legal process: the military commander of the West Bank rubber-stamps an order actually issued by the ISA, and the person is thus sentenced to six months incarceration with no possibility of appeal. After six months, the general may rubber-stamp the detention order again, ad infinitum; people have served long years without ever seeing a court.

This draws attention, however, and as Brown and Rotem noted, Israeli lawyers of Palestinians noticed a new pattern: when someone is arrested by algorithm, he is charged with “incitement.” The level of proof required is quite low. Even supporting a Palestinian armed group online may suffice. The smoking gun is what happens in the rare cases when an algorithm detainee is acquitted: then he is almost immediately put into administrative detention, i.e. the one without any judicial oversight.

So, basically: A computer program whose biases may never be discovered, as it is a state secret, decides that a person may commit a crime; the person is then detained, but is not informed of the real evidence against him or her, as there is no valid evidence; he or she is then charged with the faux crime of “incitement”, and, should the judge refuse to be a cypher and acquit the person, the person is thrown into the maws of a technically lawless system of administrative detention.

All the while, the prisoner is repeatedly told they should confess to the phantom crime of “incitement”: they are informed that should they confess, they will be given a relatively light sentence – but should they plead innocence, they will be held in detention until the process is over. That may take several years; the offered sentence is lesser. So, our software says you’re guilty. Do you want to take it to court and go home after five years, or confess and go home after three?

While this particular legal twist may only be used against Palestinians, the algorithm itself is not so limited. Former Unit 8200 soldiers are highly-sought-after programmers, and what they know is surveillance. The systems used for practice against Palestinians are often later sold to other countries. Your local police department may soon acquire one.

Do you have a documented history of protest? Do you dislike the president? The government in general? Israeli-made software may soon be secretly cataloguing it.

A major part of the problem, of course, is the fact that we have become used to sharing information on social media platforms that collect numerous data points about us. We have done so willingly, for ephemeral benefits: we have forged the bars of our own cell.

And yet, for all that, someone is shaping the bars into a cell, and we better stop it soon – because doing so from within will be infinitely more difficult. We should not accept the normalization of tools used by a military dictatorship against an occupied people.

About Yossi Gurvitz

Yossi Gurvitz is a journalist and a blogger, and has covered the occupation extensively.

Other posts by .


Posted In:

11 Responses

  1. Eva Smagacz
    July 3, 2017, 4:49 pm

    Very “Minority report” – Israel, start up nation, is well ahead of 2065:

    http://www.imdb.com/video/imdb/vi3232936729

    1/1

    • jd65
      July 4, 2017, 12:19 am

      You beat me to it, Eva :) Kinda ironic that it’s a Spielberg film…

  2. Eva Smagacz
    July 3, 2017, 5:03 pm

    This two days ago in Independent:

    “How UK police are turning to Israel for help stopping ‘lone wolf’ terror attacks”

    http://www.independent.co.uk/news/world/middle-east/uk-anti-terror-israel-lone-wolf-attacks-help-islamists-advise-westminster-palestinians-london-bridge-a7817286.html

    Titbits:

    “But even if human intelligence begins to dry up, Israeli officials point out that they built up wide ranging communication and surveillance system. In addition, no fewer than a thousand members of the police force 29,000 strong are involved in monitoring the Internet and social media sites.

    Israel has rapidly expanding export sales in surveillance. The country’s leading defence electronic contractor, Elbit, showcases a nationwide database system called “WIT” (Wise Intelligence Technology) which can be used in conjunction with cameras, “Skyeye”, mounted on drones or helicopters. It draws in information from signals intelligence as well as open sources such as Facebook and Twitter and can automatically alert emergency services if a terrorist attack is underway. ”

    Obviously, UK democracy is more democratic than Israeli “democracy” preventive detention is not yet used for pre-crime, like in Israel, or Turkey, although it has been tried on Samina Malik in UK in 2007.

    2/2

  3. Stephen Shenfield
    July 3, 2017, 5:44 pm

    When that Israeli general came to speak at Brown University, I remember he summarized the ‘research findings’ regarding the ‘profile of a terrorist’ — the profile that presumably underlies the algorithm. One criterion is that the ‘terrorist’ is likely to be young and single: he does not yet have family responsibilities. Another is that he is likely to be related to someone who has been arrested and tortured by the Israelis, especially a brother, father, or other close relative, so that he has a motive for revenge. This points in the direction of arresting entire families, as in North Korea.

    • jd65
      July 4, 2017, 12:33 pm

      Another is that he is likely to be related to someone who has been arrested and tortured by the Israelis, especially a brother, father, or other close relative, so that he has a motive for revenge.

      In other words, the arrest and torture of Palestinians by Israel/IDF is its own justification for further arrests and torture of more Palestinians by Israel/IDF. It’s a fucking freak show…

      • marc b.
        July 8, 2017, 11:46 am

        Yes, the self licking ice cream cone.

        This system also shows the often overlooked 3rd use of detention and torture: in addition to ‘punishment, and ‘intelligence gathering’, torture produces the conversion event. See the Inquisition. You must come to love your captor and prove your love by becoming a quisling. That’s the real purpose of the various GWOT torture sites: conversion through trauma, catch and release.

  4. JosephA
    July 4, 2017, 12:33 am

    Disturbing, at the very least. I am reminded of Brave New World and 1984. Modern Israel and Palestine will need much more than a truth-in-reconciliation commission.

  5. Bob Trujillo
    July 4, 2017, 11:23 am

    Suspect same algorithm is used in the dreadfully increasing summary executions at checkpoints.

  6. Keith
    July 4, 2017, 5:19 pm

    “They were detained not on the basis of evidence, but on the decision made by an algorithm.”

    A further advancement in Israel’s matrix of control which it exports to other countries. We are at the end of an era, and as the world drifts into PLANNED chaos, control of crumbling societies during desperate times will be critically important to the elites who wish to maintain their power and privilege. There is more to the social media than meets the eye. And the US Constitution an increasingly worthless parchment barrier.

  7. JLewisDickerson
    July 4, 2017, 7:24 pm

    “The first option is to get around the courts: when a Palestinian is detained by algorithm predicting future actions, the security services simply put him under administrative detention. This basically means no legal process: the military commander of the West Bank rubber-stamps an order actually issued by the ISA, and the person is thus sentenced to six months incarceration . . . After six months, the general may rubber-stamp the detention order again, ad infinitum . . . This draws attention, however, and as Brown and Rotem noted, Israeli lawyers of Palestinians noticed a new pattern: when someone is arrested by algorithm, he is charged with ‘incitement’. The level of proof required is quite low. Even supporting a Palestinian armed group online may suffice. The smoking gun is what happens in the rare cases when an algorithm detainee is acquitted: then he is almost immediately put into administrative detention, i.e. the one without any judicial oversight.” ~ Gurvitz

    THEIR PREDICAMENT IS A BIT LIKE THAT OF JOSEF K IN ORSON WELL’S FILM ADAPTATION OF FRANZ KAFKA’S “THE TRIAL”:
    After months of trial postponement, Josef K goes to court painter Titorelli to ask for advice. He is told to hope for little. He might get definite acquittal, ostensible acquittal, or indefinite postponement. No one is ever really acquitted, but sometimes cases can be extended indefinitely.

    Titorelli: “You see, in definite acquittal, all the documents are annulled. But with ostensible acquittal, your whole dossier continues to circulate. Up to the higher courts, down to the lower ones, up again, down. These oscillations and peregrinations, you just can’t figure ‘em.”
    Josef K: “No use in trying either, I suppose.”
    Titorelli: “Not a hope. Why, I’ve known cases of an acquitted man coming home from the court and finding the cops waiting there to arrest him all over again. But then, of course, theoretically it’s always possible to get another ostensible acquittal.”
    Josef K: “The second acquittal wouldn’t be final either.”
    Titorelli: “It’s automatically followed by the third arrest. The third acquittal, by the fourth arrest. The fourth…”

    SOURCE – http://www.imdb.com/title/tt0057427/quotes?qt=qt0135410

    The Trial (1962) Trailer

    Kafka: The Trial (M Studio Movement Theatre)

    P.S. Woe be unto anyone who dares to question a divinely inspired algorithm, for G-d will surely smite them with a heightened vengeance! To be on the safe side, it is best to just assume that all algorithms are sacred.

    • Mooser
      July 5, 2017, 1:15 pm

      “Woe be unto anyone who dares to question a divinely inspired algorithm.”

      I’m an errand girl for algorithm, send me.

Leave a Reply