Hiding in plain sight: the ISC report on GCHQ surveillance

Yesterday the UK Parliament’s Intelligence and Security Committee published its report into the security services.  The thrust of this investigation was to look at the whole issue of the bulk interception of data – an issue dragged into the limelight by Edward Snowden – and determine whether this constitutes mass surveillance. (See this post for more detail on the difference, or not, between bulk interception of data and mass surveillance).

What the report has really done is both flush out some important issues, but then allow these to remain hidden in plain sight, because the Committee has failed to grasp the implications of what they have uncovered.

The BBC summarises the key point thus: (The Committee) said the Government Communications Headquarters (GCHQ) agency requires access to internet traffic through “bulk interception” primarily in order to uncover threats by finding “patterns and associations, in order to generate initial leads”, which the report described as an “essential first step.”

And here is what is hiding in plain sight.  The acknowledgment that GCHQ is using bulk data to “find patterns and associations in order to generate initial leads.”  What is wrong with that, you might say?  Here is what is wrong with that.   This means that information gained by swallowing (intercepting) large chunks of humanity’s collective digital activity is being used to predict the possibility that each and everyone of us (not just those whose data might have been swallowed) is a … fill in the gap (potential terrorist, criminal, undesirable).  We all now wear a badge, or can have such a badge put upon us, which labels us with this probability.  Now it may well be that only those of us that have a badge with a high probability then go on to become ‘initial leads’ (whose emails will then be read).  But we all still wear the badge and we can all go on to become an initial lead at some point in the future, dependant on what specific area of investigation an algorithm is charged with investigating.

Algorithmic surveillance is not about reading emails, as the Committee (and many privacy campaigners) seem to believe.   This is an old fashioned ‘needles in haystacks’ view of surveillance.  Algorithmic surveillance is not about looking for needles in haystacks, it is about using data from the hay in order to predict where the needles are going to be.  In this world the hay becomes the asset.  Just because GCHQ is not ‘reading’ all our emails doesn’t legitimise the bulk interception of data or provide assurance that a form of mass surveillance is not happening.  As I said in the previous post: until we understand what algorithmic surveillance really means, until this is made transparent, society is not in a position to give its consent to this activity.

 

 

 

 

 

 

Post a comment

You may use the following HTML:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>