2021v28, Friday: Not oil.

Getting the analogy wrong can ruin policy, as our approach to data has shown. And turning to real fossil fuels: two big, big events involving an oil company and a coal mine.

Short thought: Taking a short break from thingification, this week marks three years since the General Data Protection Regulation, or GDPR as most of us know it, came into force.

Many hate it. It’s caused a huge amount of work for organisations of all kinds. It’s clunky, imprecise, open to vast interpretation as to how its extensive obligations should be implemented, and therefore tends towards lengthy tickbox exercises rather than the “privacy by design and default” which is, to me, the heart of the whole exercise.

And, of course, it leads to all those dialogs every time you log into a website. And more recently, also the pointless and aggravating requests that you acknowledge the site’s legitimate interest in using your data any way they want. 

(Pointless because if push came to legal shove, I can’t believe a court would waste more than a few seconds on any such factor. Your legitimate interests aren’t something you can just sign away. Particularly not without genuine consideration. Yet another piece of annoying figleaf. Aggravating for the same reason.)

But I still think the anniversary is worth celebrating. Because GDPR did something really important. It enshrined, far more strongly than its predecessor legislation the core principle that your data is yours. It’s not some public resource that organisations can use however they want; some commons they can enclose at will. 

Analogies are important here – and yes, I realise we’re back on stories again, because when the story’s wrong, our responses to it are wrong too. Here, the problem is the dominant analogy: “data is the new oil” is a phrase often bandied around, but as Matt Locke notes here, this is entirely the wrong categorisation:

The discussions around data policy still feel like they are framing data as oil – as a vast, passive resource that either needs to be exploited or protected. But this data isn’t dead fish from millions of years ago – it’s the thoughts, emotions and behaviours of over a third of the world’s population, the largest record of human thought and activity ever collected. It’s not oil, it’s history. It’s people. It’s us.

To indulge in a bit of shameless exaggeration, treating data as a common untapped resource from which anyone can make a buck is akin – in direction if not in scale – to treating Swift’s Modest Proposal as a sensible contribution to the argument on population control.

Think of data as a part of ourselves, and suddenly the priorities change. Stories like the UK government’s attempt – again! – to give relatively unfettered commercial access to health data become as vile as they seem. (On that, instructions to opt out are here – the deadline’s 23 June.)

It’s not oil. It’s us.

Ultra-short before-thoughts: While we’re on the subject of oil, a couple of interesting items which I haven’t had time properly to process yet:

  • A small investment outfit has managed to force directors onto the board of Exxon who actually care about climate change. My recent reading of The Ministry for the Future, by Kim Stanley Robinson, has been scaring me witless, and bringing me to the belated realisation of just how much harm climate change naysayers have done to my daughter’s future. About damn time.
  • This one I really want to read and consider: an Australian federal court has denied an application by several children for an injunction to stop a vast open-cast coal mine. But in doing so, it’s found something legally fascinating and with huge potential implications: that there is a duty of care on a government minister to consider what such a project will do to those children’s futures. To anyone with a nodding acquaintance with the common law jurisprudence of negligence, this is immense: new duties of care rarely emerge, with courts (at least in England and Wales) highly reluctant to go beyond existing categories of duty; and only then incrementally and with small steps, based on analogy with existing duties. (For a really good explanation of how this works, see the case of Robinson in the UK Supreme Court.) I really, really need to see how the judge in Australia reached this conclusion (which is at [490-491] of the judgment). I’m on vacation next week, so I might have time to take it in. 

Because I’m on vacation, no promises about posts next week. I’’ll try to take thingification a bit further forward, and there’s so much to do on privacy. We’ll see where we get to.

(If you’d like to read more like this, and would prefer it simply landing in your inbox three or so times a week, please go ahead and subscribe at https://remoteaccessbar.substack.com/.)

2021v7, Friday: Finding family.

Why I welcome the fact that I ache. And a quick link to a writeup of one of the most interesting Supreme Court cases around: Lloyd v Google.

Short thought: “I ache, therefore I am,” as Marvin once put it. “Or perhaps I am, therefore I ache.”

I ache. And I’m happy that I do. Because it’s 48 hours or so since I went back to capoeira for the first time in months.

It’s not the exercise that I’ve missed – from time to time I’ve stopped mid-run and trained a little, solo, in the park.

No. It’s that even for an introvert like me, the community of training with others in this most organic and communicative of martial arts has been a painful thing to lose. That feeling as your mind, soul and body ease into the ginga, the music wraps itself around you, and techniques start to flow the one into the next. As you smile, full of malandro, at the person you’re playing with. As the physical conversation between you ducks and weaves, slow, fast, slow.

God, it’s glorious. Although God, it hurts a couple of days after. I’m 50. I don’t bend as well as once I did.

But every ache is a benção, a blessing.

Because I’m back with family. Or rather, back with one of them.

Here’s the thing. We all have multiple families, which sometimes – but not always – overlap. If we’re fortunate (and my heart breaks for all those for whom this is tragically, painfully, sometimes dangerously not true) our first is with blood.

Another comes from the person we choose to bond our life with: spouse, partner, name them what you will. (My good fortune on this front is boundless; a wife and daughter who are both beyond compare.)

And then there are all the other communities which you find. Or which find you. Some of which will themselves wrap you in love and care, and so will become found families in themselves.

For all but the most wholly solitary among us, these multiple families are the earth from which our lifelong learning, growth, evolution, even our ongoing ability to be human, springs.

My capoeira family is one such. I’m blessed to have so many families. Blessed.

So, yes. I ache. Therefore, I am. Thank goodness.

Someone is right on the internet: Despite my best intentions, I wholly failed to make time to watch the submissions in Lloyd v Google, which sees the Supreme Court wrestle with some fundamental ideas in privacy and data protection.

I’ll try to make the time, then I’ll probably write something. (A radical idea: digest the source material before opining. Good lord.) As usual, the SC has the video of the hearing up on its website at the above link. Open justice for the win.

In the meantime, the UKSC Blog does a great job of summarising the submissions: a preview here, then a rundown of Day 1 and Day 2.

If privacy is at all important to you, and goodness knows it ought to be – it (along with worker status) seems to me to be the critical question of how individual rights interact with contract law and business for the next few years – the upsums richly repay a read.

(If you’d like to read more like this, and would prefer it simply landing in your inbox three or so times a week, please go ahead and subscribe at https://remoteaccessbar.substack.com/.)

2021iv19, Monday: Privacy and the Supremes.

One of the most consequential cases on the law and privacy makes it to the Supreme Court next week. I’ll be watching. And some great stuff on gaming and moral panics.

Short thought: There’s no doubt that arguments about privacy are going to grow, and multiply, for years to come. On so many fronts, the question of what companies and governments can do with data about us affects us – literally – intimately. It’s going to be a central focus for so many areas of law – be it regulatory, public, commercial or otherwise – and we lawyers can’t and shouldn’t ignore it.

Which is why I’m blocking out next Wednesday and Thursday (28th and 29th) in the diary – at least as far as work will allow. Those are the days on which the Supreme Court will be hearing Lloyd v Google, probably the most important data protection and privacy case to make it all the way to the UK’s court of final appeal to date. 

As I’ve written before, the Court of Appeal fundamentally changed the landscape in 2019 when they decided that Richard Lloyd, a privacy campaigner, could issue proceedings against Google in relation to its workaround for Apple’s privacy protections. It’s no surprise that Google took the appeal all the way, since the CoA said (in very, very short) that a person’s control over one’s personal data had value in itself, and that no further harm – not even distress – need be proved for loss to exist. (There are other grounds of appeal too, but this to me is the most fascinating, and wide-ranging in potential effect.)

Next week is only the arguments, of course. Judgment will come – well, no idea. But Lord Leggatt is on the panel. I can’t wait to read what he has to say.

(I’ve had a piece on privacy brewing for some time. I just haven’t had the brainspace to let it out. Perhaps next week. I’ll try.)

Now hear this: I’ve always been rather allergic to team sports. Martial arts, on the other hand, have long been my thing. While I’ve dropped in and out, depending on levels of fitness and family commitments, there’s always been one at least at any given time which has given my joy like no other form of physical activity.

If one nosy trouble-maker had had their way, this would have been nipped in the bud. When I was doing karate in my teens, one clown wrote to my dad – then a canon at St Albans Abbey – claiming that my indulgence in this was Satanic and should stop immediately.

No, I don’t get the reasoning either. Needless to say, my dad treated it with the respect it deserved, and lobbed it into the wastebasket. And on I went, via aikido, tae kwon do and (these days) capoeira. No doubt this last, which I hope to keep doing with my current escola in Southend for as long as my ageing limbs can manage it, would have given the writer even greater conniptions, given that the music often name-checks saints and is thought in some quarters to have connections to candomblé.

But I think the writer missed a trick. Because back then, in the 80s, if he’d known I was a role-playing gamer he’d have been tapping totally into the zeitgeist.

By RPG I’m talking about pen and paper, not gaming. I loved these games; via an initial and very brief encounter with Dungeons & Dragons (2nd edition, for the cognoscenti – it was never really my thing), I found Traveller and Paranoia, and never looked back. It’s been a long while since I played, but my love of them, and conviction that they’re good and valuable, hasn’t dimmed.

These days, these games are pretty mainstream. But in the 80s, particularly in the US, they were the subject of significant, if now in retrospect batshit insane, panic. This panic is beautifully explored by Tim Harford in his podcast, Cautionary Tales. I warmly recommend it. You don’t have to know or care about the games themselves for the story to be engaging and fascinating, as an analysis of how societal panics can grow and evolve into something wholly unmoored from reality from even the most unpromising foundations. And yes, the irony there is palpable. 

(Tim’s a gamer himself of no little repute; I imagine a game GMed by him would be wonderful. But he’s fair on this, I think.)

The whole series is great (the one on Dunning-Kruger is particularly brilliant). Tim’s previous podcasts, in particular 50 Things that Made the Modern Economy, are just as good. And he always makes them relatively short, and scripts them properly. Not for him the 90-minute frustrating meander. Thank goodness.

Warmly recommended.

As an aside: A recent FT piece of Tim’s has just appeared on his own website (as usual, a month after FT publication). It’s superb. Lots of people have linked to it, but it’s good enough to do so again. 

It’s entitled: “What have we learnt from a year of Covid?” His last sentence is one with which I utterly concur:

I’ll remember to trust the competence of the government a little less, to trust mathematical models a little more and to have some respect for the decency of ordinary people.

Read the whole thing.

(If you’d like to read more like this, and would prefer it simply landing in your inbox three or so times a week, please go ahead and subscribe at https://remoteaccessbar.substack.com/.)

2021iii3, Wednesday: data, privacy and the Golden Rule.

If people talk about changing data protection laws, always ask for their philosophy; if they won’t say, be suspicious. And two great tales about the file format that makes remote working possible.

I wouldn’t worry if it were still like this.

Short thought: On Friday, I’m giving a webinar on privacy issues in employment. It’s part of a series of employment law webinars over 15 days (two a day); the second such series since lockdown, organised by my friend Daniel Barnett. With signups come big donations to the Free Representation Unit, a splendid charity which organises advocates for people who can’t afford lawyers in employment and social entitlement cases. (So yes – if employment law is important to you, sign up. There’s still 80% of the sessions left to run, all are recorded for later viewing at your leisure, and it only costs £65 plus VAT a head. Discounted significantly the more heads you book for.)

Anyhow: prepping for it has got me thinking. There’s a lot of noise, particularly post-Brexit, about what kind of law governing privacy and data protection we should have. GDPR comes in for a lot of stick: it’s cumbersome, it’s civil rather than common law, it’s inflexible, it makes life harder for small businesses and is easy for large ones. Scrap it, some say. Other countries get adequacy decisions (that’s the European Commission saying: yes, your data protection laws give sufficiently equivalent protection that we won’t treat you as a pure third country, with the significant restrictions on cross-border data transfer that entails) with different laws. Why shouldn’t we? (Incidentally, initial signs are we should get an adequacy ruling. Phew.)

All of this, I tend to feel, misses the point. The first step to working out what data protection architecture we have isn’t common vs civil law. It’s identifying the essential philosophical and ethical – and, yes, moral – basis for why data protection is needed in the first place. When I hear people advocating for changing the onshored version of GDPR, I want to hear that philosophical basis. If I don’t, I’m going to start patting my pockets and checking my firewalls. Because the cynic in me is going to interpret that the same way I interpret – say – calls for restricting judicial review, or “updating” employment law: as a cover for a fundamental weakening of my protections as a citizen and an individual.

Here’s why. GDPR, for all its faults, did represent a root-and-branch shift, and it’s that shift rather than its shortcomings (and lord knows it has them) that has caused much of the outcry. The shift? The imposition, in clearer terms than ever before, of the idea that people’s data is theirs, unalienably so. And if you want to muck about with it, they get to tell you whether that’s OK or not.

I know this is a wild over-simplification. But in our data-rich, surveillance-capitalism world, as a citizen that’s what I want. Yes, it carries downsides. Some business models are rendered more difficult, or even impossible. But that’s a trade-off I’m happy with.

I’m aware of one case, for instance, where an employer is alleged to have accessed an ex-employee’s personal email and social media accounts (or tried to) using credentials left on their old work computer, because the credentials to a work system were missing and there might have been password re-use.

I’ll leave the reader to compile their own list of what potential problems this might give rise to. But it does bring into sharp relief what I think the core issue is in privacy and data protection, both generally and in the employment context.

And it boils down to this: don’t be (to use The Good Place’s euphemistic structure) an ash-hole.

Honestly, it’s that simple. So much of employment law (and here, as we saw in the Uber case and many others, it differs from “pure” contract law significantly) is about what’s fair and reasonable. (As employment silk Caspar Glyn QC put it in his webinar on Covid issues yesterday, he’d made a 30-year career out of the word “reasonable”.) And through the architecture of statutes and decisions and judgments, what the tribunals and courts are ultimately trying to do is apply the Golden Rule. Has this person, in the context of the inevitably unequal power relationship between them and their employer, been treated fairly?

Now, everyone’s definition of “fair” is going to differ. But that’s why we have laws and (in common law jurisdictions like ours) authority. So that we can have a common yardstick, refined over time as society evolves, by which to judge fairness.

What does this have to do with privacy in employment? Loads. For instance:

  • Can you record people on CCTV? Well, have you told them? Have you thought about whether it’s proportionate to the risk you’re confronting? Does it actually help with that risk more than other, less intrusive, means?
  • Can you record details of employees’ or visitors’ Covid test results? Well, why are you doing it? Do you really need it? If so, how are you keeping it safe – since this is highly personal and sensitive health data?

It’s difficult. But it’s difficult for a reason. Personal data is so-called for a reason. Its use and misuse have immense, and often incurable, effects. The power imbalance is significant.

We lawyers can and do advise on the letter of the law: what GDPR, the Data Protection Act, the e-Privacy Directive and so much more tell you about your obligations.

But a sensible starting point remains, always, to consider: if this was my sister, my brother, my child, working for someone else, how would this feel? How would their employer justify it to them? And if they came home, fuming about it, would I think it was fair?

I live here. Or so my family would probably say.

Someone is right on the internet: I haven’t been in my Chambers since September. My workplace is my home office. It’s evolved into a decent environment over the past year. Furniture, tech, habits: they’ll keep changing, but they’re in a pretty good place right now.

But in many senses, the single biggest enabler of my remote working isn’t a piece of kit, or even a piece of software. It’s a data format. The PDF.

PDFs have been around for decades. Adobe came up with it, and in one of the smarter bits of corporate thinking, gave it away. Not entirely, of course. But anyone can write a PDF app (my favourite being PDF Expert) without paying Adobe a penny; while Adobe, rightly, still makes shedloads from selling Acrobat software for manipulating PDFs as a de facto, rather than de jure, industry standard.

And I rely on PDF entirely. I convert almost everything to PDF. All my bundles. All my reading matter. Practitioner texts. Authorities. Everything. Even my own drafting gets converted to PDF for use in hearings. That way, I know I can read them on any platform. Add notes, whether type or scribble (thank you, Goodnotes, you wonderful iPad note-taking app), highlight, underline. And have those notes available, identically, everywhere, in the reasonable confidence that when I share them with someone else, they’ll see precisely what I see, on whatever platform and app they themselves have available. They’re now the required standard in the courts, with detailed and thoroughly sensible instructions on how PDF bundles are to be compiled and delivered. (Note that some courts and tribunals have their own rules, so follow them. But this is a good starting point.)

My utter reliance on, and devotion to, PDF means I’m interested in its history. And two excellent pieces tell that story well. One describes Adobe’s long game. The other describes PDF as “the world’s most important file format”. Neither are terribly short, but neither really qualify as long reads. And given how much we now rely on this file format, they’re both well worth your time.

(If you’d like to read more like this, and would prefer it simply landing in your inbox three or so times a week, please go ahead and subscribe at https://remoteaccessbar.substack.com/.)

Algorithms, face recognition and rights. (And exams, too.)

The Court of Appeal’s decision to uphold an appeal against South Wales Police’s use of facial recognition software has all kinds of interesting facets. But the interplay between its findings on the equality implications of facial recognition, and the rights we all have under GDPR, may have significant repercussions. Including, possibly, for the A-level/GCSE fiasco.

Most nerd lawyers will, like me, have been fascinated by the Court of Appeal’s decision to uphold the appeal in R (Bridges) v Chief Constable of South Wales Police [2020] EWCA Civ 1058. The tl;dr version is that the Court said South Wales Police (“SWP”) had acted unlawfully in mining CCTV to scan the faces of thousands of attendees of large public events, compare them to a “watchlist” of persons of interest using a software tool called “AFR Locate”, and identify people for further police attention.

It’s worth noting that the Court did not find SWP to have acted wholly improperly. It’s clear from the narrative that they made at least some efforts to build safeguards into their procedures and their use of AFR Locate. Nor did the Court find that an activity like this was unlawful per se. However, the Court found that both in who SWP chose to look for, and where they did so, its procedures and practice fell short of what would be required to make them lawful. To that extent, Edward Bridges, the appellant, was right.

It goes without saying that for privacy activists and lawyers, this case will be pored over in graphic and lengthy detail by minds better than mine. But one aspect does rather fascinate me – and may, given the tension between commercial interests and human rights, prove a trigger for further investigation.

That aspect is Ground 5 of Mr Bridges’ appeal, in which the Court of Appeal found SWP to have breached the Public Sector Equality Duty (PSED). The PSED, for those who may not be intimately familiar with s149 of the Equality Act 2010 (EqA), requires all public authorities – and other bodies exercising public functions – to have due regard to the need to, among other things, eliminate the conduct the EqA prohibits, such as discrimination, and advance equality of opportunity between people with a protected characteristic (such as race or sex) and those without it. As the Court noted (at []), the duty is an ongoing one, requiring authorities actively, substantively, rigorously and with an open mind, to consider whether what they are doing satisfies the PSED. It’s a duty which applies not so much to outcomes, but to the processes by which those outcomes are achieved.

Bye-bye to black box algorithms?

In the context of the Bridges case, SWP had argued (and the Divisional Court had accepted) that there wasn’t evidence to support an allegation that the proprietary (and therefore undisclosed and uncheckable) algorithm at the heart of AFR Locate was trained on a biased dataset. (For the less nerdy: a commonly-identified concern with algorithms used in criminal justice and elsewhere is that the data used to help the algorithm’s decision-making evolve to its final state may have inbuilt bias. For instance, and extremely simplistically, if a facial recognition system is trained on a standard Silicon Valley working population, it’s likely to have far fewer Black people and quite possibly far fewer women. And thus be far less accurate in distinguishing them.)

The Court of Appeal found this argument wholly unconvincing. The lack of evidence that the algorithm WAS biased wasn’t enough. There was no sign that SWP had even considered the possibility, let alone taken it seriously.

Most interestingly, and potentially of most far-reaching effect, the Court said at [199] that while it may be understandable that the company behind AFR Locate had refused to divulge the details of its algorithm, it “does not enable a public authority to discharge its own, non-delegable, duty under section 149“.

So – unless this can be distinguished – could it be the case that a black-box algorithm, by definition, can’t satisfy the PSED? Or that even an open one can’t, unless the public authority can show it’s looked into, and satisfied itself about, the training data?

If so, this is pretty big news. No algorithms without access. Wow. I have to say the implications of this are sufficiently wide-ranging to make me think I must be misreading or overthinking this. If so, please tell me.

Algorithms and data protection

There’s another key aspect of the lawfulness of algorithm use which SWP, given the design of their system, managed to avoid – but which could play a much bigger role in the ongoing, and shameful, exam fiasco.

GDPR is not fond of purely algorithmic decisions – what it calls at Recital 71 and Article 22 “solely automated processing”. (I’m using algorithm here in its broadest sense, as an automated system of rules applied to a dataset.) This applies with particular force to “profiling”, which Article 4 defines as automated processing which “evaluate[s] certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”.

In fact, Article 22 prohibits any such decision-making on matters which either affect someone’s legal rights or otherwise “similarly significantly affects” them – unless it is:

  • necessary for entering into or performing a contract between the data subject and the data controller;
  • Authorised by EU or (in this case) UK law which incorporates safeguards to protect the data subject’s rights and freedoms; or
  • Based on the data subject’s explicit consent.

Unlike a number of other GDPR provisions, no exemptions are allowed.

Similarly, s14 of the 2018 Data Protection Act (“the DPA”) says such processing – even if authorised by law – must allow the data subject to ask for a decision to be made which is not “based solely on automated processing”. And that request must be honoured.

The key word here so far as Bridges is concerned is “solely”. The human agency at the end of SWP’s process, whether inadvertently or by design, takes this out of the realm of A22; so this didn’t form any part of the Court of Appeal’s reasoning, or of the grounds of appeal. Were there no human in the loop, this kind of processing might be in serious trouble, since there’s no contract, certainly no freely-given consent (which can only be given if it’s possible to withdraw it), and I don’t know of any law which explicitly authorises it, let alone building in safeguards. And using facial recognition to target individuals for police attention is a paradigm case of analysing someone’s “personal aspects, including… behaviour, location or movements”.

So what about exams?

[UPDATE: Unsurprisingly, the JR letters before action are coming out. And one in particular raises points similar to these, alongside others dealing with ultra vires and irrationality. The letter, from Leigh Day, can be found at Foxglove Law’s page for the exam situation.)

But even if A22 wasn’t an issue in Bridges, I suspect that the rapidly-accelerating disaster – no, that implies there’s no agency involved; let’s call it “fiasco” – involving A-levels and no doubt GCSE results will be a different story.

I won’t go into detail of the situation, except to say that an algorithm which marks anyone down from a predicted B/C to a U (a mark which is traditionally believed to denote someone who either doesn’t turn up, or can barely craft a coherent and on-point sentence or two) is an algorithm which is not only grossly unjust, but – given 18 months of pre-lockdown in-school work, even if it isn’t “official” coursework – is likely provably so.

But let’s look at it through firstly the PSED lens. The Court of Appeal in Bridges says that public authorities using algorithms have a duty to work out whether they could inherently discriminate. I haven’t read as much as the lawyers crafting the upcoming JRs of Ofqual’s materials, but I’m not at all certain Ofqual can show it’s thought that through properly – particularly where its algorithm seems heavily to privilege small-group results (which are far more likely in private schools) and to disadvantage larger groups (comprehensives and academies in cities and large towns).

(I have to acknowledge I haven’t spent any time thinking about other EqA issues. Indirect discrimination is certainly conceivable. I’ll leave that reasoning to other minds.)

Now let’s switch to the GDPR issue. We know from A22 that decisions made solely by automated processing are unlawful unless one of the three conditions applies. I can’t see any legal basis for the processing specific enough to satisfy the A22 requirements – certainly none which sufficiently safeguarded the rights and freedoms of the data subjects – that is, the students at the heart of this injustice. Nor am I aware of any data protection impact assessment that’s been carried out – which, by the way, is another legal obligation under A35 where there’s a “high risk” to individuals – self-evidently the case for students here whose futures have been decided by the algorithm. And the fact that the government has thus far set its face against individual students being able to challenge their grades seems to fly in the face of DPA s14.

One final kicker here, by the way. Recital 71 of the GDPR forms the context in which A22 sits, discussing in further detail the kind of “measures” – that is, systems for processing data – with which A22 deals, and which are only permitted under narrow circumstances. It stresses that any automated measures have to “prevent… discriminatory effects”.

Its final words? “Such measure should not concern a child.”

Watch this space.

Adieu, Privacy Shield. Cat. Pigeons.

That’s torn it. The CJEU says Privacy Shield, the deal which lets EU companies send data to the US, is no good. Not only does this cause huge problems for everyone using Apple, Google, Microsoft and about 5,000 other firms – but it also foreshadows real problems for the UK come 2021.

As David Allen Green might put it: Well.

The tl;dr version: Max Schrems, the guy whose lawsuit did for the old Safe Harbour principles allowing trans-Atlantic data exchange, has done it again. He asked the Irish data protection commissioner to ban Facebook from sending his data to the US. The DPC asked the High Court to ask the Court of Justice of the EU. And now the CJEU says Privacy Shield, the mechanism which since 2016 has allowed the US to be seen as an “equivalent jurisdiction” for data protection purposes, isn’t good enough.

This isn’t entirely unexpected. Although the Advocate General’s opinion in December took the view that the Court didn’t have to examine the validity of Privacy Shield (while by no means giving it a clean bill of health), many in the data protection game pointed out that Privacy Shield didn’t resolve some core problems – in particular the fact that nothing in it stops US public authorities from accessing non-US citizens’ personal data well beyond the the boundaries established (since 2018) by GDPR – any more effectively than Safe Harbour had. As such, so the argument went, there wasn’t any good reason why the CJEU would think differently this time around.

And that’s how it’s turned out. The Court (decision here) says Privacy Shield is no good.

To be clear: this isn’t farewell to all trans-Atlantic transfers. If they’re necessary, it’s OK – so this doesn’t stop you from logging into Gmail or looking at a US-hosted website. But EU companies sending their people’s or their customers’ personal data to US servers for processing solely in reliance on Privacy Shield, will need to stop.

And most, on the whole, don’t. Instead, they rely on mechanisms such as standard corporate clauses or SCCs (almost nine-tenths of firms according to some research), which use language approved by the European Commission in 2010.

Today’s ruling says SCCs can stay. But it puts an obligation on data controllers to assess whether the contractual language can really provide the necessary protection when set against the prevailing legal landscape in the receiving jurisdiction, and to suspend transfer if it can’t.

And there’s the cat among the pigeons. Hard-pressed and under-resourced data protection authorities may need to start looking more intently at other jurisdictions, so as to set the boundaries of what’s permissible. And data controllers can’t simply point to the SCCs and say: look, we’re covered.

In other words: Stand by, people. This one’s going to get rocky.

(Obligatory Brexit alert…)

Just as a final word: as my friend Felicity McMahon points out, this is really, really not good news for the UK. When the transition (sorry, implementation) period ends on 31 December 2020, and our sundering from the EU is complete, we will need an adequacy decision if EU firms are to keep sending personal data to the UK unhindered. One view is that since the Data Protection Act 2018 in effect onshores the GDPR wholesale, we should be fine. But there are significant undercurrents in the opposite direction, thanks to our membership of the Five Eyes group of intelligence-sharing nations (with the US, Australia, Canada and New Zealand) and the extent to which (under the Investigatory Powers Act 2016) our intelligence services can interfere with communications.

Since this kind of (relatively) unhindered access by the state is what’s just torpedoed Privacy Shield, I’m not feeling terribly comfortable about the UK’s equivalence prospects. I hope I’m wrong. But I fear I’m not.

As news stories used to say: Developing…

Privacy: one step forward, one step back.

A quick hit here to memorialise two privacy-related bits of news: a German court bans Facebook from tracking you elsewhere, but US Republicans try – again – to ban encryption that actually works.

Like many people, I barely use Facebook. And when I do, I only do so when using Incognito (Chrome) or Private Browsing (Safari). It’s annoying logging in each time (albeit less so with 1Password). But it stops Facebook from doing something I viscerally loathe: tracking everything else I do, everywhere else, thanks to tracking code and cookies.

I get that this may make me a paranoid tin-hat type. I’m OK with that. Just like I’m OK with blocking ads which rely on adtech, preventing videos from auto-playing, and generally trying to stop a simple text website from downloading an extra double-digit MB load of data so they can show me ads so intrusive that I never want to go back to the site in question. (I’m fine with ads. I like free stuff, paid for by advertising. But adtech-delivered ads are essentially a conman’s dream. And from a data protection/privacy perspective, I have grave doubts about whether adtech is lawful. So I’m very happy to screw with it.)

Which makes a German court’s decision to reinstate a ruling banning Facebook from combining its own data with that from other sites into so-called “super-profiles” very interesting. The ban was at the behest of Germany’s Cartel Office, and the judge’s ruling (press release in German here) said there wasn’t any serious doubt over whether Facebook was (a) dominant and (b) had abused that position – particularly by getting information from non-Facebook sources.

The ruling only applies to Germany, of course. But this does seem to be the first time that cross-site tracking and data collection has been seriously set back. Which may make things slightly hotter for adtech’s widespread consent-less collection of personal data, legally speaking – although the dominance question doesn’t necessarily arise, of course, the ruling nonetheless explicitly addresses, in resolutely negative terms, what Techcrunch calls “track-and-target” and what writers like Shoshana Zuboff and many others call surveillance capitalism. It does so by noting that a significant number of Facebook users would prefer not to be tracked and targeted, and a properly-functioning market would allow them that option. It’s hard to see how the same can’t be said for adtech in general.

Less encouraging, and far more predictable, is US Senate Republicans’ move to introduce legislation (the LEAD Act – seriously, these acronyms…) to “end the use of warrant-proof encrypted technology by terrorists and other bad actors“. As almost any even slightly encryption-savvy person will know, this translates to “making encryption stop working securely”. Simply put, if – as this legislation would appear to require – a service provider keeps a key to your comms so it can give it to law enforcement, then end-to-end encryption is done and your comms aren’t secure any more. As Ars Technica puts it, “Encryption doesn’t work that way.” Anyone claiming it does is either ignorant or acting in bad faith. No real middle ground there.

John Gruber points out that describing the bill as “a balanced solution” as its proponents do because the key would only be handed over with a court order is hogwash. If a key exists, it becomes a target. “That’s how the law works today,” he writes. “What these fools are proposing is to make it illegal to build systems where even the company providing the service doesn’t hold the keys.”

Fools seems like a generous description. It presupposes good faith. I’m not sure I’d go that far.

Trust, trash and privacy notices.

Some time last week – and it may have been up for longer, but I haven’t checked – several people on Twitter started commenting on NHS England’s privacy notice for the Test and Trace programme. And oh sweet Jesus, it’s a fail.

What’s worse, in the current environment, that fail may have deadly consequences.

I don’t want to take too long over the details. Suffice it to say that a programme which fails properly to address questions of whom personal data might be shared with, refers to it as “personally identifiable information” which is a concept wholly absent from UK data protection and privacy law, says it will hang onto everything for 20 years, demands the provision of huge amounts of information about other people – OK, only for 5 years, but still – and entrusts it to several private enterprises with (at best) dubious records with other people’s data (including inadvertently leaking the email addresses of 300 of its trainee tracers), is a programme for which the phrase “privacy by design” really doesn’t seem appropriate.

Add in the stories which suggest that the training of those to be working on the Test and Trace programme is appalling in its inadequacy, and the government’s refusal to undertake a data protection impact assessment first, and this is carelessness, bordering on (gross) negligence.

I’m trying to be polite here. You may have noticed.

Because this is deadly serious. Literally so.

Lockdown lifts, partially, tomorrow. Looking at foot traffic on the street, and at pix of a crowded Clapham Common, and hearing from school-age kids of how their friends are already acting as if it’s all over by visiting each other’s houses just as they were in February, it’s over.

I can’t say how much of that is a reaction to the insouciant arrogance of the Cummings/BoJo double-act re Cummings’ wilful breach of regulations, and his wholly implausible explanation for it. (I describe it as such because, if the other side’s witness gave that kind of explanation in the box, I would happily shut up and let them keep talking, providing gold dust for my closing submissions.)

But this I’m sure of. The trust, which undoubtedly existed in late March and early April, is now gone, “trashed” as one behavioural expert put it – even among many of BoJo’s natural supporters. And without trust, Test and Trace won’t work. The privacy policy might have been acceptable if we trusted the powers that be not to be cavalier with things that matter to us.

But I don’t, not any more, not after they’ve shown us just how little respect they have for those they govern. And I’m certain I’m not alone. Matt Hancock’s “just trust me” approach wasn’t good enough for Harriet Harman, and it isn’t for the rest of us.

This, by the way, shows that Apple and Google were right to take the decentralised, privacy-first approach they did to building exposure notification into their mobile OSes. I don’t hold a brief for either. Both have immense faults (on privacy, Google in particular). But this was the wise approach. Give people control, and put trust in them, have faith in them, to listen to their better angels.

This is something our government never did. Lockdown was slow because we couldn’t be trusted to obey. Yet we did, overwhelmingly. Until the rules were muddied and it became clear they only applied to the little people.

So where does that leave us? An under-trained Test and Trace workforce, run by private contractors proven to be untrustworthy, collecting data precious to us with minimal genuine controls, ignoring if not deliberately sidelining local authorities who both know their areas and know how to do this, properly, personally and professionally, in favour of a classic mass-outsourcing impersonal “pile it high” approach. Contrary, it won’t surprise you, to contact tracing best practice which has actually worked elsewhere.

On the basis of this, people are to be asked to self-isolate for 14 days with no guarantee of ongoing job or wage protection, by people who clearly don’t think this applies to them. And with lockdown being lifted just now, when our infection and death rates remain far, far higher than other countries who have lifted lockdown, but without masking in any material numbers? You don’t have to be a conspiracy theory-loving leftie to wonder whether the speed is, at least in part, a distraction from the Cummings fiasco.

I didn’t mean to sound angry. But I can’t help it. Like I said: this is deadly serious.I just can’t understand why those running the show don’t seem to be treating it that way. I really, really wish they did.

What’s your data worth?

In allowing Google to appeal against the Court of Appeal’s findings in Lloyd v Google llc, the Supreme Court holds out the prospect that we’ll conclusively know whether a personal data breach is a loss in itself – or whether a pecuniary loss or some specific distress is required.

In 2019, the Court of Appeal did something special to personal data. It gave it a value in and of itself – such that a loss of control over personal data became an actionable loss in itself. No actual pecuniary loss or distress was necessary. Now the case in question is going to the Supreme Court, and the understandable controversy triggered by the Court of Appeal’s decision may (once the case is heard, late this year or more probably next) finally be resolved one way or the other.

The Court of Appeal’s decision came in Lloyd v Google llc [2019] EWCA Civ 1599, a case involving one of the highest-profile tech firms in the world, and one of the foremost examples of either (depending on your perspective) finding an inventive solution to another firm’s (in this case Apple’s) unreasonable intransigence, or shamelessly evading that firm’s customers’ privacy protections for one’s own gain. The issue was Google’s use of what was generally termed the “Safari workaround”, a means of tracking users of websites on which a Google subsidiary had placed ads even if a user’s Apple device was set up to stop this from happening.

Richard Lloyd, a privacy activist, was trying to initiate a class action on an opt-out basis, which could in principle encompass millions of users of Apple’s Safari web browser. This was in itself highly controversial, although I don’t propose to address that side of things. Of more direct interest from a privacy perspective was the claim made on Mr Lloyd’s behalf that the Safari Workaround was actionable in itself: that users didn’t have to prove they’d lost out emotionally or financially, but that the loss of control over their personal data which the Workaround caused was per se a loss sufficient to allow them to sue under the Data Protection Act (the 1998 version, which had been in force at the time).

The High Court had no truck with this argument, Warby J concluding that without some actual loss or distress, section 13(1) of the 1998 Act wasn’t engaged. The Court of Appeal disagreed, with Vos LJ finding (at [44-47]) that a person’s control over their personal data had an intrinsic value such that loss of that control must also have a value if privacy rights (including those arising from article 8 of the European Convention on Human Rights) were to be properly protected.

(The Court of Appeal also reversed Warby J’s findings on the other key point: that the potential claimants all had the same “interest” in the matter, such that a representative action under CPR r.19.6 could proceed. As such, it ruled that it could exercise its discretion to allow Mr Lloyd to serve proceedings on Google even though it was out of the jurisdiction.)

To no-one’s surprise, Google sought permission to appeal the matter to the Supreme Court. The Court has now given permission for the appeal to proceed – not just on the core question of whether loss of control over personal data is actionable in itself, but on the other points on which the Court of Appeal disagreed with Warby J.

The question of where the cut-off point lies in privacy breaches involving loss of control over personal data has been a live one ever since an earlier case involving Google, Vidal-Hall v Google Inc [2015] EWCA 311, which (as Vos LJ put it in Lloyd) had analogous facts but one critical difference: it had been pleaded on the basis of distress caused by a personal data breach, rather than the idea that such a breach was intrinsically harmful and thus actionable in itself. The Court of Appeal in Lloyd went beyond Vidal-Hall in expanding the scope of actionable harm. Now, at last, we may conclusively get to identify the outer borders of that scope. Watch this space.

(Incidentally – I’m rather late to this party. The Supreme Court granted permission in March. I’d intended to write about it a little earlier, but the Bug got in the way. My apologies.)

I’m a data protection and privacy geek. And even I think this time is different.

Opinion klaxon: I’m in the minority who actually like GDPR – while of course acknowledging its faults. But in these strange times, I agree with one of the wisest tech essayists I know. We’ve already given the keys to the data kingdom away to surveillance capitalism. Time to put those keys to use to save lives.

tl/dr: read this. “We Need A Massive Surveillance Program.” Now.

Slightly longer version: I’ve been interested in data protection and privacy for years. I have a (nascent, admittedly) practice in it. (Advertising break: So if you need someone who combines data protection/privacy with investigative, white collar or regulatory problems, I’m your guy!)

And personally, I’m a cheerleader. For goodness’ sake, I’m one of the few people I know who actually applauds GDPR (while of course acknowledging its problems). And that’s even after I had to write corporate procedures to deal with it. And then try to get a sceptical North American audience to buy into them.


Equally, I’m on board with the critique of “surveillance capitalism” by such stars as Shoshana Zuboff and Jaron Lanier. We’ve given away lots, for a lot less return than we think.

So normally the idea of empowering the live tracking of everyone would set my Big Brother muscles twitching horribly. And let’s be clear. It does.

But if there’s anything that South Korea, Singapore and Taiwan have shown us, it’s that the only way through this thing without either killing a horrific number of people or destroying lives through penury (and to be clear, I recognise that we may not be able to evade either of these outcomes) involves testing, tracing, and telling. Testing widely. Tracing where those with the misfortune to be carriers have gone. Then telling them to get them quarantined, and telling others so they can steer clear. And the only way to do that is by using our own self-surveillance devices. Our phones.

It’s a terrifying thought, isn’t it? Let ourselves be Tracked. Singled out. Isolated. Potentially ostracised. (This last is really scary, given our proven tendency as a species towards bigoted blame directed to out-groups of all kinds.)

But smarter people than me can’t see a sensible alternative. Even Maciej Cegłowski, who’s got a solid pro-privacy track record and (incidentally) runs Pinboard, the best bookmarking site ever. (Social bookmarking for introverts, as he calls it. Perfect.)

So no more blather. Read what Maciej has to say. Think about the safeguards he (and others) talk about. And also think about how – if we did this, with those safeguards – we’d be setting an example of how the broader surveillance capitalism issues might – just might – be reined in.

Please: read. Then think. That’s all I ask.