2021iv9, Friday: (Not) getting things done.

Has productivity fallen because we’re all doing things we’re just not that good at? And how tech perverts language.

Short thought: Once upon a time, the BBC had a specialist expenses unit in Cardiff. Unsurprisingly, journalists run up some fairly bizarre expenses from time to time, and the unit could handle all of them. Legend was you could send them a photo of a receipt written in Coptic and scratched in the whitewash on the wall of a house, and someone in the team would dredge up a memory of John Simpson having done something similar in the 80s and find a way to process it.

This, of course, cost money. So it was shut down about 20 years ago, outsourced and standardised, and therefore became entirely useless. The result: a lot of stressed reporters spending foolish amounts of time trying to work out how to file expenses instead of, you know, reporting.

This came to mind when reading a piece from Tim Harford earlier this week. Tim (whose praise I’ve sung before), wonders whether the lack of productivity increases is because all of us “knowledge workers” are doing loads of stuff we’re not actually that good at, rather than the things we’re in fact meant to apply our finely-honed brains to; and because the organisations we work within tend not to have terribly good systems for managing workflow.

One culprit, of course, is email. I shudder when I see someone’s phone screen with a little red spot saying they’ve got 32,483 unread messages. As someone who – while a long way off Inbox Zero – nonetheless reads, acts upon, defers or deletes everything as soon as it comes in, it makes me physically ill. But I understand how it happens: the sheer amount of stuff hitting this single channel can rapidly become unmanageable. And I don’t think things like Slack are an answer: it’s just yet another inbox, for the most part.

If I’m honest, I think the main problem is that there’s an organisational equivalent of the Dunbar number. The Dunbar number is the idea that the practical optimum number of people in a community (whether physical or virtual) is about 150 – the number where there are enough people to have a range of views, skills and experiences, but where you can still know everyone. For work, though, I think it’s a lot smaller: maybe a dozen at most. Yet most people who work in anything but the smallest of business are blizzarded with information about a range of things that simply aren’t really that important to them.

It’s one of the reasons I find the idea of one day going back to a paycheque far less appealing than I thought it would, before I jumped into this world of self-employed advocacy. These days, I’m part of a Chambers which only just goes into 3 digits. But usually I’m working in teams which max out at half a dozen: solicitors, client and maybe one other barrister. Yes, I’ve got several such projects going at any one time; but still, my signal-to-noise ratio across all forms of communication is far, far higher than it used to be.

I’m sure some businesses get this right. Information that people need when they need it goes in intranets or elsewhere, rather than in emails. Announcements are corralled into groups, rather than put out willy-nilly. CC and BCC are ruthlessly suppressed. Meetings – and the vast paper-counts that have to be read first – are pruned.

It’s just that I never worked in one. One reason why I’d hate to go back.


Someone is right on the internet: Thinking about the Dunbar number makes me recall something I read by John Naughton a few weeks ago. Back in the days when I was an investigator, it was always great to discover that a target was promiscuous when it came to Facebook friends. (Of course, these days Facebook isn’t anywhere near as useful a tool for social network mapping, but it still has its place – certainly for people over 25 or so.) The reason being, it was often possible to back into someone’s friend lists via one of their other friends, whose privacy settings might well be less rigorously enforced.

Despite being something of a geek, I’ve long loathed this use of the word “friend”, as I do the word “like”. (I got exceptionally pissed off when Twitter dropped stars in favour of likes. I used to star things as a way of denoting them as worth hanging onto. Much of that I didn’t “like” in any meaningful way.)

But John put it better than me:

Much of the Orwellian language that’s endemic in the tech business reminds me of Heidegger’s definition of ‘technology’ as “The art of arranging the world so that you don’t have to experience it.” Just think how Facebook has perverted the word ‘friend’, or how nearly every company has perverted ‘share’. As Sam Goldwyn might have said, in Silicon Valley if you can fake empathy you’ve got it made.

Spot on.


(If you’d like to read more like this, and would prefer it simply landing in your inbox three or so times a week, please go ahead and subscribe at https://remoteaccessbar.substack.com/.)

2020i5, Tuesday: Lifting the shell.

Again, it’s a busy week. So two quick hits: a potential AML game-changer, and catnip for Apple geeks.

Short thought: Those who are calling it “groundbreaking” aren’t wrong. One of the more frustrating loopholes (and when I say “loophole” I mean “something the radius of the Channel Tunnel”) in the global anti-money laundering and counter-fraud architecture has long been the ease with which anyone can set up anonymous shell companies in the US.

Congress’s override of Trump’s veto on the US defence bill in the waning days of last year – he’d blocked it ostensibly because it didn’t have anything tacked on to deal with Section 230, the bit of US statute which gives social media services some immunity for website publishers from liability for third party content – also allowed through the Corporate Transparency Act. This, agreed after years of campaigning, makes it mandatory for anyone registering a new company anywhere in the US to disclose the name, address and date of birth of its beneficial (i.e. human) owners, as well as an ID number such as driver’s licence or passport; and for existing companies to produce this info within two years.

No time for in-depth analysis, and obviously the proof is in the pudding – and (thinking of the widespread and largely unpunished abuse of Companies House requirements in the UK) in the enforcement. But campaigners and writers aren’t wrong to call this “the most sweeping counter-kleptocracy reforms in decades” aren’t wrong. Big news.


Someone is right on the internet: For long-term Macheads like me (I’ve owned precisely two Windows devices – a Surface which I resold and a cheap Windows Phone just so I’d know what it was like – and two Androids, a Nexus 4 and a bargain Nokia I’ve since given to a relative, against several dozen Apple devices between me and the missus), Jason Snell’s series on 20 seminal Macs has been a joy. As he’s explained, it’s not necessarily the best machines, but the most notable. And the utterly deserved winner is the iMac G3, the machine that set Apple on the road from mess to megacorp. The whole series is pure comfort food for Apple nerds. Perfect for a New Year’s Day kickback and relax.

(I never owned an iMac G3, although I had one on loan from a client for a while. My personal fave on the list was also the first Mac I ever bought, the PowerBook Duo. I still have one in the house. Time, perhaps, to see if it still boots. Although if it doesn’t, as we found was the case with the – jawdroppingly-beautiful, even 20 years on – family Pismo the other day, it’ll hurt…)


(Don’t forget – if visiting a site doesn’t float your boat, you can get this stuff in your inbox. Subscribe at https://remoteaccessbar.substack.com/.)

2021i1, Friday: Start over.

A new year. Time to do better.

Short thought: Nothing much today. 2020, by common agreement, sucked. If we work together, and listen humbly to one another, we can make this year better. Let’s.


Someone is right on the internet: Hui Chen, former compliance counsel to the Department of Justice and now Chief Integrity Adviser to the Attorney-General of Hawaii, is an ex-boss and an old friend. She has a post talking about this past year. In which she lost her father – not to the Bug, but to an aneurysm. She writes wisely about what’s gone wrong, and where, even in the dark times, light can be found.

Hui became my boss in July 2014, as she became Standard Chartered Bank’s first ever global head of anti-bribery and corruption. I became her deputy. We hit it off in days. Less than three months later, I lost my dad. And she went from being a good boss to an amazing one. Her kindness and goodness to me – and my family – as we dealt with this were unparalleled, and her encouragement thereafter was a huge part of how I found the guts to take a risk and become a barrister. I can only hope, and pray, that she and her family have been surrounded by equal kindness and goodness as they’ve dealt with their loss.


Things I wrote: A new year is a chance to look back. So I thought it might be amusing to travel in time almost two decades to my reporting days at the BBC, and a piece I found in the archive describing my first experience with ADSL as opposed to dial-up, hooked up so far as I recall to my TiBook. Not sure it’s aged terribly well as a piece of writing, but oddly it brings back fond memories of the Series1 Tivo box I owned around the same time. Those were the days.


(Don’t forget – if visiting a site doesn’t float your boat, you can get this stuff in your inbox. Subscribe at https://remoteaccessbar.substack.com/.)

2020xii30, Wednesday: Soul food.

Without the sustenance of something we do for our souls – even if we’re bad at it – we lose something vital to being human.

Short thought: Everyone needs a hinterland. Something (or indeed somewhere) they can retreat into: as an escape, or for solace, or simply for the sake of sanity. It feeds a soul which can otherwise wither and die.

My soul food? Music. Always has been. I’m a (poor) piano player, helped somewhat over the past year by my 2019 birthday present to myself: a subscription to a wonderful jazz and improvisation teacher called Willie Myette (his site, Jazzedge, has been a haven).

(And I’m getting better. Very slowly. And re-learning the essential lesson: to get good at anything, you have to accept being pretty bad for a while.)

But honestly, I’ve realised it doesn’t matter what you do. Play something. Write. Build. Make. Walk, or run, or ride, whether with music/podcast/audiobook or in blissful silence. Just something that’s not passive consumption or work. Something that can become a habit of self-nurture.

(I’m not including reading in the above. Because – call me an elitist; please, go ahead – I regard reading long-form things, by which I mean anything long enough to have some structure and thought behind it, as something as fundamental as breathing. Not so much soul food as a basic necessity.)

If the past year of strange days that seem to stretch for weeks, and months that have fled by like days, has taught me anything, it’s that without regular intakes of soul food we lose something critical to being human.

So find your sustenance. Treasure it. Be bad at it for as long as it takes. Your soul will thank you.


Nothing wrong with a re-read: I’ve never understood those who say books aren’t worth reading twice. I love a good series, and there’s something special about re-reading the last one (or, for TV, re-watching it) before getting into something new. Adrian Tchaikovsky’s Children of Ruin is on my Kindle maybe-next-up list, and I’ve nearly finished a re-run through its predecessor, Children of Time. It’s awe-inspiring; a consideration of genuinely alien thought and culture in the grand tradition of CJ Cherryh’s Chanur and Foreigner books. (Again with the series…) It doesn’t hurt that its non-human species reminds me of my favourite gaming alien race of all time, Traveller’s Hivers. They were always such fun to play…


Someone is right on the internet: With thanks to Anne Helen Peterson, Anne Applebaum writes about collaborators in The Atlantic. A long read, talking about the US GOP and dealing with the use of strategic and voluminous lies. But worthwhile.


Things I wrote: some time ago I looked at bundling apps for barristers. A good one would be the holy grail. So no surprise there isn’t one. Not yet. I’d favoured one; but now I’m reconsidering. And I’ve committed hard cash too.


(Don’t forget – if visiting a site doesn’t float your boat, you can get this stuff in your inbox. Subscribe at https://remoteaccessbar.substack.com.)

2020xii29, Tuesday: Day One.

Short thought: I can’t say much about Dame Elizabeth Gloster’s report on the FCA’s handling of London Capital & Finance. The FCA is a client. But suffice it to say that the grey area that is currently the perimeter – the boundary between what’s regulated financial services activity and what isn’t – is going to be a point of serious contention for the foreseeable future. Even the FCA acknowledges (perhaps implicitly, but still) that it isn’t always clear what falls on either side of the line; it now publishes an annual perimeter report setting out the current state of play. If I wanted to be optimistic, I’d say that one of the very few dividends of Brexit might be to grasp the nettle and define it more clearly, or at least make a better stab at dealing with those who deliberately blur the distinction. But my optimism only goes so far.


I really need to get round to readingBeing Mortal, by Atul Gawande. Given it several years ago, after my dad died. Haven’t yet been able to read it. (I’m not sure if that’s causative or not.) Gawande is just marvellous; I really owe it to myself to get on with this. And it’s a sign of some basic competence in the incoming Biden administration that he’s been co-opted into the US’s coronavirus taskforce.


Someone is right on the internet (with apologies to the stone-cold classic XKCD): Ars Technica with the history of the ARM architecture whose use by Apple is now overturning several decades of microprocessor orthodoxy. (I’m using an M1-powered MacBook Pro. It’s revelatory. Honestly.) Recall that this is arguably the most influential and game-changing UK tech company of the past several decades. So when you hear UK government spokespeople bloviating about national champions, or tech mastery, just recall that ARM was bought by Softbank in 2016, and Softbank agreed to sell it to Nvidia earlier this year.


Things I wrote: The Brexit deal is (nearly) done. What’s it mean for legal services, and data protection? Mostly a case of Watch This Space.


(Don’t forget – if visiting a site doesn’t float your boat, you can get this stuff in your inbox. Subscribe at https://remoteaccessbar.substack.com.)

Algorithms, face recognition and rights. (And exams, too.)

The Court of Appeal’s decision to uphold an appeal against South Wales Police’s use of facial recognition software has all kinds of interesting facets. But the interplay between its findings on the equality implications of facial recognition, and the rights we all have under GDPR, may have significant repercussions. Including, possibly, for the A-level/GCSE fiasco.

Most nerd lawyers will, like me, have been fascinated by the Court of Appeal’s decision to uphold the appeal in R (Bridges) v Chief Constable of South Wales Police [2020] EWCA Civ 1058. The tl;dr version is that the Court said South Wales Police (“SWP”) had acted unlawfully in mining CCTV to scan the faces of thousands of attendees of large public events, compare them to a “watchlist” of persons of interest using a software tool called “AFR Locate”, and identify people for further police attention.

It’s worth noting that the Court did not find SWP to have acted wholly improperly. It’s clear from the narrative that they made at least some efforts to build safeguards into their procedures and their use of AFR Locate. Nor did the Court find that an activity like this was unlawful per se. However, the Court found that both in who SWP chose to look for, and where they did so, its procedures and practice fell short of what would be required to make them lawful. To that extent, Edward Bridges, the appellant, was right.

It goes without saying that for privacy activists and lawyers, this case will be pored over in graphic and lengthy detail by minds better than mine. But one aspect does rather fascinate me – and may, given the tension between commercial interests and human rights, prove a trigger for further investigation.

That aspect is Ground 5 of Mr Bridges’ appeal, in which the Court of Appeal found SWP to have breached the Public Sector Equality Duty (PSED). The PSED, for those who may not be intimately familiar with s149 of the Equality Act 2010 (EqA), requires all public authorities – and other bodies exercising public functions – to have due regard to the need to, among other things, eliminate the conduct the EqA prohibits, such as discrimination, and advance equality of opportunity between people with a protected characteristic (such as race or sex) and those without it. As the Court noted (at []), the duty is an ongoing one, requiring authorities actively, substantively, rigorously and with an open mind, to consider whether what they are doing satisfies the PSED. It’s a duty which applies not so much to outcomes, but to the processes by which those outcomes are achieved.

Bye-bye to black box algorithms?

In the context of the Bridges case, SWP had argued (and the Divisional Court had accepted) that there wasn’t evidence to support an allegation that the proprietary (and therefore undisclosed and uncheckable) algorithm at the heart of AFR Locate was trained on a biased dataset. (For the less nerdy: a commonly-identified concern with algorithms used in criminal justice and elsewhere is that the data used to help the algorithm’s decision-making evolve to its final state may have inbuilt bias. For instance, and extremely simplistically, if a facial recognition system is trained on a standard Silicon Valley working population, it’s likely to have far fewer Black people and quite possibly far fewer women. And thus be far less accurate in distinguishing them.)

The Court of Appeal found this argument wholly unconvincing. The lack of evidence that the algorithm WAS biased wasn’t enough. There was no sign that SWP had even considered the possibility, let alone taken it seriously.

Most interestingly, and potentially of most far-reaching effect, the Court said at [199] that while it may be understandable that the company behind AFR Locate had refused to divulge the details of its algorithm, it “does not enable a public authority to discharge its own, non-delegable, duty under section 149“.

So – unless this can be distinguished – could it be the case that a black-box algorithm, by definition, can’t satisfy the PSED? Or that even an open one can’t, unless the public authority can show it’s looked into, and satisfied itself about, the training data?

If so, this is pretty big news. No algorithms without access. Wow. I have to say the implications of this are sufficiently wide-ranging to make me think I must be misreading or overthinking this. If so, please tell me.

Algorithms and data protection

There’s another key aspect of the lawfulness of algorithm use which SWP, given the design of their system, managed to avoid – but which could play a much bigger role in the ongoing, and shameful, exam fiasco.

GDPR is not fond of purely algorithmic decisions – what it calls at Recital 71 and Article 22 “solely automated processing”. (I’m using algorithm here in its broadest sense, as an automated system of rules applied to a dataset.) This applies with particular force to “profiling”, which Article 4 defines as automated processing which “evaluate[s] certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”.

In fact, Article 22 prohibits any such decision-making on matters which either affect someone’s legal rights or otherwise “similarly significantly affects” them – unless it is:

  • necessary for entering into or performing a contract between the data subject and the data controller;
  • Authorised by EU or (in this case) UK law which incorporates safeguards to protect the data subject’s rights and freedoms; or
  • Based on the data subject’s explicit consent.

Unlike a number of other GDPR provisions, no exemptions are allowed.

Similarly, s14 of the 2018 Data Protection Act (“the DPA”) says such processing – even if authorised by law – must allow the data subject to ask for a decision to be made which is not “based solely on automated processing”. And that request must be honoured.

The key word here so far as Bridges is concerned is “solely”. The human agency at the end of SWP’s process, whether inadvertently or by design, takes this out of the realm of A22; so this didn’t form any part of the Court of Appeal’s reasoning, or of the grounds of appeal. Were there no human in the loop, this kind of processing might be in serious trouble, since there’s no contract, certainly no freely-given consent (which can only be given if it’s possible to withdraw it), and I don’t know of any law which explicitly authorises it, let alone building in safeguards. And using facial recognition to target individuals for police attention is a paradigm case of analysing someone’s “personal aspects, including… behaviour, location or movements”.

So what about exams?

[UPDATE: Unsurprisingly, the JR letters before action are coming out. And one in particular raises points similar to these, alongside others dealing with ultra vires and irrationality. The letter, from Leigh Day, can be found at Foxglove Law’s page for the exam situation.)

But even if A22 wasn’t an issue in Bridges, I suspect that the rapidly-accelerating disaster – no, that implies there’s no agency involved; let’s call it “fiasco” – involving A-levels and no doubt GCSE results will be a different story.

I won’t go into detail of the situation, except to say that an algorithm which marks anyone down from a predicted B/C to a U (a mark which is traditionally believed to denote someone who either doesn’t turn up, or can barely craft a coherent and on-point sentence or two) is an algorithm which is not only grossly unjust, but – given 18 months of pre-lockdown in-school work, even if it isn’t “official” coursework – is likely provably so.

But let’s look at it through firstly the PSED lens. The Court of Appeal in Bridges says that public authorities using algorithms have a duty to work out whether they could inherently discriminate. I haven’t read as much as the lawyers crafting the upcoming JRs of Ofqual’s materials, but I’m not at all certain Ofqual can show it’s thought that through properly – particularly where its algorithm seems heavily to privilege small-group results (which are far more likely in private schools) and to disadvantage larger groups (comprehensives and academies in cities and large towns).

(I have to acknowledge I haven’t spent any time thinking about other EqA issues. Indirect discrimination is certainly conceivable. I’ll leave that reasoning to other minds.)

Now let’s switch to the GDPR issue. We know from A22 that decisions made solely by automated processing are unlawful unless one of the three conditions applies. I can’t see any legal basis for the processing specific enough to satisfy the A22 requirements – certainly none which sufficiently safeguarded the rights and freedoms of the data subjects – that is, the students at the heart of this injustice. Nor am I aware of any data protection impact assessment that’s been carried out – which, by the way, is another legal obligation under A35 where there’s a “high risk” to individuals – self-evidently the case for students here whose futures have been decided by the algorithm. And the fact that the government has thus far set its face against individual students being able to challenge their grades seems to fly in the face of DPA s14.

One final kicker here, by the way. Recital 71 of the GDPR forms the context in which A22 sits, discussing in further detail the kind of “measures” – that is, systems for processing data – with which A22 deals, and which are only permitted under narrow circumstances. It stresses that any automated measures have to “prevent… discriminatory effects”.

Its final words? “Such measure should not concern a child.”

Watch this space.