Designing Software to Minimise Harm Whilst Complying With Legal Obligations

Under GDPR, data controllers are expected to assess the legal basis for their collection and processing of data and declare it in their privacy policies (for example, mine is here).

The regulations enumerate the various legal basis that data controllers can rely upon

(a) the data subject has given consent to the processing of his or her 
    personal data for one or more specific purposes;

(b) processing is necessary for the performance of a contract to which
    the data subject is party or in order to take steps at the request 
    of the data subject prior to entering into a contract;

(c) processing is necessary for compliance with a legal obligation to 
    which the controller is subject;

(d) processing is necessary in order to protect the vital interests 
    of the data subject or of another natural person;

(e) processing is necessary for the performance of a task carried 
    out in the public interest or in the exercise of official 
    authority vested in the controller;

(f) processing is necessary for the purposes of the legitimate 
    interests pursued by the controller or by a third party, 
    except where such interests are overridden by the interests 
    or fundamental rights and freedoms of the data subject 
    which require protection of personal data, in particular 
    where the data subject is a child.

In the years since GDPR came into force, there's been a lot of focus on how to properly obtain consent ((a)), as well as when and why Legitimate Interest ((f)) can reasonably be used.

However, (to my knowledge) there's been much less focus on clause (c)

(c) processing is necessary for compliance with a legal obligation to 
    which the controller is subject;

This clause is often taken at face value: the law says I must collect x, so I collect x.

But, it's not always that clear-cut, because the law isn't always specific about what needs to be collected (or how).

In this post I'm going to explore an example that I believe highlights the implications of GDPR on how we design software and processes that need to comply with some form of legal obligation.

As is obligatory for these sorts of posts: I am not a lawyer, I'm just a grumbly git who enjoys thought exercises.


Background

Earlier this year, Ebay deployed a change that requires users to verify both their mobile phone number and an email address in order to be able to login their accounts.

It's no longer possible to log in without providing this information (although my understanding is this is in early rollout, so may not yet apply to everyone in the UK), so as a user you're not just prevented from making new purchases, but also from checking order status, correcting your personal details or retrieving details of past purchases.

Once you've verified a phone number, you cannot delete it. You can replace it with another number only if you're able to verify the replacement.


My Objection

At a personal level, I object quite strongly to unnecessary requirements involving my phone number.

Amongst the various methods that we use to communicate, a phone number is quite unique: it's the only medium which consistently acts as an interruptor.

If you're concentrating on something, receiving an email won't generally disturb you (you'll just see it later), but your phone ringing will always break your chain of thought and interrupt what you were doing, even (especially?) if you don't answer the call.

It's an extremely intrusive way to communicate, especially when it's used for marketing purposes.

Companies marketing to phone numbers collected for security purposes is not without precedent and even where the company acts honestly, a data-breach (sorry, Twitter, you again)can lead to your phone being hammered with marketing and phishing messages/calls.

Because I've had issues in the past, I try to strictly control who has my number, routinely entering a false number into forms where it's mandatory but (IMO) unjustified. That, of course, doesn't work where the number needs to be verified.

GDPR grants data subjects various rights, predicated on the principle that we've a right of control over

  1. what information is collected
  2. why it's collected (i.e. what it's used for)
  3. how long it's retained

Ebay's implementation interferes with points 1 & 3: I could not refuse to provide the information, nor could I remove my number from their system once I'd gained access to my account.

It's long accepted that identifiers such as phone numbers are considered Personal Data under GDPR, so we'll take that as read.


The Legal Obligation

Of course, Ebay haven't implemented this stuff just for the fun of it.

The Financial Conduct Authority (FCA) expects that payment providers implement Strong Customer Authentication (SCA) as defined in the Payment Services Regulations 2017. When I spoke to one of Ebay's advisors, they pointed the finger toward this requirement, so we're going to take their word that this applies to them.

The definition of SCA is, for all intents and purposes, the same as we might use when defining two factor authentication (2FA, also sometimes known as 2SV or MFA):

“strong customer authentication” means authentication based 
on the use of two or more elements that are independent, 
in that the breach of one element does not compromise the 
reliability of any other element, and designed in such a 
way as to protect the confidentiality of the authentication 
data, with the elements falling into two or more of the 
following categories—

(a) something known only by the payment service user (“knowledge”); 
(b) something held only by the payment service user (“possession”); 
(c) something inherent to the payment service user (“inherence”);

Ebay added the SMS requirement as a way to provide the possession factor, with the account's password providing the knowledge factor. The only real distinction between SCA and 2FA really does seem to be the name.


Balancing Interests

At the start of this post, I suggested that this case was less clear than you may initially think.

Surely, though, it's a cut-and-shut case? Ebay are complying with a legal obligation, so that's their legal basis and my argument fails?

But, it's not that simple in practice: Ebay may be subject to a legal obligation, but they still have a duty to consider and balance the impact of their implementation.

The Information Commissioner's Office (ICO) publishes guidance on use of the Legal Obligation basis which notes

Although the processing need not be essential for you to comply with the legal obligation, it must be a reasonable and proportionate way of achieving compliance. You cannot rely on this lawful basis if you have discretion over whether to process the personal data, or if there is another reasonable way to comply.

So, "compliance with a legal obligation" doesn't give carte blanche and as a data controller

  • you must consider whether you can reasonably meet the same obligation in a less impactful manner
  • you must ensure that any impact is proportionate
  • you cannot rely on the compliance legal basis if you could reasonably have achieved compliance without collecting/processing the data at issue

In fact, I would go so far as to say that the assessment requirements are almost identical to those required when relying on the "Legitimate Interests" basis.


Alternative Implementations

The first thing we need to consider is whether Ebay could have implemented their system differently: if there's no other way to implement, then the question of reasonableness is pretty moot.

But, as we noted earlier, the definition of SCA is almost indistinguisable from that of 2FA, which means that there's a huge market of options available.

Taking the three best known standards:

There are, of course, also proprietary options like those offered by RSA (beloved by banks), but these three options provide broad compatability against user needs and devices.

It seems worth noting, too, that SMS 2FA is considered fundamentally flawed, and the UK's Government's National Cyber Security Centre (NCSC) recommend against it's use where better options are available:

SMS is not the most secure type of MFA, but still offers 
a huge advantage over not using any MFA. Any multi-factor 
authentication is better than not having it at all. 
However, if there are alternatives available that will 
work for your use case, we recommend you use these 
instead of SMS.

We can confidently say then, that not only are there other ways that Ebay could have achieved their obligation, but the advice to companies from NCSC is not to use the very mechanism that Ebay have adopted.


Is It Reasonable?

The test in the guidelines though, isn't just "is there another method?", we also have to consider whether those methods can reasonably be used.

Reasonableness as a measure, is fairly subjective and depends on what is being argued for.

It's almost certainly not reasonable (in the context of this discussion) to argue that Ebay cannot use SMS-2FA at all.

It might be reasonable to argue that Ebay need to offer a non-SMS based mechanism in addition to their SMS-2FA offering.

The following observations bolster that argument:

  • Inclusion of support for another mechanism would mitigate the privacy impact for customers not wishing (or unable) to provide a phone number
  • Supporting additional mechanisms is a reasonable and common means of helping ensure continuity of access (customers could configure both - no signal? use TOTP instead)
  • U2F as a standard allows for a wide range of authentication devices, allowing it to better cater to users with accessiblity needs.
  • Ebay apparently already support app based 2FA, just not in the UK
  • Ebay also require a verified email address: email-OTP-2FA is used as an option by some in the industry, so proof of access to a verified email address should be sufficient to meet the obligation (and if it isn't, why was it required and collected?)

Is It Compliant?

At a stretch, you might even try to argue that SMS-2FA's widely acknowledged drawbacks mean that it is not a reasonable way of achieving compliance.

That, though, is more a matter for the FCA given that it's tantamount to arguing that SMS-2FA doesn't meet the obligation in the first place.

But, the FCA's guidance is also rather helpful:

We expect firms to develop SCA solutions that work for all groups of consumers. This means that you may need to provide several different methods of authentication for your customers. This includes methods that don’t rely on mobile phones, to cater for consumers who don’t have, or don’t want to use, a mobile phone.

This is fairly explicit and significantly strengthens the argument that offering SMS-2FA as the only option is not reasonable for data-protection purposes: it doesn't meet the expectations of the authority that the obligation is to (and therefore should not be able to use the protection of that legal basis).

Whether the compliance basis could be used to defend past non-compliant collection is (as far as I know) unsettled law. But, it feels like a reasonable interpretation to say that it could not be used to defend ongoing non-compliant collection.

The Other Side

By implementing the system, Ebay are arguing that it is both reasonable and proportionate (we can infer this, because if they thought otherwise, they'd have knowingly released an implementation that they believed unlawful).

Obviously I can't speak for Ebay, but it's quite possible that they'd argue reasonability on the basis that

  • They need a mechanism that all users can use, and SMS is incredibly accessible (but not universal) in that regard
  • They would suffer undue financial burden implementing support for some other mechanism
  • The FCA's guidance may set certain expectations, but their implementation is compliant with the wording of the regulations

Whilst I (obviously) consider the argument against reasonableness to be stronger, there's no firm guarantee that a regulator, or a court, would share this view.


Software and Service Design

Moving on from Ebay for a minute then.

Part of the reason that I found this interesting enough to write about, is that it also serves to highlight some of the obligations that we should all be paying heed to when designing software, services and processes.

The legal basis of compliance with a legal obligation seems straight forward and broad, but, the principles of Data protection by design and by default still apply.

That means that we cannot just take the simplest/cheapest route to compliance and instead need to consider

  • Do we need to do this?
  • What are the risks involved in the chosen method?
  • What are the impacts on consumers/users?
  • Is there another approach with less impact?
  • If not, is there a way we can mitigate the impact?

If some of this sounds familiar, it's because a similar set of questions are often applied when considering the principles of Defence in Depth.

Like any important design decision, the assessment (and outcome) should be recorded somewhere so that it can be referred back to when making changes to the system.

The costs of a data protection issue can potentially be very high, so if the worst does happen, it's important to be able to demonstrate that appropriate consideration was given to the impact of our design choices. Hopefully, that consideration led to choices that mitigated the impact on those directly impacted by whatever breach occurred, after all, that's the entire point of balancing exercises.


Phrasing of Legislation

Although outside of the remit of anyone likely to read this post, it's not just software and service developers who need to pay attention to this.

There's no concrete answer in Ebay's case: I disagree with their implementation choices, but that alone does not mean that they are acting unlawfully.

It's also unlikely to ever progress to the point of being settled by a court, at least outside of extreme circumstances like Ebay suffering a breach leading to a regulator asking questions about whether the data they lost was held lawfully in the first place.

The regulator may have an opinion on the matter, but that opinion is meaningless unless they're also willing to enforce it.

Which leaves us without much clarity.

This shows that there is a tangible risk that lack of clarity in legislation might give cover to bad data protection practices under the guise of "compliance with a legal obligation".

For example, during the most recent attempt to implement mandatory Age Verification, concerns were (again) raised about the privacy and data-security implications of the scheme.

A common response to these concerns was that Age Verification Providers are subject to the Data Protection Act (which implements GDPR in the UK) and that there was therefore nothing to be concerned about: providers activities are already bound by the existing body of law.

Whilst technically true, there's a risk that a vicious circle could develop

  1. Government doesn't specify safeguards in legislation because GDPR/DPA has supremacy
  2. A data controller implements $bad_thing using the legal basis of being "necessary for compliance with a legal obligation" (complying with the legislation in 1.)
  3. Data subject's rights are adversely impacted with little recourse (because of the hurdles we've demonstrated in defining and showing reasonableness)

In software development, we sometimes refer to keeping things DRY (Don't Repeat Yourself) and it's clear that legislators lean the same way, allowing certain definitions to remain in other legislation.

However, DRY principles do not always work when treated as absolutes and just as excessive abstraction can lead to confusion when maintaining code, abstraction in legislation can lead to confusion amongst those trying to operate within it's framework.

Anyone who's refactored a large codebase to remove technical debt knows how time-consuming and costly it can turn out to be. Trying to "refactor" and add missing regulatory controls is no different: it takes someone (usually a regulator, but sometimes an individual like Max Schrems) to be willing to fight through the time and cost of a court-case to obtain formal clarification. Until a decision is issued, data subject's rights often continue to be impacted with no real recourse.

It should be clear then that where certain protections are considered necessary and predictable, they need to be clearly defined in regulation to help prevent misinterpretation (deliberate or otherwise) - have a generalised class by all means, but use more specific definitions where necessary.


Conclusion

I don't agree with Ebay's approach, not only have they chosen one of the worst two-factor authentication mechanisms available, but they've also imposed it as a non-negotiable burden to be carried by their users.

For a software project delivered in 2022 to have SMS-2FA as it's only supported mechanism, frankly, is borderline offensive.

Have Ebay taken a reasonable approach at a technical level? I would argue not: there are a multitude of less flawed 2FA solutions out there, many (if not all) of which would meet the definition of Secure Client Authentication without incurring the privacy and security cost inherent in provision of a phone number.

But, it's possible that it may be reasonable at a legal level.

Factors like cost of implementation are legitimate considerations when balancing design decisions under GDPR, and so could be used in Ebay's favour (though cost is usually not sufficient on it's own).

The fact that the implementation isn't in line with the FCA's guidance, however, doesn't really help their case. It implies that Ebay's implementation isn't compliant with the FCAs expectations and so gives rise to an argument that they should not be able to rely on the protection offered by that legal basis.

Realistically, there is no way to say with confidence whether Ebay's implementation is lawful or not: the assessment hinges not just on interpretation of the word "reasonable", but also on how two very different regulators (the ICO and the FCA) would choose to enforce their respective interpretations.

Software Design Principles

What this highlights though, is the importance of going through the balancing process, whether via discussion or impact assessments.

When designing software, services or features, we really need to have conversations about whether we can lessen the impact of our design.

This is particularly important for things that we feel we're obliged to implement, because it's incredibly easy for a "just get it done" mentality to creep in.

The juxtaposition between Ebay's implementation and the FCA's guidance highlights the importance of ensuring that discussion covers not just the privacy implications but what the expectations around the obligation itself actually are.

None of this should actually feel particularly new, because the principle of "Data protection by design and by default" is easily likened to "code defensively": think about what you're doing, what might go wrong and how to mitigate failure.

Future answers, maybe

I raised this topic with Ebay's Data Protection Office when they enforced the requirement on my account, but have had no response (at least, beyond an automated acknowledgement of receipt). ICO guidance states that Controllers have one month to respond, so I've now escalated my complaint to the ICO.

So, it's possible the ICO might (in the coming months) provide some insight into the interpretation of reasonableness that they apply.