Bud's blog and key insights

90-day re-auth is gone, data access is solved - but what next?

Written by Edward Maslaveckas | Nov 30, 2021 5:01:49 PM

On Monday, the FCA released updated rules around the use of strong customer authentication in the context of Open Banking connections. The upshot of this is that the need for users to re-authenticate with their bank is finally a thing of the past. Instead, it’ll be the responsibility of TPPs like Bud to ensure that users re-confirm consent after 90 days if they want to continue using services that rely on the shared data.

Removing a key barrier to adoption

This is good news. 90-day re-authentication may have been well-intentioned, but it wasn’t fit for purpose. Users shouldn’t share data unwittingly but the current consent flows are uniquely unintuitive.  There is no other product in the world where active users must keep providing consent in order to keep using it. The new changes significantly reduce this barrier and we’re expecting to see drop-offs at 90 days fall from the reported 20 - 40% range to around 5 - 15% (based on a user study we conducted earlier in the year).

Despite the good news, there are still opportunities to improve the customer experience. We argued at the time that continued use of a service powered by open banking data should constitute consent. We still believe and will continue to argue for that to be the case. I can understand how, after 90 days of inactivity, you can make a case that continued data sharing is not in the user’s interest but in reality, it’s more complicated than that.

Regulating consent in the context of Open Finance

Financial products exist over a variety of time periods, and it makes sense that consent frameworks should reflect that. Imagine a mortgage provider that wants to help people use windfalls to pay back the loan early – that’s a once / twice a year login. In that context, over a 25-year period, 90-day re-authentication is just pestering. Users should be able to select the period for which they consent at the point of giving consent. When that’s the case, and I think we’ll get there, when we start to see more progress on Open Finance regulations, then we’ll have a consent regime that’s ready to power the next generation of apps and services.

In the meantime, it’s on TPPs to make re-consenting as pain-free as possible. There’s good news in the rule changes from this perspective with the FCA confirming that TPPs can renew multiple consents in a single click. This means users can go through one consent flow and renew the consent for all their connected accounts at once. For those services with frequent use, it’s a now relatively trivial barrier to overcome and, paired with good UX design, it will help so many more ideas get off the ground. 

What next?

Like any new concept, Open Banking has taken time to establish itself. We’ve needed time for providers to work out how to add value and for users to acclimate to concepts that can feel alien at first - but we’re past the early signs of growth now. Steps like this are about how fast we’ll see mass adoption, not whether that adoption will happen. There’s proof of that in the markets where we’re seeing open banking concepts adopted and delivered by the private sector without the regulatory kick-start we’ve seen in the UK, and it’s my belief that it’ll be the private sector that drives the next big jump in the adoption of data-enabled propositions.

Access to transaction data is now more-or-less a solved problem. At Bud we’re agnostic re: the source of the transaction data we work with. We’re just as happy using clients’ own data as we are using Open Banking data, or even data aggregated via other data aggregators. In fact, most of the billions of transactions that pass through our platform are from 1st party sources. With data access becoming more of a commodity, the next step to take is to realise that access to data only matters if you can understand and act on it. 

Data intelligence is not a solved problem, not even nearly. Turning transaction streams into useful insight requires incredibly nuanced AI to perform consistently across a widely distributed dataset. We’ve invested heavily in solving this kind of problem for the last year, and it’s something we’ll be talking a lot more about in 2022. See you then.