James Harvey, Head of Design at Yoti

Speaking at Trust & Design #1, 18th July 2017

Transcript: Ok, right. Hello everyone, good evening. So yeah, I'm head of Design UX at Yoti. A few other little things about myself, I particularly love coffee, eating out, I love a really good restaurant, I also love live music, I'm very close to my friends and family, I have a particular interest in trainers and denim shirts. I think that's something that like all designers just get initiated with when they leave university or they graduate.

So I've just kind of painted a little bit of a picture of who I am, like apart from the fact that I'm head of Design UX. Um, I've specifically and freely told you a number of pieces of information about me, so I've kind of freely handed over that information. So, Yoti. Ultimately it's your idea on your phone, and what does that mean in terms of us as a business? So, our vision is that we believe anyone should be able to be, sorry, prove who are they simply by being themselves. And the way that we're gonna do this is by building the world's Trust Identity platform. So fairly, a kind of a tall order, very ambitious, as a business, and not something that anybody else is sort of trying to do at the moment within the world.

And the way that - another way that we're gonna do this is by being guided by a set of principles. A number of companies and pretty much every company will have a set of brand principles, ours specifically focus on privacy, transparency and security, but encouraging personal data ownership and keeping our community safe.

So in terms of why we're here today? Specifically I'm going to talk about how you ask people for consent when sharing data. And a lot of people might say "Well why is this a challenge?" Like, we had a bit of an introduction from Harry, luckily, to cover that bit off, but I thought I'd cover sort of a couple of different angles so from our point of view as Yoti, we're a new brand and we're trying to build this identity platform. New brands, there's often no trust there. People don't know who you are, unless you kind of say "I'm working with a specific partner or a trusted brand". People from research, we've done a lot of research over the last couple of years and people want to know "where can I use it?" that's the first thing they kind of say - like, how actually is this going to solve a real problem for me? But actually we kind of see this as an opportunity to try and set the path for some sort of - you know and introduce some new design patterns and thinking and some principles. And essentially lead the way, if we can, on privacy by design.

Now this is a quote, in terms of everybody else that's out there and kind of follows on quite nicely from that screenshot that Harry showed around Terms and Conditions, and I think it's quite apt - it's from a company called My Customer. And it says "Until now, consent has been primarily about Terms and Conditions rather than explaining a wish to have a valued conversation." And I think that's really interesting personally because actually if we can find better patterns to have that conversation with our customers and they actually understood what it was they were, you know, signing up to - because essentially it is a contract -- then actually trust is inherent in that and it can kind of ring true.

But actually under, you know according to, sort of, DPA, like 98 law as it stands, you know, from a data and privacy point of view organisations are covered, you know, when somebody signs up for a service under the performance of a contract. So although, yes, you're absolutely right, that pattern is broken, I completely agree with you on that -- essentially, currently as it stands in the way the law is written, it's not essential for companies to explain any more detail than what they currently are. So, I just thought I would highlight that.

And in terms of, kind of some further research that I did, I mean one of the natural first steps is actually just to look up what it means, so permission for something to happen or agreement to do something, so again, the patterns that currently exist, they do do that, so if somebody kind of Googles it, but that's not to say that it's right.

Now as Yoti - so we have iOS and android apps that we're developing and, you know, a lot of you will have accounts with Google and Apple already, but if whatever reason you didn't and you kind of went there to try and sign up, this is kind of what you're presented with. And there is no optionality here, there is no choice, so it's kind of selling this dream of like, you know, one account is all you need, but they're asking for an incredible amount of information here, but they're not actually telling you why in any particular detail and, if you do want to find out more, you've got this kind of 'Learn More' at the bottom which is almost hidden, so it's almost as if it's a bit of an afterthought and they don't really want you to - you know they just want you to get on with it. If you do kind of drill down into the detail a little bit more, like you're presented with this screen, and it very kind of vaguely sort of says you know, "we asked for some personal information when you create your account in order to keep it secure", but that was a lot of information that supposedly is keeping my account secure and it doesn't really go into any detail about how that information's going to be used, how they're storing it, so like I say - you kind of have to take a bit of a leap of faith if you are signing up to this service.

Um, Apple similarly -- so again this form here doesn't actually look as if any of it is mandatory, but in fact they require all of it to create an account. But what they've started doing is these kind of just in time notices, so, and I don't really know why they've only just done this one, I don't know whether they're testing it and whether they're moving towards a slightly kind of more progressive model, potentially? I doubt it, but. You know, so they say Date of Birth may be used to help verify your identity. So there's a bit of a hint there, but again there's nothing further, you can't go and find out any more information about why they actually need this. And on kind of searching for that, again very similar to Google, it's very much like "we need all this stuff in order for the service to work." So they are kind of abiding by the law from a sort of data privacy point of view, but that doesn't really as a user give me any kind of comfort in terms of actually what's happening to my information.

So, so - at Yoti in particular, how then are we kind of, so, looking at all the patterns out there that exist, you know how are we kind of designing with privacy and consent in mind. Now, I mentioned our principles earlier on, our brand principles - as a design team we actually kind of have distilled these a little bit into some core values and these are sort of values that help guide us in decision making, and how we actually kind of design in terms of a user-centric process. So if we look at integrity - now, you know, this is one of the first touchpoints that a user will experience our brand. They might have kind of heard about the app through recommendation, hopefully other people will potentially start using it in the future, or they might have seen a piece of marketing and you know - so this is all about first impressions, being up front and honest with the people that are going to engage with our brand. So in terms of, you know, it's about initially engaging in that conversation. And clearly kind of spelling out what it is - what is the product, what do you need in order to sign up, and essentially how's it going to sort of, how's it going to help you - that's kind of why you've come here. So the second touchpoint -- so that's the first touchpoint where a user actually has a choice, and the second touchpoint is you know, again, if they've been recommended the app or the service, there's a good chance that they won't even actually look at this, they'll probably skip straight through. But you know we're trying at every single touchpoint to give as much information to the user as possible to make that informed choice, being very specific with the language that we're using, so they have the opportunity, like I say, to make that choice. So although they're going to enter into a contract, if they want to proceed, it's still about - but even before that, giving them the opportunity like I say, to sort of make that choice.

So, with simplicity, rather ironically I've kind of decided to talk about Terms and Conditions but I think from our point of view what we tried to do is kind of, thinking about transparency as well - we kind of came upon this idea of having a promise up-front which essentially is a bit of almost kind of a top level introduction to what our terms and conditions are all about, because as you sort of say: the way they're written, the way they've been designed, terms and conditions -- nobody wants to read them and they feel like it's just a prerequisite for signing up to a service. So this is something we introduced a long time ago and couple with this idea of transparency, we actually originally had - the way the Terms and Conditions were presented was that you would encourage the user to scroll all the way through them which, you know, to our detriment ended up meaning that people actually didn't really trust the service because they thought either we were trying to sort of maybe highlight bits that we didn't want them to see or you know - they found it a real struggle and actually it sort of was almost a bit of an anti-pattern in a way, which is a bit of a shame because we were, like I say, trying to be transparent. But familiarity is something again that comes across you know, in terms of that's what people trust, they trust familiar patterns and they trust familiar kind of sign-up processes. So we didn't -- we'll always have the promise, that's something that's really important to us and something that's key to our brand. Here it kind of talks a little bit more about analytics, because again from user research we felt that that was something really important that people wanted to kind of understand up front. And then providing kind of clearer links to the privacy policy and terms of conditions. So, maybe not as clear as they could be, maybe they could be like, much bigger, definitely be up for discussion about that at some point, but the point is again it's trying to sort of create a pattern here whereby people kind of have the option and have that choice at the point they're entering into the contract, so.

And then, security. So the current process and the way that you create your account is essentially to take a photo, to add a mobile number, and then add a PIN. Now, the photo is a little bit contentious and from a lot of testing and research, you know there's a good chance that we might potentially think about changing that a little bit or potentially even optionality and letting people skip it. But whilst it's still there and whilst it's kind of a fundamental part of the account creation process, it's important for us that we kind of explain to users exactly why this is important. And it's important from a security point of view first and foremost, because that's the piece that sets it apart from other applications in terms of just adding a PIN. But as you can see here, what we've started to do is add a kind of layered privacy, kind of notice. So one big thing from kind of looking at analytics, and again, like, the sort of research -- essentially people are like "Why do I need to do this now", so that was the kind of first introduction, it seems very obvious and it's quite a small point, but essentially it's like, well, you know - "why do I need it?" So let's educate them a little bit on why it's important. And then the second bit is actually - at the point they're here, like, they still might not be sure and they still might not necessarily have enough information to make that choice. So let's point them actually to the point in the privacy policy where they have all the information there that they need in order to make that choice at that point. Yes, they don't have a choice about whether to proceed within the app, that's kind of unfortunately the way the system has been designed, but we're trying to kind of, like I say, provide them and educate them along the journey.

And then the second part from the security point of view is, again, so it's adding your mobile number, which isn't considered a sort of personal piece of data at this point in time, it might be in the future. And then adding a PIN. And going forward we will follow a similar pattern with the mobile number, so it'll have that extra link to kind of inform people around actually how and where the number's being stored. So when it comes to transparency, so you know, asking somebody for a new service, to kind of add a document to something they've never heard of before, like there has to be a pretty big motivation. And these, you know the patterns that you're kind of seeing here, essentially are to try and ensure that again, the user has every piece of information at their disposal in order to make a choice. Now, it could be seen as sort of almost over-engineering and almost kind of saying "Well maybe there's too much here, maybe they're going to be overwhelmed and they won't do it, but surely that's better than just assuming that actually, you know, they just want to kind of crack on and use it, so the very first bit is almost like a sort of snippet and it says why within the application actually, like why it's beneficial. Then actually what happens to all the information that I'm adding, and then, you know, further, like I say, if you really want to dig into the detail around how as a company we use that information, then it's there if you need it. And this pattern follows, not just in terms of adding details or adding information to the service, but actually in terms of specific features as well. So when you kind of want to actually start using a feature, we've introduced this idea of like, education screens across the board to try and sort of encourage like this idea that you know, yes, the user should be fully informed at every point in the process.

And then, when they're actually in within the application itself and they want to start kind of like, sending information, to try and verify somebody else's identity, again - this is all about choice, so I have all of the information now that I've added to the application at my disposal, I want to choose specific pieces, I have a preview, so I have a kind of second point within the journey where I can kind of just check that I'm actually willing to kind of then continue with it, and then a number of different methods, so again at every point in the journey there is a choice for the user.

And then the final piece of this journey - well, there are many others, but in particular in reference to what we're discussing today, so if somebody asked me for some information, they sent me, they sort of said "Oh I want to just check who I'm dealing with" if I was, if for example I was selling something on Gumtree and somebody had sent me a request to check who I was, or a business for example, kind of, you know I was signing up for a new service and they wanted to ask me for some information so we have this kind of very clear choice again in terms of like Yes or No, I'm willing to kind of share that information. Now the next piece of this kind of, this piece of work that we're hoping to continue to work on, is actually that optionality. So as a user at this point, so it's not a case of like, "We need all this and its either a yes or a no", it's like, "we'd like all of this and it's up to you about which bits you send back, and then depending on that…", like, you know, then you start that conversation that I kind of mentioned earlier on. And probably most importantly, something that I think doesn't really happen at the moment but it's starting to a little bit, but it's actually this idea of a receipt of that transaction and what you've kind of - so - in terms of signing up for our Terms and Conditions, that'll be something in the future that actually I will have a record of within the application to say "Yes, I've signed this contract with this company to say that I want to use it," but then any time that I actually share or verify information with a business or another person, I get a receipt for it. So, and that's something you know I can clearly see what had been received, this is just a mock up, so it might be anything from kind of like, contact information, an address, something like that, maybe the company's willing to actually disclose some kind of interesting information about themselves, not just the kind of standard stuff, but then a record of all the kind of details that I've sent as well, so.

So essentially what this all kind of boils down to is being very specific with our language, so, and having an informed choice at every step for the user. So that's kind of really like, a pattern and a guideline that we're trying to follow in everything that we're doing. And essentially it's guided by really, like I say, the current law. And under GDPR, so you know, consent has to be specific, informed, and freely given. There is a bit of a kind of like, I suppose a blurred line there at the moment in terms of like, the Data Protection Act, but essentially we're still, we're not just ignoring that and we're doing everything that we can in terms of, you know, when people do sign up for the application. But where it starts to get a little bit more interesting for us is that actually, with the implementation of the GDPR, is that biometrics will be considered a sensitive data. So for us, as an application that essentially initially asks you to take a photo, that's going to be a little bit of a challenge. So we are kind of considering our options, but essentially, you know around how we kind of, we ask, essentially we're going to have to ask for consent when this law comes in to play. And we don't want to kind of make the app any less secure than it already is, but there are things that we can do based on some of the guidelines and patterns that we're starting to look at in terms of the sort of layered privacy approach, so always giving the user the option to exit at the point of the journey that they're in. 'Skip' can be considered as 'No', so, you know, essentially, no I'm not interested in doing this right now, and providing them with enough information at the point that we're trying to educate them around that part of the journey that they actually again have this, you know, an informed choice around whether they want to proceed or not.

So, in terms of going back to that original problem, around how we ask for consent to share data, these are kind of like four points really that kind of guide us and ensure we're always designing and building with user choice in mind, we're very lucky, I mean it goes without saying in terms of our vision and mission, and our values - are very sort of specific in terms of what we're trying to achieve as a business, but we're also very lucky that we've had a, you know, we kind of employed a privacy expert about a year ago, into the business, and she's essentially been able to guide us on this journey, so you know, the last thing we would have wanted is right now to suddenly be having these conversations and not be thinking about the types of patterns that we'd want to introduce. And I suppose the last one's probably a bit of a cop out, in a way, because it's easier said than done, but like, you know, that's partly why we're here, it's to talk about how we can offer choice within those journeys.

And last but not least, this quote that I've kind of - I've used in the past but actually it's quite kind of relevant to what we're talking about today, essentially talks about, you know, can this credibly be represented as being to the user's benefit. And I think it's all very well saying, well, the way terms and conditions are in place at the moment, we can just kind of agree with that and just kind of work with it, or we can try and kind of do something about it, and really think about, like I say - when I'm signing up is it actually going to be to my benefit. So, so yeah. That's it.