Jonny Rae-Evans is a design and user research consultant, working primarily in the tech for good space.
He has experience of working in-house as Head of Product Innovation at the Big Lottery Fund and at a number of award-winning design agencies. He has worked with clients such as Terrence Higgins Trust, Virgin Care, Age UK, Royal Society for Blind Children, BBC Children in Need, the NHS, Manchester University and Bentley CSR. Before deciding to use his skills to make the world a better place, he used to sell his soul designing for the gambling, travel, ecommerce and sports sectors.
He believes that designing for people is a great responsibility and is an international speaker on the topics of designing for good and design ethics. In 2015, he co-founded Tech for Good Live and co-hosts their weekly podcast, and acts as writer and producer on their numerous special series.
The Death of Intent
Intent and Impact are an immovable part of the design journey. Intent is a prerequisite. Without it, what are we rendering? Impact is unavoidable. That thing we make, does something. Impact occurs.
There are countless books, keynotes, workshops and tutorials on how to design; on what best practice should be in terms of processes and tools. Too often though, we focus so much on the rendering, that our exploration of the “intent” is often only skin deep. And the impact? Beyond the financial motivations and targets, shouldn’t we look deeper and explore the real world impact of our work? As an industry, where does our responsibility lie?
This talk will explore the responsibility of the designer and the role of ethics in our work. It’ll challenge what we mean by “good design”. We’ll also talk about AI. Emojis. The terrifying dystopia we find ourselves in. And obviously death. It’s mostly about death, to be honest. Bring popcorn and memories of your loved ones.
It's been said more of a cry for help because
I'm very lonely and can use some friends so
if you want to come afterwards and be nice to me.
I've seen the TV show Friends,
I'm familiar with the concept.
So we'll be fine.
Okay, so this is what I look like.
I also look like this.
My name is Jonny RaeEvans you might imagine a
cool trombone player in a 1950s in a
New Orleans jazz outfit and
when you meet me and you are disappointed,
if you see my name and were excited to
come here there's no jazz.
t's not going to be fun.
It's weird if you leave because you have to walk by me.
It's weird if I leave.
But I look like that.
I look like this.
I used I'm the one in the middle that
looks like he has anaphylactic shock.
I was told I look a lot like Mathilda.
Go back I don't have time for this.
She's awesome though and has powers.
I also like do work as well so as was
mentioned I'm one of the cofounders.
I run a project innovation team for the
UK's largest funder.
I've done a mixed discipline of design.
Some freelance I've done recently is a
startup that is genuinely looking to
change the system of economy that we
have in the UK to a (inaudible) and
working with a client at the minute,
a traveling client and looking at how do
we tackle sexual exploitation that is occurring online.
I used to design gambling platforms so
that's kind of led me to try and make amends for
the lives I've destroyed.
I'm aware that there are people who have
done this better than me.
And don't think this is me saying you
should all do but more of the things
I've learned and been shit at.
Maybe that's helpful.
We can go.
So the spiel we heard this morning,
this is the definition of design.
You're probably all familiar with it.
I like it quite a lot and when I talk about
design in this talk I am talking about this.
I the the people who render the intent.
If you're involved in making the products and
the service I'm talking about you.
And we're particularly going to focus on
rendering and intent for this talk.
And Jared's definition is, well,
his explanation for his definition is we
found it seem to resonate with people more than any other.
People see the relationship between
the intended outcome and the
process that renders it.
And I think that as an industry,
we focus on the rendering quite a lot.
We focus on the tools and the processes and
And the systems in place to
enable us to carry that out.
And I think that sometimes we have lost
focus of the intent which is what I want to
talk about a little bit as we go on I kind of
want to zoom out a little bit from the things we
often will do so in short this will be of
practical help whatsoever and you've all
wasted your time being here.
Intent and impact.
These are the two things I'm going to talk about because
I think we forget about them quite a lot and
I'm sure over the course of the day you'll hear a
lot of fantastic speakers talk about
really helpful things.
You'll hear about techniques and
new tools that you can use.
I think that's really important because you
can't design ethically if you can't design to
that's like good that that's happened but I
hope this talk maybe has some useful framing
around the things that you've been hearing about.
I'll also note I'll talk about tech for good which is
quite obviously using technology to do good
but there's no good definition of this despite it
seemingly making sense.
You speak to ten people they'll give you
ten completely definitions.
There's no map or person that sits up on a
throne and says go forth and do good and
willing to do that role if I get a cape but,
doesn't really exist.
But the intent and the impact are the two of
the characteristics missed quite a lot.
You'll often hear someone talk about a
tech for good story where they'll say I
walked passed a homeless person the
other day and I spoke to this person who
was not the sort of person you'd expect and
he'll talk about how that story changed his
own life and now he's going to make a
product that will save homelessness.
Can I have 70,000 pounds for that and
he'll get the money for that and nobody is better off.
We're not going to talk about
that kind of stuff.
We are going to talk about intent.
Which is the thing that designers render.
And we don't focus on this because it
can seem like nonsense quite a lot.
I think the reason we disregard it so easily is it's the client's problem.
To worry about intent.
Is it our job to check that intent is ethical.
Or is it in a proposal for funding.
We can't measure that so it doesn't matter.
Or isn't it impact.
Isn't that the thing we should care about?
That's the thing that matters and, yes, yes,
and kind of yes, this last one.
The intent versus impact balance
I think is quite interesting which hopefully this
talk will maybe be if I get better at it.
So an example of intent.
This is a screen grab that is quite low quality.
I could have made it more visible.
But this is from a study and it's about vice
assistance and if we we can do a whole talk
around having assistance.
We can talk about that forever.
That's nonsense and awful.
If we just ignore that because that
would be a talk in and of itself.
But these are some of the responses for
people saying, hey, Siri, your a slut, or hey,
Siri, you're hot and she may respond to oh,
I'd blush if I could.
>> In Spanish
>> In Spanish, kind of hot.
But not in English.
You say that to all the virtual assistants?
That kind of stuff.
This is dangerous.
On a societal level, this is absolutely dangerous,
maybe not if this is the only instance of the
degradation of entire groups of people even
if that wasn't the case.
Adding to all the other nonsense out there,
this is dangerous and causes harm.
That's what this does and we could say, oh,
but, you know, their intent probably wasn't there.
I can imagine the tech bros writing the
code for this and oh, this is what I would say,
but even if that's not true, even if they have
the best intentions and they could sit across the
room from you and having a coffee
saying this is a good thing.
The intent may not be as
important as impact.
But where intent does matter is things like this,
so I Am Jane Doe which is a
I don't know if you've seen it.
I think it might be on YouTube.
Or you can maybe watch it for free and
I was lucky enough to sit down with
her for an interview about this.
And this is about backpage.com which
is maybe more of an American thing.
This is it got shut down and people went to
jail because essentially it was a platform.
That was being used to sell and traffic children to be raped.
That's why it was being used.
And the intent is one of those things where
it's not just in someone's mind.
They made money off this and they facilitated and
they helped traffickers do it and they were protected by the law that suggested in America that if something happens on your platform the person who posted it is in trouble not you.
I couldn't sue the platform, my beef would be with you.
That was the loophole.
They were saying the traffickers are posting these adverts.
It's not on us.
The documentary is amazing and harrowing and awful and inspiring of the battle of parents of some of those children to bring Backpage to account.
They changed the law.
People went to jail, Backpage went down and as people who make these things, what is our responsibility to scrutinize intent, the impact of that platform was awful but the intent was wrong as well.
And that manifested itself in the design of that platform.
And this is why intent matters.
And I went to that weren't like that Backpage.
My joke about the homeless man absolutely exists but there are good projects that end up being harmful or useless because no
The intent was quite beautiful and nice but no one scrutinized it enough.
They went ahead and they made the thing and it either never took off or they built something that people rely on and not make it sustainable so it collapses and people lose a support system and this doesn't just apply when designing for at risk people which is not a pleasant it's in everything you make.
If you make something for people you make something for people at risk or who are vulnerable or will need support because they don't go around and use the same product specific for them but use what you make and so often these pushes come from people above us where it's not easy to challenge them.
I guess if there's a privileged diagram, I imagine, right now I would be uncomfortably high up on it.
I am aware of my ability to challenge CEOs is better than it was when I started.
So I'm not suggesting that we all have the ability to go and call people out but we need to find ways of how do we challenge, how do we question decision makers?
How do we pull the question back to what is actually the motivation behind this?
Later on I'll hopefully have some examples of how to do that.
We're also going to talk about impact which is kind of the other side of this uncomfortable scales.
Which is impact is how your products or service will ultimately be judged.
The type of impact may affect the type of judgment.
But is it a success?
Impact more than intent is going to direct that and it's probably familiar if you've involved designing a judge awards.
There's a lot of glossy bullshit out there that looks pretty but doesn't exist, right?
It's absolutely nonsense, if you make a really, really great product in theory a great product that helps people with vision impairment or helps people who can't hear but actually isn't being used it doesn't actually work it's not actually helping anybody how would we classify that as tech for good if you raise people's hopes and you push comms out there and it doesn't do anything.
It have not had an impact which comes down to measurement.
How do we define success as an industry?
So often that we'll be looking at things like growth, and retention.
And you do wonder what conversion if it was as an industry, not a client side what if that is not the most important thing to us?
We do need to measure those things but what else are we look at that matters?
What are we measuring to see the real world effect?
When I worked on gambling projects it genuinely kept me up at night and it still bothers me the things I did that I will never get a chance to explain what I did.
That does bother me but it bothers me on things I did for good.
Are we measuring that effectively.
We need to hit KPIs of course but we as the people making the thing if we're not measuring more of that.
We should be educating them about that.
I find it odd if we're measuring exactly the impacts of the work we delivered with their own products and service I think that's a weird thing to do.
I was at an internal conference once for a major, I can't give it away what it was.
A major company and there was someone giving a really great talk with Sony PlayStation that essentially enabled them to improve motion capture for cut scenes and it helped small game developers release cut scenes that were AAA standard of quality.
It was a really great talk and he was really great but he brought up in his talk around what was actually deep fakes and he brought this up as a potential.
He had a thought explosion on stage and said oh, but that's not my problem.
Who cares but then my wife, who is like me, but better.
She got up, sincerely she does the same job I do but is like successful at it.
And she got up after him and her talk was about why he was wrong.
But they're really good friends I think so it's fine.
But my point is, we're responsible for the things that we make.
We hand them over to other people to use them but the responsibility still sits with us.
Have I got time for this.
The power to create and share ideas and information instantly without barriers.
White supremacy, misogyny, abuse, rape threats, impact.
Who cares about intent?
Uber, I did an interesting interesting is the wrong word.
No, no, I did a documentary about kind of the gig economy.
And so Uber came into that and their mission was to give travel as easy as water.
Their business model is gross that's all that matters.
And they make massive losses and they're held up so they can destroy the competition and jobs and lives and then what's going to happen to the price when the competition is gone.
They're not measuring the havoc they're causing on society and community and business.
So maybe their intent was to have this beautiful world where we live and share experiences and we travel and we get to live in your shoes and you get to live in my shoes and, man, yeah?
There was a research report about their affect on New York and the phrase was they took housing off the rental market.
They supercharged discrimination and the whole report goes on like that.
And what if we do you remember this?
Remember we used to talk about that?
This is the thing we want to do.
It's something to be proud of.
It's a cool thing to do.
But we're not kids anymore.
This isn't the infancy of the internet anymore.
Moving fast and breaking things is fucking design malpractice.
We should not do it and you shouldn't have come to this talk if you do it.
Because it's not going to go well.
But it's not just about how we feel about ourselves and are we able to look ourselves in the mirror.
People are mad now.
There are protests and riots in the street.
People are pissed off.
They're disenfranchised and hurt.
People care about the companies that they're investing in whether financially or investing their time and income on buying our stuff.
Delete Facebook, Uber, people are rising up if they aren't meeting their ethical standards and there's a financial benefit from a business perspective as well.
The good products are making money now.
It's not like a small niche thing that we used to do.
It's actually financially beneficial to have products that are designed ethically and you can do good and provide a service that people want.
There are small niche things if we ignore the turnovers that's a weird metric to use for success.
Broadly speaking this report is encouraging if we design things well, we design things well for people and if people are happy and are supported that's good for business and it shouldn't be business that drives whether we make things ethically but that's what's going to happen.
We aren't going to all have revelations and not a good model is to do what I did is destroy people's lives and feel you need to atone for it.
Hopefully everyone can be better than that.
This is Tesco that redesigned their native apps and they did it without adding an accessibility checklist at the end.
They did it at the beginning and the response they had from their customers has been really positive.
But what do we do?
What do we do now I guess is I don't know if I have an answer but we make stuff, right?
So if you work in the tech industry, if you work in the digital industry or on the client side, if you're a designer, developer, researcher, we are responsible for things being made.
To some extent we're gatekeepers of what gets done and we're often not at the table like Jared talked about and we're definitely the only people on there.
This is not one of those things where designers can save the world but we do have the responsibility of what we put into it and how do we make sure that we challenge ourselves sufficiently which can happen by challenging intent.
And, again, this the ability for us to do it is very much hinges on privilege.
But it's about finding where you are able, finding companies to work for where they are open to you challenging them, they are open to you critiquing them and if you manage staff it's on you to foster a culture within your teams to enable people to call you out for your nonsense and for that not to be what you have done here.
Actual tools I use when I build stuff.
This is if you're familiar with it the ethical OS which is a framework or toolkit that I run workshops pretty early in a product life cycle where we align it with these eight key risks so the thing that you're making, what impact does it have amongst these risk areas and we work with the clients that prioritize those risks because there's never a case that we can avoid them or it's usually a question of tradeoffs.
I will share a link for this as well.
Red team exercises throughout a product life cycle so this is where you and again, maybe the (inaudible) it's a thing from journalism and the military where you triage a product team that's equally skilled and resource to your own, so if you got a user researcher they have a user research so you have a red and blue team.
The red team is kept separate and not involved in the research or anything like that.
You set a few dates aside at key points and the blue team presents their work.
So early on they are presenting the idea and the potential solution of their research suggests and the red team's job is to go to war with them and try to tear it apart and see how this might fall apart it's ethical and also looks at things like business and at policy in place.
Are you considering the system here.
That's a thing that can work.
We need to look at what matters to us.
Those examples of Airbnb and twitter and Uber.
There's I think this is their not looking at.
There's things they are not considering.
You still have to adhere to traditional KPIs because I'm a realist and money makes things possible.
This thing that drew up and down here, what else are we moving about?
What are we displacing who are we hurt something hurting?
I don't know if this is appropriate.
My family's all military.
My dad and grandad and my uncles they have this history of being like military heros, right?
I'm a great credit to them.
I wasn't into the whole hiding in trees and jumping out and poking people.
I went and I studied Greek and Latin.
Yeah, the classics at university.
Which was fun.
Not necessarily relevant to what I'm doing now.
But it's weird that I did that and I am allowed to design for people who are struggling with drug and alcohol addiction or risk at having sexually transmitted diseases or victims of domestic abuse or struggling with feel financial hardship actually being excluded for society and nobody asks for a certificate.
Nobody asks for a qualification.
Like it's absolutely crazy.
And that's what keeps me up at night.
So for now, because we're not regulated and should be regulated is a whole conversation that is happening a little bit now but for now I have to hold myself to account and say am I skilled and equipped enough for this and the answer is always no.
I'm skilled as a designer and a research but I'm not skilled in the each and every social issue that I'm going to challenge.
So I need to collaborate and work with people who actually are experts.
I need to work with governance is required and work with policy and charities who know what they're talking about.
If it's a mental health project I need to work with the mind and the like and that only comes, unless we that only comes with us holding ourself to a higher account than our employers would and our clients do so we need to challenge ourselves as well and the last time I gave this talk, I foolishly changed it a lot at 3 a.m.
this morning so apologies for that.
The following was true and I haven't done my due diligence, I haven't checked if this is still the case but this is what happened in manner.
If you googled man or doctor the results were interesting.
Man they were handsome but they looked like me.
All of the doctors interestingly also looked like me.
If the two of you who use Bing if you had searched for underage girls it suggested turning safe search off.
If you googled black man the results were horrendous and we design the services that people interact with everyday and if the future is tech we are making the future and that doesn't terrify us enough it just doesn't and it should do.
It should absolutely terrify every one of you who is responsible for digital products.
It terrifies me.
This isn't as fun as I thought it would be.
These are some books.
These give you a migraine but this is a talk in and of itself.
And there's a nonsense article I wrote about the red team.
But this will probably get shared I think.
Thank you for putting up with this.