WEBVTT

00:00:00.000 --> 00:00:30.228
*When and Why to use Analytics*
Primarily, we're going to need to be using  analytics on existing solutions.
So, if you're talking about *green field* – which is a brand-new solution, hasn't been built and delivered yet – 
versus *brown field* – which is something that's  already running but perhaps we want to improve it –
then we're decidedly on the brown field side.

00:00:30.228 --> 00:01:00.529
So, we're looking at existing solutions because it's only existing solutions that can provide us with the analytics.
If you haven't got an existing solution, you're going to have to use another technique.
And there are obviously many other techniques, but they're not going to provide you with much in the way of *quantitative data*.
We do have early-research methods, which we'll  be talking about very briefly as an alternative,
but predominantly analytics for existing deployed  solutions.

00:01:00.529 --> 00:01:31.231
Having said that, then if you're looking at a rework of an existing site or app,
then looking at current analytics can tell you a lot about what you might like to address;
what questions you might like to raise with your team members, stakeholders, users.
So, those are important considerations.
A good starting point in organizations or teams with low UX maturity is analytics
because analytics are easier to sell – to be honest – than qualitative methods.

00:01:31.231 --> 00:02:01.280
If you're new to an organization, if they're only just getting into user experience,
then trying to persuade colleagues that they should be making important decisions on the basis of
six to eight qualitative sessions,
which is typically what we do in the usability lab,
then you should find by comparison web analytics a much easier thing
to persuade people with.
And the other issue particularly relevant to qualitative methods

00:02:01.280 --> 00:02:33.800
is that quantitative methods tend to be very,  very much cheaper – certainly on the scale of data,
you are often having to talk in terms of  hundreds of dollars or pounds per participant
in a *qualitative* study, for various expenses;
whereas a hundred dollars or pounds will get you potentially hundreds or thousands of users.
And, in fact, if you're talking about platforms like Google Analytics which are free,
there is no cost other than the cost of understanding and using

00:02:33.800 --> 00:03:01.591
the statistics that you get out; so, obviously  it is very attractive from a cost perspective.
Some of the things that we'll be needing to  talk about as alternatives to analytics
or indeed *in addition* to analytics:
Analytics can often *highlight* areas that we might need to investigate, and
we would then have to go and consider what alternatives we might use to get to the bottom of that particular problem.

00:03:01.591 --> 00:03:32.951
Obviously, *usability testing* because you'll need to establish *why* users are doing what they're doing.
You can't know from analytics what users' motivations are.
All you can know is that they went to *this* page and  then they went to *that* page.
So, the way to find out if it isn't obvious when you look at the pages
– like there's something wrong or broken or the text makes no sense –
is to bring users in and watch them actually doing it,
or even use remote sessions – watching users doing the thing that has

00:03:32.951 --> 00:04:00.978
come up as a big surprise in your analytics data.
A/B testing is another relatively low-cost approach.
It's – again – a *quantitative* one, so we're talking about numbers here.
And A/B testing, sometimes called *multivariate testing*, is also
performed using Google Tools often,
but many, many other tools are available as well;
and you show users different designs;

00:04:00.978 --> 00:04:33.664
and you get statistics on how people behaved and how many converted, for example.
And you can then decide "Well, yes, putting that text there
with this picture over here is better than the other way around."
People do get carried away with this, though; you  can do this ad nauseam,
to the point where you're starting to change the background color by
minute shades to work out which gets you the best result.
These kinds of results tend to be fairly temporary.
You get a glitch and then things just settle down afterwards.

00:04:33.664 --> 00:05:03.008
So, mostly in user experience we're  interested in things which actually really
change the user experience rather than getting  you temporary blips in the analytics results.
And then, finally, *contextual inquiry* and *early-design  testing*:
Contextual inquiry is going out and doing research in the field
– so, with real users doing real things to try to find out how they operate
in this particular problem domain; what's important to  them; what frustrations they have;

00:05:03.008 --> 00:05:30.510
how they expect a solution to be able to help them.
And early-design testing – mostly in the web field these days
but can also be done with software and mobile apps;
approaches like *tree testing* which simulate a menu hierarchy.
And you don't actually have to do anything other than put your menu hierarchy
into a spreadsheet and upload it – it's as simple as that;
and then give users tasks and see how they get on.

00:05:30.510 --> 00:06:00.080
And you can get some very interesting and  useful results from tree testing.
And another early-design testing approach is *first-click testing*.
So, you ask users to do something and you show them a screenshot
– it doesn't have to be of an existing site; it can be just a design that you're considering –
and find out where they click, and is where they click helpful to them? Or to you?
So, these are examples of early-design testing – things that you can do *before* you start building

00:06:00.080 --> 00:06:34.376
a product to work out what the product should look  like or what the general shape or terminology or
concepts in the product should be.
And both of these can be used to find out whether you're on the right track.
I have actually tested solutions for customers where users had no idea
what the proposition was: "What does this site do?"; "What are they actually trying to sell me?"
or "What is the purpose of it?" – and it's a bit late to be finding  that out in usability testing towards the end
of a project, I have to say. And that was indeed  exactly what happened in this particular example

00:06:34.376 --> 00:07:08.398
I'm thinking of. So, doing some of these things  really early on is very important
and, of course, is totally the opposite of
trying to use web analytics, which can only be done when you finish.
So, do bear in mind that you do need  some of these approaches to be sure that you're
heading in the right direction *long before* you  start building web pages or mobile app screens.
Understand your organization's *goals* for the interactive solution that you're building.

00:07:08.398 --> 00:07:31.246
Make sure that you know what they're trying to get out of it.
Speak to stakeholders – stakeholders are people typically within your organization who have a vested interest in your projects.
So, find out what it's supposed to be  doing; find out why they're rebuilding this site
or why this mobile app is being substantially  rewritten.
You need to know that; so, don't just jump in and start looking for interesting numbers.

00:07:31.246 --> 00:08:02.663
It's not necessarily going to be that useful.
Do know the solutions; become familiar with them.
Find out how easy it is to use them for the kinds of things
which your stakeholders or others have told you are important.
Understand how important journeys through the app or website work.
And get familiar with the URLs – that's, I'm afraid, something that
you're going to be seeing a lot of in analytics reports –
the references for the individual pages or screens,

00:08:02.663 --> 00:08:33.834
and so that you'll understand, when you actually start looking at reports of user journeys,
what that actually means – "What do all these URLs mean in my actual product?"
So, you're going to have to do some homework on that front.
You're also going to have to know the users – you need to speak to the users;
find out what they think is good and bad about your solutions;
find out how they think about  this problem domain and how it differs from others
and what kind of solutions they know work and what  kind of problems they have with typical solutions.

00:08:33.834 --> 00:08:59.920
Also ask stakeholders and colleagues about known  issues and aspirations for current solutions.
So, you know, if you're in the process of rebuilding  a site or an app,
*why* – is it just slow-ish?
Is it just the wrong technology? Maybe.
Or are there things which were causing real problems in the previous or current version
and that you're hoping to address those in the rebuild.