37
How well are you delivering your experience? Andrew Fisher @ajfisher Web Directions Respond 5 February, 2014 Hi! My name is Andrew Fisher and I’m an Interaction Analyst. Today I’m going to talk about “How well are you delivering your experience” for the next 15 minutes.

How well are you delivering your experience?

Embed Size (px)

DESCRIPTION

The web has always had fragmentation, though not on the scale we're seeing now with new devices - and that's before we consider hybrid-touch laptops, microscreen smart watches, gesture interfaces or displays the size of a wall. Testing all the user permutations of your application is becoming almost impossible, so how do you go about working out whether you're delivering a good experience or not? In this session, we'll look at the use of responsive design oriented analytics coupled with a few statistical methods that will help determine how well you're delivering your experiences and highlighting the areas you need to focus on next in order to maintain a decent level of coverage.

Citation preview

Page 1: How well are you delivering your experience?

How well are you delivering your experience?

Andrew Fisher @ajfisher!

Web Directions Respond!5 February, 2014

Hi! My name is Andrew Fisher and I’m an Interaction Analyst. Today I’m going to talk about “How well are you delivering your experience” for the next 15 minutes. !

Page 2: How well are you delivering your experience?

Just the way it is

Throughout its history, the web has always suffered from fragmentation. Whether it was due to different browser versions, variances in platforms or now device types, it’s always been there - It’s a side effect of the web’s openness. !Recently, we’ve used responsive design as a method of combatting device fragmentation !!!!Image (cc) Jared. http://www.flickr.com/photos/generated/7854132806/

Page 3: How well are you delivering your experience?

Ignoring isn’t an answer

but today I think many of us are fixated on screen sizes. !All sorts of fragmentation is appearing that most of us aren’t considering right now at all. !!!!!Image (cc): macromary http://www.flickr.com/photos/macromary/5124878394

Page 4: How well are you delivering your experience?

Bandwidth fragmentation

For example, in Australia we have massive variations in bandwidth, especially in the mobile network as well as in rural areas. !!!!!Image (cc) United Soybean Board: http://www.flickr.com/photos/unitedsoybean/9629679217

Page 5: How well are you delivering your experience?

Touch is starting to dominate

We also have new input methods to consider. !Mouse and keyboard are giving way to touch, but what about voice, gesture and that weird pad thing on the arm of a Google Glass. !

Page 6: How well are you delivering your experience?

New input devices

And that’s before we consider the effects of things like MYO or Fin on interaction affordances. !!!!!image (c) Thalmic Labs

Page 7: How well are you delivering your experience?

No more secrets

So today, I want to show you how to use UX oriented analytics and analysis techniques to understand how well you're delivering your experience and where to focus your efforts to deliver more responsive one. !!!!!Image: Sneakers (c) Universal Films

Page 8: How well are you delivering your experience?

What we’ll cover

1. What behaviours are important? 2. Getting meaningful data. 3. How well are we doing?

To do that we’re going to look at: !What sorts of behaviour are we looking for? How do we get meaningful data? How well are we doing? So let’s get going !

Page 9: How well are you delivering your experience?

Understanding behaviour and context.

To start with, let’s think about what behaviours we need to look for in our users. !When designing an experience it’s all too easy to say it needs to work in every situation - which leads to a mediocre experience for everyone or we try to overly constrain our targets which leads to fragility and a lack of future proofing.

Page 10: How well are you delivering your experience?

Context matters

Our first step should always be to consider the contexts our users may be in and work out what we need to analyse from there. !Let’s consider 2 experiences that have very different user contexts so we can see how the behavioural analytics requirements change. !!!!!image (cc): http://www.flickr.com/photos/henry_stradford/5348910688/sizes/o/ Henry Stratford

Page 11: How well are you delivering your experience?

Tinkercad

This is a site called TinkerCad. Has anyone used it? !It allows you to do 3d modelling in the browser. !Now 3d modelling is a complex application. From a user perspective we need: !3d acceleration in the browser A nice big screen is helpful And we really need a mouse and keyboard to provide a fine level of control on our objects.

Page 12: How well are you delivering your experience?

Measures for TC

Available screen size Frames Per Second 3D acceleration capability Model downloads Model shares

When we look at UX analytics then, we might consider things like the following: !Total Screen size available to us The Frames per second in the editor and is it changing. Do we have hardware or software acceleration available? Are people downloading or sharing their models? This means they are having a positive outcome. !So there’s a bunch of things we can consider.

Page 13: How well are you delivering your experience?

Guardian

Here’s a different example. I’m sure most of you are familiar with the Guardian site but tho could be any news website. !Clearly this is very content oriented, people are reading, following links and sharing stories - it’s a very different type of experience from tinker cad - it’s much more transient and it’s heavily focussed on the story being viewed.

Page 14: How well are you delivering your experience?

Measures for guardian

Story density Scroll depth Next action Parked / Focussed ratio

So here we want to look at very different behaviours. !The story density - how much of the viewport is the story taking up? Scroll depth will tell us how far someone has read We can consider the next actions people take And I can look at whether the tab has been parked or not in order to tell whether people are dipping in an out over a period of time or reading in one sitting.

Page 15: How well are you delivering your experience?

Look as close as you can

These two examples show how different behaviours and contexts drive very different ux analytics requirements. If you’re only considering visits, maybe by device, then you need to dig a LOT deeper to get a better understanding of your users’ behaviour. !So we start with what contexts are we targeting, we then derive the behaviours that we want to examine in more detail so we can start assessing how well we’re aligned to that. !!!!!!Image (cc): Roland http://www.flickr.com/photos/roland/3672485590/

Page 16: How well are you delivering your experience?

Getting meaningful data to work with.

Now we can look at how we get meaningful data to work with. !Something you may need to consider here is that if you’re using a standard marketing analytics platform, you may need to supplement or enhance it to cope with proper behavioural data as few of them do it well out of the box.

Page 17: How well are you delivering your experience?

What we want

{ "interactionTime": "Tue 4 Feb 2014 12:05:15", "ua": "Mozilla/5.0 (Macintosh; … Safari/537.36", "deviceClass": "desktop", "touchDevice": false, "timeInModel": 846142, "currentFPS": 64, "objectsInModel": 36, "facesInModel": 268, "action": "Object ADD", "acceleration": true }

So we’re looking to store something like this for every single user’s interaction against a behaviour we’re after. You can see Ids, times, meta data around the tech they are using and the behavioural values we’re after too. !So how do we get something like this?

Page 18: How well are you delivering your experience?

Tallying is just the start

Let’s look at an example. We’ll consider page scroll as it’s relatively simple and it gives a sense of how we approach this from a generic view. !!!!Image (cc): with associates http://www.flickr.com/photos/withassociates/4385364607

Page 19: How well are you delivering your experience?

Scroll measures

Total scroll distance Scroll back distance Back to top Rate of scroll

So with page scroll we’re interested in multiple facets. !How far does someone go? Do they scroll back at all? Do they scroll back a bit or all the way back to top as these are different? What is the rate of their scrolling? !So this gives me a bunch of things I can instrument and start writing the code for.

Page 20: How well are you delivering your experience?

Keep it fast

However I need to be concerned about performance because I don’t want my instrumentation code to be the cause of the experience chugging because I’m tracking so much stuff… I’ve seen this happen. !!!!!Image (cc): Nathan Bittinger http://www.flickr.com/photos/nathanbittinger/4206583265/

Page 21: How well are you delivering your experience?

Capture tradeoffs

Scroll event? Polling on a timer?

So tradeoffs start to happen !I could hook the window.onscroll event but that’s going to fire every time a move a pixel - ouch. !Alternatively I could poll to look at the position maybe every 100 msec or every 10 seconds depending on what I’m doing. Maybe I use a mix - polling for forward, event for backwards.

Page 22: How well are you delivering your experience?

Keep appending

{ "interactions": [ {…},{…},{…},{…} ] }

At whatever point I’m trapping the data, I’m just going to append it to an object so my event code is returning nice and fast and not locking anything up. !Periodically I can then post all of this data to the server which I’m going to do as a broadcast - I don’t care what the server responds. !This works for any variable I can use to assess the experience.

Page 23: How well are you delivering your experience?

All variables work

ß 64˚ ß 32˚ ß 20˚ ß 120˚

For example I could use the device orientation API to determine whether someone is sitting, standing, walking or reclining - which might be important for shaping your experience or determining your user context - such as dealing with some of the reach issues that David touched on.

Page 24: How well are you delivering your experience?

Data capture

Trigger state capture Aggregate variables Broadcast to capture server

So our approach is basically trigger the event, dump the value of it somewhere and then periodically broadcast it to the server when the browser isn’t doing anything.

Page 25: How well are you delivering your experience?

Understand your users

So we look at the behaviour and context we’re trying to understand then we design some methods to capture data relating to them. !!!!!Image (cc) Johan Larsson http://www.flickr.com/photos/johanl/5619897608/

Page 26: How well are you delivering your experience?

How good is our experience for different users?

The next step is to work with that data to see how good an experience we are delivering. !

Page 27: How well are you delivering your experience?

❤ maths

Here’s where we are going to hit some maths but it’s fairly straight forward. !The way I approach UX Analysis is for each behaviour or metric I’m looking at I assume I’m actually doing a pretty good job as my baseline. Then I create an experience crapness measure to represent divergence from this optimal state. !!!!Image (cc) Zebra squares http://www.flickr.com/photos/zebrasquares/4556045398/sizes/l/

Page 28: How well are you delivering your experience?

Analysis example

Now I know I said we consider screen size a lot but we’ll use it now because it’s something you can apply this method to your own site right now and it keeps the maths easy. !Let’s assume you’ve built a nice responsive site. We’ve got three optimisation points in the design for mobile, tablet and desktop type experiences. It’s working on all the devices we’ve tested which is great!

Page 29: How well are you delivering your experience?

Visits by resolution

We can take the screen resolutions of all the people visiting the site and then cluster them according to our known GOOD screen widths. !So we end up with a chart like this. Lots of people clustered around the peaks of popular devices and our breakpoints. We’re not doing too bad. !Now we start to look at the divergence from this. But we need to be careful, because not all diverging experiences are as bad as each other.

Page 30: How well are you delivering your experience?

Good or bad could be pixels

Take these two. The one on the left is nowhere near as bad as the one on the right. Who hates it when this happens and you can’t zoom? !Even though the divergence is the same, the user experience is worse if it’s too small.

Page 31: How well are you delivering your experience?

Weighting experiences

XP = 10 where Ws < Wc XP = 1 where Ws > Wc

So lets create a rule to account for that. !Screen widths that are too small are 10x worse than screen widths that are too large for the content area. !Now we’ve got the beginning of a comparative metric we can use for crapness.

Page 32: How well are you delivering your experience?

Weighted divergence

Now if we look at these weighted scores on the frequency of divergence we can start to see trouble spots. Creating groups we can start to look and see if any group is under performing on key behavioural metrics compared to others. !And so we look at this over time.

Page 33: How well are you delivering your experience?

A new device

And then what happens. Someone launches a new device that is right between two of our optimisation points. !And we can see the experience measure is terrible for this group and as a result our key number for reading content has dropped by 40% compared to all the other groups. OUCH. !!!!!!image (cc): Janitors http://www.flickr.com/photos/janitors/9072453459/sizes/o/in/photostream/

Page 34: How well are you delivering your experience?

Watch, then act

So now we know who is affected, how bad the experience is how many people are affected we have enough information to act. We can implement some mods to make the experience better and we track the results and sure enough our content metric almost immediately fixes itself. !!!!!!!image (cc) Penguin Cakes http://www.flickr.com/photos/penguincakes/3119527981/sizes/o/in/photostream/

Page 35: How well are you delivering your experience?

Any variable works

Mouse = 1 Touch = 5 Gesture = 20 Voice = 100

We can use this approach for any data point you can think of so long as we can capture it including things that are categorical. !For example if your experience assumes a mouse but the person can only use touch, maybe that’s a 5x worse experience. If they can only use gesture maybe it’s 20x worse and if it’s voice maybe it’s 100x times. !So any set of variables can be used to measure divergence so we can assess how well we’re delivering the experience.

Page 36: How well are you delivering your experience?

Summary

Examine behaviour and contexts Create instrumentation to get data Use weighted divergence scores to assess where the issues are.

In summary then. !We look at the behaviour we are trying to understand and the contexts we are designing for. We design instrumentation to assess what people are doing and get us the data. Then we establish known good benchmarks and we use weighted divergence from these to assess where we are having issues.

Page 37: How well are you delivering your experience?

How well are you delivering your experience?

@ajfisher (twitter / github / etc)[email protected]

If you have any questions including about how to instrument something that looks unmeasurable then hit me up - here’s my details. !Hopefully you can see how this approach can be used on your site or application and now you should all be able to go and see how well you are delivering your experiences. !Thanks.