[This is a transcript of Part 2 - Store Conversion on Steroids. Below is a link to the video.]

Introduction


Kausambi: Let's get started. I definitely have a few questions already. I don't want to miss out on an opportunity to us that, so, yeah, so today's day two and we are super, super pumped here for 9 days, 9 experts series of livestream that we are calling Store Conversions on Steroids. It's all about intense, but quick, deep dives into what does it really mean to run a high converting online store.

And a big part of that process of optimizing conversions is - experiment, experiment, and experiment. What works for me, my industry, my products, and my brand up. Right. But, and everyone's trying to run a bunch of AB tests, but practically speaking, how do I experiment successfully?

That's that's pretty important. Right? And is AB testing something that. Literally right. For my stage of the brand or my stage of business. Right. Um, we dive into all of this bit Oliver, but first, a quick 30 second elevator pitch. Oliver, tell us about your work and why is it important for.

Oliver: Sure. Yeah. Um, Hi Kaus! lovely to be here.

Um, I guess I would say my work is really split along two lines. So on the one part, I do a lot of work with large enterprise scale organizations, retailers, um, financial services organizations, and so on that have really invested in optimization and personalization. And I help them. Build scale, their programs manage their programs better and ensure that they're really getting good results out of those programs.

That the other side of it is, is working with small to medium sized e-commerce businesses. Um, where my work is really focused around helping. I guess create a strategic blueprint. Lots of my work is, is creating a diagnostic, looking at all of the research sources, doing qualitative research, quantitative analysis, getting the sort of picture that you can really only see when you're outside of a business and helping to, to say, what are those leavers that are going to really drive results?

And in those cases, very, very rarely do I recommend A/B testing.

Are Experiments Important? (A/B Tests)

Kausami: That's really interesting actually, because it's a super popular strategy, right? Like everybody's talking about it doesn't matter. It could literally be a business accumulate GMP, and they're talking about, we need to run A/B tests. And if you did that, Most of the times, their version of A/B test is that - Hey, I just did a 50-50 split and I put a, you know, a one copy on, uh, on one segment and another copy of the other segment.

I don't really know much more about it and I'm not taking a calculated call on whether that's right for the business that I'm running and the kind of visitors that I'm seeing in my store. So are these really relevant for all stores? And you just said that you don't recommend it for some kinds of businesses.

And can you talk a little bit more.

Oliver: Yeah, absolutely. Um, I mean, the thing about A/B testing is it's not easy. You know, there's a narrative out there that says that it's really easy and there's a lot of vendors and agencies and everyone talks about, you know, changing some button colors and some copy text and getting double digit conversions increases.

That narrative I think has, has sobered up a little bit in recent years, but that for a long time, that was really the case. Everybody was talking about that. And I think anyone that was running tests and not seeing those results, um, felt a little bit embarrassed about it. You know, why are they giving me.

Everyone's talking about. Cause that was really the dominant narrative for a very long time. And certainly what I see working at enterprise scale is yeah, absolutely. It's not easy. It's very hard to run a good optimization program to get good results out of A/B testing, um, and even more. So it gets even harder when you, when you're working with sort of smaller and medium-sized organizations that don't have the enormous resources and that enormous scale.

And there's many, many reasons for that. One of the most pertinent ones I think relates to traffic, you know, any, any experiment needs to have, um, uh, well to obtain statistical significance, there's a couple of inputs that you need to consider. So one is how many visitors are coming to my site. What's my baseline conversion rate.

What is the minimum, sort of change that I'm looking to detect? So if I've got, for instance, I'm looking at one here on a, an A/B test duration calculator, and I recommend people have a look at these that can be really useful to assess, um, how relevant A/B testing is for you. But I've got one up here that says right.

A 3% conversion rate, pretty normal, maybe on the good side, the minimum detectable with 5%. And I've said that this site has got 5,000 visitors a day, a hundred percent of visitors in the test. So that would take 41 days to run that test. And the thing you have to overlay on top of that as well. Is that most AB tests don't deliver a statistically significant uplift.

Most A/B tests do not deliver a statistically significant result. - Oliver

Most A/B tests don't deliver, um, a statistically significant result at all. Typically, you know, there's benchmarks from, um, organizations with, with really large sophisticated testing programs like Google, Airbnb, uh, booking.com and so on. And they say, they go on record saying. Really only about 10% of their experiments deliver a statistically significant uplift.

So in this example, I just started here with a store. That's got 5,000 visitors a day, not a lot. Um, that's 41 days to run that experiment. Wow. Maybe you run 10 tests a year, perhaps you might be able to get away with that. And you, at the end of it might get a 5% uplift. There's a huge amount of effort that goes into that.

My feeling is ultimately that that really people can spend their, uh, efforts much more wisely and get those results with, uh, with, uh, a lot greater.

How Often Should You Run Tests?

Kausambi: It's pretty interesting. So what do they really do then? Right. Like if you know this, when you put it in perspective that you got to run a single test for 41 days, and how many days do we have the you're left to run the rest of your experiments, right.

It sort of makes you step back and realize that it's not going to be fast or iterated enough for me to see that. Chop that I'm expecting to see. Right. And then what do I really have as least.

Oliver: Um, and, and also just to sort of put that in perspective. So that's a 5% uplift. Now in this case, this store is doing 5,000 visitors a day.

Depends on, um, average revenue per visitor, maybe 5%. It's a lot to them. Maybe it's not. Certainly in the case of, you know, I've run experiments with really large e-commerce retailers where we've got a 1% uplift and maybe we've won an experiment on millions of visitors over the course of a month, or even longer to be able to reliably detect that 1% uplift, but it's worth it because that means millions and millions of dollars of incremental revenue for those businesses.

In the case of this door, that's got 5,000 visitors might not be that much.

Kausambi: Yeah, absolutely. I mean, it depends on the defense and the baseline itself is 5,000. So even if we're talking about 5%, that's not really going to be the impact that you were hoping for.

Oliver: So in my work typically, um, I start with just a diagnostic audit of analytics and I think that's a great place to start. Um, I've never, ever encountered a perfect analytics implementation and in fact, almost all of them are completely broken. And I don't know if that, that matches your own experience, but I just see that again and again, and again, even the organizations that think their analytics is, good, their analytics is not good.

They're typically, not comprehensive enough. They're not tracking the right things or they're just delivering junk data, which is distorted and not giving an accurate read on overall business performance. There's a really good place to start, particularly before engaging in A/B testing. Even if you've got the scale, I think it's really important to ensure that your analytics are in a good place first.

Checklist When Assessing Analytics

Kausambi: Yeah. The, are there any benchmarks or any checklists that folks can use if they actually get stuck at assessing their analytics information?

Oliver: Yeah. I mean, there's some automated testing tools, which are a good place to start. Uh, there's one called Verified Data, um, which I think costs, I think cost a hundred euros or something and you'd link up your Google Analytics property and it runs a huge amount of tests on there.

And we'll tell you things like, um, you know, are you recording a large amount of sessions which have no origin for instance, Uh, you know, dead giveaway or, um, are you, um, are you capturing PII? And it runs all sorts of automated tests, which will give you, uh, a very good sense as to the health of your analytics.

And, yeah, I really recommend doing that.

Steps to Optimize Conversions

Kausambi: So let's say I did that and I checked out my health or my dashboards and the data that I'm getting. I kind of realized that I'm probably in the ballpark. There are some things to fix, but I really still need to optimize conversions. So what do I want, what are the next steps that I can do?

I always recommend user research. It's... almost like a superpower!

Oliver: One of the things that I, I almost always recommend. I think there's just the most powerful thing. I think it's almost like a superpower and it baffles me that more people don't do it is just, it's dead simple. It's just, it's user research and people have this idea that user research means, oh, you have to have a fancy Blab with like two way glass and you can watch the participants, you know, from the other side.

User research can be as simple of sophisticated as you want it to be. - Oliver

But it can be as simple or as sophisticated as you want it to be. I worked for, for many years, with a large retailer here in Australia, which has a lot of bricks and mortar department stores in addition to their, their online business and pre-pandemic. I used to often just sit in their outdoor furniture section on a sort of a plastic table and share with my laptop and a stack of gift vouchers and talk to customers and say - 'Hey can I like ask you about how you shop online and so on? What do you find difficult, confusing, off-putting about this website?'

If we were contemplating changes, I would show them clickable prototypes and so on. And it's just the most invaluable way to understand problems that you don't see because you're in it every day. If it's your own business, your own store, you know, there's that old saying, like you, you can't read the label if you're inside the tin.

Start talking to people that use your products - you will find some really surprising things about what you have missed. - Oliver

And I see that again and again, all of these websites have been conceived, designed, built almost entirely independently of the people that are actually going to use them. And when you start talking to the people that use them, you find some really surprising things about what you've missed!

How to Conduct User Interviews

Kausambi: Yeah. But, but what are the right questions to ask? Because that's, that's also something that can actually lead people in their own direction, or it might not.

Oliver: Yeah, absolutely. The questions can really vary on what you're trying to achieve. I guess I would say more than specific questions. I'd be willing to offer advice on the approach that you might want to take and the sort of caveats that you would offer.

One of the things that I always say that I've found is really useful is just to say - You cannot hurt my feelings here. Whatever you say about this website feedback is a gift. I didn't make this, even if I did, just please tell me openly and honestly, what you think... - Oliver

Because oftentimes people will be... the thing about user research is there's a great blog post from one of my former colleagues, a guy called Harry Brignull.

Who's written a lot of interesting stuff about user research and UX design, and he's, I think it's called the unique thing about user research. And he says, what makes user research unique? If you're doing it properly hurts, it's painful. You bringing bad news. He says, if you don't run a usability study and feel, you know, if it doesn't feel like a kick in the guts, you've done it wrong.

You're the website's perfect or you've done it wrong. And definitely. We have the answer. I love that maybe we can link that out to, uh, you know, folks who are viewing it live or, uh, to, to, uh, everybody else later on email, uh, just a heads up, uh, you know, if anybody's watching it live on YouTube or LinkedIn, please drop in a few questions.

Kausambi: We already have a couple of questions, from some of our audience who couldn't make it today. I'm going to pick that up.  

Question: Is there any instance when qualitative data or insights is shopper feedback contradicted with the quantitative data?

Oliver: It's an interesting question. I can't think of an example that really comes to hand. Um, I mean, I know that people do say, you know, the, there is a, there is an issue whereby there is, there is typically a sort of, you know, what people say they do versus what they actually do.

And I've certainly seen that, but I can't think of an example that particularly comes to mind.

Kausambi: Um, the next one, I think we've covered a bit of it, but maybe diving into it.  

Steps to Collect Feedback

Question: What would you suggest is the best method for collecting feedback? I think what you mentioned about asking an honest opinion on the product and the experience is great, but is there anything in terms of pointers or a three step checklist that people you've got to keep in mind when they're speaking to.

Oliver: Yeah. I don't know that I have a checklist. I think one thing that took me a while to learn, which I'm like, like everything seems very obvious in retrospect is that, um, it's really good to talk to people that aren't your customers already. You know, if you've got an existing online store, you've got a database of all of your customers, it's very easy to email them and say, Hey, would anyone like to participate in some research?

But the fact is most stores convert sub 3%, let's say. Um, so you've got 97% as least of the visitors to your website, aren't buying something. And I think they're the people that have got the really rich insights. If you want to increase conversion, they're the people that you want to talk to, particularly, it's great to talk to existing customers as well, but there is just such bountiful insights to be had in talking to the people that didn't convert.

That's something to bear in mind in terms of recruitment. And I'm just simply putting a pop-up on your site that says, - 'Hey, would you like to talk to us?' And inviting those people whose details you don't have can be a really good way to find the right people to talk to.

Examples of Collecting Feedback

Kausambi: Do you have any examples or anecdotes around this? This is very interesting for me because I think it's applicable. It doesn't matter e-commerce or SaaS or whatever else you're doing. Talk to people who don't use your product, any examples, any, any stories that come to mind?

Oliver: The first time I ever ran a usability research study of any kind was back in 2008.

And I was managing a small e-commerce store in Australia and New Zealand that sold magazine subscriptions. Um, it was the online arm of, uh, of, uh, a business that actually still exists, uh, which only sell sold magazines now sells other things. Um, and I had just discovered usertesting.com. Which still exists, but it's now quite an expensive enterprise service.

You have to enter into a contract and give them a lot of money back then you could give them something like $50 us per person. And, um, you could set a task which might be go to my website. Um, find a magazine that you want to purchase. Go through the whole process of purchasing. Go all the way to the end.

You know, don't put any credit card details. It's a typical task. And, uh, Back then you could only do it with people in the U S , now you can do it with people everywhere. And there's lots of other sites like this, um, usability and many others. If anyone wants to check these out, really recommend just, just looking into unmoderated, usability testing.

Um, anyway, so we ran this test and. We ran five people through it. And everyone got to the same point in the checkout and said exactly the same thing. So we had a thing in our checkout, which said you will receive the first issue of your magazine in eight to 12 weeks. That's normal for the magazine business.

And the reason for that is a magazine could come out. Monthly, could come out by monthly, but let's say it comes out monthly. One magazine has just come out four weeks until the next one there's processing. There's middleman, there's slow systems. Um, there's importing. It's just very normal that magazines take that long to.

So we had that on our website and the reason we had it in the checkout was because we want it to be transparent with our customers. We thought, you know, like that's what Amazon would do, right? They'll tell you exactly what you should expect. That's a good customer experience. Every single person got to that same part of the checkout.

And they said, are you kidding? This is a joke, eight to 12 weeks. Why. And I showed this to my boss. We looked at the videos and we went, wow, we had don't. We would just being transparent. And we looked at all of our competitors who would still being supplied by the same companies, because typically, you know, the magazines, the publishers themselves send out the products.

So had the exact same conditions that they were operating under. No one else said a thing. So we. Very cautiously, even then remove that line from the checkout and with their bright, we're just going to keep an eye on complaints and, uh, hopefully it will be okay. And we had immediately had a step change conversion uplift just by stepping outside of the square and seeing something, you know, from our customer's perspective.

Kausambi: Yeah. And that's not something that just your Amplitude or your Google Analytics will give you, right. This is literally how your, how your customer reacting, but they're seeing that statement right in front of them and they're checking off, right?

Oliver: Yeah. And date, I mean, I think, I think probably, uh, when somebody's right in front of you or right in front of you in a video saying this is a joke, and you've got five people saying that, you can be a lot less prone to confirmation bias than you might be.

If you're looking at your reports in your analytics. Absolutely.

Kausambi: Yeah. Yeah. Love that example. Um, and, and there's a third question, uh, just to heads up, we had a couple of folks joining in. If you have any questions, please drop in, uh, on chat. Uh, I'm on the third question in the list right now. Um, so from your experience, and I think you kind of touched upon it a little bit.

When Should You Start Running Tests?

Question: How many visitors do I really need to actually start getting, uh, you know, to a place where constant testing starts making sense?

Oliver: Oh, wow. Constant testing. Um, look, I think it will, it'll really depend. It'll change for everyone. It'll depend as to how much effort is involved in putting together a test, whether you have a, you know, a team set up for it, where the constant testing for you means engaging an agency or building an in-house team.

Um, you know what your AOV is, for instance, you know, what sort of ROI you expect to get from it. But. As a, as a very basic, um, rule of thumb, I typically think, oh look, I wouldn't consider really investing optimization until I had at least a couple of hundred thousand visitors every month. Yeah. That's a good benchmark.

Kaausambi: We are nearing the stop time. Anybody has any questions once we closed the livestream, you can definitely ask that. Otherwise Oliver, this was super interesting. I love the anecdote. I'm actually going to take it back and hopefully, hopefully get a bunch of recordings from our customers too, on how they're reacting when they're seeing flows within the app.

But thank you so much. This was very insightful and, uh, we hope, uh, that we work on that. Blog and we send it out to our audience. Eddie, be so excited to work with you, excited to have you on, uh, today's session. And thank you so much.

Oliver: Thank you. Thank you very much. Lovely to be here. Thank you.