S01 E12 – Ecommerce SVP Justin Christianson on How to Listen to Customers & Split Test
Guest: Justin Christianson, SVP Ecommerce, Conversion Fanatics
Host: Andrew Figgins, Founder, AOV Lab
Episode Synopsis:
Justin shares his experience setting up sites to listen to customers while delivering value to those customers via split testing, while also delivering value to the business by collecting email addresses.
Transcript:
[00:00:15] Andrew: Hello Ecommerce fans, and welcome to 10-Minute Ecom, an AOV Lab podcast. Every episode, we break down a new and different tactic that can help you improve your ecommerce KPIs (key performance indicators). I’m your host, Andrew Figgins, and like most of you, I am an e commerce professional. You may know me as the founder of AOV Lab, the former VP of Digital Product Innovation at Scrubs Beyond, the former director of E-Commerce Technology at Rural King. Or from LinkedIn.
Today I’m excited to be talking to Justin Christianson, an ecommerce colleague that has a long track record of success in e commerce and conversion rate optimization. But he’s also humble, which I love. Anyway, Justin, what did you come on the show to share today with your ecommerce colleagues?
[00:01:05] Justin: Hey, Justin Christensen here from Fusion 92 and conversion fanatics. Excited to be here and share some knowledge that I have in ecommerce. One thing that I know and I’ve seen work in ecommerce is truly just listening to your customers and following that qualitative data and supporting it with some actionable split tests to really drive your growth and scalability.
[00:01:28] Andrew: Thank you, Justin. And right after the ad, we’ll get right into the chat.
Today’s episode of 10-Minute Ecom is brought to you by Constructor.io. If your website isn’t relying on constructor’s machine learning algorithms to power your site search recommendations and quizzes, you are leaving money on the table. And I know what you’re thinking. The host of this show, Andrew Figgins, not only can he host the show, but he can also read podcast ads that he wrote. Is this man proving right now that he’s a double threat? Not a triple threat, just a double threat. You’re not wondering that. Okay, back to constructor. I saw up close just how effective constructor’s technology is. Having implemented it at Scrubs & Beyond, it’s also trusted by Sephora, Life is Good, Birkenstock, Ashley, Bonobos, Petco, Backcountry, Plow and Hearth, and even Target Australia.
Go to AOV Lab.com and click on Vendor Network to set up your 30-minute discovery call with Constructor today.
And now, back to the show.
[00:02:35] Andrew: Justin, it’s a pleasure to have you on ten minute ecom today. I know you’ve got a long career in ecommerce, so I’m super excited to chat with you. Tell me a little bit about split testing. I think I’ve heard it called A/B testing quite a bit in the past. What’s your approach to a brand new company that you’re just starting to consult with. There needs to be some split testing or some A/B testing happening. How do you go about getting it implemented?
[00:02:59] Justin: Most companies don’t actually split test and they don’t actually run A/B tests. They don’t deploy what’s true optimization strategies to make their decisions in marketing. And as a marketer, assumptions kill, so you can’t assume one thing is going to do. So we validate it with A/B testing.
So the first thing that I do when I jump in is just get a basic read of the analytics. See, the big reports that I look at are demographic breakdown, mobile versus desktop breakdown, what pages are actually going to, and then that shopping behavior of where the drop off points are and ecommerce are the easy, you’ve got your homepage collections, product cart checkout kind of flow and you just look and see specifically where it is and that’ll pinpoint where we need to start our strategy. With there, I’ll jump in and I’ll actually pull some qualitative feedback. So using geek maps, click maps, scroll maps. One powerful solution to back that up is exit polling, which most companies don’t. They just guess or they see it. Oh, so and so site has this on it. We’re just going to use that. They must have tested it. It must work. When that’s a recipe for disaster.
The whole goal of testing is really just to understand the behaviors of the visitors and how they interact with your site. Where they’re falling off, what are they ignoring? What holds weight in their eyes? We validate that or prove or disprove our assumptions are right with A/B testing. So you find out, oh, they really respond well to social proof, or they really respond well to a shipping expectation timer or something random or heavy visuals or review shopping or just whatever demographic breakdown that you have, then you can expand that to a bunch of things with the exit polling. That’s just our way to see the forest through the trees. In a lot of ways, we ask simple open ended questions like, hey, what’s holding you back from buying from us today and just letting them vent. They’re going to tell you whether you like it or not. Most people don’t like that outcome or those answers, but it’s very viable information. It’s usually a pretty big forehead slap once you’re in there to understand where the visitors are at.
A good example of this was a men’s grooming company that we worked with that the number one question was, “I don’t know what scent to choose.” And we had the scent guide on the page, we had all of this stuff on there and all we did was just move it up on the page and it increased mobile conversions by 30 some odd percent. It’s just a question that the visitors had and we just helped solve the problem and it drove out the growth.
[00:05:24] Andrew: That’s amazing. That’s a great gain from such a small change. Just moving something up a little bit on a page like that.
[00:05:33] Justin: I’ve seen one word swing conversions 80%.
[00:05:36] Andrew: What was that word? I have to know.
[00:05:38] Justin: It was changing add to bag to add to cart. And it’s about a 50/50 coin toss whether it’s going to go one way or the other. But most people just assume because the big brands have bag, particularly in like home goods or clothing and accessories or high end retail, they’ll use bag and most of the time it doesn’t work.
[00:06:03] Andrew: I’m super curious now about that specific A/B test, because I know in my last gig for about five years, we had “add to bag.” It was one of those things where internally we would often trip over ourselves and saying, add to cart. Oh, no, [it’s] add to bag. But we never actually tested it with customers. Now I’m kicking myself.
[00:06:22] Justin: I’m like, man, the big fish in the pond’s Amazon. So you have to pull some ideas or concepts or things from that known. Even though their experience is absolutely terrible. It’s a known experience and you know exactly where you’re going to go, exactly what you’re going to click on, exactly where that fancy orange button is.
And it says “cart,” it doesn’t say bag. So it’s always unique and always fun to see the split test results in there.
[00:06:50] Andrew: How do you tend to generate ideas for tests, Justin? Because we talked a little bit before we started recording today about that tendency that brands have, and I’ve definitely been at brands where it’s been like this, but you watch your competitors, see what they’re doing, and then maybe they have a good idea or a new section of the site. That’s pretty cool. And then before you know it, you’re basically designing the same thing out and trying to add it to your site without really necessarily validating it with customers first. But how do you come up with those good ideas? Is it really just a matter of trying as much as you can?
[00:07:26] Justin: It’s a little bit of art and science. It goes back to just common sense in a lot of ways, too. At this point in my career, I’ve just seen it so much that it’s okay. I know this thing’s going to have a good chance of winning. I’m still proven wrong, but after 20,000 plus odd split tests, it’s easy to see there. But yeah, we pull swipes all the time.
So ideas or concepts pass split tests. But then I just use the common sense split kind of thing. They’re females, 35 to 45. They are on mobile devices primarily. They’re falling off, they have a high abandon rate. So I’m just going to go through a user experience flow of what that looks like.
Okay, home page. What’s confusing? Where am I hung up? Do I know exactly what this brand is all about? Do I know exactly what the next step is?
Collections? Is it easy to filter and sort? Do I know exactly where things are at?
Product page. Does it have all the stuff that’s needed above the fold? Is it confusing or have redundant information? Same thing with cart. And I just go through that user journey and then make a list of about the first ten or so ideas. I don’t get a bigger list than that because you never know.You can’t just go. It is never just a go one through 100 on the split test ideas. So I’ll just go in and I’ll try to cover all those five key site areas and just incrementally test.
So, hiding stuff, changing stuff, rearranging things on the page to see. Does it matter if it moved that collection block up on the home page right below the fold? Or is it the bestseller block? That’s a carousel of products. Do we move that up?How big is the hero image? Does it actually drop things down below the fold of the page? And does that matter? You just test and the visitor is going to tell you, oh, yeah, this raised or lowered? The conversion rate means it holds weight or it doesn’t, or it’s a null and it just really doesn’t matter. So then you got to find something that goes in that spot that actually does matter and then you just wash, rinse and repeat. From there, we find out they really shop well by doing categories. So how many different ways can we do categories and collection breakdown and how can we lead them down that path to where we want them to go?
[00:09:32] Andrew: Excellent. And something that came to mind as you were describing, that is, I feel like you’ve landed on a pretty healthy way of looking at it because you’ve got a lot of expertise and experience over 20 years. But you said something interesting, which is sometimes you’re wrong, basically, and you give yourself that room in that space.
Do you find that companies that you’re working with are open to the idea of failing with split testing or not?
[00:10:03] Justin: A lot of them, no. The smaller companies, definitely not. You think we live in a very growth hacker kind of world where everybody’s screaming, and particularly in Ecom, oh, just spin up the store and you’re magically going to be doing $100k a month. And I have a saying that marketers ruin everything, because we typically do. We managed to just completely derail the purpose of a lot of things. But yeah, a lot of companies, they think they need these 30% month over month growth and all of this where these bigger companies, if they get 5% a year, they’re like, you’re a god.
So some companies aren’t. They just want those big fixes and those big swings that magically are going to boost their conversion rate from 2% to 3% or boost their average order value $10. But the fact of the matter is, I can do that, and I can get you those quick wins. You’re just not going to like me in six months when the offer now doesn’t work. So you can’t jeopardize your long term scale and growth and your sustainability over quick wins with gimmicks, tricks and tactics. But I’ve gotten to a point where we win almost as much as we lose. Where I think industry averages, I think the win rate is only a little 12% measurable improvement in a split test.
We have about a 45% win rate because of the way we approach it. And we take that incremental approach, and then we use that as a springboard to the next step. So if you can get good and understand that persistent, consistent experimentation, it’s less about your conversion rate and it’s more about learning and understanding and adapting and evolving. Same thing we saw during COVID for example, when the world shut down. All these companies freaked out. But the companies that we were working with, we just leaned into it. We already knew what we needed to do and what the understanding was. And all we did was just expand on setting some expectations and some different things in that regard, and they came out on top. So it’s being able to have those adaptabilities and those swings in there that allow you to pivot and adjust and to different market conditions and different cycles as we come about. We’re coming into an election year same thing. It’s going to be a lot of nonsense for the next year.
How do you cut through the noise, and adapt and evolve? And the only way to really do that effectively is through understanding your visitors. And the best way to do that, I found, is either one, ask them. Or two, support it and approve or disprove whether you’re right or not with A/B test.
[00:12:36] Andrew: I appreciate you bringing that tactic today, Justin. Unfortunately, we don’t have a whole lot more time to chat, but before we head out, I wanted to ask you if there’s anything else. You’ve got an audience of e commerce peers and colleagues out there listening to the show today. Is there anything that you’d like to share with them?
[00:12:51] Justin: The biggest thing that I say is just put your ego aside, no matter how much you think. Like I said, I’m proven wrong all the time. I’m humbled all the time. I’m always a student and I have been for a very long time. And I learn something new every single day. And don’t be afraid to go out there and take some calculated risks. But if you’re supporting it with data, it’s not really a big risk because either you win or you learn. And that’s the biggest thing that you can do to grow your brand or grow your company.
[00:13:20] Andrew: Thanks so much again, Justin. It’s been a pleasure having you on ten minute ecom today.
[00:13:24] Justin: Yeah, thanks for having me.
[00:13:25] Andrew: Well, we have hit that ten minute mark, so that’s a wrap for today’s episode. I want to again thank our guest, Justin Christianson. If you have a moment, be sure to subscribe like or follow the show on Apple Podcasts, Spotify, Amazon Music, Google Podcasts, Deezer, wherever it is that you listen. If you have a topic, suggestion, or if you’re an e commerce professional that would like to join for an upcoming episode, reach out to humans at AOV Lab.com and a human will read and respond to your inquiry. I hope you’re enjoying these first episodes of the show. I will continue to publish two episodes each week this month. And here’s a little surprise. The plan is to produce additional episodes each week throughout 2024. Over 150 episodes next year.
So gear up! Until next time, this is Andrew Figgins saying, have a good one.