Sarah W Johnson on SaaS Experiments
In this episode, I talk to Sarah W Johnson, a Product Owner and former Growth & Optimization Manager at American Addiction Centers.
We talk about how faster … isn’t always better 🤔
Listen on SoundCloud
… or on YouTube (with subtitles)
- “The really interesting thing about this project is what happened after [we hit our task completion time target]”
- “We weren’t expecting that the speed at which we were presenting results would impact [trust]”
- “We put a fake loading screen in :)”
- “Think about a video game giving you tutorial tips …”
- “It was a total placebo”
- “When our software came in and did [the task] at the push of a button … they didn’t trust it”
- “People want to imagine that the software they’re interacting with … has to think”
- “You’ll see the same kind of approach used in other places”
- “Talking to the users was key”
- (Narrator) Nobody likes to wait. So the faster your app, the happier your customers, right? Today on SaaS Experiments, we’ll hear from Sarah W. Johnson who found the exact opposite to be true.
- Sarah W. Johnson is a product owner and former growth and optimization manager at American Addiction Centers. She’s worked in Conversion Rate Optimization, UX and SEO for companies and clients from e-commerce to restaurants, to insurance. But today she’s here to share a very counterintuitive result from a former SaaS client. Sarah, welcome, great to have you.
- Hi Brian, thank you so much for having me.
- It’s a pleasure. So let’s get right into this. Can you tell us a little bit about the company and the product that you worked on?
- This story comes from a construction software company. They work on material planning, which is resource allocation, buying materials and things like that. And their software helps construction companies do that faster.
- All right. And so the user here then it’s a construction company or a contractor, or tell us more about who needs this stuff.
- So the actual end client for the software is construction companies, architects, engineers, in some cases, anyone that would be planning for what materials they need to bring onto a job site or ahead of construction work being done.
- Okay, what is the olden way of solving this problem? What did people do before they had software to handle this material planning stuff.
- So before the software solutions existed, people would do manually, say you’ve got to build a 10 foot by 20 foot wall, you would figure out how many studs you needed, how many screws you needed, what kind of sheet rock or clouding you needed, what kind of installation, all of that and figure it out manually. And then add every wallet for the project and every door and all of that. And that would be your material list. It’s a lot of manual work.
- So somebody who’s an experienced domain expert who knows construction, sits down and just does this with some graph paper or a spreadsheet or something and plans all this out basically by hand.
- Got it, yeah, that, that sounds hard. And so I guess the importance here is that this work has to be done pretty early in the process for the purpose of budgeting and planning out the entire build?
- Yes. It’s generally part of the initial rundown after blueprints are delivered, so that people can make a proposal, companies can make a proposal for what their costs are because a huge chunk of construction costs are the materials and then the labor. But if you don’t know what you’re gonna need to build a building or whatever, you can’t really put a decent proposal in.
- Got it. So getting this right can make or break a project entirely or can make or break the actual profitability of the project?
- So this company has produced a software solution for this material planning, which is crucial for budgeting and approval of a project. I get why it’s important. Tell me more about how the product actually made money. How do they charge their users? How did that work?
- So the original model was buy a license and you could use it from here to forever. But they moved to a subscription model during this process. So we were making that adjustment as well to being able to add renewals and things like that within there. So they made money by selling the software to different construction companies and planning firms and things like that.
- Okay, got it, and so it’s a monthly subscription?
- It’s a monthly, quarterly or yearly.
- And so what was your role in all of this? Which KPIs were you paying attention to?
- So my big KPI for tracking this was, were people getting through the entire process? What we were doing was we were taking the timeframe from the first upload of their information to the final print out of the material sheets that the software did. So think of it like from you opening your refrigerator to writing down your grocery list that you need to go to the grocery store. That timeframe is what we’re looking at.
- And we were seeing that our expectation wise for medium scale project, that we were expecting that to take about two hours. What we were seeing was that it was taking about four to six hours because people were having to do a lot of tweaking along the way, which in some regards that’s expected. But the fact that they were taking three times longer than planned was a bit weird. So we wanted to go back and track every little thing along the way to figure out where exactly people were having problems or what we anticipated to be problems. They didn’t complain about it, they liked it.
- Huh, well, so yeah, what did you find? So what we found out is that the reason it was taking longer than we were expecting was some of our interface elements were a little difficult, they weren’t as intuitive as planned. And we had just copied the original, basically UX, from the software to bring it onto the web. So we started cleaning that up. And what the result of that was is we got down to that two hour expectation within the first 60 days of updating the UI. But the really interesting thing about this project is what happened after we got that time down. When people were at the end of their sweep of blueprints or tweep, the requirements for what materials it needs to produce process, and got to the point where they were ready to hit the submit button to print out their materials list. What we were seeing was people were hitting that submit button and then our software was pretty much instantly printing off a material list. And what we were seeing was instead of people immediately downloading it and coming out of the software, they were backing up in the process and going back a few steps and double checking things and going line through line. We could see them moving their mouse line through line, through all their settings and things like that and then resubmitting it, thinking stuff wasn’t quite right. So we saw that behavior in about 45% of the user base that was letting us do the session recordings. So we reached out to them and asked them what was happening. And from about 25 people we were talking to one on one about the software, what they were saying was, we don’t trust it. It’s happening really fast. We think we made a mistake or something’s not calculating right so we’re going back and triple checking and then we’re sitting here without downloading everything just to double check and make sure that the printout list is actually accurate to everything we put in. And that was astounding to us because we didn’t change anything on what calculates the material list. This particular calculation, the final printout had been exactly the same for the last 13 months. Nothing had changed except for the UX and the quickness of they hit that button and then the materialist prints out. And these companies that we were working with had been using the software for years at this point. So we were wondering why all of a sudden they weren’t trusting the output. And that was something really weird. We weren’t expecting that the speed at which we were presenting the results would actually impact it.
- So people are spending a couple hours to input all kinds of information about the intended build that they’re trying to get plans and material lists for, and you managed to get them even more swiftly to the point where they’ve completed all that process and they’re ready to press the button. This is the moment when they obtain the value from the app and it was happening too fast.
- Yeah, it was happening too fast.
- So they weren’t going back and catching mistakes. They weren’t going back and changing the inputs because they had gotten something wrong. They were just going back and checking their work?
- Yeah, they were going back and checking and making sure they actually put everything in because they didn’t understand why it had gone so fast. They felt they had missed some major step or something. We weren’t seeing the change inputs, we weren’t seeing if they were in need of that. It was like they were going back through and reviewing everything just to make sure that what they got at the end was accurate.
- That’s wild. So what did you do about this?
- Well, for the lack of a better way to put it, we put a fake loading screen in.
- Okay, fake loading screen. Tell me more.
- So we tried a few different things. We tried to give them a review checklist on the page. Like after they hit the button to submit it, do you wanna review this? And while they were reviewing it, we were already doing the calculations and had the next screen ready for them. Sure. Most people didn’t really like that. So the other option we tested at the same time was quite literally putting a animated GIF, for lack of a better way to put it, on the page that looked like a beach ball, kind of loading icon that said, we’re processing your information. We’ll give you a materials list in a couple minutes. And we tried out a range of time from 15 seconds to about two minutes, to see where the sweet spot was. And we ended up at the 35 to 38 second mark. Yes, we got (frequency drowns out speaker) 35 to 38 seconds of loading screen, and with little tips we would switch a tip out every now and then think about, video game giving you tutorial tips or something. Right, okay. And then we would print out their material list. It was a total placebo. There was no thought process. Our software didn’t take that long, none of it. But we saw that 45% of people that were going back through to recheck their list, dropped to like 5% when we rolled it out to everybody.
- Wow. So beach ball and a loading screen and tips for 35 to 38 seconds. This was the magic number that allowed people to relax and trust that the software was really working?
- Yes, it was amazing to us. It completely made us rethink
how we were approaching this. And we talked to a couple people during the process. Our goal with the software was to save time at construction companies. Let them be able to quote more, get more accurate using the materials, plan better overall. What we didn’t anticipate was that the people that were using a software, had been doing this for 10, 15, 20, 30 years. And they had it down to a tip, the calculations, how they did all this stuff. And to them, it was time consuming and manual and they were experts at. So when our software came in and did it in the press of a button, they didn’t trust it.
- I wonder if it felt like a threat or if it hurt somebody’s feelings to have something that you think of as being so much work happens so quickly.
- It’s entirely possible. It’s not like we’d get construction folks and say that in particular. But I think that it was part of it, for sure. It was that they wanted to know that the computer was having to work as well.
- Wild. This is an amazing result and amazing solution. And it sounds like the numbers, can you go back again to the numbers? I think you said something about maybe 40% of people were doing this backtracking, don’t quite trust the machine movement, was that right? What was the result you landed on?
- We had about 40 to 45% of users going back after they got their material lists and going back and checking all their inputs because they thought they’d done something wrong or that they missed a step or something because it was too fast. Once we put in the loading screen with the tips and the 35 to 38 second wait, that number dropped from 40 to 45% of the users to less than 5%.
- This is an experiment that happened a while back and that you’ve done a lot of conversion optimization work since then. I wonder if this experiment is in any way changed your approach or how you think about a new client or optimization in general. Anything like that?
- It has. I actually have taken a step back for a couple of similar setups, like sauna processes, insurance claim submissions, things like that, and executed a very similar process where we were measuring all the same kind of metrics like time from start to end, if they go back, those kinds of things. And this has been a pretty consistent result. Is that people want to imagine that the computer that they’re interacting with, their software that they’re interacting with has to think about it. They don’t wanna know that there’s a checklist of yes and no’s, and it’s a real easy quick evaluation, especially for submitting insurance. They want people to… they want to feel like it’s being considered.
- You’re just blowing my mind. (laughing) I don’t know what else to say. I’ve kind of thought this was just a standalone. This one time, this crazy experiment yielded this unexpected result kind of story.
- Oh yeah, I thought it would be too. I thought I’d never run into the same thing again. But we come up against situations, not exactly the same, not exactly where people were going back and reviewing it, but where people were like submitting an insurance claim and then instantly calling because it wasn’t the result they expected or they didn’t trust that this software actually gave them an accurate result. And so we tried the similar thing to the weight screen, the loading screen, and it worked. And you’ll see the same kind of approach used in other places that I haven’t touched and no one that I’m aware of that worked on this team has touched. Out in the wild, TurboTax is a great example of this process. If you go through and you submit all your tax information, there are consistent loading screens between each section. Not because the software is having to work hard, but because people want to feel like they’re actually taking the time to think about it and double check everything. The submission button and the processes in the back end can do it pretty much instantaneously.
- Of course, I never thought about that. I know exactly what you’re talking about. And I have seen that and I guess I assumed. They got me. I thought, wow, this must be some really complex calculation they’re doing. But now that you say it (laughing),
- They got you.
- Surely not. (laughing) It’s math, it’s just arithmetic. Wow, okay. Well, so this is an idea that I never even once thought about that there might be an actual optimal lag for at least certain key steps of a process, but it sounds like you’ve seen this more than once.
- Yes, definitely seen it more than once and there’s definitely no set timeframe that I can give you for what people expect. It does seem that the longer and more involved the process, the longer people expect the software or the web app or whatever they have to spend calculating stuff. But aside from that, there is no like hard and fast rule for it.
- Yeah, well, of course not I guess. I would expect too maybe by user, it depends on the user’s sophistication, the nature of the information it’s processing?
- Oh yeah, for sure. What I’d love to do one of these days, next time I get a chance to get hands on something like this, would be to test out and see if we change the timeframe for that load screen, depending on how fast people made it through the processes.
- I haven’t done that.
- Okay, so personalized, optimized, phony loading time based on behavioral triggers from the user.
- You heard it here first, amazing. Something I wish I’d asked more about a few minutes ago is, all of the different approaches that you took, all of the different techniques, session recordings and heat maps and user interviews, it sounds like, of all the different research methods that you applied in finally finding out this issue, did anything stand out or was there anything that you think was the key to figuring out this very unintuitive problem?
- Talking to the users. Because data is data. It’s like cold, hard facts. It’s, doesn’t take the human element into it and when you’re looking at data, especially people that are sitting there looking at software every day and try to optimize it, you put your spin on it. What does this mean to me? You don’t necessarily know what the frustrations the users are having. That’s why we never could have potentially anticipated that people thought the software was too fast and we’d never gone there if we didn’t talk to them and like identify that a large chunk of them were going backwards. And then once we identified that going and asking them why.
- I love this story because you did of course supply data to first of all optimize the onboarding flow in the first place, and you use data to uncover this behavior that seemed counterintuitive. But then you reached a point where talking to humans was sort of key to figure out what it was all about and what you could do about it.
- You can’t really do optimization without both quantitative and qualitative data. There are just some things you’re gonna have to break down and talk to another person, figure it out.
- For sure, there’s some things that you just can’t imagine. Or at least I can’t.
- If we could all figure it out, we would be out of the job.
- Right? Okay, well, this is great. Thank you so much for your time, Sarah. This is quite educational. I never thought that I would learn anything of the sort. And now I realize that I’ve actually experienced this myself, the TurboTax example, perfect example. So thank you so much for coming on. If people wanna find you online, where should they go?
- They can find me on LinkedIn. Look me up as Sarah W. Johnson in the Atlanta area.
- Perfect, well, thank you so much.
- Thanks Brian.
I'd really like to email you.
Sign up and get a quick, skimmable update on what I'm writing / recording / building / thinking, once a week.