The Marketer’s Guide to Customer Experience explores how customer perceptions about brands inform marketing strategies — if you haven’t taken a spin through the guide, check it out here.

One of our CX experts featured in The Guide is Ty Magnin, Director of Marketing at Appcues. Appcues is a tool that helps improve onboarding and drive feature adoption through removing friction and flattening user learning curves.

Below is our “behind the scenes” interview with Ty in which he shares how customer experience shapes his team’s marketing — and why collecting more quantitative data isn’t always the right answer.

Thanks for taking the time to chat, Ty. So, how do you think about the connection between what your customer is experiencing, and Appcues’ marketing?

Ty Magnin, Director of Marketing at Appcues

Ty Magnin: I think about alignment a lot.

When you’re thinking about building out a marketing campaign, or any piece of marketing, it has to align very closely to something your customers are doing in your product. Whether that be a Facebook ad, an email drip or anything else, it has to focus on the value that they’re getting from your product and what they’re doing with it.

We talk about being customer-centric a lot now — SaaS companies, that is. I don’t think it’s always been that way and it’s actually still something that’s hard to do, but the idea of having all your data in one place for the team to use as a representation of the product is really important to achieving that mission.

So when your team is figuring out, “What next project do we want to take on?” how do you tie customer experience into that? Where does that play into the discussion?

Ty: The way I think about any marketing activity is I’m either doubling down on something, or I’m trying something new and filling in a gap, if you will.

In both cases the decision to double down or seize an opportunity comes from looking at some data initially. Maybe I’m pulling monthly data on, say, first touches to revenue. Where’s our revenue coming from through the lens of first touch or lead source? If I see that a certain webinar or a certain channel — broadly speaking, something like organic traffic — is driving a lot of revenue, I’m going to think about doubling down there. I think revenue is a decent representation of customer experience, right? If customers are paying you money after going through that experience, that’s a good sign.

However, it’s not perfect. Money isn’t a representation of happiness — of product-market fit, necessarily. And it’s hard to have revenue data represent churn, too.

So some of it does come from a gut feeling or quick decision. Maybe the sales team is telling you that this certain use case is heating up, or they’re sharing some stories about how company XYZ is trying to solve problem A, and you hear that enough times to finally think, “We don’t have enough collateral on that,” or, “We should tell that story through a blog post.” These anecdotes signal, “Hey, there’s a gap here, there’s an opportunity.”

Are you able to think of any examples of something you’ve doubled down on because you realized, “Oh, this has led to a really positive result, so let’s look at what people have been doing and do more of that?”

Ty: One thing that we see all the time in the wild — so not with Appcues customers necessarily — is companies using a certain pattern to introduce new features. A pattern often means they’re using a tool tip, a single tooltip. If you go onto Facebook or Twitter, you might notice a little tooltip pointing out something new, and we really thought that was super effective for us as users.

So we’ve been writing about that and promoting that use-case, and customers have been picking it up and imitating it.

Another one might be a case study we wrote on AdRoll and how they used modal windows, just like pop-ups, to advertise their integrations. We’ve promoted that case study to our customers. We’re constantly telling them to use just one pop-up — one single tool tip.

It seems like they’re following it and having success; I wish I had data as to how our whole customer base has done this thing and how it’s helped them get more users to activate their individual features or integrations.

As much as we want to be data driven, it’s really hard to get everything you need to get, data-wise. So what data do you use to assess or get signal on ways customers are experiencing your marketing?

Ty: It’s mostly quantitative. When you talk about data, I default to quantitative.

(Although there is some qualitative stuff and I think that’s a very interesting conversation to have.)

Just talking about the quantitative, I spend a lot of time in Hubspot at the end of the week or the end of the month pulling data. They make first touch really easy to measure, they make revenue easy to track. I’m also looking at email metrics. I’m looking at opens and clicks to see how certain messages or certain campaigns are resonating with a group of users.

I also look at it on a campaign basis or a single email basis. I’m looking at how certain emails have resonated with someone or how certain ads via Facebook have resonated. We have a few disparate solutions, but most of the time it’s going through our CRM, which is also our marketing automation tool, which is Hubspot.

You mentioned resonance. How do you determine whether something has resonated? Is it just the open rate? Are you looking at click-through rates? What matters most?

Ty: The bottom line is engagement.

I think that more and more, product managers and marketers are starting to think about engagement as the key metric to measure happiness or a successful customer experience.

That’s hard to do with an email. You can measure opens and clicks, and kind of get a gauge of it. But knowing if it truly resonated? I don’t really know. I just know that a lot of people looked at it, clicked on it.

Then maybe I can look at time on page for a website. Time on page is definitely a good metric for that. Bounce rate, too, we look at those kinds of things for our blog, for our homepage. But it’s really hard to track how engaged a person is in our product, all the way back to, say, the email that made them sign up for it.

Attribution is hard. I don’t know if having all that data would make me crazy, or if it would help me be that much more effective. Probably a little bit of both.

What would you say are the biggest obstacles to really understanding that customer journey? Or understanding attribution?

Ty: I imagine some of it has to do with tooling. Like I said earlier, Hubspot makes first touch attribution pretty easy to do, but last touch is actually really hard. There’s just different ways of doing it and it would be a full time job to keep up with that stuff. I don’t think there’s a single tool that’s plug-and-play.

It’s a time versus value thing, too. It’s not going to give me that much value to spend weeks and weeks building up attribution reports. I have so much data now and get more and more insight as we go. I think it’s just a matter of scaling with your team and your needs.

On the note of scaling with your team, how big is your marketing team? And when you’re working on a project, how often are you working alongside the customer success team, or the sales team?

Ty: Marketing [at Appcues] is a team of two — with a lot of freelance help. It’s project-by-project in terms of how we work with customer success or sales. For example, it could be a salesperson offering a blog post, or I might have a salesperson help me out with an email. Or with customer success, maybe we have a new feature coming out and we’ll work with them to craft the announcement.

Can you think of an example of a marketing project that had a particularly positive impact on customer experience?

We use Amplitude, which can be really helpful for me to measure profile data. So for example, how often does a marketer buy our product, versus an engineer?

That definitely helps me. I ran that exact report two months ago, and it helped me rethink our welcome emails because I realized how valuable signing up engineers for our product was. Engineers have the ability to install this thing and be really successful with Appcues.

That led me to target our welcome email that goes to engineers, and so far, the new email has been opened at a really high clip.

So on the opposite end, can you think of any examples of a marketing project or activity you decided not to do, or maybe to change, because of the potential impact on your customers’ experience?

Ty: Yeah, I think there have been a few. When we first launched Appcues, we weren’t sure what our most valuable use-case was going to be. People used us for user onboarding, people used us for feature adoption, people used us for in-app feedback forms, and those are all sort of different. At that time, we led our marketing with like, “You can do all these things,” you know, “Do it all in-app with Appcues.”

When you hit something with a blunt instrument, you’re just not going to sink that deep. So we wanted to make that message more focused, and we really started to lean into activation and product adoption.

We’ve changed our home page a million times, as everyone has, but at one point we had all these pages for different use-cases — user retention, and user engagement, and feature adoption.

Appcues homepage focuses on user onboarding.

Based on what our customers are really doing with our product, we’ve realized that user onboarding is really the story we want to tell. We’ve continued to move in that direction, and someday we might come back with a new story focused on one of the other things like feedback forms, but now that we’ve focused, we have a much more tangible solution to offer people today.

That’s really interesting. How did you come to that realization? How did you figure out that’s what you wanted to drill down on?

Ty: The problem was apparent through feedback — qualitative feedback — that people just weren’t sure what we do. That makes it hard to get them to sign up for the product.

After hearing that over and over, we knew we had to focus the product message. A lot of that was feedback from the website, and some of it over the phone, in support — and from our moms when they went to our website!

One thing I did last week is put chat on our website using Drift. I just wanted to hear from more people what was missing. The question I had that was prompting people in chat was, “What do you need to know?”

And I got a few tips from visitors, which was helpful. I had to use FullStory a number of times to see what people were talking about.

Okay, so you’re using it to track activity on the marketing site?

Ty: Yes — I mean, the product team is using it all the time, and so is support and so is customer success — and I’m looking at it when I put a new page out to see if people can navigate it appropriately. Or see where they stop — that kind of thing. We definitely use it a ton to look at our onboarding flow.

We typically go from prototyping and user testing to full-blown user testing and then launch with FullStory. I’ve found that to be an effective way to “user test” the marketing site and get some of that qualitative feedback.

In watching the onboarding flow, can you think of any examples of somewhere you noticed, “Wow, everyone rage clicks here,” or any other major change you made?

Ty: Yeah. After you fill out our signup form, it takes a minute for the app to load, because there’s a ton of stuff that has to happen on the backend. So originally, after that button click, the button text would change from something like “signup” to something like, “Warming up our engines.” It was subtle and didn’t really explain what was happening, so no one knew what that meant and a lot of people would bounce there.

After seeing that in FullStory, I actually went into the app and coded up a change so the button text was something like, “Hang on a sec,” or “We’re building your account.” Instead of trying to be too cute, we switched the text to explain exactly what was going on.

So you looked at FullStory to see people were bouncing, and we’ve talked about some of the other tools you’re using, too. But if you could change something about how you understand your customer experience, or how you manage it, what would you change? What would you ask for?

Ty: It’s definitely on the qualitative side. Which is good for FullStory.

There’s so much that’s already measured today. But as product owners, knowing where someone’s motivation level really drops, where they got confused, or what didn’t resonate is still really tough — knowing what’s going on inside the emotional side of the brain.

Getting that would be awesome. It’s hard to get that feedback out of people because sometimes you (the user) don’t quite know what made you bounce. And as a product owner it’s hard to know when to ask that question and how to ask that question without getting in the way of your user.

If I had a magic wand, I would use it to get tons of feedback about where people are frustrated, and what they’re excited about and use that data to make the app even stickier and all-around more awesome.

In relation to marketing and your customers’ experience — or your leads experience — can you think of any other stories of how the two play together? Or where you use the data you have to make a decision that’s supposed to help the customer?

Ty: You know, I’m trying to think of tangible input/output stories, but so much of our opinions about our customers and what they need and what works happens slowly over time. Our product team has to hear something 10 times before they realize, “This is a thing.” It’s just because there’s so much feedback coming through support and coming through analytics and so on and so forth. You can’t just jump at one opinion, right?

Often I’m hearing things 10 times before I make a change to marketing. I’m hearing things 10 times probably from different people and at different points throughout the day, the month, the week, the year. A lot of these things are like undercurrents. A lot of the quantitative feedback and qualitative feedback — the outputs that help align you to your customers— happen slowly over time.

Like figuring out what channels to go through. You test out a channel a little bit and then you see it get a little better, and then you do a little bit more there. And that’s how things evolve.

So it requires sitting and waiting and watching to see what happens for awhile before making changes.

**Ty: **Yeah, and we’re in the world of deploying 10 times a day and minimal viable products, right? So the change is going to be like a tiny little experiment, like a tiny thing, and then you’re going to say, “Okay, it worked. I’m going to double down a little more. I’m going to double down a little more.”

If you think about the engineering welcome email, that’s just, “Okay I saw this pattern, I noticed it.” That’s one test. I can go way further — and I probably will over time.

As you do that, do you see the onboarding flow changing in a way that makes it more targeted to the different types of users? When you say “go further with that,” what do you think that will look like?

Ty: That’s a great question. I think that it means we’ll be onboarding them uniquely. We’d nurture them along their trial differently based on who they are and what they need — not just that first email but the next five as well.

I’d start marketing to them differently. I’d be trying to figure out how to get those engineers that are interested in solving for onboarding into a very particular drip. Give them a very particular content upgrade. Nurture them differently to make sure that they are getting the info that’s most valuable to them.

In theory, it all sounds easy, but it all takes work.

Ty: Yep, Definitely. You know, this stuff is hard to talk about, and I think it’s because marketers know customer experience is important, but it’s harder to define than, say, lead generation. I could talk about lead gen all day and articulate it well, but thinking about how we define and manage customer experience as marketers is challenging in a different way.

So what you’re saying is that while “lead gen” is a part of the marketer’s daily vocabulary, “customer experience” is still very new.

Ty: I think it’s because it’s hard, but it’s valuable to think about. This conversation is pushing me to think, “Am I really customer-centric? Am I really thinking about how my marketing aligns all the way down the funnel?”

This is a good exercise. I think it’s important work.

Agreed. And I think this is a good place to wrap up — it was so great to chat with you.

Ty: You, too.

— May 2017


We thoroughly enjoyed hearing from Ty about how he is thinking about customer experience and marketing for Appcues. As Ty ended the conversation, we couldn’t help but agree: for marketers, talking about customer experience is hard but important work.

Thank you Ty (and Appcues) for sharing with us!

If you liked this interview, be sure to check out The Marketer’s Guide to Customer Experience!