Digital marketing promised to make our marketing measurable with clear lines of connection between the dollars spent and the results produced. Yet, as any data-driven marketer will tell you, tracking and attributing marketing activities in the real world is far from straightforward.
Today’s guest argues that it is actually possible to get analytics right and that moving past “true but useless” metrics is essential for building marketing’s credibility within an organization. His name is Chris Sietsema. Chris is a 20-year agency veteran and analytics expert who’s advised some of the most recognized names in both industry and higher education.
Why is Measuring Marketing So Hard and How Do We Get it Right?
According to Chris, when people think about the challenges of marketing analytics, they tend to list things like budget, technical difficulties, and the inherent limitations of the available tools. While these challenges are real, Chris argues there are deeper issues that are often never discussed. Among these lesser-discussed challenges are a lack of well-defined goals, inconsistent processes and vocabulary around measurement, and marketing workflows that turn measurement into an afterthought.
In contrast, analytics-savvy organizations do three things exceptionally well. First, they place a strong emphasis on process and organization, which tends to clarify thinking and align the team’s analytics vocabulary. Second, they work hard to avoid “true but useless” scenarios by identifying meaningful metrics that will actually inform decision-making. Third, they carefully craft reports to align with the unique needs of the different audiences that will be reading them.
Building Your Team’s Analytics Capabilities
In Chris’s view, many organizations already have all the analytics talent they need to be effective. However, when managers are evaluating which team members to grow into analytics roles, they’re often faced with a common question: is it better to train a marketer how to do data science or train a data scientist how to do marketing? Chris suggests both are viable options if the person in question has the right characteristics. First, they must be curious and sufficiently motivated to learn the other side. Second, they must be stubborn problem solvers–almost to a fault. Third, they need to understand how their activities contribute to the organization’s bigger picture. “You don’t need a nerd with a calculator,” he explains, “You need a nerd that can eloquently articulate the core reasons your organization exists in the first place.”
Chris wraps up by describing a straight-forward 8-step process to make sure your next marketing initiative is set up for measurement success from the start.
Subscribe to the Higher Ed Marketing Lab
Transcript
Jarrett Smith:
You’re listening to the Higher Ed Marketing Lab. I’m your host, Jarrett Smith.
Welcome to the Higher Ed Marketing Lab. I’m Jarrett Smith. Each episode, it’s my job to engage with some of the brightest minds in higher ed and the broader world of marketing to bring you actionable insights that you can use to level up your school’s marketing and enrollment performance. In this episode, we’ll be talking about marketing analytics. If you’re like many marketers out there, you’re sold on the value of data-driven marketing, but actually, doing it in the real world is easier said than done.
Today’s guest argues that getting analytics right is not only possible, it’s essential if you want to gain credibility and prove the value of marketing. His name is Chris Sietsema, and he’s a 20 year agency veteran and analytics expert who’s advised some of the most recognized names in both industry and higher education. Chris starts by outlining some of the sneaky reasons measurement is hard, and then he explains the three things analytic savvy organizations consistently get right.
Chris then talks about how to cultivate analytics talent in your organization and he gives his take on the age old question of whether it’s better to teach a marketer to do analytics or teach a data analyst to do marketing. He wraps up by describing a straightforward eight step process to make sure your next marketing initiative is set up for measurement success right from the start. Chris is a deep thinker on this topic and he provides a ton of useful insight for anyone looking to do analytics better. Without further ado, here’s my conversation with Chris Sietsema.
Chris, welcome to the show.
Chris Sietsema:
Thank you for having me.
Jarrett Smith:
Absolutely. I’m really excited to dive into this analytics conversation. Before we get there, could you just give us a quick snapshot of your background and the work you’re doing today at Teach to Fish Digital?
Chris Sietsema:
For sure. It’s been about 20 years now that I’ve been in the digital marketing space, pretty much all I’ve done since graduating school. I started off by doing a little bit of everything, a little bit of search marketing, digital advertising, ton of email marketing. When social media came around, that was definitely kind of added to the repertoire. Over the past three years, I’ve really kind of focused on analytics because I feel that, while I love all those disciplines, it’s like picking amongst your children, who do you love more? Analytics is basically the layer over everything.
That’s really allowed me to still be involved in every aspect of digital marketing, but really help my clients focus on what all these numbers mean and what we can do with them and what we do next. I do a lot of audits, a lot of configuration projects, some training, stuff like that.
Jarrett Smith:
Very cool. Yeah, I mean, I feel like, to piggyback on what you just said, analytics is kind of where the rubber meets the road. You’re trying to distill it down into something useful and say, okay, now what? How do we act on this? Where do we go? I know we’re going to spend a lot of time talking about all of that, but I guess I want to headlong into sort of the core challenge, I think, will be familiar with anybody who’s tried to measure their marketing efforts. Digital marketing kind of carries with it the promise that everything can be measured, but the reality is implementing that in the real world is actually pretty difficult and not everything is as measurable as you would like it to be. It’s just not as straight forward. Could you talk to us for a minute about what makes measurement so challenging? Why is this so difficult to get right?
Chris Sietsema:
Well, first of all, I would say, for anybody who has had that conversation about measurement being challenging, and perhaps someone is talking to someone where the response is, what are you talking about? It’s easy. I don’t know what you mean. That person’s off the rocker. It should be hard. It’s supposed to be difficult. I think one of the reasons that, that’s kind of a tease is that, when you compare digital marketing specifically to other more traditional marketing, whether it be broadcast, or print, or whatever, direct mail, you’re right, it does kind of have that promise of being so much easier to measure, but when in practice, it’s not.
One of the reasons I think is that not every marketing program is uniform. We’re all a little bit different. We have different objectives. We’re working in different environments. Because of that unique quality of our own marketing program, the measurement program has to then reflect that as well, and thus the measurement program has to be very unique and specific to whatever it is that we’re trying to accomplish with our marketing goals.
When you look at the analytics space, are there standards? Kind of, there’s tools that everybody utilizes that are pretty consistent, but the manner in which we utilize those tools is completely frayed and totally different from one client to the next or one marketing program to the next. Part of the reason too, I think is that when we think about analytics, we think about a report, like we think about the finish line, or the final state. Where really, if we’re doing it correctly, we have to start way at the front and talk about, what is it that we’re trying to accomplish? What are the true goals that we’re trying to measure? What’s our business objective.
If those are undefined or poorly defined or not shared amongst the entire group of people that are working on a singular project, we’re in trouble from the start, and so is the measurement program.
Jarrett Smith:
Yeah. It brings to mind the idea that doing analytics right looks like a lot of things that don’t look like a report, if I can say that the right way. There’s a lot of activities that don’t look like a nice report that ultimately result in that hopefully, but there’s a lot of work to do on the backend.
Chris Sietsema:
Yeah. There’s a lot of background work for sure. There’s a lot of setting up of dominoes. In terms of like just the whole, I thought this was going to be easier, why is this so challenging? I’ve had that conversation a lot with a lot of people. How I equate it would be like, I’m sure the first time Orville and Wilbur Wright had their maiden voyage in Kitty Hawk. They brought a lot of promise for things like transportation and supply chain and exploration and measuring weather and all the different implications that flight has. But when you look at where we came from then and where we are now in aviation, we’re not even into jet engines yet in terms of like the comparison from analytics perspective.
We’ve got so much ways to go, and it’s a challenging thing. Despite all the promise that digital marketing has for measurement, we still have to pilot the thing, we still have a navigation, we still have to have a strategy, we still have to have equipment to get us from here to there. It’s a lot of work we still have to do yet. We’ve already come such a great distance so that’s a good thing.
Jarrett Smith:
Yeah. That reminds me of another sort of challenge here is that the tools that we’re using are changing. I know that we’ve talked a little bit, prior to this conversation, about Google Analytics is getting a major overhaul. We’re coming out with GA4. That’s going to kind of upend everybody’s boat, and of course, one of the most common measurement tools out there in digital marketing, so there’s a lot there. The other thing I think that’s tricky, that’s kind of higher ed … Maybe it’s not really higher ed specific, but institutions, they’re large complex organizations, and so just the people involved, right?
Getting appropriate goals set up on your website, it may be enrollment management that’s really super tuned into conversions on your online application, but it may be marketing who actually owns those properties and they may not always play well together so that even if you straighten out all the technical aspects, there’s still the human element that can make things really difficult.
Chris Sietsema:
Yeah, and with higher ed, which you and I have both have a ton of experience in, for some of these institutions that are massive, like you said, there’s a lot of different hands in the pot. One thing that I would say that’s a challenge for a lot of folks is that enrollment does analytics a little bit different than marketing does analytic, especially now with all the online learning, the manner in which students are actually utilizing the applications to access courses and syllabus and assignments and all those different types of things, and whether those things are being utilized properly.
There’s a lot of different ways to look at it, but I think that as long as we are kind of utilizing a common language or a common dialect with the manner in which we are measuring, that’s really key. Sometimes from department to department that language or that dialect changes, which can cause challenges. Still, part of it is just really kind of speaking the same language from a measurement standpoint. Oftentimes, that language, unfortunately, does not exist when you start to building the tools or building the processes and you have to create it and adopt it and make sure that everyone can speak that language.
Jarrett Smith:
That’s a really interesting point. I think one of the core traps that I see people falling into is that any analytics package, whether you’re looking at what’s available on your social media platform or what’s available out of Vanilla Google Analytics, or whatever tool you’re using, is that it’s going to serve up some things to you automatically. They may or may not be useful. You’ve talked to me about say something like time on site in Google Analytics being a problematic measure. Interesting to look at, but there’s a lot of nuance there and it’s not as straightforward as it seems. I guess my question to you is, where do you see the danger in focusing only on the things that we can easily measure.
Chris Sietsema:
It certainly limits us. The things that we can easily measure aren’t necessarily indicative of success and they don’t really provide a good representation for whether we reached our business goal or not. Some things like brand awareness, like how do you use Google Analytics to measure that? I don’t know that you can, quite frankly. When you have a specific goal that you’re trying to measure that the standard tools do not provide an easy answer for, you’ve got a few approaches. One is just measure what you can, which is not my preferred method, which is basically giving up. Let’s just deal with what we can, right?
The other method is to go to great lengths to configure measurement tools or processes that truly answer the most important questions. For example, to use that brand awareness, like maybe we don’t use a digital analytics tool set. Maybe we actually do a study with real people like a survey, right? Like unassisted brand awareness type study. The challenge there is that those are time consuming and expensive. Another approach would be, as opposed to just dealing what we have to deal with the tools we have, or we’re going above and beyond and finding the metric that we’re looking for no matter what, no matter how expensive it is and how time consuming it takes to get that answer, is to use more attainable metrics as a proxy.
For example, to use like brand awareness, perhaps we look at branded search volume, right? How often is our brand searched for on the Google and other search engines today versus when it was six months ago or a year ago or 18 months ago, those types of things, and that’s just one example, right? From an analytics perspective, oftentimes, we’re thought of as the pocket protector kids and the nerds and very left brain thinkers, and that’s totally fine. That’s what we are. But at the same time, analytics requires a great deal of creativity in terms of how to get the answers that we’re looking for to some of the tough questions that are being posed by leadership, by ourselves, by marketing managers, marketing directors, etc.
So, it does take a bit of creativity to think through like how we can find a real good answer that provides at least a clue as to how we’re performing in that respect.
Jarrett Smith:
To kind of serve as a reasonable proxy.
Chris Sietsema:
Totally. Yeah. That can be difficult because sometimes that, yes, that’s a proxy, but is it really a good indicator? Does it provide a factual representation of the question that we’re trying to answer rather? The most often discussed issues with analytics is that it’s expensive, it’s difficult to configure, or just, like you said, just simply not available. It’s not easily attainable. However, I think the least discussed problems, but more like symptomatic to the actual core issue, is that measurement is often an afterthought.
Sometimes I believe, at least in my experience, I don’t know if you’ve found this as well with some of your clients, not to name names, there’s a need to work backwards. What I mean by that is from a marketing and advertising perspective, we tend to think in a very linear fashion, like, all right, see audience, create ad, choose media for ad, deliver ad. Then at the very end, oh yeah, we should probably measure this too.
Let’s figure out how we’re going to prove that we were successful. When in reality, it’s often necessary to ideal or to identify the ideal scenario in which we can prove success, almost start with the number, or the metric that we’re trying to attain, or the answer that we’re trying to get at and work backwards a little bit from that. Now, should that sway your marketing strategy or your approach to how you put a message in front of your audience? No, but if you’re thinking about it from the outset, it can solve a lot of problems later on that often manifest when, oh, shoot, we should probably create a report for this, two months into the campaign.
Jarrett Smith:
Yeah. Well, and that allows you to really have that conversation around what metrics are truly going to be useful, how would this guide our decision-making if we knew this and we were able to measure this, and you could even ask yourself in advance. It’s like, okay, if we’re underperforming on this particular metric, once we launched this, what does that suggest? I mean, you could start to pressure test your thinking a little bit. Would we actually find that useful? Might we change our plans or activities? On the flip side, if we’re doing really great, what does that suggest? Might that also guide our efforts once we actually get into full swing on this campaign or whatever it is that we’re doing.
Chris Sietsema:
Exactly. I think once you’ve got everything configured and you’ve got a dialect and a language that you’re relying upon for this metric is going to tell us that, and we measure this way because it’s going to provide an answer to this important question. It’s a never ending process, right? A lot of what my work is, and what it involves, is once everything is configured from a marketing standpoint, sometimes a lot of times my clients will have me just hang out. Help us read the tea leaves a little bit here. Help us understand what these numbers mean, because as long as a campaign is live, as long as a website is live, as long as a social media account is active, so is analytics, so is measurement.
It’s a never ending process. It’s a never ending struggle to achieve better results than we did last time. I don’t know about you, but I often get asked the question like, what’s a good benchmark for X? What’s a good click rate or what’s a good conversion rate? The answer is, well, what’s the click rate or what’s the conversion rate now. Whatever’s better than that is the answer. Oftentimes, I think we rely heavily upon what the “industry” is doing. That doesn’t matter. Let’s just focus on us. The more time we spend on others, the less time we can focus on ourselves.
Even if you kind of achieved, let’s say the industry benchmark for open rate for emails is 25%, we have a 30% open rate. Does that mean we’re going to stop trying to get 35% or 40% open rate? It never stops. You always want to improve, improve, improve, and that’s part of the fun.
Jarrett Smith:
That’s a really interesting point. Just thinking about if you managed to get this right and you are able to really think through, in advance, what’s going to be the most useful information, that then really becomes a tool for building credibility within your organization for marketing specifically. I think one of the core challenges we have, and I think this is especially true in higher ed, where you’ve got a lot of PhDs. In a lot of cases, they’re a PhD in like a hard science, something that is super rigorous and empirical.
Marketing kind of comes in, it’s hard to nail down. It has this creative aspect to it if it’s done right. It’s part art, part science, and so I think it opens the door for people to kind of ride off marketing activities and discount what’s happening. Analytics can help build that credibility to say, no, the things we’re doing actually have impact in the real world, and these things that we’re doing matter and deserve to be funded and deserve to be expanded. What’s kind of been your experience with that?
Chris Sietsema:
My experience has been it’s a huge challenge, especially for … You work in higher ed quite a bit, and I do as well. When you have an audience, whether it be those PhDs or in a different industry, whether it’s the board or the CEO that definitely has chops and has earned her own credibility, it’s difficult to prove your worth from a marketing perspective, and I can totally get that. The metaphor I often think about when I’m talking about this topic is, do you know who Bryan Cranston is?
Jarrett Smith:
Yeah. Breaking Bad, best TV show ever recorded.
Chris Sietsema:
Exactly. My wife and I, we spent like six months a few years ago and watched the thing from start to finish, and it’s excellent. If you’ve never watched Breaking Bad, for your audience, he’s also the dad on Malcolm in the Middle, which also just exemplifies the range that, that guy has as an actor. He’s a terrific actor. Right?
Jarrett Smith:
Yeah.
Chris Sietsema:
We also started watching, not to take us on too big of a tangent, but we also started watching the show on Showtime called Your Honor, have you seen it or heard of it?
Jarrett Smith:
No.
Chris Sietsema:
Don’t bother. It’s trash. It’s the worst. It is so bad. It was such a let down, because he’s a fantastic actor and he’s excellent at his craft, but it kind of speaks to the point that if the script isn’t there, if the writing isn’t up to snuff, there’s nothing an actor can do to kind of salvage it. In a similar way, if the product isn’t great, the marketers can only elevate a fantastic product. We can’t turn a garbage product into like gold. The reason that, that is a problem in that the situation is marketers only elevate a product and we kind of struggle when the product is weak, is that issue exists due to this question that is posed, I think, a lot oftentimes by leadership, and that is, what if we just did nothing? What if we did none of this marketing stuff or any of this advertising or spent any of this budget on promoting ourselves, would we have gotten the same result?
Because that question exists, and I don’t know if you’ve faced that question or at least heard that question muddled subtly, that’s why analytics exists in some ways, in that we have to actually prove, that because of our efforts, we elevated this product, we improved the state of this company, this organization, this product, whatever it is, due to our efforts to promote and advertise and get the message out. Oftentimes, when we’re looking at measurement, we can run tests or we can run flights, or we can turn our marketing off and then turn it on and see what the results are and see what the differences are.
Chris Sietsema:
To build credibility, we really have to have analytics to show the change we’re making and the positive that we’re bringing to this scenario. It’s invaluable from a marketing perspective. The best moments of my career is when a client of mine or a coworker or whatever says, “Hey, Chris, I’m taking this report that we worked on and I’m bringing it to my next review,” because they can actually prove, with metrics, that what they’re doing is elevating, not only the product, but the organization and the overall company or the mission.
Jarrett Smith:
Well, and I want to circle back to something you said earlier, which is how important it is to agree upfront on what’s going to be measured. Then also, the idea of, and we all need to agree on what these things mean and the language we’re using. If you’re the one actually gathering the data and performing the analysis, that means you’ve got to walk into your bosses office and say, “Hey, I’m working with my marketing counterparts to conceive this new campaign. Here’s what we think we’re going to measure. Is this actually useful to you?” Is this actually useful to your boss? Are they going to find these things, yes, that’s great, or no, actually that’s kind of nice to know, but not actually that useful?
Then you can have that conversation of, okay, well, what can we do with that? What’s within our grasp that might be better. Maybe it is that compromise of, hey, the best we can do in this scenario, because of budgets and dimes and internal capabilities is to come up with a good proxy, but we can at least recognize that problem in advance and all agree that here’s the sort of sensible compromise we’re going to make.
Chris Sietsema:
Yeah. That meeting or that discussion with the measurement person or the analytics person, or whoever’s going to be pulling those numbers and whoever their reporting audience is, whether it be the director of marketing or the CEO, or the CMO, or whomever it is, is absolutely vital, because you have to understand from the get-go what it is we’re all trying to accomplish, so we’re all kind of aiming for the same target. Yeah, and sometimes those conversations do sound exactly like you described, where it’s like, hey, this is what we’re trying to achieve and this is how we’re going to measure the results of this effort, agree or disagree.
Other times it’s, do you agree that we are aiming for this target? Do you agree that this is the business objective, this is our quantified goal? We’re going to increase X by that percent, or we’re going to decrease the cost of that by Y percent, whatever it is. Do you agree? Yes, I do. Great. Then the metrics don’t get discussed until the report happens. Oftentimes, those discussions with stakeholders are all about the goal or what it is we’re trying to achieve. The actual metrics sometimes get brought up in conversation from the outset, but not every time, primarily because those stakeholders either don’t know or don’t care. I don’t care how you do it. Just make it happen. Prove that this effort is worthwhile. We should continue doing it or continue to augment it, etc.
Jarrett Smith:
As you were talking about the importance of the product, it reminded me of the classic four Ps of marketing, product, price, promotion, placement, and how, historically, promotion is just one part of marketing. Historically, there’s been a recognition that marketing can and should involve many other things. Obviously, as a marketing team, you’re not going to be deciding the next new great program that’s going to be unveiled at your school. I think it is good to remember that as a marketer, you do tie into product. You do have something useful to contribute and you have a perspective that your provost might be interested in. You have a perspective that the dean of a particular college might be interested in, that they wouldn’t have access to. Hopefully within your organization, it’s okay to express that perspective and lend a hand as they try to make the best decisions that they can.
Chris Sietsema:
Exactly. Yeah, I wish there was like a fifth P, maybe it’s performance. Because that absolutely needs to be configured into the whole marketing schema, for sure. The other thing that’s really rewarding about measurement and analytics from my perspective is that, when you kind of uncover those juicy insights or those nuggets in which we can kind of show our reporting audiences, whether it be, to use your higher ed example, the dean or the provost, it really is extremely gratifying, because for a lot of those people, they build a program, they build a university, they build a curriculum, and oftentimes, they don’t necessarily see how its intended audience reacts to that, unless that audience is in a classroom.
But up until that, we have to rely upon measurement to understand what resonates with the audience, what programs are really interesting, what aspects of specific programs are most interesting, that kind of thing. There’s a lot of ins and outs to the measurement process from providing leadership with visibility to the front lines, essentially.
Jarrett Smith:
Absolutely. Chris, we’ve kind of talked at length about all the various challenges of measurement and analytics, but I think you would probably agree that some organizations do get it right. I know you’ve worked with a lot of different folks, not just in higher ed, but across various industries, and it is possible, I guess, depending on how you define. But I guess, is there anything that you would bring to us, maybe as a common thread, or what kind of universal lessons might we apply? If someone’s listening to this and they’re saying, okay, I know we’ve got a lot more we could be doing, we could be doing better. Where should they kind of focus their energy? What are sort of the common threads that tie together the folks that are doing a good job?
Yeah. When we first started talking from the outset, we noted that one of the biggest challenges with analytics specifically is that there isn’t a standard. Every situation is unique. Every marketing program is a fingerprint, right? They’re all a little bit different. However, to your point, there are some common threads or common elements that I seem to find with successful measurement programs. The first is, and one of the first questions I even ask when I’m entering in a new analytics engagement is, do you have documentation? Is this stuff written down?
Because if they do have documentation, you can almost prove that there’s a process. If they don’t have documentation, you automatically are a little bit … It’s not clear whether there’s a true process or not. One of the mantras that I have used before, and you’ve heard me say this is, if it’s not written down, it doesn’t exist. Even the most simplest measurements, and the way that things are configured, and the way that we pull numbers for reporting and things like that, write it all down, because that’s knowledge that has to be shared, and it’s also knowledge that has to be referenced later on.
Chris Sietsema:
Focus on process and documentation, the two have to go together. The other thing I would note is that a lot of times when we think about the final report, we tend to try and think about, well, what are some of the other reports that I’ve seen, and what are the metrics that they utilized? It’s like, ugh, I don’t know, because oftentimes what happens is you really have to make sure that your reports reflect the business objectives that you’re trying to achieve. If your business objective is like, get more leads, whatever that number is, however that’s quantified, leads should be at the top of the report.
We really have to avoid including measurement tables or charts or graphs that provide true information, but useless information. All the information that we provide in our reporting should have some kind of actionable next step, like a yes, and. Something that we can read and interpret and then know exactly what we should do the next time we run a campaign, or tomorrow, or within the next hour They should all be actionable. Avoid true but useless scenarios, and really focus on actionable metrics and insights that we can rely upon to make stuff happen.
The other thing that I would say is that I think … I come from an agency background, and we were definitely guilty of this, and I don’t know if your agency has been guilty of this as well, but when we think about providing a report, I think we have the notion that more is more. Like, here’s your report fad.
Jarrett Smith:
Let me justify my hourly rate.
Chris Sietsema:
Exactly. Yeah, I really want them to think that I know, and I really want to prove that I worked really hard on this based upon the size of this document and the number of pages in this document. I say that’s baloney. The truth is that the reporting format should be aligned to the audience. If you are talking to the provost, or the dean, or the CEO, I’m guessing they have limited time. I try and make, as a rule, most of my reports that I provide, one page. Just one singular page, and that’s all you get, because it really forces you to eliminate those true, but useless scenarios, really focus on the actionable stuff, and get right to it. It’s kind of the whole be bright and be gone. Get in and get out. Be bright, be brief, and be gone, I should say.
Jarrett Smith:
It reminds me of the quote, well, at least the internet attributes it, I think to Mark Twain, about I would have written you a shorter letter if I had more time. The internet probably attributes that to Abraham Lincoln too, I don’t know, and Einstein, one of those guys.
Chris Sietsema:
One of those guys, yeah.
Jarrett Smith:
Smart dude back in the day.
Chris Sietsema:
That’s exactly it. I think the value that we provide to our reporting audiences is not the breadth of data, it’s the clarity of our next steps. It’s the interpretation that we can provide and say, here’s the approach that we should venture into next.
Jarrett Smith:
Yeah. For some reason, when you were saying that orange metaphor, maybe it’s because I’m in Florida, an orange juice metaphor appeared in my mind that the analytics person is the one squeezing the oranges and selecting them from the tree, but by the time it makes it to the executive level, it really needs to be orange juice concentrate.
Chris Sietsema:
Yeah. I’m going to use that. Yeah.
Jarrett Smith:
If someone’s listening to this, we do have a lot of higher ed leaders and folks in decision-making positions that are in charge of guiding teams and staffing teams, folks that listen to this podcast. I can imagine there are folks listening to this right now saying, okay, I hear what you’re saying, but I don’t know that I really have the right person on my team, or at least I don’t have the right role to find, and I think we’re going to need to develop our organization in this way if we’re going to do all these wonderful things that we keep talking about with analytics.
I guess, how do you sort of think about and advise people when they’re thinking about their own team and they’re thinking about expanding their teams’ analytics capabilities, what advice might you have for us?
Chris Sietsema:
You focused on the people aspect, so I will take that and run with it as well. Because oftentimes, when people think of analytics, it’s like, what tools should we be using? What software license should I get next? That kind of thing. Really, it starts with the human resources piece of it. What I’ve seen from my clients, and this is outside the higher ed space specifically, but with other industries, is that for those organizations that have an individual or individuals that are specifically dedicated to the discipline of analytics, some of them do great and some of them struggle.
When they struggle, they are almost too close to all the data. They’re too focused on all the data. It’s one of those, can’t see the forest through the trees type of thing. Or I would say, from a leadership standpoint, you probably have the right people on already. It’s just a matter of changing their view to understand your business and its objectives just as well, if not better, than they understand something like linear regression or something, like something very nerdy, right?
You don’t need a nerd with a calculator. You need a nerd who can eloquently articulate the core reasons your company exists in the first place. That’s where I see people struggle the most. It’s like, hold off the data just a little bit. What is it we’re trying to do here and focus all your energy and efforts on that. That’s where I’ve seen some analytics folks just struggle a little bit. They’re trying to fill a role from an analytics perspective, but not fill the role within the organization.
One thing I would say too, if you’re looking for somebody who would be a great analytics person, I definitely think a background in all those, all the math, and the science, and the art of it, and having good experience with all the tools is important. But the one quality that I see that’s consistent amongst analytics people is that they are problem solvers that are just insanely stubborn. The type of people that can not stop looking at something and trying to figure something out until it’s figured out. They examine it from every possible angle.
They’re persistent to a fault. I think that, that sounds negative, but it’s actually a good thing from an analytics perspective, because that’s where some of that creativity we spoke about earlier comes in, where if we can’t solve it this way, let’s look at another angle to solve the problem, which can be really great. Oftentimes too, especially with the agencies I’ve worked with or trained, or whatever, there’s a question about, all right, do we go find an analytics person or do we just take one of our marketing people and just train them to do analytics?
It’s like, well, I don’t know. It really depends, because it absolutely depends upon the individual that we’re talking about. For example, I work with some agencies like out in Arizona or California, or on the West Coast, Colorado, etc, where, hey, we have this person on our team. He or she is a great account manager and they really want to learn more about analytics. Then when you talk to that person, they just really want to be a great account manager, and there’s nothing wrong with that, but unless there’s a passion for the metrics and the ability to pull those numbers and configure all the data and do all the tagging work, really get your hands dirty and all that work, as well as kind of lead the meeting in which the insights are provided, they’re going to struggle a little bit.
There has to be a passion for both, if that makes sense. Because oftentimes, when we try and convert a person who doesn’t have a lot of analytics experience into something that is more of that analytics person, if that passion isn’t there, there could be a struggle there. Then the last thing I would say is invest in the necessary tools for sure. That’s not easy for everybody, but depending upon what questions we’re trying to answer, it might be worth whatever the budget requirement is to get that testing tool, or that analytics upgrade, or that tag management solution, or whatever the case might be.
But for an analytics person to really be powerful, not only do they have to have that creativity and the stubbornness, and the problem solving qualities, but they also have to have the right tools at their disposal to do the work.
Jarrett Smith:
Okay. I’m going to ask a question that may be so ridiculously broad that it’s not possible to answer it in a good way, so just tell me if that’s the case, but just thinking through the kind of digital marketing operations that you have encountered over the years, what does that tool stack typically look like? I mean, things come to mind like Google Data Studio, which a lot of folks over the years in higher ed have adopted. It used to be, not that many folks really knew how to use it. Now I think it’s pretty common.
You’ve got Google Analytics, you mentioned testing tools, but what else do you consider to be the basic tech stack for good quality digital marketing? Is that even a question that makes sense, or is it too particular to the organization?
Chris Sietsema:
No, it’s a good question. I think that there’s probably three categories of items and then there’s some extra stuff. There’s three primary categories and some wildcards. The three primary categories are whatever tool you’re utilizing to track and bring all the data and all the behavior that you’re measuring into one cohesive tool, whether that be Google Analytics or Adobe Analytics, or Google Analytics 360, which is the souped up version of Google Analytics, the non-free version, that kind of thing.
We talked about configuration before. Configuration basically equates to making sure that all of the behaviors that occur with an advertisement or on a webpage or a landing page or an app or whatever, that all those behaviors can be tracked adequately, and typically, for that type of thing, you need some kind of tag management solution. The most common one is Google Tag Manager, but there are others out there. Then those tags are basically little tracking codes that you place on the website to enable you to measure certain behaviors that occur that are important to you.
So, you’ve got your primary kind of analytics repository, your tag management solution. Then, once it’s time to actually prepare a report, you do need some visualization tool. For many, that visualization tool is Excel, or Mac numbers, or whatever. They basically pull those numbers out. That concept is not foreign to me. That’s how I used to do all my reports up until probably two or three years ago. I would pull the data and make it pretty on my own. But at the same time, some of these visualization tools, like you mentioned, Google Data Studio or Tableau. Oh my gosh, there are so many. I could through a rock right now and I have eight of them, all the Domos and the Power BIs and all those different types of tools. They’re out there. When you asked the question earlier, I don’t think there’s a right tool, but those are the ones I commonly see most often.
As long as you’ve got an analytics database or repository like a Google Analytics, a tag management solution, which you’ll likely need in a visualization tool, you’re good. Then there’s those wild cards for like call tracking, or landing pages, or AB split testing, all those different types of things. Then, to make matters a little bit more complicated, certain channels that we rely upon heavily, such as Google Ads, Facebook Ads, even like YouTube, they all have their own little dashboards and Facebook insights and things like that, that we need to marry in that kind of thing. That can be a little challenging, but those are the core tools I think that you probably have in your analytics toolbox.
Jarrett Smith:
Very good. Let’s bring this down to earth. Maybe someone is saying, all right, I hear what you’re saying. I’m on board. I want to do measurement right, and they’ve got a new project coming up. Maybe it’s a new micro site they’re putting up. Maybe it’s a new campaign that’s just getting kicked off, and they’re thinking, all right, now’s our chance to think ahead on this one. Can you give us some concrete steps that we might take to sort of experiment with doing analytics right on this particular project?
Chris Sietsema:
For sure. The first steps I think would involve something pretty archaic, like a whiteboard or a piece of scratch paper, honestly. Let’s just bring it back to basics, and those are the first thing we want to do, as we spoke about earlier, as we alluded to earlier, just, what is it we’re trying to achieve? Just write down the business objectives and quantify them if you can. That’s step one. With step one, that’s not an easy step a lot of times, right? We have to determine what those are. I will say, though, if you’re trying to accomplish 23 objectives, then you don’t have an objective. Let’s probably limit it to maybe three to five, maybe six max, something that can be accomplished.
Then write down, for every objective, the key questions that surround those objectives, what is it we’re trying to answer. If the objective is I want to generate X amount of leads in February, 2021, then there’s probably some questions that surround that goal such as, what is the source of those leads? What is the conversion rate of those leads? How are people accessing the website? How many times do they need to see the website before they feel comfortable with filling out the lead generation from? All those different questions you want to answer, right?
So, write those questions down. For every question you’ve got, you might have a handful of, maybe say eight to 12. Assign a metric to those. So, lead volume, conversion rate, lead volume by source, conversion rate by source, all those different types of things. Then prioritize those metrics. Which are most indicative of success? What are the most important metrics or the most important questions that we’re trying to answer? Now, all this time, we’re just sketching this out. We’re writing this on a whiteboard or a piece of scratch paper.
Now that we’ve got our questions identified, our metrics identified and defined, and then our metrics prioritized, now the report almost writes itself. It kind of configures itself. Where you can start to say, all right, because this is a top priority metric, this is going to be at the top of the report. These are nice to know and complimentary metrics and questions that are going to help us determine whether we accomplished our business objectives, are going to go next, and then I’m going to provide some notes and some interpretation on that.
The structure of that report is pretty much good to go. You got your metrics prioritized, you know the structure in terms of who your reporting audience is like that single page versus maybe a couple of pages, depending upon how detailed you want to get, and so on. You also, at that point, probably need to determine what the cadence or the frequency is with which you deliver that report. Oftentimes, that question comes up is, Chris, how often should I be preparing a report? Should it be monthly? Should it be weekly?
The answer is, how often can you enact change? How often can you make decisions and actually work with them and apply some of the insights that you’ve derived from your report? If you can do that every week, then you should probably do a weekly report. If you can only do that once a month or once a quarter, then change your reporting cadence accordingly. I would say, with all your reports, automate the data collection if you can, just because it’s easier and you can spend more of your time and resources on actually reading the tea leaves and looking at those numbers and try and understand what it is that’s going on.
So, automate if you can, but don’t limit yourself to just the automated metrics. You might need to dig in a little bit to, what’s going on in Google Ads? And why did that ad in Facebook work so well? Or why did that ad in LinkedIn Ads not work so well? Then, once you’ve got all your data configured and set to go, now it’s time to actually interpret that data. The greatest value you can bring is your ability to translate data points into meaningful insights and actionable next steps. That’s how we get paid as analysts. We don’t get paid for numbers. We get paid from the actions that come out of those numbers, and that’s the most important thing to remember with all that.
Jarrett Smith:
Really good advice and very actionable, Chris, thank you for that. If folks want to connect with you more, to talk more, geek out on all things analytics, where’s the best place to connect with you online?
Chris Sietsema:
A couple of places. I’m not too active in the socials these days. Linkedin’s probably a good place. Chris Sietsema, you can find me there. My email is chris@teachtofishdigital. Website’s teachtofishdigital.com. Those are probably the primary places to get ahold of me or get in contact with me.
Jarrett Smith:
Good deal. Well, Chris, thank you for your time today. This was such a fantastic conversation. Thank you.
Chris Sietsema:
Thanks for having me. Appreciate it.
Jarrett Smith:
The Higher Ed Marketing Lab is produced by Echo Delta, a full service enrollment marketing agency for colleges and universities of all sizes. To see some of the work we’ve done and how we’ve helped schools, just like yours, visit echodelta.co. If you enjoyed this podcast, please subscribe and leave a review on Apple Podcasts, and as always, if you have a comment, question, suggestion, or episode idea, feel free to drop us a line at podcast@echodelta.co.