Making DEI Practical at University of St. Thomas

Diversity, equity, and inclusion is top of mind for many higher ed marketers, but figuring out the practical side of applying DEI concepts to the actual creative work marketing teams produce is often easier said than done. In this episode, we’ll hear from Kymm Martinez and Katie Jensen at the University of St. Thomas, Minnesota about how they made DEI less intimidating and more actionable for their marketing team.

We discuss:

  • How the St. Thomas marketing team developed a shared vision around DEI
  • The steps they took to evaluate their existing marketing materials from a DEI perspective
  • The DEI Viewfinder tool they developed to help their team create more inclusive marketing materials
  • Practical advice for making DEI a part of your team’s day-to-day operations.

Subscribe to the Higher Ed Marketing Lab

Apple Podcasts

Spotify

Stitcher

Google Podcasts

Transcript

Jarrett Smith:

You’re listening to the Higher Ed Marketing Lab. I’m your host, Jarrett Smith. Welcome to another episode of the Higher Ed Marketing Lab, I’m Jarrett Smith. Diversity, equity and inclusion is top of mind for many higher ed marketers, but figuring out the practical side of applying DEI concepts to the actual creative work marketing teams produce is often easier said than done. For instance, how exactly does a team review their work from a DEI perspective, without putting their colleagues on the defensive? Who decides when work needs to change? And how do you portray your school as an inclusive and welcoming community without overselling the level of diversity that actually exists on your campus?

In this episode, we’ll hear from two guests who’ve tackled these questions head-on and have some valuable learnings to share. Joining us is Kymm Martinez, VP of Marketing, Insights and Communications, and chief marketing and communications officer at the University of St. Thomas, Minnesota, and Katie Jensen, AVP of Insights and Analytics at St. Thomas. Like many university marketers, Kymm and Katie have grappled with the challenges of applying DEI concepts in their team’s creative work, and they have a unique perspective on how to make it less intimidating and more actionable.

We start by exploring how the St. Thomas marketing team developed a shared vision around DEI, and how they organized a thoughtful evaluation of their existing marketing materials. Then we hear about the DEI Viewfinder, a tool their team developed to help evaluate their creative work as it’s being produced. And finally, Kymm and Katie share their best advice for making DEI a regular part of your team’s operations. So without further ado, here’s my conversation with Katie Jensen and Kymm Martinez. Kymm, Katie, welcome to the show.

Kymm Martinez:

Welcome. We’re so excited to be here.

Jarrett Smith:

Well, I’m so excited to have you here, and I think it’s going to be just a fantastic and very relevant topic that our audience is going to get a lot out of. Before we dive into that, I would love it if you could just tell us a little bit about the University of St. Thomas in Minnesota and your roles there.

Kymm Martinez:

Well, I’ll start with the University of St. Thomas. We are the largest private university in the State of Minnesota. We are proudly a Catholic university. We are among the top 20 national Catholic universities in the country. We have eight schools and colleges, about 10,000 students, and lots of exciting things happening here in Minnesota. We have just launched a brand new school of nursing. We also are the very first university in modern NCAA history to go directly from D-3 to D-1, which just happened in July. So we’re super excited about that.

There’s just a lot of really exciting things happening at the university, and so it’s been fun to be a part of that. I’ve been here for about five and a half years. My background, my title, I am the chief marketing and chief communication officer here at the university. I came from industry, as I’ve learned to say in higher ed, about five years ago. I was at General Mills for 20 years prior to that. And I also hold the title of Vice President of the Marketing, Insights and Communications team here on campus.

Katie Jensen:

Yeah, so I’m Katie Jensen, I lead the insights and analytics team within the marketing, insights and communications team which Kymm leads. And so my team is really all about bringing the audiences to the forefront, and helping people understand and have empathy for the prospective students we’re trying to talk to, or our donors, or our alumni, and just really understand where they’re coming from, as well as measure and track our success in terms of our digital campaigns, our websites, all the analytics from the marketing side work with my team as well.

Jarrett Smith:

Excellent. So we are here today to talk about all things DEI and really how to make that practical for marketers. And I think this is one of those topics that is just on everybody’s mind. You may have seen it. I came across a poll, I think it was on Inside Higher Ed from the Art & Science Group, and they were talking about how prospective students view DEI, and it is top of mind. It is obviously not the only thing they’re considering, but they are looking at that as they’re evaluating schools. And I’m just curious kind of to start things off, as university marketers, where do you see yourselves fitting within your institution’s broader DEI efforts?

Kymm Martinez:

Yeah. I feel like we should also perhaps start off with a little bit of a disclaimer about, we’re not DEI experts. We have a ton of faculty here and other folks within the university that could claim a more theoretical expertise than we have. I would call us more practitioners and we know how important the DEI journey is, and so we have committed ourselves and our department to being on it, but just full disclosure, kind of on that front that this is a learning journey. And I know we’re going to talk a little bit more about that as we kind of get into the podcast.

But I think that the role of marketing and communications at our university or any university is critical to DEI because we’re the ones that are the storytellers that are really fashioning the narratives about the university and helping everybody to have the perceptions about the university that hopefully are accurate. And so making sure that we are really being inclusive with our messaging is really, really important. And so I would say that we’re essential. But of course, when we do think about marketing and communications, it’s really important that it falls on a base of something authentic and that it is embedded in the values of the university, so obviously the entire university plays a big role in making sure that we’re walking the walk on this.

And so as marketers, we are shining spotlights on things, but it’s up to the whole university to really feel it at its core because otherwise we shouldn’t be marketing it. So yeah, we’re critical I think to the university’s efforts to get the story out, but it also needs to be an authentic journey for the university.

Katie Jensen:

The other thing I would maybe add to that, Kymm, is just we work with every part of the university, and so we have the ability to see patterns across different teams or needs across different teams. And so when we think about the role we’ve played in DEI, a lot of times it’s, we’re just in the conversation on a regular basis, and so we have the ability to influence and kind of help push things forward. Even just the way that our faculty communicate with students, or the way that our staff communicates with each other also needs to be inclusive and bring this sense of belonging because we want our entire community to feel included and welcomed here. And so some of our role too is just, we’re there as colleagues and coworkers trying to push this ahead for everybody.

Jarrett Smith:

I think that’s a great perspective. So St. Thomas, like many schools, is associated with its faith tradition, and in this case you are a Catholic school. I’m curious to what extent that has played into, or possibly created any challenges with regard to DEI. I mean, I think I’m thinking specifically about individuals that identify as LGBTQIA+. Has that presented any sort of unique hurdles that you would call out for your school?

Kymm Martinez:

I guess the first thing I would say is our faith tradition is what calls us to, we have a conviction of dignity which is very much rooted in Catholic social thought, which is about respecting the dignity of all and loving everybody and really welcoming the diversity into our community. And we absolutely do that. So I would argue that everything that we do as a university on the DEI front, including welcoming our LGBTQIA members of our community here is rooted in Catholic social traditions. So it is part of who we are as a university to accept and embrace, and to really allow everybody to bring their full potential, and to know that we as a community are stronger because of all those diverse perspectives.

Now having said that, there’s a wide variety of perspectives within the Catholic faith about how welcoming institutions should be. And that’s where, we are a university we’re Catholic, but we’re also a university. And so we are just making it very clear that our convictions call us to welcome and embrace. We do celebrate and uplift all members of our community here. And the best way we can just make sure that we overcome any resistance to that, again is to just clearly communicate, “This is who we are. That’s not up for debate. It’s part of our Catholic social teaching,” and be very honest about that so that nobody is left to wonder what is our position on various topics. So we make it very clear it’s part of who we are and it’s part of our Catholicism.

Jarrett Smith:

I think that’s a great perspective and thank you for sharing that. I think the way your team got started on the DEI journey and the actual marketing products you’re putting out there, the marketing and communications products you’re putting out there is really interesting. And Kymm, you said at the beginning of this, you said, “We are not approaching this from a fully informed theoretical standpoint, we’re really coming at this from a very practical angle.” But I think the way you went about as a team deciding, “How are we going to put this into practice and how are we going to make progress on our journey towards better diversity, equity and inclusion or marketing materials?” was super interesting. Could you just kind of tell us about, I guess at a high level, kind of a 30,000 foot view, where did you start? How did you approach this in a sensible way?

Kymm Martinez:

Actually it started with the university’s convictions. As I mentioned, one of them is dignity, and we obviously have a university definition of what that means. But as a department, we were going through the university’s convictions and then taking a marketing, insights and communications lens to them and writing, “What does it mean to embrace dignity and diversity in marketing and communication?” So that’s trying to articulate what that meant was where we started. And this was probably about three years ago. I would argue that the first paragraph that we came up with to describe, we were talking about wanting to make sure that we were never treating any members of our community as tokens or trying to oversell what our actual experience of being here on the campus was looking like. So three years ago, we kind of started there and started on the journey.

And then just recently we went back to that description to just see, is there any updating that we would want to do? And because of the work that we’ve been on, we’re now specifically saying that we’re an anti-racist department. We want to root out systemic racism in our place. So there’s our journey even in terms of our language has really kind of evolved. But it did start with that place. We ourselves are white cisgender females, so recognizing that and diving into white privilege and what that means, and again, recognizing that we don’t want our communities of color, either in our department or in the university, to always be the one that are educating people on what DEI should look like. We said, “Well, we’re going to take that on.”

And so it’s coming from that articulation of it, we started a committee that we call Representing DEI with Integrity Committee. Actually, it initially started as the Marketing DEI with Integrity Committee, and then we thought, “You know what? That doesn’t feel right to say we’re marketing it. We want to represent it.” So even that language was a bit of an evolution, and we have members of our group bidding monthly, bringing forward issues, topics. And one of the ideas that the group came up with is, maybe we should do an audit of some of our marketing and communications, and get feedback on how we think we’re doing. We are looking at this stuff every day, we’re thinking about this stuff every day, but potentially we’re missing something. And so that’s where Katie’s team actually came in to help facilitate that audit. So I don’t know, if you want to talk a little bit about that?

Katie Jensen:

Yeah. So with this audit, part of our goal was to, as Kymm said, just get outside of our team, get some fresh eyes on things, but also knowing that, we’re not experts in DEI and wanting to tap into the experts we have on campus, or at least people who have been engaged in the journey. And so we worked with a group called SEED. It stands for Seeking Educational Equity and Diversity, which is a national program. So we happen to have a chapter of that here at St. Thomas. And so we partnered with them and we recruited a handful of people who had graduated from this program where they dive deep into DEI concepts, and we said, “Hey, look at our materials and tell us what you see. Put sticky notes all over it.”

And so we gathered a ton of feedback. We then had focus groups with them to really kind of dive deep into the topics and learn more, and we came out of it with a really interesting perspective then on all of the things we had missed, even though we’d been paying attention and an ability maybe to spot those issues a little more easily. So from there we developed what we’re calling the DEI Viewfinder tool, which kind of lays out nine questions you can ask yourself to help develop more inclusive materials.

Jarrett Smith:

Yeah. And we definitely want to set aside some time to talk about the Viewfinder, because it’s super cool and just such a practical tool. If we’re going to dive into that, I guess one question I have is just, how did you decide what to audit in the first place? I am sure your team is turning out a lot of materials. Where did you focus your effort to make it doable?

Katie Jensen:

Yeah. I mean the key was really that doable piece is, how do you… Because we spent a lot of time actually just going round and round about, “What should we put into this audit?” And so, for anybody who would want to do something like this, my advice would be just keep it simple. Pick a few things, pick things that are really important, really central pieces. So for us, we picked our undergraduate view book. We picked the program pages that we have for each of our 150 plus majors and minors, because we know those are some of the first places that our undergraduate students see and get a perspective for, “What is this community like?”

And then we also picked something from our graduate side, so we picked from our Opus College of Business. They had an awareness campaign that we also put into it. So trying to find a bit of range and also kind of pick some of those big high profile items that we could learn from. We also found that, especially the view book, a lot of what we do in other places kind of flows from what that view book creative looks like. And so that was another reason to pick a piece like that, that sort of is representative, it has a lot of tentacles in other work that we do.

Jarrett Smith:

Yeah. It’s going to set a lot of direction for the other materials that you create. I know we’re in a podcast, so it’s a little hard to kind of visualize some of these things, but maybe we can paint a picture with words. I’m curious your audit, were there any things that kind of leapt out at you as things that surprised you, that the team came back with that you just weren’t expecting, or that opened your eyes to maybe seeing some things in different ways?

Katie Jensen:

Yeah. What’s interesting about it is that most of the stuff they came back with were topics we were aware of. So things like, “Don’t reinforce stereotypes. Don’t use language that might be perceived differently by somebody from a different background, code words or idioms, for example.” So these are things we knew about out, but the trick was getting ourselves to actually spot them. And so I think our biggest aha from it is if we could have sort of a mental checklist, which makes it sound simple and it’s not, but a mental checklist of, “Here are some of the things to check for,” and then examples of how that actually comes life. I think it was the examples that really unlocked for us a much better ability to just be able to spot those things.

And then I would say too, just it reinforced for us a sense of how important it is to get other people’s eyes on it. Even if it’s within our own team, people who haven’t been working on the project, because you swap in a headline or you swap in a picture and you Frankenstein this thing together and suddenly you have a combination of things that you might never have put together in the first place. And by the time you get to that final product, you can’t even see some of the issues that you have. So that was for me, one of the biggest ahas.

Kymm Martinez:

Yeah. I guess I would say the other aha for me was just how many comments people had and how many things that they spotted. Again, we’ve been on this journey, we thought we were… I mean, we obviously knew we were going to learn something or we wouldn’t have undertaken it in the first place, but the sheer breadth of the feedback that we got back was pretty interesting. And then just to underscore another point about what Katie said, the examples are key because it’s one thing to say something like, “Don’t reinforce the stereotype.” I mean, who’s not going to nod their head to that. But it’s the examples of like, “Did you realize you were reinforcing this stereotype, or this one or this one?” That was really what brought it to life for people to be able to, “Oh, wow.”

Because it’s easy to agree with statements and things that you know you should do, but once you see it in action, it’s more important. So what’s interesting about this whole thing is we started it because we thought it was just going to be for us. We just were using it within our team as a tool, but the insights were so rich that that’s when we said, “Wow. This has implications for anything that we’re doing, anytime we’re communicating, even internal communications.” And so that’s when we realized, “Hey, we need to package this up so that others can learn from these insights.” And we started internally at the university and then now we’re going external with it as well, just to help spread the word of things that were insights for us that we hope others… If we can help somebody else from making a mistake that we made, great, we’ll all be better off.

Jarrett Smith:

I think it’s such a interesting point you both make that, you’re both approaching this thoughtfully, like nobody is going out to create a piece of creative that’s going to reinforce a stereotype or use a language that is loaded, and at the same time in the moment. And I think it’s almost like when you’re that close to the creative, maybe it’s hard to read the label when you’re inside your own bottle. You’re just too close to it, you’ve got too many other considerations that you’re thinking about, and it can kind of fall to the background. I think that’s just such an interesting point. Just so our audience has a sense of the type of things that came up, could you throw out maybe a couple of examples of things either big or small that were brought to your attention that you hadn’t seen before?

Kymm Martinez:

I mean, I’ll start off with one that, so as a Catholic university, one of the headlines that we love to use actually either digitally or in other places is, Blessed are the Nerdy. We love that headline because it reinforces the academic excellence that we represent, but at the same time it has a little bit of humor and a nod to our faith-based tradition. So Blessed are the Nerdy, when we were originally looking at using that, we were using a stock image of a nerd. And in this case, the nerd happened to be Asian, and we were using that sort of as a juxtaposition. And that’s a classic example of, “You are reinforcing a stereotype by using an Asian in this particular image.” So that’s a great example of like… And we also were using stock photography, which again, we don’t love to use, but sometimes if you can’t get the authentic shot in your community, it’s easier.

But so we went back to student affairs and we said, “All right, this is what we’re trying to convey. We need somebody that is going to be comfortable looking kind of a little bit more bookish. Can you help us here?” And they came forward with a student, an actual student who loves to dress in bow ties, like that’s his authentic self to do that. Happened to be an African American guy with a wonderful smile and just a wonderful way about him. And he was excited to be featured. And so we flipped it. So we have him, a African American male as our model for Blessed are the Nerdy. And so anyway, that’s a great example of a stereotype that we wouldn’t have necessarily thought of. I’m sure I can think of others on my hand, but do you have any others that are top of mind for you?

Katie Jensen:

The other one I’d maybe mention is just the idea of subtleties and how if you are from a marginalized group, you might really pick up on things. So for example, we had one of our ads from the Opus campaign, there was a person who you couldn’t immediately tell, “Is it a man or a woman?” A little bit androgynous. And so if you are part of the non-binary community, it just signals a bit of, “Oh, maybe there’s a place for me here.” Or we had an image in one of our undergraduate program pages on the website where a young man is holding up a frog and he’s got a wedding ring on. So that signals maybe if you’re an older, maybe non-traditional student that this might be a place welcoming for you as well. So I thought that was really interesting too, not just the things to avoid, but the things to include that might signal something in a positive way, not in a message you didn’t intend sort of way.

Kymm Martinez:

One other example that pops to mind too is, in our, I believe it was also in our undergrad view book, our creatives were working with illustration and so we would have the picture, but there were little doodles kind of around the picture, just for more of a friendlier feel and tone, and that was going all the way through the book. There was one photo of a professor talking to a couple of students and there’s no words in these doodles, so they’re all just kind of like doodles. And above one of the black student’s heads is kind of a thought bubble that has just lines in it. Because again, we’re not using words, but one thing that was pointed out to us is, “Are you trying to say that she has no thoughts, that she couldn’t come up with anything to say to the professor?”

And I think that that’s actually really interesting thing because you put that thought bubble on top of a white male student, you’re probably not going to get the same person drawing the same conclusion. So it’s a good example of just context and being careful, even just unintended messages that you’re trying to say. Which actually ladders to another point that I think is really important to make. You were talking before about just like putting these things together. Our creative team, we have an internal creative team who are just amazing and they do really wonderful work in partnership with all of our schools and colleges. And it’s really important to, this DEI journey is everyone’s job and it’s not their fault if they come out with something like that instance for example. We’re not going to, “Why did you put a thought bubble with no words in it above the black woman?”

I mean, it’s all of our jobs to actually think about this and catch this, and you can’t personalize the fact that you missed it because we all missed it too. I mean, this went all the way through Katie, it went through me and it took this group, this external group looking at it with a different lens to find it. So that’s another, I think really important lesson on the journey is to not expect perfection of yourself and to lean on others. And again, not just to lean on your communities of color to help you with that. That’s not fair, but to lean on others and to expect that it takes a village in some respects to get this right.

Jarrett Smith:

I think that is such an important point, Kymm, the attitude with which you approach your team about… And from the very beginning you said this was rooted in sort of the authentic principles and values of the institution. You were thinking about, “How does this apply to our department? Let’s try and articulate that.” And then as you said, you’ve gone back and you’ve evolved this over time, and so you’re kind of taking people on this journey where it’s like, “Okay, we’re not going to achieve perfection, and it’s okay. We’re all doing our level best to try and continue to improve and to be more mindful about the products that we’re creating and what messages they’re sending.” I think it’s such an important point.

But that does bring me to my next question, which is, at some point you’ve got to ship work, you have deadlines to meet. And I think in this case, it’s one of those areas where you will never achieve perfect. You will always be able to point to something in the work and say, “That’s not quite ideal. There may be a better way to handle this.” So how do you talk to your team about it? How do you on your own think about kind of balancing the need to make improvement and be mindful, thoughtful about what you’re communicating, but also at the end of the day ship work that you know to some extent is never going to meet that perfect standard.

Kymm Martinez:

I mean, I think the trick is to just acknowledge that up from the get-go that this is a journey. The other thing is that it’s constantly evolving. This landscape is evolving, language evolves. So even something that might have been appropriate a year ago, maybe isn’t appropriate today because of new learning out there. So even if you studied up and got the A on the test originally of doing all the things right, again you have to stay current on this. And as a result of that, it’s impossible because there’s always going to be something. So I think it’s just about setting that expectation that you want to do the best that you can. It is obviously focusing on intent, although I think it is important to also remember impact. So it’s not enough just to say, “Well, we intended good, so sorry if the impact wasn’t there.”

I think we have to own the impact that we make in addition to our intentions. But just again, to assure everybody that this is a journey. I mean, one of the ways that we tangibly reinforce that for our group is we have personal diversity goals that we ask every employee of the marketing, insights and communications team to commit to at the beginning of the year as part of their annual objective setting. And we don’t dictate what those are. Those are very personal. So you can choose for yourself what it is, but I expect you to have like two to three of your own personal goals, that’s going to help you on that journey.

It could be reading books by diverse authors or consuming media about different population. It could be anything that you feel is going to help you on your journey, but it reinforces to the full team, “This is a journey and we expect you to get on it, but you can kind of dictate how you’re comfortable moving forward.” Because obviously we’ve got people across our team that are at very different points in the spectrum. I don’t know if you have anything you’d add to that.

Katie Jensen:

Yeah. Well, I would just say that one of the things that’s important when you’re thinking about the balance between, “Do we make this edit? How critical are we going to be of our own work?” versus, “We got to get stuff out of the door,” and we’re full up to capacity most of the time, but what’s really important is it is worth the time to take a minute and make sure that we’re sending inclusive messages, that we are not sending unintended messages. And so I think that’s been a bit of a shift for us too in opening up the conversation to DEI and being open to it all the way up the chain to Kymm, as a leader of our department. It’s expected that you’ll speak up. If you see something, you need to say something. And I think maybe prior to having some of these tools and having been on this journey, people were maybe noticing something and thinking, “Oh, maybe I shouldn’t speak up. I don’t want to derail the project.”

And now I think we’re much more comfortable being able to say, “Hey, I noticed something here, can we have a conversation about it?” It doesn’t mean we need to make a change, it just means we should talk about it and make sure we’re all comfortable and make sure we feel good about a risk we might take, or, “Hey, I saw it this way, but maybe everyone else sees it a different way.” And so it’s really about that conversation, and if we can pause for a minute to have the conversation. A lot of times it’s not throwing out the whole project, it’s swapping in one different picture or changing one headline slightly. So I don’t think it’s taken as much time as we might have feared to be able to do it, and it feels good to have taken that moment and said, “Okay, do we feel good about this? All right. Let’s go.”

Kymm Martinez:

And that’s where that committee that I was talking about also plays a role, because if you as individual see something that you’re maybe a little bit uncomfortable with, or maybe your spidey sense has gone off, but you can’t really put your finger on why, there is a place for you to bring that image or whatever it is that sort of caused your spine to tingle and to say, “Is anyone else seeing this? Am I overthinking or is there something here?” So there’s a forum actually to bring that forward to get others points of view. And and again, then we as a group can decide, “All right. Is this something we want to approach somebody else on campus with or something like that?” Again, always with humility. We’re not the DEI police, we don’t have all the answers with it.

But I think the other message to really underscore with some of the things we’ve done, like in terms of starting off with just, “What does the diversity conviction mean for our group, the dignity conviction?” Setting up our committee, our personal diversity goals, these are all things we didn’t need university approval to do. It’s not like we waited for HR to say, “All right, now we’re formally going to add these personal diversity goals into our annual plan.” I mean, we just did it, and it’s not in our online form, but everybody has kind of their offline objectives that they have as well. So hopefully your viewers are taking from this that, this is stuff you can do and implement that you don’t necessarily need to wait for somebody to give you approval to do in order to start on the journey.

Jarrett Smith:

And we definitely, I would like to dig into implementation a little bit, but before we get there, I would love to talk about the DEI Viewfinder. Could you tell us what it is and how it works?

Katie Jensen:

Sure. Yeah. It’s a really simple tool, which is kind of the beauty of it, but what the Viewfinder does is it gives you nine questions you can ask yourself or ask with a buddy or ask somebody else to take a look through, and it really calls out some of those key topics in terms of diversity, equity and inclusion, and across all types of diversity. And so really what it does is it gives us these questions we can tick through in our heads. It also gives us common language to use with each other. So instead of saying, “Wow, that picture and that headline together is super racist,” we can say, “I think that’s reinforcing a stereotype,” or, “I think that that is, there’s some unconscious bias coming in there and let’s talk about it.”

So it’s really a tool that you can use, and for any piece of creative, we tick through and we say, “All right, does it hit on any of these?” And if it does, we have a conversation. And so that’s one of the key things with it. The point of it is to be super critical, super detailed about the work that you’re doing. Sometimes we even catch ourselves going, “Oh gosh, are we being overly sensitive, overly critical?” And that’s the point. The point is to really take that critical look and then have the conversation and say, “What do we think? Do other people take the same thing away? Do we need to pull in other people to give us additional feedback?” And then-

Kymm Martinez:

Can I just script for that particular point too? And we don’t always, even when we’ve thought of something with like maybe, we don’t always make the change. And I think Katie said that before, but I mean, we sometimes agree that we’re going to lean into risk either because of there’s another benefit that would be going away if we swapped out the image or what have you. So it’s not a done deal that just because somebody brings something up, we’re automatically to be like, “Ooh, we have to stay 10 feet away from that.” It’s the discussion and then you make a decision on the basis of the risk and the risk reward, the benefit to that. So I just wanted to make that point as well.

Katie Jensen:

Yeah. And one of the things that we love about it is that it’s given us that common language we can use with other teams as well, so when we see something another team has put together. Because I’m sure like many of your listeners, if they’re marketers in higher ed, they are not the only people creating content on behalf of their institution. And so it gives us the ability to say, “Hey, we’ve been on this journey too. We made all these mistakes and here’s all these examples of mistakes that we’ve made.” And it comes with a bit of humility as well to be able to have that tool and be able to show people that, “We’re on a journey and we invite you to join us. And we notice something in your work we’d like to share.”

But the meat of the tool really is there’s these nine questions, for each question we give an overview of what it’s all about, and then we give those handful of examples that really bring it to life. And again, that’s one of the pieces we think is so critical is if you can give those examples that really helps people kind of put it in their brains and really understand and potentially make it easier for them to spot it in their own work.

Jarrett Smith:

So at this point, how many different groups I guess, across your university are actively using the Viewfinder in their day-to-day work?

Katie Jensen:

I don’t know how many are actually using it. I hope they are. We’ve trained several units on it. So we’ve trained anything from, we did a training with the entire faculty and staff of our school of education, for example. We’ve also done it with all of our social media content creators which live across all kinds of departments within the university. We’ve done leadership academy trainings, which is our internal HR kind of professional development group. We’ve done trainings there where we get everybody from the admin, for the provost who puts together all of his emails and presentations and things like that, to faculty members, to even students in some cases. So-

Kymm Martinez:

Actually that’s another one you’ve done it for. We have a student-run media agency on campus, which I’m sure a lot of people do, we call it TommieMedia, and there’s been a presentation to all the reporters, all the folks that are involved with TommieMedia, because sometimes we see things that they’re posting and doing, and we’re like, “Whoa, they might be able to benefit from these questions as well.”

Jarrett Smith:

I’m curious, as you’ve rolled it out to these different groups and kind of socialized this more, have you received any pushback or how has it been received overall, and then have you received any sort of specific pushback?

Katie Jensen:

Yeah. So I would say it’s been really well-received, particularly just as a very actionable tool that people can use. I think the tough part with DEI is once you’re kind of on the journey for a little while and you’re spending time and you’re learning and your eyes are so open to all the things that aren’t great about how marginalized groups are treated and you want to take some action, and it’s hard to know how to do that. And a lot of times I think people leave DEI trainings going, “Okay, I get it. I know there’s issues here, but what can I do?” And so that’s been one of the biggest pieces of feedback is just how actionable and accessible it is. You don’t have to know every stereotype that exists in the world to know that, “I’ve seen that trope before and I just want to research it a little bit and learn more about it.”

So it’s really, it’s accessible to anyone no matter where they are on the journey. I would say in practice some of the more difficult conversations I would say, I don’t know if it’s been pushback is just, with people who are creating content. Our creative team, people who write stories for us, and they’re having the benefit of this being an additional thing that we’re coming back to them and giving them feedback on their work, which I give creative teams a ton of credit. I couldn’t do what they do and have people tell me my baby’s ugly all day. But that’s been the toughest thing. And so what has again been helpful with this tool is that common language and the ability to say, “I think this is sending a message you didn’t intend, and here’s kind of the category that falls in, and let’s talk about that.” I think this has made those conversations easier than they were before we had something like this.

Jarrett Smith:

And again, it just underscores the importance for me about leading with the attitude of progress not perfection, awareness, mindfulness around these issues, but there is no end state where we check it off and it’s just done. And so that sets you up to be able to have a more positive interaction on those things.

Kymm Martinez:

Actually, that does remind me of another, I guess I wouldn’t call it a pushback, but another question that we get. We’re as a university on a journey and our faculty, student, staff community, we would like it to be more diverse than it is. So that leads to a question around from a marketing, communication standpoint is where do you draw the line in terms of you want to represent an aspiration of the university such that you can draw from a population that will help increase diversity, but at the same time, you don’t want to overpromise and create an image of something that actually doesn’t exist when it’s here. So that’s another conversation that we do get into when we roll this out.

And sometimes I think people are looking for, they want like the answer, they want the black or white answer. So every picture should have 25% people of color and 75% white people, because that’s what the diversity of the student body in the first year class looks like. And it’s not that simple. And again, all these things are nuanced and it’s a little bit more gray. So the line that we walk is, we do want to nod to the university that we want to become, so we think it is all right to lean a little bit into aspirational, but we never want to cross the line to where we’re selling again, a look, a tone, a feel that just doesn’t exist here at the university. So we have dialogues around that.

An example is we were doing a campaign here on campus. It was an internal campaign in conjunction with our advancement group, thanking people for the philanthropy, the many gifts that have come onto campus. And so we had taken photos of students with their thumbs up just sort of thanking donors. And there were six students originally that were photographed, and of the six, four were students of color. And so that was something where like, “Okay. That would be like two thirds of the folks in the picture.” So that was an example of something we took back to our committee to discuss. We said, “How do we feel about that?” Because again, we’re not trying to come up with a rule in terms of like, “Every campaign has to have this percentage, but in this particular case, knowing how first of all, everybody was going to have a chance to see all the images because they were going to be what we like to call campus famous, so they were going to be everywhere on campus.

It just didn’t feel like the right mix. It felt like we were over, perhaps going into that token area that we do not want to be in. So our solve for that was to shoot four more students to bring it up to 10 in total and to keep the original four students of color as part of that 10, just to help that mix feel a little bit more true to life. So that’s just an example of how we try and walk that line, and that is one of the questions that we sometimes get when rolling out the tool is aspirational versus reality.

Jarrett Smith:

Yeah. I think that’s such a great point because it’s not a math problem at the end of the day that has a perfect answer.

Kymm Martinez:

No, people would love it to be because then it’s easier. You count the number of people and then you… But it’s not that, that’s why the conversation is important.

Katie Jensen:

And I would say that applies across all of the concepts in the Viewfinder. People want it to be a simple, “This is your set of rules and if you follow these rules, you’re going to be okay.” And that’s just not the case. I mean the diversity and inclusion concepts are complex and nuanced, and so sometimes too from our creative team, another example here, we had a picture that we said we really shouldn’t be using, that had a kid doing a thumbs up. And he happened to be in front of the Colosseum in Rome, and in parts of Italy, a thumbs up is offensive.

And so we gave that feedback and we said, “Look, we can’t use it in this case because he’s really kind of bringing his own culture into this place where he’s supposed to be immersing in Italian culture.” And so our creative team was like, “Got it. We won’t use thumbs up anymore.” And it was like, “No, no, you can use thumbs up, it’s just in the context of that situation.” So that’s a big piece of this is just embracing there’s a lot of gray area and that’s why the conversations are so important to just make sure that we’re thinking through it, and is it okay in this case, because the answer might be different depending on the context.

Jarrett Smith:

So thinking about operationalizing this, people are listen to this and saying like, “Wow, this is very cool. I think we could do something similar on our campus.” What advice do you have for folks that are thinking about rolling out something similar at their own institution?

Katie Jensen:

My advice would be starting with buy-in at the leadership level. One of the things we are really lucky to have is this woman sitting next to me here, Kymm, is a champion of DEI at St. Thomas and even in her personal life. And so, we are lucky to have a executive sponsor who expects us to do this work, who puts it in our personal objectives every year, who asks us to engage and welcomes us to engage no matter where we are on that journey. And I think that piece is really important because now it is an expectation of our creative team when we brief them, or our writers when we brief them, that they know that one of the things when they bring it back to us is, “Yes, does it hit on the key messages we want to hit? Does it speak to the audience? Is it inclusive?”

They know that that is going to be a bar that they’re held to. And so when the feedback comes, it’s not a surprise, that’s part of the deal. I would also say though, in fairness to them, they’re so close to those pieces that we can’t always expect them to come with a perfectly inclusive piece. And so, one thing we’ve done is we’ve said, “We’re not going to make the applying the DEI Viewfinder a step in the process, it’s something that anybody who comes in contact with a piece on its journey to being developed can step in with, “Hey, I’m noticing this thing from a DEI lens.” And so that’s been really good too. It’s this shared accountability, it’s not one person’s job. It’s not a thing we do at one point in time, it’s something we’re all aware of and all free and open to say, “All right, let’s take a second and take a look at this.”

Jarrett Smith:

That’s really interesting. My assumption was that, “Oh, this must just be a formal step in the process.” And that’s interesting that you found that actually it seems to work better for your team that it’s not again, maybe a box that you’re ticking.

Katie Jensen:

Right. Exactly. I think that’s part of what we’re trying to do is it shouldn’t be the like, “Oh, it’s this cursory thing.” It should be part of anybody who’s reviewing the materials should be looking at it that way. It also means though that we have some freedom for people who are maybe more immersed in DEI concepts. When they come to it, they’re for sure looking at it. For those who are maybe less comfortable, they know that there will be other people along the way who can apply it. And then it also means that sometimes if people are less comfortable, they might tap those of us who are more used to using it. So just it provides a little more freedom and feels less like a hoop you have to jump through and more just like, “This is anybody who’s reviewing it is going to want to be looking at this stuff.”

Jarrett Smith:

It becomes more of a community effort at that point.

Katie Jensen:

Yeah. And much more conversation too. I mean, I just actually, I got a text from a coworker on the way to the bus stop to drop my daughter off this morning to say, “Hey, what do you think of this picture? Well, now let me show you it in context.” Or just kind of constantly having these little conversations, and these are conversations where we would’ve in the past felt like, “Ooh, I got to sit down and have a really careful one-on-one conversation with somebody and really…” And it’s just opened up our comfort levels with it.

Jarrett Smith:

That is really interesting. Okay. So let me flip this around and you may have kind of already hit on all these things, but how could somebody go wrong trying to roll this out on their campus? I think you’ve kind of already touched on a few what not to do type things.

Katie Jensen:

Yeah. I mean, I think you for sure don’t want to be the only one. I don’t think you want to be the lone wolf. A lot of the magic with it is the conversations you can have, and the dialogue that it opens up. That’s actually been one of the really positive of things for us is showing others in the university that we have opened ourselves up to this feedback and we’re open to it and we’re having these conversations. We’re getting more feedback from people when they see something. It used to be that we would hear it, three or four people down the road when, “Oh, somebody saw this,” and then finally it gets to somebody who’s comfortable talking to us. And we’re hearing from a bigger variety of people now too, both on the positive and negative. So I think it’s definitely, it’s got to be ingrained in your team culture and you have to be sort of committed to, “We’re on a journey together and we’re going to hold hands and do this as a group.”

Kymm Martinez:

Yeah. I would just echo too, and again, the importance of approaching this with humility and making sure again that people know that you’re not considering yourself the expert, and now you’re going out to try and tell everybody how to do that. Especially in an academic setting where sometimes marketing, people can look at that and say, “Wait a minute, you’re just spinning.” We want to just make very clear what our roles are in there. But I said this at the beginning of the podcast but I can’t underscore it enough, the other really, really, really important way you can go wrong is if the institution that you’re talking about does not have a true commitment to this, and yet you’re out there trying to represent that you do.

That would be very difficult, kind of like in a show stopper from the very, very beginning. So it’s important to as a university, as an institution, as an organization to have a true commitment to walking the walk on DEI, because people will smell it immediately if you are just trying to spin something that isn’t there, or put lipstick on a pig, as we sometimes like to say. That’s not going to work. It doesn’t work in this space. It needs to be authentic.

Katie Jensen:

Can I just add one more I think really critical thing? And you touched on this earlier, but the other way you can go wrong is relying on the people on your team from marginalized communities to do this work. We talk a lot and this is part of the humility we try to have about this. We realize we’re two white women up here talking about this stuff, white cisgendered, able-bodied, straight. And so, that’s important because we hear a lot from our colleagues from marginalized groups that they are tired. They have, we call it representation fatigue, which I thought was a really eye-opening term because they are being asked to draw on experiences of oppression and hurt, and to do it for free on top of their day jobs.

And so, one of the things we really like about this tool is it’s given us a way to contribute and not always have to rely on those folks on our team to speak up and do the work. It’s tiring. And so, that would be the other piece is just, that’s one of the reasons it’s important that it’s everybody, because it can’t just be those few people on the team.

Jarrett Smith:

Wow. What great advice. So what’s next? What’s next for St. Thomas and your DEI efforts? What’s next for your team? Where do you go from here?

Kymm Martinez:

So I’ll talk at the university level first and then bring about our group. Our university has all sorts of really wonderful initiatives underway. Right after George Floyd was murdered here in Minneapolis, we founded a racial justice initiative here with a fantastic scholar, Dr. Yohuru Williams at the helm of that, that is doing a lot of work in terms of historical recovery, and really trying to engage in conversations about, “How do we grow from here? How do we do better from here?” So that’s just one aspect of what the university is doing.

We also have a new college that launched back in 2016, the Dougherty Family College, which actually is focused on underrepresented students, helping them get to a four-year degree. So it’s a two-year program, but that wraps around and provides all sorts of different support for a person who might not have been able to enroll directly in a four-year. But we provide that support for them so that they can then get that support in the first two years and then transfer hopefully to St. Thomas but also other schools. The most important thing is that they get a degree.

So the university has a lot of initiatives like that, that we’re involved in. We just launched a new college of health that is very much engaged in healthcare disparities and eliminating health inequities. Our college of education, our school of education is very much focused on the gap. Minnesota has one of the worst education gaps in the country in terms of when you look at students of color grad rates versus white student grad rates. And so we’ve got a school that’s very much focused on that. So there’s a lot of work at the university level that we’re doing which we’re then privileged and proud to be able to represent and tell the stories in our group.

So for our group, we’re just going to continue on again with our journey. We’ve just done our very first ever three year DEI plan for our group. It includes things like looking at our suppliers, our freelancers, making sure that we’re working with more BIPOC-owned businesses or freelancers. There’s all sorts of things that are in our strategic plan, but we’ve written it down to just make sure that it’s on our journey. And this particular Viewfinder tool, again we’re starting to do… We’ve already been doing presentations outside of St. Thomas. As people have become aware of this tool and are interested in it, and we’re very happy to share that because it has applicability not only for other higher ed institutions, but also for anyone that’s in an organization.

So we’ve been doing presentations on that, but that obviously, the scalability of our ability to be able to get out there and do all the presentations is limited. So we’re in the process of trying to turn this into an online course that people could access that would then hopefully bring the tool to more people, because that would be our goal. We want everyone to have access to it. That’s one of the things I love about working in higher ed is the willingness to share things that you’ve uncovered and discovered that could be of use to other people, and that’s what we are trying to do here.

Jarrett Smith:

Well, that is great. So Kymm and Katie, if folks want to reach out to you and find out more, continue this conversation, what are the best places to do that?

Kymm Martinez:

Well with both, it’s easy to reach us via email, and the email nomenclature here is pretty simple. First name ., last name @. St. Thomas, S-T-T-H-O-M-A-S.edu. And I’m assuming our names are going to be, the spellings are going to be published somewhere so we don’t have to run through that here. So there’s that. LinkedIn is another great place to reach either one of us. And we’re very open to having the conversations with anyone who is wanting to start on this journey and wants to learn from what we’ve done.

Jarrett Smith:

Well, thank you both so much for sharing so openly and freely with the journey that you’ve been on, and I think so many folks are going to find this very useful in their day-to-day work. So I just want to say thank you for joining us today.

Kymm Martinez:

Thank you for having us. Really appreciate it.

Katie Jensen:

Thanks Jarrett.

Jarrett Smith:

The Higher Ed Marketing Lab is produced by Echo Delta, a full service enrollment marketing agency for colleges and universities of all sizes. To see some of the work we’ve done and how we’ve helped schools just like yours, visit echodelta.co. If you enjoyed this podcast, please subscribe, and leave a review on Apple Podcasts. And as always, if you have a comment, question, suggestion or episode idea, feel free to drop us a line at podcast@echodelta.co.

 

Parents and Guardians of the Enrollment Galaxy [Webinar]

Watch a replay of our webinar discussion about insights and strategies you’ll need to influence parents in the enrollment journey.

Continue Reading

Who Took My Cookie? Driving Inquiries in a Changing Digital Landscape [Webinar]

Watch a replay of our webinar discussion covering some of the most important updates impacting ad platforms and outline how to adjust your approach to digital advertising to continue to drive results from your digital advertising initiatives

Continue Reading

Back to Campus Visit (But Not Like Before) [Webinar]

Watch a replay of our webinar discussion about how to build a singular campus visit that blends the best of on-screen and on-campus visit experiences

Continue Reading

Outliers: How Some Schools Grew [Webinar]

Watch a replay of our webinar discussion with two small universities that posted record enrollment numbers in 2020, despite the pandemic.

Continue Reading

Understanding Financial Aid Leveraging with Dr. Jimmy Jung

In this episode, we talk about financial aid leveraging—the common and sometimes controversial practice of using data-driven approaches to inform how schools use institutional funds to manage enrollment and tuition revenue.

Joining us in the conversation is Echo Delta data scientist and consultant, Dr. Jimmy Jung. As a higher ed veteran with over 20 years of experience, Jimmy has successfully led enrollment management at multiple institutions around the country and has consulted internationally in the areas of marketing, program evaluation, data analytics, and student success.

We start by covering the basics of financial aid leveraging—what it is, how it works, and the kinds of data schools use to inform their financial aid models. Then, we dig deeper into the impact of financial aid leveraging, including improving net tuition revenue, controlling discount rates, and how it can help schools uncover hidden insights about their institution and the students they hope to attract.

We wrap up by talking about some of the criticisms of the practice and hear Jimmy’s thoughts on the ethical and responsible use of financial aid leveraging.

This was a fascinating conversation full of essential information for anyone looking to understand more about financial aid leveraging.

Subscribe to the Higher Ed Marketing Lab

Apple Podcasts

Spotify

Stitcher

Google Podcasts

Transcript

Jarrett Smith:

You’re listening to the Higher Ed Marketing Lab. I’m your host, Jarrett Smith.

Welcome to the Higher Ed Marketing Lab. I’m Jarrett Smith. Each episode, it’s my job to engage with some of the brightest minds in higher education and the broader world of marketing to bring you actionable insights that you can use to level up your school’s marketing and enrollment performance. In this episode, we’ll be talking about financial aid leveraging. The common, but sometimes controversial practice of using data-driven approaches to inform how schools award financial aid.

Joining us in the conversation is Echo Delta data scientists and consultant Dr. Jimmy Jung. As a higher ed veteran with over 20 years of experience, Jimmy has successfully led enrollment management at multiple institutions around the country and has consulted internationally in the areas of marketing, program evaluation, data analytics, and student success. We start by covering some of the basics of financial aid leveraging, what it is, how it works, and the kind of data that schools use to inform their financial aid models.

Then we dig deeper into the impact of financial aid leveraging from improving net tuition revenue and controlling discount rates, to showing how it can help schools uncover hidden insights about their institution and the students they hope to attract. We wrap up by talking about some of the criticisms of the practice, and we hear Jimmy’s thoughts on the ethical and responsible use of financial aid leveraging.

This was an interesting conversation full of essential information for anyone looking to understand more about financial aid leveraging. Without further ado, here’s my conversation with Dr. Jimmy Jung.

Jarrett Smith:

Jimmy, welcome to the show.

Jimmy Jung:

Good to be here.

Jarrett Smith:

Jimmy, I would love it if you could just start off by giving us a quick snapshot of your work and expertise in higher ed and financial aid leveraging.

Jimmy Jung:

Definitely will do, Jarrett. Spent the last 20 years working in higher education. The first job I ever had was as an analyst creating these financial aid leveraging models and looking at the impact that it had on institutions, driving of course their revenue model up and impacting students. Allowing students to have a chance to go to college because financial aid leveraging gives additional aid to students. My passion began there. I saw the policy changes that financial aid leveraging can do to help students and institution. Now I’ve been doing it for 20 years for many different institutions.

Jarrett Smith:

Good deal. For those who might not be familiar with the term financial aid leveraging, how do you define that? What do you mean when you say that?

Jimmy Jung:

Financial aid leveraging is a pretty simple concept. Obviously, it gets a little bit more complicated in how an institution applies it, using institutional aid or financial aid strategies to impact enrollment and retention and attracting students. Something as simple as scholarship offer on the website is technically financial aid leveraging because you’re trying to drive some type of student behavior.

Jarrett Smith:

It’s interesting you kind of mentioned that because I know in the past you’ve told me, in a sense everybody has a financial aid leveraging strategy where they’re doing it, whether they realize it or not. Based on that definition, I see how you get there. But when somebody is engaging in the practice, kind of formally, of financial aid leveraging and really trying to use that as a tool to shape a class, can you give me a sense of what kinds of schools are using it? How is it actually done? I also want to preface that by saying, I realize that practices evolved over time and gotten increasingly more sophisticated. I think I just bundled like three questions up in one for you.

Jimmy Jung:

I’ll try to deconstruct it.

Jarrett Smith:

Okay. Thank you.

Jimmy Jung:

Bring me back to questions or pieces of it that I don’t catch.

Jarrett Smith:

Sure.

Jimmy Jung:

I think the first part was talking about what types of schools should do it. The answer for that is many types of schools are doing it. I think the key difference between a scholarship strategy that is put out there versus financial aid leveraging, I think the key piece is are you using data, right. A lot of times, in the past, scholarship strategy is based on what you hear from the market, “Hey, students that are asking for this,” or, “Hey, our competitors are doing it.” Then institution choose to do it. Financial aid leveraging begins where you’re looking at data and say, “Hey, is this scholarship having an impact? Is there something the data tells us how to structure the scholarships so it has impact on a yield where in students are applying.”

I think that’s the key point between coming up with a scholarship strategy versus the technical term financial aid leveraging, right. I’ve seen financial aid leveraging used at all different types of schools, private, public, small, large, medium, public flagships. Once an institution realize that they have data and they can use this data to look at their aid strategy, they often continue to do so and allows the school to really think about and develop reports to track how students are responding to the financial aid offer or the scholarship packages they’re giving students.

Jarrett Smith:

Right. What I take away from that is really it’s about making data-driven decisions. Kind of digging into the data aspect, for a minute, I mean, what kind of data are we talking about? It’s historical data, so how far back does it need to go?

Jimmy Jung:

Let’s tackle what type of data, because every institution in some way collects data. I mean, it’s impossible not to nowadays because you have these sophisticated student information systems developed by Oracle, Banner, Campus Solutions, just to name a couple. They’re very good at collecting data. Data have become cheaper to collect over time. Storage has become cheaper. Processors are faster. Data analytics has become a thing in the last five years. Institutions are sitting on [inaudible 00:05:57] and institutions collect a lot of data. The data that’s used for financial aid leveraging is no different than the data you use to track students for retention, track when students registers. A demographic data, the characteristics of students at your school is one big one. The admissions’ data, like get which students came or did not come to your school, whether or not they put it in a deposit, whether they melted.

When they applied, when did a financial aid package gets up to them. Finally, some financial aid data, whether or not they apply to the FAFSA, things like whether or not they will offer a scholarship and the loans and the state aid and the federal aid they qualify for are all important data that schools use for leveraging.

Technically speaking, we could go as far back as five to 10 years, depending on how stable you want to look and answering bigger questions of like, which academic program has been attracting students over time, right? Or looking at how effective the different changes in each tragedy were over time. To start up, you need at least two to three years of data. One, anything with data and trends, they’re kind of sometimes spurious, right. One year you’re up maybe 5%, next year, you’re down 5%. You want at least three years of data to solidify any type of strategy you’re putting up there, or at least look at whether or not there’s drastic changes in trend, because if you’re looking at just one or two years of data, the market can change that quickly, right. You don’t want to make big financial decisions on like one or two years worth of data.

Jarrett Smith:

Yeah. When you’re thinking about the data that you’re feeding into a financial aid leveraging model, how do you think about a year like we just had, right? 2020 is anomalous. 2021 could be equally as anomalous. How do you account for a year like that? How do you control for that so it doesn’t just like wildly throw off your model or should it wildly throw off your model? I mean, how do you navigate that I guess is my question.

Jimmy Jung:

Two thoughts. You’re right. I mean, nothing like this has happened before, at least in sort of data tracking history. The best way to go about it, and me and my colleagues in higher education has been talking about this like, “Hey, what kind of financial aid model? How do we offer this?” Based on my experience and what people have been talking about this particular year is that your market doesn’t necessarily shift drastically even in the pandemic. The students who knew about you still know about you and students who want to come to you will come to you. What we take a look at is maybe the economy, right? Higher education generally has a decent sense of how the economies impact a moment, right? One, the freshmen students, the incoming students, we leave them more price sensitive. You’re going to have to guess that some things are happening where the income is lower.

That’s given. It’s going to be lower. It’s going to disproportionately impact folks, parents, and families. They’re going to be more price sensitive. Discussion becomes, “Do you provide more aid upfront to attract students?” On the other end, the graduate end, is that in times of economic downturn, people are going to retool. Older students that are going to be getting a graduate degree who have been impacted negatively by this pandemic will want to retool. There’re government programs that sort of support that. You expect an expansion of your graduate programs.

Jarrett Smith:

Yeah. That’s an interesting point you bring up there. I came across a little data point the other day saying law school applications and admissions where way up year over year. A perfect example of exactly what you’re talking about, people going back to retool in times like this. Is setting aside sort of the current immediate challenges that we’re all facing and what does that mean for the models in the financial aid leveraging strategy? Just in general, when schools are adopting financial aid leveraging for the first time, what are some of the more common sort of frequent challenges that they tend to encounter as they think about doing that?

Jimmy Jung:

One of the most common one is even though we’ve been collecting data in the student information system, getting the correct data out is always the longest. It’s 90% of the work when you first start financial aid leveraging. Taking out the right data, making sure that the timing of the data is correct to make sure you have the most accurate model and working with financial aid and admissions offices to ask them those questions. “Hey, does this really reflect your admissions of behavior of your students and the financial aid offers are they correct? Is the timing right?” That’s the biggest upfront challenge.

The second piece is less of a technical piece. It’s the sort of campus cultural piece. Financial aid leveraging is not something that an institution goes in to likely because there’s investment, generally an investment of institutional funds. The senior level of the university started to be on board like, “Hey, here’s the risk. We’re offering more aid. We’re expecting this return. This is the model it’s built on. These are the assumptions that we are creating this model for. Getting buy-in from the president, the CFO, and even the VP of marketing, it’s going to be key in terms of a successful leveraging strategy.”

Jarrett Smith:

Mm-hmm (affirmative). There’s so many different types of schools out there and financial aid leveraging can be applied in so many different ways. I know historically financial aid leveraging has been used by private schools. It’s becoming a lot more common in the publics. Can you talk to me a little bit about how the leveraging strategy might change based on some institutional characteristics?

Jimmy Jung:

One of the most interesting part of financial aid leveraging, helping different schools developing financial aid leveraging strategy, especially for me, is that you never know what the data is going to show you. Doing it for over 20 years, there’s no one sort of financial aid leveraging model that fits another one. Our schools are unique. The markets in which they are sort of placed are very unique, right? A financial aid leveraging model shows that, right. Developing these models in the different types of schools, sometimes the same program at one school is not going to yield as good as academic program in another school for many reasons, right. You might not leverage off the same programs. You might not leverage off the same characteristics. Certainly, you have to think about the different rules that exist in terms of leverage strategy.

I think you mentioned public versus private. At a public institution, the leveraging model’s the same, but things you have to keep in mind are particularly different. Different states have different rules about how an institution can use its own aid. For financial aid, got to be sensitive to that. Of course the equity piece, is the model built that it gives everybody a fair chance at the aid of a certain characteristic, right. Of course at a public institution, you have to be politically sensitive about offering aid to out of state and international students. These are some of the additional things to think about in private versus public.

Jarrett Smith:

I want to go back to an interesting point that you made that the data is going to surface things that are unique to that institution that you did not expect. I think that is the value of any type of research that you’re doing, is surfacing something that you did not previously know, shining light on something. When you surface these kinds of insights, I imagine that there are times where the institution is kind of confronted with the truth or a reality that maybe they hadn’t thought of before. Can you talk to maybe some of those things that you’ve observed over the years? Any surprising insights that maybe have come out from any work that you’ve done with the school and kind of the sorts of things they’ve discovered along the way that were a little counterintuitive or unexpected?

Jimmy Jung:

Definitely. I’ll talk about my most recent example. This particular institution wanted to create a financial aid leveraging strategy to attract more freshmen students. In looking at that, we found out they’re doing extremely well in attracting freshmen student. A financial aid leveraging strategy is going to change how they attract freshmen students, right? That tool isn’t for me. Instead, it was one of their graduate programs, which they thought was doing extremely well in terms of leveraging strategy that would have worked better if there’s a change in financial aid strategy. For this institution brought into context like benchmarking, right?

They didn’t think outside of the institution. What is the standard historical yield for a public institution for their freshman class? When they took a look at that, they were like, “Hey, this is a core strength. We’re going to keep an eye on this and work on these graduate programs,” which they could gain a larger market share or more revenue from. That’s something they never considered just because they had this assumption that and what they’re hearing is, “Hey, let’s work on the freshman class.” That gave them more to think about. That gave them options to focus their institutional resources to do something that will have a bigger impact.

Jarrett Smith:

When we talk about leveraging, it’s obviously a very technical topic, but when you’re engaging in that sort of initiative, ideally, you’re going to have stakeholders from a lot of different areas around the school. You’re going to have financial aid at the table, obviously enrollment at the table. Even your institutional leadership marketing maybe at the table. As you engage in financial aid leveraging, does it have the potential to change the way those departments interact? Have you kind of seen it alter the way that those departments work together?

Jimmy Jung:

Oh, absolutely. Best cases of financial aid leveraging works when you get a lot of different departments working hand in hand. Of course the biggest two departments, admissions and financial aid. In traditional terms, admissions generally thinks it’s just as recruitment. Once we got the students to apply and admit it, we’ll hand it off to financial aid and financial aid offices think, “Okay, until they’re admitted or deposited, I’m not going to think about giving them a financial aid offer.” Leveraging works best when these two departments sort of collaborate with each other. “Hey, we’re going to meet the students. They’re going to get a financial aid offer,” but then it goes back to the admissions to maybe engage with the students and ask them the questions. “Hey, did you get a financial aid offer? Do you have any questions? How do you feel about it, right?”

To get the market sort of feedback, right. Because sometimes you make that phone call and it’s like, “Hey, a different institution offered me more and I love to come here, but your offer is less than this other institution.” You bring that feedback back to financial aid to see if there’s anything additional that they could do about it. That type of financial aid leveraging allows for a more market feedback if admissions and financial aid talk to each other. One more piece, which is the insights that you bring back when you’re doing financial aid leveraging changes marketing, right? The institution I just gave an example, for freshmen, we’re actually doing really well. We don’t need additional financial leveraging strategy there. It’s the graduate programs and marketing we’ll switch the focus from marketing potential freshmen students to these graduate programs. That type of feedback loop allows an institution to pivot their resources and allow different departments to really begin communicating and working together to achieve the institutional goals.

Jarrett Smith:

Yeah. That’s a really interesting point, Jimmy. It makes me think that especially so many schools are trying to be smarter and more data-driven about how they approach marketing. This is an interesting source of information and insight. I’m wondering if you can dig in a little more on the sort of broader impact that financial aid leveraging can have though. I mean, it’s certainly an important tool for driving net revenue, but it’s not only an exclusively about maximizing net tuition revenue, is that right?

Jimmy Jung:

Absolutely. That’s where I think financial aid leveraging gets a bad rap. Yes, I think maybe most important outcome, depending on who you’re talking about, who you’re talking with, is the net tuition revenue. It’s a data process, right. How with a lot of variables essentially provides a market demand study. Another way to look at it, it’s a market demand study that allows you to put financial aid on top of it to shift the behavior of students, right? The market piece cannot be discounted because that’s where you get the insights. That’s where an institution learns most about itself and its students it attracts, right? Like the topic we just talked about, how does it drive business process changes between departments. Those insights allows you to look at certain market segments and how they’re doing, right. Because a lot of times those are sometimes more important than the financial aid leveraging part, right?

Here are things that we could change or focus on without investing a lot of money to yield students. Here are some process you change. Do we want to work on that first before we invest a lot of money in to the leveraging part? What doesn’t get told is the times when you look at leveraging. Maybe the financial aid doesn’t get applied yet. It’s the marketing changes that come first because you get very good data about how certain academic programs are doing. They get very good data about whether or not you’re attracting enough students that you want to attract, right? Definitely the process itself is worth going through because it’s a market demand study. The market demand study then sort of flows into this, can that market be impacted by financial aid, right? If the answer’s yes, the outcome is a net tuition revenue, right?

Jarrett Smith:

I do want to talk at least a little bit about some of the criticisms financial aid leveraging has received over the years. For instance, there are those that would say the practice can put certain groups of students at a disadvantage. It could ultimately hurt diversity and access. Obviously that would run counter to the mission of so many schools. How do you think about that? How do you help the schools you work with navigate those issues?

Jimmy Jung:

That’s a great question. A couple of thoughts on it. The first thought is financial aid leveraging is a tool. It’s statistical analysis that tells an institution more about SAF. That’s a way of self reflection, [inaudible 00:19:55] if I was philosophical. It will allow institutions to really take a hard look at what students are looking for in their institution. The results of it is a choice, right? If anything, that’s a tool, whether or not it’s your iPhone or your car, how you choose to use it is going to be key, right? Financial aid leveraging is a big part of that. But the reason I fell in love with financial aid leveraging as a driver to benefit institutions and students at the first institution where I learned this is the president of that institution took a look at the leveraging model and he said, “Jimmy, how do we help inner city, low income students? Because it doesn’t look like our model’s doing enough.”

He made a choice and it was the mission of that school to provide opportunities for these students. He says, “We need to invest more. Your model is telling us that we need to invest more.” He made a choice to do that. But that was only possible because the model showed that these students wasn’t benefiting as much as it should. Financial aid leveraging is a choice, an institutional choice. If you don’t think the model is telling you that it’s gearing towards your mission, you can make these policy decisions to make it so it aligns with your mission.

Jarrett Smith:

That’s such a great point, Jimmy. What I hear you saying is the model is essentially values agnostic. It’s a spreadsheet. It’s a machine learning model. It’s a whatever fancy software package you want to run on it. But at the end of the day, it’s up to the humans running the model to decide how to use it and what they’re going to optimize for.

Jimmy Jung:

It gives institution leadership choices, right? I think that’s a very good thing in terms of planning in terms of making decisions about where your institution is headed and whether or not you’re aligning to your missions of values.

Jarrett Smith:

Well, Jimmy, thank you so much for sharing all of these, your experience and insights about financial aid leveraging over the years. Before we go, I’m wondering if you could just give any sort of parting advice for folks that might be listening to this and considering whether financial aid leveraging might be right for their school. Any advice you might give them?

Jimmy Jung:

Well, if you’re an institution that are looking to become more data-driven, to think about how financial aid is impacting your enrollment more deeply, potentially, right? From a data perspective, from market perspective, then doing the financial aid leveraging process is something that you want to do. It’s a choice, right? First, it’s a market demand study. Taking a look at which students are you attracting and how they are choosing your institution and then leading on the financial aid to see how that has impacted your enrollment in the past. There’re potentially thinking about policies that you might want to change looking into the future. Obviously, I’m a big proponent of financial aid leveraging, especially for institutions taking a data-driven look at themselves, thinking about what we could do to help institutions and students enroll.

Jarrett Smith:

Excellent. Thank you for your time today, Jimmy. I really appreciate it.

Jimmy Jung:

Okay. Thank you, Jarret. Great to be talking to you about this stuff.

Jarrett Smith:

The Higher Ed Marketing Lab is produced by Echo Delta, a full service enrollment marketing agency for colleges and universities of all sizes. To see some of the work we’ve done and how we’ve helped schools just like yours, visit echodelta.co. If you enjoyed this podcast, please subscribe and leave a review on Apple Podcasts. As always, if you have a comment, question, suggestion, or episode idea, feel free to drop us a line at podcast@echodelta.co.

 

elementary student learning online

K-12’s Hasty Retreat to Online Learning Will Leave a Lasting Mark on Higher Ed

K-12 students' experience of online learning may cause them to reject online college. Learn what went wrong and how higher ed can address it.

Continue Reading

Marketing Analytics: How to Build Your Team’s Credibility and Data Savvy

Digital marketing promised to make our marketing measurable with clear lines of connection between the dollars spent and the results produced. Yet, as any data-driven marketer will tell you, tracking and attributing marketing activities in the real world is far from straightforward.

Today’s guest argues that it is actually possible to get analytics right and that moving past “true but useless” metrics is essential for building marketing’s credibility within an organization. His name is Chris Sietsema. Chris is a 20-year agency veteran and analytics expert who’s advised some of the most recognized names in both industry and higher education.

Why is Measuring Marketing So Hard and How Do We Get it Right?

According to Chris, when people think about the challenges of marketing analytics, they tend to list things like budget, technical difficulties, and the inherent limitations of the available tools. While these challenges are real, Chris argues there are deeper issues that are often never discussed. Among these lesser-discussed challenges are a lack of well-defined goals, inconsistent processes and vocabulary around measurement, and marketing workflows that turn measurement into an afterthought.

In contrast, analytics-savvy organizations do three things exceptionally well. First, they place a strong emphasis on process and organization, which tends to clarify thinking and align the team’s analytics vocabulary. Second, they work hard to avoid “true but useless” scenarios by identifying meaningful metrics that will actually inform decision-making. Third, they carefully craft reports to align with the unique needs of the different audiences that will be reading them.

Building Your Team’s Analytics Capabilities

In Chris’s view, many organizations already have all the analytics talent they need to be effective. However, when managers are evaluating which team members to grow into analytics roles, they’re often faced with a common question: is it better to train a marketer how to do data science or train a data scientist how to do marketing? Chris suggests both are viable options if the person in question has the right characteristics. First, they must be curious and sufficiently motivated to learn the other side. Second, they must be stubborn problem solvers–almost to a fault. Third, they need to understand how their activities contribute to the organization’s bigger picture. “You don’t need a nerd with a calculator,” he explains, “You need a nerd that can eloquently articulate the core reasons your organization exists in the first place.”

Chris wraps up by describing a straight-forward 8-step process to make sure your next marketing initiative is set up for measurement success from the start.

Subscribe to the Higher Ed Marketing Lab

Apple Podcasts

Spotify

Stitcher

Google Podcasts

Transcript

Jarrett Smith:
You’re listening to the Higher Ed Marketing Lab. I’m your host, Jarrett Smith.

Welcome to the Higher Ed Marketing Lab. I’m Jarrett Smith. Each episode, it’s my job to engage with some of the brightest minds in higher ed and the broader world of marketing to bring you actionable insights that you can use to level up your school’s marketing and enrollment performance. In this episode, we’ll be talking about marketing analytics. If you’re like many marketers out there, you’re sold on the value of data-driven marketing, but actually, doing it in the real world is easier said than done.

Today’s guest argues that getting analytics right is not only possible, it’s essential if you want to gain credibility and prove the value of marketing. His name is Chris Sietsema, and he’s a 20 year agency veteran and analytics expert who’s advised some of the most recognized names in both industry and higher education. Chris starts by outlining some of the sneaky reasons measurement is hard, and then he explains the three things analytic savvy organizations consistently get right.

Chris then talks about how to cultivate analytics talent in your organization and he gives his take on the age old question of whether it’s better to teach a marketer to do analytics or teach a data analyst to do marketing. He wraps up by describing a straightforward eight step process to make sure your next marketing initiative is set up for measurement success right from the start. Chris is a deep thinker on this topic and he provides a ton of useful insight for anyone looking to do analytics better. Without further ado, here’s my conversation with Chris Sietsema.

Chris, welcome to the show.

Chris Sietsema:
Thank you for having me.

Jarrett Smith:
Absolutely. I’m really excited to dive into this analytics conversation. Before we get there, could you just give us a quick snapshot of your background and the work you’re doing today at Teach to Fish Digital?

Chris Sietsema:
For sure. It’s been about 20 years now that I’ve been in the digital marketing space, pretty much all I’ve done since graduating school. I started off by doing a little bit of everything, a little bit of search marketing, digital advertising, ton of email marketing. When social media came around, that was definitely kind of added to the repertoire. Over the past three years, I’ve really kind of focused on analytics because I feel that, while I love all those disciplines, it’s like picking amongst your children, who do you love more? Analytics is basically the layer over everything.

That’s really allowed me to still be involved in every aspect of digital marketing, but really help my clients focus on what all these numbers mean and what we can do with them and what we do next. I do a lot of audits, a lot of configuration projects, some training, stuff like that.

Jarrett Smith:
Very cool. Yeah, I mean, I feel like, to piggyback on what you just said, analytics is kind of where the rubber meets the road. You’re trying to distill it down into something useful and say, okay, now what? How do we act on this? Where do we go? I know we’re going to spend a lot of time talking about all of that, but I guess I want to headlong into sort of the core challenge, I think, will be familiar with anybody who’s tried to measure their marketing efforts. Digital marketing kind of carries with it the promise that everything can be measured, but the reality is implementing that in the real world is actually pretty difficult and not everything is as measurable as you would like it to be. It’s just not as straight forward. Could you talk to us for a minute about what makes measurement so challenging? Why is this so difficult to get right?

Chris Sietsema:
Well, first of all, I would say, for anybody who has had that conversation about measurement being challenging, and perhaps someone is talking to someone where the response is, what are you talking about? It’s easy. I don’t know what you mean. That person’s off the rocker. It should be hard. It’s supposed to be difficult. I think one of the reasons that, that’s kind of a tease is that, when you compare digital marketing specifically to other more traditional marketing, whether it be broadcast, or print, or whatever, direct mail, you’re right, it does kind of have that promise of being so much easier to measure, but when in practice, it’s not.

One of the reasons I think is that not every marketing program is uniform. We’re all a little bit different. We have different objectives. We’re working in different environments. Because of that unique quality of our own marketing program, the measurement program has to then reflect that as well, and thus the measurement program has to be very unique and specific to whatever it is that we’re trying to accomplish with our marketing goals.

When you look at the analytics space, are there standards? Kind of, there’s tools that everybody utilizes that are pretty consistent, but the manner in which we utilize those tools is completely frayed and totally different from one client to the next or one marketing program to the next. Part of the reason too, I think is that when we think about analytics, we think about a report, like we think about the finish line, or the final state. Where really, if we’re doing it correctly, we have to start way at the front and talk about, what is it that we’re trying to accomplish? What are the true goals that we’re trying to measure? What’s our business objective.

If those are undefined or poorly defined or not shared amongst the entire group of people that are working on a singular project, we’re in trouble from the start, and so is the measurement program.

Jarrett Smith:
Yeah. It brings to mind the idea that doing analytics right looks like a lot of things that don’t look like a report, if I can say that the right way. There’s a lot of activities that don’t look like a nice report that ultimately result in that hopefully, but there’s a lot of work to do on the backend.

Chris Sietsema:
Yeah. There’s a lot of background work for sure. There’s a lot of setting up of dominoes. In terms of like just the whole, I thought this was going to be easier, why is this so challenging? I’ve had that conversation a lot with a lot of people. How I equate it would be like, I’m sure the first time Orville and Wilbur Wright had their maiden voyage in Kitty Hawk. They brought a lot of promise for things like transportation and supply chain and exploration and measuring weather and all the different implications that flight has. But when you look at where we came from then and where we are now in aviation, we’re not even into jet engines yet in terms of like the comparison from analytics perspective.

We’ve got so much ways to go, and it’s a challenging thing. Despite all the promise that digital marketing has for measurement, we still have to pilot the thing, we still have a navigation, we still have to have a strategy, we still have to have equipment to get us from here to there. It’s a lot of work we still have to do yet. We’ve already come such a great distance so that’s a good thing.

Jarrett Smith:
Yeah. That reminds me of another sort of challenge here is that the tools that we’re using are changing. I know that we’ve talked a little bit, prior to this conversation, about Google Analytics is getting a major overhaul. We’re coming out with GA4. That’s going to kind of upend everybody’s boat, and of course, one of the most common measurement tools out there in digital marketing, so there’s a lot there. The other thing I think that’s tricky, that’s kind of higher ed … Maybe it’s not really higher ed specific, but institutions, they’re large complex organizations, and so just the people involved, right?

Getting appropriate goals set up on your website, it may be enrollment management that’s really super tuned into conversions on your online application, but it may be marketing who actually owns those properties and they may not always play well together so that even if you straighten out all the technical aspects, there’s still the human element that can make things really difficult.

Chris Sietsema:
Yeah, and with higher ed, which you and I have both have a ton of experience in, for some of these institutions that are massive, like you said, there’s a lot of different hands in the pot. One thing that I would say that’s a challenge for a lot of folks is that enrollment does analytics a little bit different than marketing does analytic, especially now with all the online learning, the manner in which students are actually utilizing the applications to access courses and syllabus and assignments and all those different types of things, and whether those things are being utilized properly.

There’s a lot of different ways to look at it, but I think that as long as we are kind of utilizing a common language or a common dialect with the manner in which we are measuring, that’s really key. Sometimes from department to department that language or that dialect changes, which can cause challenges. Still, part of it is just really kind of speaking the same language from a measurement standpoint. Oftentimes, that language, unfortunately, does not exist when you start to building the tools or building the processes and you have to create it and adopt it and make sure that everyone can speak that language.

Jarrett Smith:
That’s a really interesting point. I think one of the core traps that I see people falling into is that any analytics package, whether you’re looking at what’s available on your social media platform or what’s available out of Vanilla Google Analytics, or whatever tool you’re using, is that it’s going to serve up some things to you automatically. They may or may not be useful. You’ve talked to me about say something like time on site in Google Analytics being a problematic measure. Interesting to look at, but there’s a lot of nuance there and it’s not as straightforward as it seems. I guess my question to you is, where do you see the danger in focusing only on the things that we can easily measure.

Chris Sietsema:
It certainly limits us. The things that we can easily measure aren’t necessarily indicative of success and they don’t really provide a good representation for whether we reached our business goal or not. Some things like brand awareness, like how do you use Google Analytics to measure that? I don’t know that you can, quite frankly. When you have a specific goal that you’re trying to measure that the standard tools do not provide an easy answer for, you’ve got a few approaches. One is just measure what you can, which is not my preferred method, which is basically giving up. Let’s just deal with what we can, right?

The other method is to go to great lengths to configure measurement tools or processes that truly answer the most important questions. For example, to use that brand awareness, like maybe we don’t use a digital analytics tool set. Maybe we actually do a study with real people like a survey, right? Like unassisted brand awareness type study. The challenge there is that those are time consuming and expensive. Another approach would be, as opposed to just dealing what we have to deal with the tools we have, or we’re going above and beyond and finding the metric that we’re looking for no matter what, no matter how expensive it is and how time consuming it takes to get that answer, is to use more attainable metrics as a proxy.

For example, to use like brand awareness, perhaps we look at branded search volume, right? How often is our brand searched for on the Google and other search engines today versus when it was six months ago or a year ago or 18 months ago, those types of things, and that’s just one example, right? From an analytics perspective, oftentimes, we’re thought of as the pocket protector kids and the nerds and very left brain thinkers, and that’s totally fine. That’s what we are. But at the same time, analytics requires a great deal of creativity in terms of how to get the answers that we’re looking for to some of the tough questions that are being posed by leadership, by ourselves, by marketing managers, marketing directors, etc.

So, it does take a bit of creativity to think through like how we can find a real good answer that provides at least a clue as to how we’re performing in that respect.

Jarrett Smith:
To kind of serve as a reasonable proxy.

Chris Sietsema:
Totally. Yeah. That can be difficult because sometimes that, yes, that’s a proxy, but is it really a good indicator? Does it provide a factual representation of the question that we’re trying to answer rather? The most often discussed issues with analytics is that it’s expensive, it’s difficult to configure, or just, like you said, just simply not available. It’s not easily attainable. However, I think the least discussed problems, but more like symptomatic to the actual core issue, is that measurement is often an afterthought.

Sometimes I believe, at least in my experience, I don’t know if you’ve found this as well with some of your clients, not to name names, there’s a need to work backwards. What I mean by that is from a marketing and advertising perspective, we tend to think in a very linear fashion, like, all right, see audience, create ad, choose media for ad, deliver ad. Then at the very end, oh yeah, we should probably measure this too.

Let’s figure out how we’re going to prove that we were successful. When in reality, it’s often necessary to ideal or to identify the ideal scenario in which we can prove success, almost start with the number, or the metric that we’re trying to attain, or the answer that we’re trying to get at and work backwards a little bit from that. Now, should that sway your marketing strategy or your approach to how you put a message in front of your audience? No, but if you’re thinking about it from the outset, it can solve a lot of problems later on that often manifest when, oh, shoot, we should probably create a report for this, two months into the campaign.

Jarrett Smith:
Yeah. Well, and that allows you to really have that conversation around what metrics are truly going to be useful, how would this guide our decision-making if we knew this and we were able to measure this, and you could even ask yourself in advance. It’s like, okay, if we’re underperforming on this particular metric, once we launched this, what does that suggest? I mean, you could start to pressure test your thinking a little bit. Would we actually find that useful? Might we change our plans or activities? On the flip side, if we’re doing really great, what does that suggest? Might that also guide our efforts once we actually get into full swing on this campaign or whatever it is that we’re doing.

Chris Sietsema:
Exactly. I think once you’ve got everything configured and you’ve got a dialect and a language that you’re relying upon for this metric is going to tell us that, and we measure this way because it’s going to provide an answer to this important question. It’s a never ending process, right? A lot of what my work is, and what it involves, is once everything is configured from a marketing standpoint, sometimes a lot of times my clients will have me just hang out. Help us read the tea leaves a little bit here. Help us understand what these numbers mean, because as long as a campaign is live, as long as a website is live, as long as a social media account is active, so is analytics, so is measurement.

It’s a never ending process. It’s a never ending struggle to achieve better results than we did last time. I don’t know about you, but I often get asked the question like, what’s a good benchmark for X? What’s a good click rate or what’s a good conversion rate? The answer is, well, what’s the click rate or what’s the conversion rate now. Whatever’s better than that is the answer. Oftentimes, I think we rely heavily upon what the “industry” is doing. That doesn’t matter. Let’s just focus on us. The more time we spend on others, the less time we can focus on ourselves.

Even if you kind of achieved, let’s say the industry benchmark for open rate for emails is 25%, we have a 30% open rate. Does that mean we’re going to stop trying to get 35% or 40% open rate? It never stops. You always want to improve, improve, improve, and that’s part of the fun.

Jarrett Smith:
That’s a really interesting point. Just thinking about if you managed to get this right and you are able to really think through, in advance, what’s going to be the most useful information, that then really becomes a tool for building credibility within your organization for marketing specifically. I think one of the core challenges we have, and I think this is especially true in higher ed, where you’ve got a lot of PhDs. In a lot of cases, they’re a PhD in like a hard science, something that is super rigorous and empirical.

Marketing kind of comes in, it’s hard to nail down. It has this creative aspect to it if it’s done right. It’s part art, part science, and so I think it opens the door for people to kind of ride off marketing activities and discount what’s happening. Analytics can help build that credibility to say, no, the things we’re doing actually have impact in the real world, and these things that we’re doing matter and deserve to be funded and deserve to be expanded. What’s kind of been your experience with that?

Chris Sietsema:
My experience has been it’s a huge challenge, especially for … You work in higher ed quite a bit, and I do as well. When you have an audience, whether it be those PhDs or in a different industry, whether it’s the board or the CEO that definitely has chops and has earned her own credibility, it’s difficult to prove your worth from a marketing perspective, and I can totally get that. The metaphor I often think about when I’m talking about this topic is, do you know who Bryan Cranston is?

Jarrett Smith:
Yeah. Breaking Bad, best TV show ever recorded.

Chris Sietsema:
Exactly. My wife and I, we spent like six months a few years ago and watched the thing from start to finish, and it’s excellent. If you’ve never watched Breaking Bad, for your audience, he’s also the dad on Malcolm in the Middle, which also just exemplifies the range that, that guy has as an actor. He’s a terrific actor. Right?

Jarrett Smith:
Yeah.

Chris Sietsema:
We also started watching, not to take us on too big of a tangent, but we also started watching the show on Showtime called Your Honor, have you seen it or heard of it?

Jarrett Smith:
No.

Chris Sietsema:
Don’t bother. It’s trash. It’s the worst. It is so bad. It was such a let down, because he’s a fantastic actor and he’s excellent at his craft, but it kind of speaks to the point that if the script isn’t there, if the writing isn’t up to snuff, there’s nothing an actor can do to kind of salvage it. In a similar way, if the product isn’t great, the marketers can only elevate a fantastic product. We can’t turn a garbage product into like gold. The reason that, that is a problem in that the situation is marketers only elevate a product and we kind of struggle when the product is weak, is that issue exists due to this question that is posed, I think, a lot oftentimes by leadership, and that is, what if we just did nothing? What if we did none of this marketing stuff or any of this advertising or spent any of this budget on promoting ourselves, would we have gotten the same result?

Because that question exists, and I don’t know if you’ve faced that question or at least heard that question muddled subtly, that’s why analytics exists in some ways, in that we have to actually prove, that because of our efforts, we elevated this product, we improved the state of this company, this organization, this product, whatever it is, due to our efforts to promote and advertise and get the message out. Oftentimes, when we’re looking at measurement, we can run tests or we can run flights, or we can turn our marketing off and then turn it on and see what the results are and see what the differences are.

Chris Sietsema:
To build credibility, we really have to have analytics to show the change we’re making and the positive that we’re bringing to this scenario. It’s invaluable from a marketing perspective. The best moments of my career is when a client of mine or a coworker or whatever says, “Hey, Chris, I’m taking this report that we worked on and I’m bringing it to my next review,” because they can actually prove, with metrics, that what they’re doing is elevating, not only the product, but the organization and the overall company or the mission.

Jarrett Smith:
Well, and I want to circle back to something you said earlier, which is how important it is to agree upfront on what’s going to be measured. Then also, the idea of, and we all need to agree on what these things mean and the language we’re using. If you’re the one actually gathering the data and performing the analysis, that means you’ve got to walk into your bosses office and say, “Hey, I’m working with my marketing counterparts to conceive this new campaign. Here’s what we think we’re going to measure. Is this actually useful to you?” Is this actually useful to your boss? Are they going to find these things, yes, that’s great, or no, actually that’s kind of nice to know, but not actually that useful?

Then you can have that conversation of, okay, well, what can we do with that? What’s within our grasp that might be better. Maybe it is that compromise of, hey, the best we can do in this scenario, because of budgets and dimes and internal capabilities is to come up with a good proxy, but we can at least recognize that problem in advance and all agree that here’s the sort of sensible compromise we’re going to make.

Chris Sietsema:
Yeah. That meeting or that discussion with the measurement person or the analytics person, or whoever’s going to be pulling those numbers and whoever their reporting audience is, whether it be the director of marketing or the CEO, or the CMO, or whomever it is, is absolutely vital, because you have to understand from the get-go what it is we’re all trying to accomplish, so we’re all kind of aiming for the same target. Yeah, and sometimes those conversations do sound exactly like you described, where it’s like, hey, this is what we’re trying to achieve and this is how we’re going to measure the results of this effort, agree or disagree.

Other times it’s, do you agree that we are aiming for this target? Do you agree that this is the business objective, this is our quantified goal? We’re going to increase X by that percent, or we’re going to decrease the cost of that by Y percent, whatever it is. Do you agree? Yes, I do. Great. Then the metrics don’t get discussed until the report happens. Oftentimes, those discussions with stakeholders are all about the goal or what it is we’re trying to achieve. The actual metrics sometimes get brought up in conversation from the outset, but not every time, primarily because those stakeholders either don’t know or don’t care. I don’t care how you do it. Just make it happen. Prove that this effort is worthwhile. We should continue doing it or continue to augment it, etc.

Jarrett Smith:
As you were talking about the importance of the product, it reminded me of the classic four Ps of marketing, product, price, promotion, placement, and how, historically, promotion is just one part of marketing. Historically, there’s been a recognition that marketing can and should involve many other things. Obviously, as a marketing team, you’re not going to be deciding the next new great program that’s going to be unveiled at your school. I think it is good to remember that as a marketer, you do tie into product. You do have something useful to contribute and you have a perspective that your provost might be interested in. You have a perspective that the dean of a particular college might be interested in, that they wouldn’t have access to. Hopefully within your organization, it’s okay to express that perspective and lend a hand as they try to make the best decisions that they can.

Chris Sietsema:
Exactly. Yeah, I wish there was like a fifth P, maybe it’s performance. Because that absolutely needs to be configured into the whole marketing schema, for sure. The other thing that’s really rewarding about measurement and analytics from my perspective is that, when you kind of uncover those juicy insights or those nuggets in which we can kind of show our reporting audiences, whether it be, to use your higher ed example, the dean or the provost, it really is extremely gratifying, because for a lot of those people, they build a program, they build a university, they build a curriculum, and oftentimes, they don’t necessarily see how its intended audience reacts to that, unless that audience is in a classroom.

But up until that, we have to rely upon measurement to understand what resonates with the audience, what programs are really interesting, what aspects of specific programs are most interesting, that kind of thing. There’s a lot of ins and outs to the measurement process from providing leadership with visibility to the front lines, essentially.

Jarrett Smith:
Absolutely. Chris, we’ve kind of talked at length about all the various challenges of measurement and analytics, but I think you would probably agree that some organizations do get it right. I know you’ve worked with a lot of different folks, not just in higher ed, but across various industries, and it is possible, I guess, depending on how you define. But I guess, is there anything that you would bring to us, maybe as a common thread, or what kind of universal lessons might we apply? If someone’s listening to this and they’re saying, okay, I know we’ve got a lot more we could be doing, we could be doing better. Where should they kind of focus their energy? What are sort of the common threads that tie together the folks that are doing a good job?

Yeah. When we first started talking from the outset, we noted that one of the biggest challenges with analytics specifically is that there isn’t a standard. Every situation is unique. Every marketing program is a fingerprint, right? They’re all a little bit different. However, to your point, there are some common threads or common elements that I seem to find with successful measurement programs. The first is, and one of the first questions I even ask when I’m entering in a new analytics engagement is, do you have documentation? Is this stuff written down?

Because if they do have documentation, you can almost prove that there’s a process. If they don’t have documentation, you automatically are a little bit … It’s not clear whether there’s a true process or not. One of the mantras that I have used before, and you’ve heard me say this is, if it’s not written down, it doesn’t exist. Even the most simplest measurements, and the way that things are configured, and the way that we pull numbers for reporting and things like that, write it all down, because that’s knowledge that has to be shared, and it’s also knowledge that has to be referenced later on.

Chris Sietsema:
Focus on process and documentation, the two have to go together. The other thing I would note is that a lot of times when we think about the final report, we tend to try and think about, well, what are some of the other reports that I’ve seen, and what are the metrics that they utilized? It’s like, ugh, I don’t know, because oftentimes what happens is you really have to make sure that your reports reflect the business objectives that you’re trying to achieve. If your business objective is like, get more leads, whatever that number is, however that’s quantified, leads should be at the top of the report.

We really have to avoid including measurement tables or charts or graphs that provide true information, but useless information. All the information that we provide in our reporting should have some kind of actionable next step, like a yes, and. Something that we can read and interpret and then know exactly what we should do the next time we run a campaign, or tomorrow, or within the next hour They should all be actionable. Avoid true but useless scenarios, and really focus on actionable metrics and insights that we can rely upon to make stuff happen.

The other thing that I would say is that I think … I come from an agency background, and we were definitely guilty of this, and I don’t know if your agency has been guilty of this as well, but when we think about providing a report, I think we have the notion that more is more. Like, here’s your report fad.

Jarrett Smith:
Let me justify my hourly rate.

Chris Sietsema:
Exactly. Yeah, I really want them to think that I know, and I really want to prove that I worked really hard on this based upon the size of this document and the number of pages in this document. I say that’s baloney. The truth is that the reporting format should be aligned to the audience. If you are talking to the provost, or the dean, or the CEO, I’m guessing they have limited time. I try and make, as a rule, most of my reports that I provide, one page. Just one singular page, and that’s all you get, because it really forces you to eliminate those true, but useless scenarios, really focus on the actionable stuff, and get right to it. It’s kind of the whole be bright and be gone. Get in and get out. Be bright, be brief, and be gone, I should say.

Jarrett Smith:
It reminds me of the quote, well, at least the internet attributes it, I think to Mark Twain, about I would have written you a shorter letter if I had more time. The internet probably attributes that to Abraham Lincoln too, I don’t know, and Einstein, one of those guys.

Chris Sietsema:
One of those guys, yeah.

Jarrett Smith:
Smart dude back in the day.

Chris Sietsema:
That’s exactly it. I think the value that we provide to our reporting audiences is not the breadth of data, it’s the clarity of our next steps. It’s the interpretation that we can provide and say, here’s the approach that we should venture into next.

Jarrett Smith:
Yeah. For some reason, when you were saying that orange metaphor, maybe it’s because I’m in Florida, an orange juice metaphor appeared in my mind that the analytics person is the one squeezing the oranges and selecting them from the tree, but by the time it makes it to the executive level, it really needs to be orange juice concentrate.

Chris Sietsema:
Yeah. I’m going to use that. Yeah.

Jarrett Smith:
If someone’s listening to this, we do have a lot of higher ed leaders and folks in decision-making positions that are in charge of guiding teams and staffing teams, folks that listen to this podcast. I can imagine there are folks listening to this right now saying, okay, I hear what you’re saying, but I don’t know that I really have the right person on my team, or at least I don’t have the right role to find, and I think we’re going to need to develop our organization in this way if we’re going to do all these wonderful things that we keep talking about with analytics.

I guess, how do you sort of think about and advise people when they’re thinking about their own team and they’re thinking about expanding their teams’ analytics capabilities, what advice might you have for us?

Chris Sietsema:
You focused on the people aspect, so I will take that and run with it as well. Because oftentimes, when people think of analytics, it’s like, what tools should we be using? What software license should I get next? That kind of thing. Really, it starts with the human resources piece of it. What I’ve seen from my clients, and this is outside the higher ed space specifically, but with other industries, is that for those organizations that have an individual or individuals that are specifically dedicated to the discipline of analytics, some of them do great and some of them struggle.

When they struggle, they are almost too close to all the data. They’re too focused on all the data. It’s one of those, can’t see the forest through the trees type of thing. Or I would say, from a leadership standpoint, you probably have the right people on already. It’s just a matter of changing their view to understand your business and its objectives just as well, if not better, than they understand something like linear regression or something, like something very nerdy, right?

You don’t need a nerd with a calculator. You need a nerd who can eloquently articulate the core reasons your company exists in the first place. That’s where I see people struggle the most. It’s like, hold off the data just a little bit. What is it we’re trying to do here and focus all your energy and efforts on that. That’s where I’ve seen some analytics folks just struggle a little bit. They’re trying to fill a role from an analytics perspective, but not fill the role within the organization.

One thing I would say too, if you’re looking for somebody who would be a great analytics person, I definitely think a background in all those, all the math, and the science, and the art of it, and having good experience with all the tools is important. But the one quality that I see that’s consistent amongst analytics people is that they are problem solvers that are just insanely stubborn. The type of people that can not stop looking at something and trying to figure something out until it’s figured out. They examine it from every possible angle.

They’re persistent to a fault. I think that, that sounds negative, but it’s actually a good thing from an analytics perspective, because that’s where some of that creativity we spoke about earlier comes in, where if we can’t solve it this way, let’s look at another angle to solve the problem, which can be really great. Oftentimes too, especially with the agencies I’ve worked with or trained, or whatever, there’s a question about, all right, do we go find an analytics person or do we just take one of our marketing people and just train them to do analytics?

It’s like, well, I don’t know. It really depends, because it absolutely depends upon the individual that we’re talking about. For example, I work with some agencies like out in Arizona or California, or on the West Coast, Colorado, etc, where, hey, we have this person on our team. He or she is a great account manager and they really want to learn more about analytics. Then when you talk to that person, they just really want to be a great account manager, and there’s nothing wrong with that, but unless there’s a passion for the metrics and the ability to pull those numbers and configure all the data and do all the tagging work, really get your hands dirty and all that work, as well as kind of lead the meeting in which the insights are provided, they’re going to struggle a little bit.

There has to be a passion for both, if that makes sense. Because oftentimes, when we try and convert a person who doesn’t have a lot of analytics experience into something that is more of that analytics person, if that passion isn’t there, there could be a struggle there. Then the last thing I would say is invest in the necessary tools for sure. That’s not easy for everybody, but depending upon what questions we’re trying to answer, it might be worth whatever the budget requirement is to get that testing tool, or that analytics upgrade, or that tag management solution, or whatever the case might be.

But for an analytics person to really be powerful, not only do they have to have that creativity and the stubbornness, and the problem solving qualities, but they also have to have the right tools at their disposal to do the work.

Jarrett Smith:
Okay. I’m going to ask a question that may be so ridiculously broad that it’s not possible to answer it in a good way, so just tell me if that’s the case, but just thinking through the kind of digital marketing operations that you have encountered over the years, what does that tool stack typically look like? I mean, things come to mind like Google Data Studio, which a lot of folks over the years in higher ed have adopted. It used to be, not that many folks really knew how to use it. Now I think it’s pretty common.

You’ve got Google Analytics, you mentioned testing tools, but what else do you consider to be the basic tech stack for good quality digital marketing? Is that even a question that makes sense, or is it too particular to the organization?

Chris Sietsema:
No, it’s a good question. I think that there’s probably three categories of items and then there’s some extra stuff. There’s three primary categories and some wildcards. The three primary categories are whatever tool you’re utilizing to track and bring all the data and all the behavior that you’re measuring into one cohesive tool, whether that be Google Analytics or Adobe Analytics, or Google Analytics 360, which is the souped up version of Google Analytics, the non-free version, that kind of thing.

We talked about configuration before. Configuration basically equates to making sure that all of the behaviors that occur with an advertisement or on a webpage or a landing page or an app or whatever, that all those behaviors can be tracked adequately, and typically, for that type of thing, you need some kind of tag management solution. The most common one is Google Tag Manager, but there are others out there. Then those tags are basically little tracking codes that you place on the website to enable you to measure certain behaviors that occur that are important to you.

So, you’ve got your primary kind of analytics repository, your tag management solution. Then, once it’s time to actually prepare a report, you do need some visualization tool. For many, that visualization tool is Excel, or Mac numbers, or whatever. They basically pull those numbers out. That concept is not foreign to me. That’s how I used to do all my reports up until probably two or three years ago. I would pull the data and make it pretty on my own. But at the same time, some of these visualization tools, like you mentioned, Google Data Studio or Tableau. Oh my gosh, there are so many. I could through a rock right now and I have eight of them, all the Domos and the Power BIs and all those different types of tools. They’re out there. When you asked the question earlier, I don’t think there’s a right tool, but those are the ones I commonly see most often.

As long as you’ve got an analytics database or repository like a Google Analytics, a tag management solution, which you’ll likely need in a visualization tool, you’re good. Then there’s those wild cards for like call tracking, or landing pages, or AB split testing, all those different types of things. Then, to make matters a little bit more complicated, certain channels that we rely upon heavily, such as Google Ads, Facebook Ads, even like YouTube, they all have their own little dashboards and Facebook insights and things like that, that we need to marry in that kind of thing. That can be a little challenging, but those are the core tools I think that you probably have in your analytics toolbox.

Jarrett Smith:
Very good. Let’s bring this down to earth. Maybe someone is saying, all right, I hear what you’re saying. I’m on board. I want to do measurement right, and they’ve got a new project coming up. Maybe it’s a new micro site they’re putting up. Maybe it’s a new campaign that’s just getting kicked off, and they’re thinking, all right, now’s our chance to think ahead on this one. Can you give us some concrete steps that we might take to sort of experiment with doing analytics right on this particular project?

Chris Sietsema:
For sure. The first steps I think would involve something pretty archaic, like a whiteboard or a piece of scratch paper, honestly. Let’s just bring it back to basics, and those are the first thing we want to do, as we spoke about earlier, as we alluded to earlier, just, what is it we’re trying to achieve? Just write down the business objectives and quantify them if you can. That’s step one. With step one, that’s not an easy step a lot of times, right? We have to determine what those are. I will say, though, if you’re trying to accomplish 23 objectives, then you don’t have an objective. Let’s probably limit it to maybe three to five, maybe six max, something that can be accomplished.

Then write down, for every objective, the key questions that surround those objectives, what is it we’re trying to answer. If the objective is I want to generate X amount of leads in February, 2021, then there’s probably some questions that surround that goal such as, what is the source of those leads? What is the conversion rate of those leads? How are people accessing the website? How many times do they need to see the website before they feel comfortable with filling out the lead generation from? All those different questions you want to answer, right?

So, write those questions down. For every question you’ve got, you might have a handful of, maybe say eight to 12. Assign a metric to those. So, lead volume, conversion rate, lead volume by source, conversion rate by source, all those different types of things. Then prioritize those metrics. Which are most indicative of success? What are the most important metrics or the most important questions that we’re trying to answer? Now, all this time, we’re just sketching this out. We’re writing this on a whiteboard or a piece of scratch paper.

Now that we’ve got our questions identified, our metrics identified and defined, and then our metrics prioritized, now the report almost writes itself. It kind of configures itself. Where you can start to say, all right, because this is a top priority metric, this is going to be at the top of the report. These are nice to know and complimentary metrics and questions that are going to help us determine whether we accomplished our business objectives, are going to go next, and then I’m going to provide some notes and some interpretation on that.

The structure of that report is pretty much good to go. You got your metrics prioritized, you know the structure in terms of who your reporting audience is like that single page versus maybe a couple of pages, depending upon how detailed you want to get, and so on. You also, at that point, probably need to determine what the cadence or the frequency is with which you deliver that report. Oftentimes, that question comes up is, Chris, how often should I be preparing a report? Should it be monthly? Should it be weekly?

The answer is, how often can you enact change? How often can you make decisions and actually work with them and apply some of the insights that you’ve derived from your report? If you can do that every week, then you should probably do a weekly report. If you can only do that once a month or once a quarter, then change your reporting cadence accordingly. I would say, with all your reports, automate the data collection if you can, just because it’s easier and you can spend more of your time and resources on actually reading the tea leaves and looking at those numbers and try and understand what it is that’s going on.

So, automate if you can, but don’t limit yourself to just the automated metrics. You might need to dig in a little bit to, what’s going on in Google Ads? And why did that ad in Facebook work so well? Or why did that ad in LinkedIn Ads not work so well? Then, once you’ve got all your data configured and set to go, now it’s time to actually interpret that data. The greatest value you can bring is your ability to translate data points into meaningful insights and actionable next steps. That’s how we get paid as analysts. We don’t get paid for numbers. We get paid from the actions that come out of those numbers, and that’s the most important thing to remember with all that.

Jarrett Smith:
Really good advice and very actionable, Chris, thank you for that. If folks want to connect with you more, to talk more, geek out on all things analytics, where’s the best place to connect with you online?

Chris Sietsema:
A couple of places. I’m not too active in the socials these days. Linkedin’s probably a good place. Chris Sietsema, you can find me there. My email is chris@teachtofishdigital. Website’s teachtofishdigital.com. Those are probably the primary places to get ahold of me or get in contact with me.

Jarrett Smith:
Good deal. Well, Chris, thank you for your time today. This was such a fantastic conversation. Thank you.

Chris Sietsema:
Thanks for having me. Appreciate it.

Jarrett Smith:
The Higher Ed Marketing Lab is produced by Echo Delta, a full service enrollment marketing agency for colleges and universities of all sizes. To see some of the work we’ve done and how we’ve helped schools, just like yours, visit echodelta.co. If you enjoyed this podcast, please subscribe and leave a review on Apple Podcasts, and as always, if you have a comment, question, suggestion, or episode idea, feel free to drop us a line at podcast@echodelta.co.

 

8 Ways to Rethink Yield Strategy During the Pandemic

As data pours in telling us that admission applications and FAFSA filings are down at many schools around the country, enrollment managers are grappling with how to improve yield. Laura Martin-Fedich suggests strategies for rethinking yield strategy in these incredibly challenging times.

Continue Reading

Strategies for Engaging Parents During College Search

It’s no secret parents play a hugely influential role during the college search process, from helping students research schools to getting hands-on with the nitty-gritty of applying and visiting to deciding where to enroll, their impact spans every step of the journey.

Despite wide recognition of the importance of parents among higher ed marketers and enrollment managers, many schools struggle to implement a cohesive communications plan targeting the parents of prospective students.

In this episode, we sit down with Will Patch, Enrollment Marketing Leader, at Niche.com to talk about strategies for more effectively engaging parents during the college search. We start our conversation with a review of some key findings from a recent survey the Niche team conducted with parents, then we dive into concrete steps schools can take to engage parents more effectively.

Understanding the Mindset of Today’s Parents

Niche’s recent parent survey provides an informative snapshot of what’s top-of-mind right now for parents as they navigate the college search. Not surprisingly given the current global health crisis, parents are overwhelmingly concerned with their student’s safety, but Will is quick to point out parents define safety in broader terms, not simply in terms of how schools are managing the COVID-19 pandemic, and that other factors like reputation are still a top priority. Encouragingly, nearly three-quarters of parents reported being comfortable with their student living in a residence hall and 80% indicated they were comfortable with the safety measures being taken by schools.

The survey also reveals some interesting geographic nuances around parent involvement, both at the regional level and between rural, suburban, and urban communities. For instance, parents in the Midwest reporting being the most highly involved in their student’s college search.

Tips for Effectively Engaging Parents

One of the first challenges schools face in communicating with parents is simply gathering parent names and contact details, to begin with. Will’s advice is to schools is simple. If you want to capture parents’ information, you need to ask for it. To accomplish this, he recommends placing a set of optional fields on inquiry forms that allow students to enter separate information for their parents.

Will then outlines several ideas for crafting better communication flows to parents including specific topics to address, how to strike the right tone, and how schools might consider tailoring communications to specific segments of parents.

Links

Niche Parent Survey

Niche Enrollment Insights Blog

Niche Enrollment Research

Find us on:

Apple Podcasts

Spotify

Stitcher

Google Play