The mistakes we make and a better path with Rob Dwyer (Table Service 106)

Episode 6 April 01, 2025 00:41:41
The mistakes we make and a better path with Rob Dwyer (Table Service 106)
Table Service
The mistakes we make and a better path with Rob Dwyer (Table Service 106)

Apr 01 2025 | 00:41:41

/

Show Notes

Join host Jordan Hooker and Rob Dwyer, a contact center pro, VP of Customer Engagement at Happitu, podcast host, and customer experience thought leader, as they dig into topics like improving customer surveys, choosing and implementing the right technology, and some insights into getting the most value from AI. 

Want to connect with Rob? Find him on LinkedIn: https://www.linkedin.com/in/j-robert-dwyer/

Learn more about Rob's with Happitu: https://happitu.com/

Listen and subscribe to Rob's podcast, Next in Queue: https://www.youtube.com/@nextinqueue

Want to connect with Jordan Hooker? Find him on LinkedIn: https://www.linkedin.com/in/jordanhooker

Table Service is presented by Tavolo Consulting. Hosted by Jordan Hooker. Music by Epidemic Sound.

View Full Transcript

Episode Transcript

[00:00:02] Speaker A: Welcome to the Table Service podcast where we'll dish on all things support, success and beyond with the people and companies building the future of customer experience. Table Service is presented by Tavolo Consulting and I'm your host, Jordan Hooker. Rob Dwyer has been in the contact center space for nearly two decades, focused on creating exceptional customer experiences through training, quality and leadership development. Today, Rob leads client engagement at Happy to, who is using AI to fill the QA gap for contact centers. He's also the host of the podcast Nexinq. Rob, welcome to the table, man. [00:00:39] Speaker B: Thanks for having me. [00:00:40] Speaker A: Jordan, for our listeners who may not be familiar with you, I'd love if you could just walk us through a little bit of your story, tell us a little bit about yourself. [00:00:48] Speaker B: Yeah, well, I thought that I was going to retire in the mortgage banking world and then 2008 hit and I had to reconsider things, things. And I found myself making the savvy career move of going into the contact center space as an agent. Yes, quite savvy. But it turns out that you can grow and learn an awful lot in the contact center space. And I ended up traveling all over the world supporting sites and in 2016 I moved to a St. Louis based BPO called Customer Direct to lead their training and quality. And then since 2020 been on this journey with Happy to, which spun out of Customer Direct. And we provide a couple of different software solutions for contact centers. And what we're really excited about today is filling the gap that exists in quality assurance and has always existed in quality assurance because you just don't have enough people to listen to all the phone calls. And so we're utilizing AI to fill that gap and provide actionable insights so that agents can get really targeted coaching that actually helps them drive performance. [00:02:10] Speaker A: Awesome. Yeah. [00:02:12] Speaker B: Rock Chalk, Jayhawk. Just so that. [00:02:17] Speaker A: We'Ll definitely leave that in. There's some, there's some rivalry here. So folks, just in, just in case you didn't know. So friendly at least. Well, hey, thanks for sharing that with us. I, I know that for me as a support leader, I, I struggle with that same thing. I have a small team and I struggle with that. I can't imagine for a, an organization with a gigantic team what it would look like to be able to do that in a quality way. So it's exciting to see tools on the market that are being used for that, that are also from folks who actually have done the job. Whereas so many of the companies I encounter that are the next great AI solution for my quality assurance et. Cetera, they have never work desk a day in their life. They have no idea what we, what we deal with, what we think through, etc. So it's exciting to see folks who are in our domain building those tools and helping us do that job better. [00:03:07] Speaker B: Yeah, it's, it really allows you to have a different perspective about the actual problems that teams face at different levels and how we can approach those thoughtfully. And also what won't work. [00:03:23] Speaker A: Right. [00:03:24] Speaker B: Because there are some things that someone thinks is a good idea, but if they've never been in one of those roles, they don't really know. And it makes it really easy for you to go to development and go, no, let's not do that. [00:03:37] Speaker A: Yeah, absolutely. And I think that's the biggest thing. Let's have the voices at the table that know a little bit more about this than, than your average Joe who happens to come in and maybe know how to build some good software, but has never actually done this part of the job. Well, I know we got a few topics to talk about that I know you and I are both rather passionate about. So let's start with the first one, which is how and why we should listen to customer feedback. Let's just dig in here. So first kind of general question is let's talk a little bit about the difference between transactional and relational surveys. Which surveys clearly are one of our hot button topics here in support leader world. So let's start there. [00:04:16] Speaker B: Well, look, Jordan, you, I know, will relate to this because we all relate to this. We get surveyed everywhere. You get surveyed when you go to the grocery store and you get your oil changed and you really, when you buy anything, every time you get something on Amazon, you get surveyed. Anytime you buy anything, anywhere online, you get surveyed. I've even gotten surveys for products that never got shipped to me that I ordered. And then they're like, hey, tell us how awesome this is. And I'm like, there's a problem with this because I don't actually have it. This over reliance on us just throwing out surveys to get feedback leads to number one, survey fatigue. So I, as a consumer, I am so over surveys. I am over them because I don't trust that anything's going to happen. And this is the problem that you get when you're surveying everyone is you're so focused on throwing out the surveys and getting back the number, the results that you don't spend time understanding what is it that we can do differently to make a better experience? How can we serve our customers better through this process. And it creates a perverse incentive at multiple levels of organizations, but mostly at the level of the people that you're interacting with. Whether that's your support rep, whether that's the person at, I mean, dealerships, car dealerships are notorious for this. [00:06:08] Speaker A: Right. [00:06:08] Speaker B: And it infuriates me. They will ask you before you leave, hey, you're going to receive a survey, Please give me a 10. Why? Because their pay is actually based in part upon survey results. [00:06:27] Speaker A: Absolutely. [00:06:28] Speaker B: So what's happened is these people know that their own personal finances are going to be impacted by what you do, and instead of trying to ensure that you have a great experience, they beg for you to give a particular result. So what does that mean? It means I, as an organization, can't rely on that data because I've got people out there saying, hey, give me a 5 out of 5 or a 10 out of 10. We've gotten past the I want your honest feedback. Tell me how we can improve. And that's what executives are touting. Look at our nps, look at our csat, look at whatever. [00:07:12] Speaker A: Right? [00:07:14] Speaker B: That transactional survey has ceased to be useful. For the most part, a relational survey, a survey that is based off of a relationship, can be targeted, timed, and more meaningful. If I understand where in the journey that you are as my customer or potential customer, or lost customer, asking for real feedback. Tell me what was awesome about your experience. Tell me what sucked about your experience. [00:07:55] Speaker A: Where. [00:07:56] Speaker B: Where did you find friction in this? What can we do better? And I don't have to do that for every single customer because, let's face it, like, sample size. And this is going to sound funny coming from someone who's really focused on increasing sample size and quality, but sample size is a thing. But also, there's a certain difference between surveying a customer who. Let's just take an example. If I'm traveling, I'm getting ready to be in Nashville next week. If I go, for instance, to a grocery store and they survey me as a customer, I'm not coming back. I'm there for two days. Like, is my opinion valid? Sure, it probably is valid. But am I going to return based off of any changes that you make? No, I'm not. I'm not going to spend any more money. I am not going to appreciate you as a company any more than I do for that one interaction that I have. So why are you surveying me? You should be focused on your customers, the people that are likely to return over and over when service improves or may choose to find another option if they don't feel like you're doing a good job and offering a good service. I think companies just need to be much more targeted when they reach out to their customer base for feedback. And there are all these other listening posts that you can utilize. And so a contact center or support center is one of those. Like if they're reaching out to you, they're giving you information that you can use. So listen. [00:09:51] Speaker A: Yeah, yeah. I think that's a, that's an element that seems so simple yet so foreign to some people that the support desk is a treasure trove of customer feedback, customer sentiment, obviously at high volumes. We've got to figure out a good way to collect it. I understand that that part is a challenge, but it's just been in the last few years, I feel like I've really started to see companies start to understand our customers are already talking to us every single day. Why am I sending them a survey? And it's mainly because I just want the number that, you know, I can go tout. I have an, you know, 95% NPS. Yeah, sure, okay. I believe that. So it's interesting that that's, that's something that is, seems to capture, is something that I think will make a difference because that feedback I feel like is also so raw and unrefined because it is in the day to day action of the experience. Not. I had some time to think about it, I was mad, but I had some time to cool down. So I'm going to be a little bit more professional, especially if we're talking in a B2B kind of setting. Whereas in the moment you're getting that raw unfiltered feedback, which is probably much more valuable than the nice little thing you're going to get on your, your survey back if they respond at all. [00:11:12] Speaker B: Yeah, absolutely. And you can understand from future behavior things like sentiment predicting churn, for instance. So you can see this in those interactions that are already coming to you without putting the onus on the customer to do yet something again for us, why do we do that? It's. I'll tell you why we do it. It's because people want a number. Because numbers are easy to understand. They go up, they go down. But the real value in getting any kind of feedback is understanding what can we do differently? How can we get better? [00:12:03] Speaker A: Absolutely, absolutely. So what do you think in terms of this wider thing? What are some other things we could do if, you know, if a company was like, you realize we've been doing this wrong, we're not getting any of the value. What are some of the options that companies could consider to help them do this better? [00:12:19] Speaker B: Well, I think number one, evaluate. If you survey why, what's your goal behind that? If your goal is to tout a number, then I mean, don't change what you're doing. That's what you're getting. Sure. If you really do want to improve the experience, start investigating how you can utilize other listening posts and start investigating how you can be more strategic in how you execute whatever survey function that you have. I will tell you, if you send out a survey after every single interaction, it's a waste of your time. I'll give you an example. I used to rent an apartment. I had a service call for an issue with my blinds in my spare bedroom. I don't remember exactly what the nature of this was, but they had someone come out and it was the on site maintenance guy. He had been to my place before. Nice enough guy. Wasn't able to resolve that issue because they needed to custom order the blinds. But you know, he could get the ball rolling. Then they send me a survey and it was an NPS survey. How likely are you to recommend based off of your recent experience with our maintenance person? Recommending where someone lives for me is a pretty, it's a pretty big deal. You're going to be spending a lot of money. This is not an item you pick up at the grocery store. And my recommendation is going to be based off of where maintenance is. On the punch list of things that I would consider relatively low. Is it important? Yes. If you live in an apartment, having a responsive staff with good maintenance, something's broken, come out and fix is important. It's not the most important. [00:14:20] Speaker A: Absolutely. [00:14:21] Speaker B: There are a wide range of other things that I would consider. First, let's, let's start a budget location. These are things that you would obviously want to consider and what are your needs in an apartment complex? Because this may or may not fit your needs. Maintenance is like it's way down there. So this is just the kind of nonsensical things that companies are doing and they bought into this. Well, we need, we need feedback, we need survey responses. We're literally in an apartment complex. I visit the office. When someone comes to visit, just say, hey, do you mind if we take five minutes of your time and ask you some questions about your experience here and you could identify the really long term tenants, people who have stayed there for a long time and figure out what is it that has kept you here versus going somewhere Else. And why would I survey someone that has been there for two weeks? Maybe I want to understand their onboarding experience. And by onboarding experience, I mean their move in experience. What was that like? Could we do a better job of that? If that's where your focus is, great. But if you're just saying, hey, would you recommend us? Waste of time. Waste of time, right. [00:15:45] Speaker A: Have I had enough time to actually evaluate that in a way that allows me to reasonably tell you that? And if it's just centered on that narrow slice of the maintenance team or the amenities or whatever it may be instead of the overall experience, it's definitely not going to tell you anything. One that I think I've experienced since similar apartment situation like maintenance teams would come. Thankfully, the company got better at actually just asking, how was that experience? Did we resolve the issue? Was our staff member friendly? Were they helpful? Like all of those pieces? And those are great questions for this very transactional thing that needs to happen, but definitely not in terms of would I recommend it? Because the maintenance staff is good. Although having lived in some apartments with some really big bad maintenance staff, it's potentially higher up on my punch list than might be for some other people, which is fair. [00:16:40] Speaker B: I don't want to minimize how important maintenance is in an apartment complex. Is important, but it's. Is it top five, Right? [00:16:50] Speaker A: Depends. Yeah. Maybe it's top five for the person who had a really bad experience at the last place. But if you don't know your customers and you don't know why they came to you, how do you know that? [00:17:00] Speaker B: That's. Yeah, that's a great point. [00:17:02] Speaker A: Yeah. Well, let's keep focusing in a little bit. I know maybe this sounds a little more in the negative side of things, but shifting to a slightly different conversation around technology and customer service and customer support. What are mistakes that companies make in terms of this tooling and the things that they do with technology in this space? [00:17:22] Speaker B: I already hit on one aspect of that, and that is not asking yourself, honestly, why? Why are we doing this? What are we trying to accomplish before we jump in? You and I both know that AI is the shiny new thing. And it's in everything. It's in things that sometimes I go, why? Why do I want AI in my refrigerator? I don't think so. I don't think so. But it is in all of the things. Now question is, what purpose is it serving? Is it better equipped to solve that challenge than what we relied on two years ago, three years ago? And I don't think enough Companies are asking what they're trying to achieve. They really are just in this fear of missing out FOMO moment where they're like, we need to bring AI in. Because maybe the CEO keeps hearing about AI and asks his, his technical team or his CTO or her cto, why aren't we adopting AI? Don't we need to be adopting AI? [00:18:45] Speaker A: Right. [00:18:46] Speaker B: And then the CTO is going, sure, we can adopt AI. [00:18:50] Speaker A: Right. [00:18:51] Speaker B: Let me sift through the tens of thousands of vendors that say that they've got some great AI solution and we'll pick one. That's a real struggle. So understanding what it is that you're trying to achieve is first and foremost. [00:19:11] Speaker A: Absolutely. [00:19:12] Speaker B: I think the next thing is understanding or setting expectations for how a relationship with a company and a product is going to go. So there's a couple of things that fall into that. The first is unrealistic timelines and unrealistic expectations about how much effort will go into implementing a new tool are often a source of failure to implement. [00:19:44] Speaker A: Absolutely. [00:19:46] Speaker B: If I think that this is going to take two weeks and the timeline is really three months, things are going to go poorly. [00:19:56] Speaker A: Absolutely. [00:19:57] Speaker B: If I need to allocate resources to help with this implementation and or oversee after the fact, I need to understand that before we get started. Because if I think it's just plug and play and then just magic is going to happen, like the magic elves are going to come out at night and they're going to do their thing and then profit, that's not going to happen. So you need to understand what really will go into that and how easily it's going to fit into the rest of your tech stack, which is not always an easy thing. Integrations get easier and easier with public APIs, but there are tons of companies that are relying on older technology that it's going to take a lot of work to get those to talk to each other. [00:20:51] Speaker A: Yeah, absolutely. I'm chuckling a little bit over here because by the time this episode comes out, I will have made or missed a deadline for implementing a large new software product. So I am definitely, definitely moving right along with you here on this. This thought is that challenge of, you know, I have estimated this amount of time. I have some experience with this type of tool. I assume it's going to take this amount of time. I've given myself a little buffer. Did I give myself enough? Join us next time to find out. Like, it'll be that. That level of things. But you're absolutely right. Like, really thinking through those pieces and I often think that young support managers in a role where they've been given an opportunity to implement, say, Intercom or Zendesk or a new tool who've never done it before, they have no idea how long it's going to take. And that's one thing I love about groups like Support Driven and CX Accelerator that you and I are part of, because now you have this community where you can go, hey, I'm going to implement Zendesk. Here's kind of what I need, here's what we do. How long should I expect? And you can actually get some good answers because Zendesk is not going to tell you the actual amount of time it's going to take to implement the product, nor is Intercom. But people who have done it before are going to tell you that. So I think that's a blessing in these days of having those kind of communities. [00:22:07] Speaker B: Yeah. I also believe that understanding the type of company that you're working with and what kind of support you can expect from them. So you just mentioned two huge names in the support world, Zendesk and Intercom. Are they going to give you support in that implementation phase? Yes, sure. What does support look like after implementation? Probably depends on how large of an organization you are. [00:22:38] Speaker A: Absolutely. [00:22:38] Speaker B: It depends on how much money you've got to pay them. And you have to consider your budget, your time that you have internally to support and what kind of. What kind of help you're going to get. I think for smaller organizations, it often is better for them to work with smaller vendors because they will be able to get more support at a price point that is amenable to them. [00:23:11] Speaker A: Yeah. [00:23:11] Speaker B: If you are a tiny little organization, you got five support agents and you expect the world from Zendesk, good luck. You're not going to get that. [00:23:21] Speaker A: Absolutely. [00:23:22] Speaker B: You are not looked upon as anything more than a number to an organization the size of Zendesk. Unfortunately. And I don't mean to say anything bad about Zendesk, that's just the reality of any large enterprise. [00:23:36] Speaker A: Yes, yeah, yeah. And I think that that that is a key thing, not necessarily an element I have fully thought of. You know, we're gonna need support. And how important is this technology that we're putting into our tech stack? What would happen if it went down? And if I know that it's gonna take a business day to get the support that I need, maybe that's fine. Maybe it is technology, where you go, that's a big deal, no issue. If it goes down, we have other things we can do other ways, we can focus. But if it's your support desk and it goes down, I really can't wait till the next day when the support reps who are assigned are able to get to it. It's gotta be today. So I think you have a very valid point there of, you know, and I, unfortunately, I think a lot of these companies, when they go sell, they don't think, they don't, they don't communicate that because that maybe wouldn't reflect well on them. And I understand it's hard to communicate that, but so I guess newsflash for, for support leaders out there, you might have some problems with support. So think about that as you, as you make these transitions. Definitely a critical piece to be thinking about. [00:24:41] Speaker B: Yeah. And at the same time, if you're a large organization, you probably want to ensure that the vendor that you're working with has the scale and the capacity to support your needs. Because if you have a thousand or more support agents, a really large organization, you're going to be taking a little bit of a gamble going with a vendor that's never supported anything of that size. [00:25:12] Speaker A: Right. [00:25:13] Speaker B: And you may find that they're just not ready for it. They don't have the support structure internally to support you as you go through this process. So I think it's about finding a good match and asking the right questions as you vet different technology vendors. Are they going to be right for me? [00:25:36] Speaker A: Yeah, absolutely. Are they going to be right for the space we work in, for the needs we have? It's very easy, I feel like, to go buy the shiny new thing, but it may not be the shiny new thing that's actually good for your organization today. Yeah, absolutely. Well, you mentioned kind of in that conversation we, we talked briefly about AI, which of course is the, this shiny new topic for all of us. So let's talk just a little bit about AI and humans working together in customer experience, customer support, customer success, wherever inside of this large tent that we, that we live in as cx. Talk a little bit about how that can be done. Well, I think that's a thing that a lot of us are not quite sure what that should look like. [00:26:23] Speaker B: Yeah, I think we're all still figuring it out, but I do have some thoughts about what works well, what you might want to avoid, and some specific use cases where I think AI actually solves some pretty incredible challenges. So let's talk about what works well. What works well often are things that are internal facing and don't have anyone trying to maliciously break what you are doing. So what do I mean by that? There's a big difference between using generative AI to guide your agents internally, to provide them knowledge that they need to help a customer, versus giving that access to a customer directly. The motivation for my agent is, I hope if we've hired the right people always to help my customer and they will recognize, ideally, if things have gone off the rails, which is possible, they'll also be ones to report that and say, hey, we've got a potential issue here. I think, Jordan, you should look into this because I don't think this is functioning the way we expect it to function. [00:27:44] Speaker A: Hmm. [00:27:45] Speaker B: When I give that access to a customer, some customers may just want to see for the heck of it what they can get that bot to regurgitate, to spit out. And that could be dangerous. It can be dangerous from a brand reputation standpoint. It could be dangerous in that if it gives inaccurate information, what's the monetary value of that information and am I going to be on the hook for that as an organization? And who do we hold accountable for that? Which is a really interesting question that I've explored before with some really interesting possibilities. But I just think that you have to be cautious. I'm not saying don't use customer facing AI, but be cautious about how you do that. What I think for support in particular, if you have a global customer base where AI does things that are really hard to replicate with people, is when it comes to language support. [00:28:58] Speaker A: Right. [00:29:00] Speaker B: So I think Klarna is maybe one of the better examples of this. I can't speak to the efficacy of the interactions that customers have with their AI support. What I know is Klarna is trying to support dozens of languages. And for a global organization to support dozens of languages, that means a lot of people. Especially if I want to have 24, 7 support. Just because I have one person that speaks Czech is not enough. Because I need to cover 24 hours a day, seven days a week, 365 days a year. So I need way more than that. And if I have a very small customer base that's Czech, that gets really expensive on a per customer basis. And it's often a challenge for support centers and contact centers once they have that global reach. AI speaks Czech just like it speaks English and it does it really well and it does it really quickly and it's always available. So I do think that there is a place. Another example is just instant translation. So we are seeing now the translation speeds. Look, we've had Google Translate for a long time. Google Translate is good enough in certain situations, but no one should rely on that for consistent translation. It just struggles with idioms. AI is way better at that. And so you can use that to support your humans who are actually offering the support so that they can then work in those different languages. So there's some really interesting opportunities that I think allow the people who are already great at supporting your customers to become super at supporting your customers or support customers that they couldn't support in the past. [00:31:13] Speaker A: Right? Yeah, absolutely. Yeah. I think that the thinking about AI so much has been how do we get this to deflect tickets? I think that's what. Which goes back to the. This pressure is likely coming from the C suite. Let's push down. It's about numbers, which I understand they're important. They keep the lights on. Very critical for us to think about. But at the same time, are we quite ready to let AI do that? And I think in most instances we are not. But there's some space, like you just mentioned, with languages, particularly, where I think there's a lot of room to support agents, have someone that can, you know, go along with them, to help them do their job well, and then also to serve a customer base that might be a challenge due to a language barrier, etc. So I think those are. Those are great examples of where we could begin to utilize that, get more comfortable with it, and then maybe we can figure out what's next instead of trying to get to the what's next way too early in the process. [00:32:14] Speaker B: Yeah. I will tell you one of the things that I find fascinating too, in you don't see this as much in support as you do in contact centers that are B2C. [00:32:25] Speaker A: Right. [00:32:26] Speaker B: But in B2C, there's often this language support that I was just talking about. But even if I have humans, so I. I don't speak Spanish. Jordan. Un poco, un poquito. I don't. You know, I can order a coffee if I'm in a. In a country where they don't speak English. [00:32:45] Speaker A: Right. [00:32:46] Speaker B: But I can't really do much that's very useful in the business world. If I listen to a phone call that's completely in Spanish between agent and customer, little tiny bits, maybe I can. I can pick up. But if I have a transcript in English now, all of a sudden I understand what. What this whole interaction was about. And if my agent is bilingual, we can still have a really valid coaching conversation about their performance. And I will tell you, as someone who has led a quality organization with bilingual agents. I would always rely on someone else to provide input when we listen to a call that was in Spanish because I just don't know. Now I can know. I can get a transcript and I can follow along and still hear things like intonation. Even though I am reading words in English and hearing the Spanish, I can still hear tone and demeanor and the customer's attitude and all of those things. So I do think there is again, room for AI to really empower people in support, in service, in sales to do the things that they've been doing. Just do them even better. [00:34:13] Speaker A: Well, as we kind of start to land the plane here, would love to just take a moment. Any, any closing thoughts, any. Anything you would love to share with our listeners that we haven't talked about during this conversation? [00:34:24] Speaker B: Yeah, I'd like to just quickly circle back to our conversation about surveys. And one of the things that I think is really exciting today is what has happened with sentiment analysis that we can do on a transcript, on a phone call, and the signals that we can pick up from that. For a business to be able to understand when things went south at the end of a conversation and be able to quickly go back and say, is this, is this a customer I need to follow up with? Because maybe my agent didn't know how to help them. Maybe my agent was having a really bad day and they just didn't handle things right. Maybe this customer just needed time to calm down. The agent did a good job, but the customer just was too hot and they hung up. And I need to give them some time to cool off, but I want to make sure I come back to them. Those types of things, a survey is just going to get you a bad number. That's what's going to happen. [00:35:33] Speaker A: Yep. Right. Very true. [00:35:34] Speaker B: But if I'm paying attention to the signals that they're providing me and then following up for that service recovery aspect, that's really powerful and being able to do that before they publicly go to Google or Yelp or G2 or whatever platform that your company might get rated on before they hit that one star or two stars, because there are problems with those rating systems as well. But right now, that's the world that we live in. If you can get in touch with that customer and have an opportunity for service recovery before they fire off that angry one star review, man, that can save you not only a ton of grief, but it potentially means that you're not losing revenue that you're going to lose to your competitors. Because your rating just went below them, right? [00:36:42] Speaker A: Yeah, absolutely. [00:36:43] Speaker B: So the more we focus on the data that we're already getting, I think the better off that we're going to be. [00:36:52] Speaker A: I think that still goes back to this. We've been collecting so much data for so much time and doing nothing with it. It's like a drug. It's hard to get off the data coming in. What do we actually need to do with it? So I appreciate that thought. I agree that this is particularly sentiment analysis and thinking about this at a deeper level is a great way to actually get progress from this feedback we're collecting and the things we're doing. Well, Rob, thanks for taking the time to talk with us. As we kind of wrap up here, we'd love to just share with our, with our listeners how if they wanted to reach out or learn more about you or connect with you, how could they go about doing that? [00:37:29] Speaker B: Well, Jordan, you know exactly how they can connect with me. I'm on LinkedIn way too much so everyone can find me on LinkedIn. I think I'm the top result. If you just search for Rob Dwyer, I have a nice black and white photo with a light blue background. That's me. [00:37:47] Speaker A: Perfect. [00:37:47] Speaker B: You can always Visit [email protected] if you think that we might be able to help you. If you're in the contact center space, we'd love to talk about how you could, we could help you there. But yeah, I'm always interested in talking to people on LinkedIn. So even if, even if can't help you, don't be afraid to just say hello and pick my brain. I love talking with people. [00:38:11] Speaker A: Well, I'll make sure to include those links down in the show notes below as well as a link to your next In Q podcast, which if you enjoy Table Service, I'm confident you'll enjoy the conversations over there. So I definitely encourage our listeners to head over there and subscribe and listen to those great conversations. Well, Rob, thanks again for joining us here at the Table. [00:38:29] Speaker B: Yeah, appreciate it, Jordan. [00:38:32] Speaker A: Thanks for listening to the Table Service podcast. You can learn more about today's guest in the show notes below. Table Service is presented by Tavalo Consulting, hosted by Jordan Hooker, music by Epidemic Sound.

Other Episodes

Episode 5

March 25, 2025 00:32:07
Episode Cover

Navigating the AI Revolution with Conor Pendergrast (Table Service 105)

Join host Jordan Hooker and Conor Pendergrast, a veteran customer support and success leader as they discuss how to navigate the AI revolution. How...

Listen

Episode

February 17, 2025 00:19:49
Episode Cover

Breaking Silos and Driving Value Across the Customer Lifecycle (Lara Barnes) - Table Service 002

Join host Jordan Hooker, Principal Consultant @ Tavolo Consulting, as he discusses breaking silos and driving value across the customer lifecycle with Lara Barnes,...

Listen

Episode 7

April 09, 2025 00:27:56
Episode Cover

Scaling support from day one and beyond with Neil Travis (Table Service 107)

Join host Jordan Hooker and Neil Travis, Head of Customer Experience at the Academy to Innovate HR and founder of Growth Support, as they...

Listen