Skip to content

Cart

Your cart is empty

Continue shopping
Bonus Episode: Defining Success Before Launching Anything with Tim Wilson
Jun 12, 20252 min read

Bonus Episode: Defining Success Before Launching Anything with Tim Wilson

Tim Wilson is the Head of Solutions at facts & feelings, a consultancy focused on helping organizations put their data to productive use through clear thinking, aligned teams, and actionable insights. A seasoned analytics leader, Tim brings over two decades of experience across enterprise BI, agency strategy, and digital analytics to help brands translate complexity into clarity.

Before co-founding facts & feelings, Tim led analytics practices at multiple agencies, advised Fortune 500 companies on digital data strategy, and built out BI infrastructure at a $500M B2B tech firm. He’s also the co-author of Analytics the Right Way: A Business Leader’s Guide to Putting Data to Productive Use and co-host of the long-running Analytics Power Hour podcast.

Whether clarifying what “success” really looks like before a new feature launch or helping teams choose the right level of analytical rigor for a given decision, Tim focuses on making data work for the business, not the other way around. He offers a practical framework for leaders overwhelmed by dashboards, and a philosophy for analysts who want to be more than just report generators.

In This Conversation We Discuss: 

  • [00:39] Intro
  • [01:15] Shifting from in-house roles to agency work
  • [02:16] Highlighting the cost of overbuilding tech stacks
  • [04:36] Pushing back on data-only decision making
  • [07:13] Avoiding narrow ad metrics that mislead growth
  • [10:08] Using AI to scale low-effort interactions smartly
  • [12:38] Translating ideas into testable hypotheses
  • [19:02] Differentiating high-credibility opinions in UX
  • [20:00] Using split tests to validate costly changes
  • [21:14] Skipping tests for clear conversion blockers
  • [23:32] Filtering user recordings for CRO opportunities
  • [26:13] Using logic when data can’t prove causality
  • [29:39] Measuring what actually matters in performance

Resources:

If you’re enjoying the show, we’d love it if you left Honest Ecommerce a review on Apple Podcasts. It makes a huge impact on the success of the podcast, and we love reading every one of your reviews!

Share

Transcript

Tim Wilson

Organizations have this tendency to chase more data when they should be thinking more about what problems they're trying to solve.

Chase Clymer

Welcome to Honest Ecommerce, a podcast dedicated to cutting through the BS and finding actionable advice for online store owners. I'm your host, Chase Clymer. And I believe running a direct-to-consumer brand does not have to be complicated or a guessing game. 

On this podcast, we interview founders and experts who are putting in the work and creating  real results. 

I also share my own insights from running our top Shopify consultancy, Electric Eye. We cut the fluff in favor of facts to help you grow your Ecommerce business.

Let's get on with the show.

Chase Clymer

Hey everybody, welcome back to another episode of Honest Ecommerce. I'm your host, Chase Clymer. And today, I'm welcoming the show Tim Wilson. Tim is another Columbus, Ohioan. Coming to us as the co-founder and head of solutions over at facts & feelings.Tim, welcome to the show. 

Tim Wilson

Thanks for having me. 

Chase Clymer

Well, I know there's at least 8 people that listen to this show from Columbus. So let's shout out where we met at the DAW, I believe is what it was called. Meetup that's about once a month. 

Tim Wilson

It's a Data Analytics Wednesday. Been running for 17 years here in Columbus.

Chase Clymer

Absolutely. And before we get into what I saw there, because it's what we're going to talk about today, I guess, give us a quick background on your history in space. 

Tim Wilson

Sure. So I'm approaching a quarter of a century in the analytics space. Cut my teeth in the world of digital analytics. Actually, when I was still living in Austin, which still has a good chunk of my heart and soul. But so I've grown up. After the first 10 years or so of my career, I wound up in analytics. 

And that was in-house then that kind of moved to the agency world. I've been consulting really for the last decade, for the most part for the last decade or so. So largely working with organizations, not actually specific to Ecommerce, certainly work with Ecommerce clients, but work kind of across a range of industries and corporate types, really helping them figure out how to sort of put their, put all this data they're spending all this time collecting, actually putting it to, to productive use.

So that's my background. 

Chase Clymer

Absolutely. Show we crossed paths was obviously at that meetup and you were touting your new book. 

Tim Wilson

Yeah. So my book is Analytics The Right Way, A Business Leader's Guide to Putting Data to Productive Use. And so the new book and the new company happened semi-simultaneously. And they're both motivated by a similar deeply held belief based on observation working with a number of clients that a lot of organizations are getting very, very obsessed with gathering the data, collecting the data. And the more data you collect, the more complicated it is to manage it.  

And there's this kind of death spiral and the more vendors are out there saying and add this other tooling to your tech stack. So the book and the company were kind of a reaction to that, that we have a lot of data. I kind of have enough. Most organizations have enough data. It's clean enough and  sort of pushing to get organizations to say, we need to spend more time thinking about our business and actually putting that data to use and putting that data to use doesn't mean to buy more tech or add more resources or add more people to it. 

That's kind of the premise of the book, which I did. I'm a co-author on the book. I wrote it with Dr. Joe Sutherland, who I worked with for a couple of years. We met when we were working together. He is, I like to say that he's more credentialed, he's younger, he's better looking and he's more articulate than I am. 

So it was kind of a humbling experience, but he did a stint in the Obama White House. He's got a PhD from Columbia. He's the founding director of the Center for AI Learning at Emory University. So he kind of does bring sort of the data science  clout to the book. He and I connected so well because he  feels, we feel very, very similarly that organizations have this tendency to chase more data when they should be kind of thinking more about what problems they're trying to solve.

Chase Clymer

Absolutely. And the name of your consultancy is facts & feelings. You got to break down that name a bit for me here because facts are very analytical. It's objectivity there. But then the feeling side of it, shouldn't we be letting data help us make these decisions and taking the feelings out of it?  

Tim Wilson

You would think. And I would say I am in the digital analytics space, the My People world. Even just recently, I was at a conference. People I've known for years and they're like, feelings coming from Tim Wilson. You're kind of a cranky curmudgeon. That seems weird.  

But the reality is that the way we look at it is we think there has been this over-indexing towards  the data. This idea that the data has this objective truth and  there are analysts like to run around quoting  Edward Deming who was kind of a famous statistician brought statistical process control into the world of manufacturing. There's this quote, in godly trust, all others must bring data. I feel like analysts and even a lot of Ecommerce and marketers like to bring that. 

They set up this false dichotomy of, you have an opinion, it doesn't matter. Your opinion doesn't matter because data is the be all and end all. There's a case where that matters. But what we think is facts & feelings and what I've kind of experienced again and again in my career is that we do want people's opinions because their opinions are their ideas and their ideas and opinions come from experience and creative thinking and thought. 

That should be kind of complimenting the data and this idea that if you just hook up to Google AdWords, then your paid media spend is going to just automatically be optimized this way. It's not healthy to think that way because it doesn't account for the value that human creativity and human thought brings to the process. So that's really kind of where the feeling comes from is that we think a lot of organizations we see being effective with data, they think they're hard thinkers and they're smart people. And they say, we need to tap into those ideas as much as we need to tap into the data. They need to really complement each other and not be treated as one being vastly inferior to the other. 

Chase Clymer

Yeah. And even going with your Google AdWords analogy, if your goal is to pay the least amount for clicks, probably not the most qualified traffic. If you're trying to get the highest return on your ad spend, maybe not a repeat buyer that's good for the lifetime value of your business. Looking at these things through such a narrow lens may not get you the best results. And that's where the feelings and even to extrapolate further from there, human beings taking a look at this stuff, not AI, not algorithms to really help understand it because computers are dumb. They only do exactly what you tell them.  

Tim Wilson

And if you think about it, if you just say, going to try to outsource my creative thought to the Google algorithm, then that's pretty easy for all your competitors to do as well. The advancements of where fantastic user experiences or great marketing ideas come from are somebody who was creative and said, I have this idea. I want to try it out.

They should be trying it out in a way that they can say, yeah, I want to use data to validate whether it works or not, but I don't want to be beholden to some external technology, which by the way is highly motivated to show me that the money I'm spending with them is actually returning results. 

Yeah, and that's on the AI front. One of my, I also have a podcast, the Analytics Power Hour.

And one of my, the other kind of founding cohost of that, Michael Helbling, I don't think he'd claim to originate this, but he's the one who I've heard say that AI will take you from zero to average very, very well. 

Because AI is kind of standing on the backs of all of the collective data that's out there, it's kind of definitionally not going to be helping you really innovate when it comes to doing a lot of things. So if I'm, you know,  a really, really experienced marketer, I can certainly use it. I want to use it for ideation. I want to kind of go, but I'm not, I'm not naysaying AI, but a lot of them people get really impressed with AI. It's because look, I went into something that I can't draw and look at this image I generated.  It's not the people who are saying I'm an artist with a vision who labors over my canvas.

I went into AI and, oh gee, gee whiz, look at this cool thing that I did. So yeah, there's a little mini, not a rant. I'm absolutely not an anti-AI person. It needs to be recognized for what it is. 

Chase Clymer

Yeah. I am going to thank your co-host for that. Because I always say AI helps you go from zero to one. But you are correct if AI is just an amalgamation of all the stuff it is consumed. It, by definition, will be average. 

Tim Wilson

Yeah. I look at it, if you want to have an online presence, if you have customer support, and you have people, you have a call center that's responding with chat, or even if it's a human being, maybe  getting to average at scale is good because these are relatively low.

low effort interactions. So there's lots of really, really good use cases to say, yeah, I just need to get it, I don't have a chat bot on my site. And if I can get one that can provide a decent level tap into some corpus of information that we have, then yeah, providing an average chat bot experience absolutely could make a lot of sense from an analytics perspective. 

I would strongly say be really clear on what that idea is, what problem you think you're trying to solve with that and be really, really clear on how you're going to objectively measure, is it actually working? And that's this other part of AI or lots of new features that I see missed opportunities is that someone gets excited about the idea. So they roll it out and then they'll either themselves dive into the data or they will come back to an analyst and say, Hey, I did this thing. It was cool. Can you tell me if it worked or not?

And what was missed was any upfront definition of what success looks like? How are we going to know if this worked? Which sometimes means, oh, we should roll this out in an experimental way like an AB test so we can truly measure the incremental lift. In other cases, it may just be, no, we just need to decide how much volume or, you know, a chatbot example kind of riffing on that. 

How many people need to actually use this and at the end of it report that their issue was resolved and that they were pleased with how it was resolved. And then we're just going to hold ourselves to that. That has to be figured out ahead of time, which gets back to human thought and human kind of alignment within an organization. We're investing in this thing. Why are we investing? Like what business problem, what business challenge are we trying to address?  

And how are we going to agree? Let's agree now on how we're going to know whether it was successful or not. Because if we wait until after we've invested and after people have poured in blood, sweat, and tears into doing it, it's really, really hard to after the fact say, did it work or not? It's like, if we don't have an objective way to  determine that, that we've all already agreed to, it gets really kind of personal. 

You don't want to tell somebody, like, congratulations, you worked weekends, you were up late at nights to pull this thing off and it was a technically challenging thing to do. But it turns out it's just not delivering the results we expected. That starts to feel like a judgment of the person as opposed to saying, hey, we tried it. We all thought this was going to work. But you know what? It actually didn't. I'm just, you know, and it's easier to say, appreciate the effort, appreciate the execution. But you know what? We all were incorrect on the results that we thought it would deliver. 

Chase Clymer

Absolutely. To pivot a little bit here, when I saw you speak the other day, one of the things that you had mentioned that stuck in my brain was this concept of ladders of evidence. And obviously, this is part of your book. I'm going to follow up asking, what else can we learn from the book? But quickly, can you break down that ladder of evidence for us? 

Tim Wilson

Sure. In the book, one thing we talk about, can use data to kind of measure performance and that's kind of the world of KPIs and  there's a whole part around kind of how do you set targets effectively and before a KPI and so on and so forth. But there's this other part that is validating hypotheses. So as an analyst, when somebody asks me a question or asks me to pull some data, I'm always trying to get to what's the underlying idea. 

Let's translate that idea into a hypothesis, make sure that when we're articulating that hypothesis, and there's stuff in the book about how to do all of that, but we want to have a path to action. If this hypothesis, if we can validate it or in technical speak, fail to invalidate it, then  what are we going to do? That's all upfront work to say, what is the decision we're trying to make or the action that would come out of validating this hypothesis?

There's a tendency to say, oh, we've done all of that. Now, what is the way to validate this hypothesis? And the ladders of evidence say there's always multiple ways to validate a hypothesis. Kind of the dirty little secret of analytics, which I wish wasn't a secret, is that we never know perfectly absolutely if something is true or not. If we look at the way scientists work when the scientific method they're never just 100% sure, right? They kind of max out at 99% sure of anything. And so on the business side, it's a similar idea that if we want to validate this thing, we can validate it maybe very, very quickly and with some weak evidence, which would be anecdotal evidence. 

And that's kind of the lowest rung of the ladder of evidence is anecdotal evidence. And it's easy to kind of dismiss anecdotal evidence. But the fact is, I can learn a lot if I'm looking at my checkout process. And I think I think it could be better. There is nothing wrong with me going around to a handful of friends and saying, Hey, can you just go through my checkout process and tell me what you think of it? And that's not scientific. It's anecdotal. 

But if they all say, why do you not give me the opportunity to say that my shipping and billing address are the same. If I get feedback and then I logically apply that, I thought there was an opportunity to streamline, that's not strong evidence, but it's evidence, which in the absence of no evidence, anecdotal evidence is still useful. If we move up from anecdotal evidence, kind of the next rung up on that ladder, its stronger evidence is descriptive evidence. 

And that's the data that falls from our digital analytics platform, the stuff we get from Google AdWords or from Meta or from wherever these platforms are, these millions and millions of rows of data that we have, and we can build models off of them. And that's because organizations have invested so much time and energy into gathering all this data, there's a tendency to say, all the answers are there.  And if we just run the right models on them. 

And descriptive evidence definitely can be stronger than anecdotal evidence, but it also has a lot of pitfalls in it. There are things like confounding bias, there's a  sampling error, there are various forms of pitfalls with descriptive evidence. Certainly in an Ecommerce context, when we're looking at data collected over time, it is really, really easy to get a strong correlation between two variables and lead to a conclusion that really is just they're both being affected by time. They're both trending upwards and they're not directly related to each other. 

So that's descriptive evidence. At its best, it can be correlative. It can never be causal and yet it gets treated that way. The top rung on that ladder of evidence is scientific evidence and really that is controlled experimentation. In the world of Ecommerce, that's generally A-B testing.

In the world of marketing, there can be geo lift testing, kind of incrementality testing. That is the one place that we can start to get to causality because there's some scientific rigor in it. That's the strongest evidence. And just the point that we make in the book and I've made throughout my career is we don't, the goal is not to do everything, get scientific evidence for everything.

The goal is not to live on any one of those rungs. The goal is to say, are the stakes and the scale of the decision that I'm trying to make? The hypothesis I'm trying to validate. And now based on the operating environment that I'm in, which one of these rungs do I want to live on? And how do I actually use it? Live on that rung as appropriately and effectively as possible. 

Chase Clymer

Yeah. And if we're going to look at this with Ecommerce and a conversion rate optimization lens, which is the worst term ever because you can. 

Tim Wilson

But we all use it. 

Chase Clymer

Optimize for a lot of things. But I had 2 questions here. One is more of a lightning round. I want to ask, where would these types of evidence gather? It's probably the best argument to bucket them in. Where would they fall in the way that you're pivoting these things? The first example would be where would  an expert's opinion be? Someone does a teardown or an audit of your website's UX, where would that fall? 

Tim Wilson

That's definitely anecdotal evidence. But that is there are ranges of anecdotal evidence. Asking my mother to evaluate somebody's website versus asking Jakob Nielsen or somebody who's a  UX expert. Those are both anecdotal evidence. But applying judgment, there's anecdotal evidence that has more credibility than others. But that would definitely be anecdotal. 

Chase Clymer

Gotcha. And then by taking the ideas that come from that and running split tests from them, the hypothesis you inferred, then  it  goes to that next level. 

Tim Wilson

Exactly. Well, that'll take you up to science. If you run a split test on it, it takes you to scientific evidence. So that's a great example. 

Somebody looks at it and says, this aspect, this accordion fold in your checkout is a terrible idea. And here's why I believe that with my expertise.  And you'd say, okay, but if I change it may take some significant investment. Is there a way for me to have an MVP that I can run as an AB test? Because I don't want to go down and make that investment if it doesn't really hold up and have the lift that I want. 

That can kind of be, it's one level of validation to get an expert's opinion if you say, this is going to mean I'm going to have an ongoing investment or a significant investment. And I either want to make sure it really moves the needle or it doesn't make me slide backwards, this may be the case where I want to do  a split test  and  actually really get a good handle on saying, yes, I have high confidence that I'm making the right decision here. 

Chase Clymer

Yeah. I like how you explained it there. If it's going to take a lot of investment, either time, effort, money, etc. to make that change, maybe finding an NVT and running the split test. Now to the feelings thing, when someone points something on your website, you're like, yeah, that is wrong. Is it valid to just make the change? 

Tim Wilson

Yeah, absolutely. Just do it. And then I think that in the CRO world, there are increasingly people saying, yeah, that's just a go-do. The  form field validates. It's a bug fix or it's just like, yeah, this is just obviously born. 

Chase Clymer

You don't need to test that that's broken. It is broken. You can see it.

Tim Wilson

The one exception is if you're in another environment where somebody's going to say, it was definitely broken, we definitely needed to fix it, how much did it have an impact? And that's only if you can predict that somebody's going to say, we fixed this broken thing, like what was the lift?  Then that may be worth considering  a test, but that's more because there's some organizational factor that's going to require that you.

Chase Clymer

Micro-managing?

Tim Wilson

Yeah, maybe. Because there is, even with descriptive evidence, there's some pretty, it's why having a statistician involved is really useful, if possible, to do a pre-post. We made a change at this point. Now, how did things actually change going forward? There's a lot of messiness in this. It's kind of why  it requires having, you can't just say, me look at a line chart. 

And my hope is that I've made this change and there's going to be a big step function that changes. It rarely is that dramatic. So then you wind up looking at a chart and saying, I made this change. I know when I made the change. 

But if you take something like a holiday or back to school, there's a bunch of stuff going on and there are a bunch of external factors like what season it is that are going to influence the results of what happens. If you say, oh, I made this change and all these other factors were going on, it's almost impossible to tease out the impact of that change. And unless you say I made that change in an experimental fashion, then it doesn't matter that we're on holiday or back to school. 

Chase Clymer

Yeah. You're going to see a lift the week of Christmas regardless. 

Tim Wilson

Yeah, exactly. 

Chase Clymer

Okay. I got three things here like heat maps,  user recordings, or even like paid user testing.  

Tim Wilson

Those all fall under anecdotal. Those are great examples now. Like, okay, second edition of the book. Those are all really useful. They're kind of painful. They can be painful to go through, but I think the platforms that provide those are getting better and better at kind of aggregating them.  They still tend to be, you can have a hundred aggregated heat maps that are a hundred users and that's still, okay, I'm going to backtrack a little bit. A heat map may be considered descriptive. If you're gathering that, it's an efficient scale.

If you're doing user testing, that's almost always anecdotal, but can be really useful. 

Most qualitative data is anecdotal and maybe getting, yeah, I think that's a  fair way to put it. You still want to gather it rigorously, be asking the people who are your target audience  for those user tests. Yeah, that was heat maps, user testing. There was a third one. It's like a session replay type. 

Chase Clymer

And I think that's something that validates for the listener. A user test is you're paying a group of people to do things on your website. 

Tim Wilson

Yep. 

Chase Clymer

And I would argue they're usually better internet users that sign up for these things, whereas user recordings are all the dumb-dumbs on your website using it the wrong way.  

Tim Wilson

Yeah. And there's a lot of useful information. From data storytelling, when you integrate, we see people moving from this step to this step on the website. Now let me filter that down to people who  did something goofy. How many recordings do I have of that? In the recording, you know, we'll show where they were clicking and you can be like, why they seem to be hanging out there scrolling up and down and up and down. So that definitely requires a lot of human observation to say, I'm just watching their behavior and I can only watch one at a time.

And if I can see 2 or 3 people who do the same thing, and I can logically explain what I think is going on, that is still very, very, very much anecdotal. It can lead to a hypothesis as to why that's happening and an idea for fixing it. And then you can actually run a test to say, does that remove that type of behavior?  

Chase Clymer

Yeah. 

Tim Wilson

So yeah. So we came up with a lot of anecdotal things for anecdotal and then in the middle was descriptive. Where would we find that descriptive evidence in the Ecom world? So definitely the digital analytics platform is the big descriptive data source. It's data we already have. It's already been collected. It feels really beefy because we feel like we have this truth because it's reporting on what's happened in the past to the present. So in a lot of ways, I'd say that's the easiest to go to because it's got the feeling of the heft. It is representing the people who came to the site who allowed themselves to be tracked, but didn't track it. So there's a little bit of sampling bias. There are some people at the margins you're going to miss and you may have AI agents mixed in with that or bots mixed in with that. So that's the most common data.

But I think that's also where the whole industry took a wrong turn when digital came onto the scene. And it was like, oh my God, we have more granularity comparing a website to an in-store experience. We have so much more granular data. And there was a lot we could do instinctively with that. But then we headed down this path of thinking that gives us  like all of the answers. 

If we treat it as kind of being too much of a truth. I feel like the CRO world, the split test board is kind of always pushing back against all of the gaps that there are in that data. So it's useful. I'll say in the book, we go through some of the most common pitfalls. I've certainly fallen into those pitfalls myself.

We've got some examples of showing how easy it is to stumble on those. But the goal is not to make people scared of using that data. It would be useful to say, again, if my website, if traffic is just generally growing over time, it is very easy to take any two metrics and put them against each other and make it look like they're correlated, but they're correlated because there's another factor time that is actually causing the core, even through holiday, even through back to school peaks and valleys, they just keep moving together. 

It's like, no, there's all this other stuff going on in which they're moving together, but they're not actually related to each other. And there are ways to even kind of tease, are like techniques for teasing apart the time component and saying, are these actually related to each other, but it still doesn't give you, does one cause the other just because they're moving together, which means you can either run a test and see if they are causally related or you apply feelings, human logic. 

Can I give a really strong explanation of why I think these two things are, one is causing the other and how confident I am in my feeling that that's the relationship? 

Chase Clymer

Absolutely. Go on. I'm assuming your book is on Amazon and everywhere else. 

Tim Wilson

Amazon, Target, Walmart, Bookshop.org I'm going to butcher that. The independent bookstore one. 

Chase Clymer

Well, let's say I go and find your book and I read it. What are some of the other headline topics that I'm going to learn about? 

Tim Wilson

There's one thing that I love to talk about that's the potential outcome framework, which sounds super academic and not helpful. But to me, it's at the core of some of the ways that help us think about what the data is and is not telling us. 

Talks about counterfactuals. That to me kind of blew my mind five or six years ago when I  came across it. The core of the book, I touched on a little bit earlier,  lays out three distinct ways that we use data. We've talked mostly about kind of validating hypotheses, but there is a lot to just measuring performance  in an objective and useful and meaningful way.

It's the reason we don't know, counting impressions and CPM is not nearly as useful as actually looking at, you know, something like ROAS or actually getting to conversions. But there's, there's a lot where organizations tend to, to stumble on and kind of meaningfully measure the outcomes of the work that we're doing, that they're doing.  

There's some misconceptions about data that I think are also useful, kind of calling out the  way that the MarTech and AdTech industrial complex is set up to kind of push organizations to  just build increasingly complex  data technology and data infrastructure. It's understandable. It's not like they're not evil. It's just everybody acting in their own self-interest and it's kind of collectively saying, let's all drink this.

So those are kind of some of the biggies and it's written in a way that's not super, super serious. It's definitely not super technical. I was writing it with a guy who's a PhD in, I think, political science. So we certainly could have gone off the rails. There are lots of little footnotes where he had to put in kind of to make sure to protect his professional reputation saying, what we've described is accurate enough for the point that we're trying to make. 

But yes, we recognize that x, y, and z. So we've gotten feedback that people are entertained by the footnotes and the tone and some of the examples that try to help people think about data in a productive way without feeling like they have to  get a degree in statistics. 

Chase Clymer

Absolutely. Yeah. I always appreciate when I'm trying to learn a subject and they dumb it down for me. I'm not necessarily saying they dumb it down. But just make it funner to read and more easy. The business-adjacent books that I recommend more frequently are the ones that weren't a slog to read. 

Tim Wilson

Yeah. And there are also illustrations in it that are all  meant to be some silly examples. And that has been very gratifying. There's a fellow who I have learned a lot from in space. I won't name him. But he can be a little hard to file. He's a delightful human being. He will patiently explain again and again and again. Part of my career has been like taking and finally distilling down what I've learned from him and a way I can explain it to a broader group. His wife started reading it and she was messaging me and she is, she is a CEO at a small company. And she was like, Oh my God. 

She's like this. Things are now made, I've been married to this guy. She's like, and I'm now understanding some of the stuff he's talked about. I was like, okay, this is great. 

Chase Clymer

Absolutely. The point of mastery is when you can take a complex subject and distill it down for people like me that don't understand anything. Tim, is there anything I didn't ask you about today that you think would  register or our audience would enjoy? 

Tim Wilson

I know. I think you covered the bases pretty well. I'm afraid if I said something else, I would head down a rabbit hole and undermine any credibility that I've built up over the last 25 minutes. 

Chase Clymer

Absolutely. Where do you hang out on the internet? If I'm listening to this, I want to learn more about you, learn more about facts and feelings, go check out the book. What should I do? 

Tim Wilson

So I am very findable on LinkedIn, even though I have one of the most generic names ever. So I should know this. I think TG Wilson is the handle or the base of the URL. I'm on Blue Sky these days, a lot more than that other platform. Connect with me on LinkedIn. Check out factsandfeelings.io. That's our site. But I'm pretty easy to find and love to connect with people. 

Chase Clymer

Awesome. Thank you so much for coming on the show. 

Tim Wilson

All right. Thanks for having me. 

Chase Clymer

We can't thank our guests enough for coming on the show and sharing their knowledge and journey with us. We've got a lot to think about and potentially add into our own business. You can find all the links in the show notes. 

You can subscribe to the newsletter at https://honestecommerce.com/ to get each episode delivered right to your inbox. 

If you're enjoying this content, consider leaving a review on iTunes, that really helps us out. 

Lastly, if you're a store owner looking for an amazing partner to help get your Shopify store to the next level, reach out to Electric Eye at electriceye.io/connect.

Until next time!