[00:00:00] Jennifer Hood: It feels like people are divorcing how they actually use Google from what they’re trying to do with SEO. So, it, because we just talked about this with a recipe, how would you go and if you found a recipe, you’d print it out, you’d save it, you’d move on.
Um, and we see this in other industries, that it’s really easy when you’re on the back end of things to start thinking really theoretical and trying to solve the whole problem and kind of forgetting the human element of what do we actually want out of it? What’s your goal when you search Google? How might you use Google that would trigger it to, um, not answer correctly and you refine your search or not to send you somewhere else?
I would be really surprised if everybody’s experience that’s working in SEO is not quite similar to most of the customers’ when they back out of the, uh, the real theoretical working on it from the business perspective.
[00:01:09] Bumper Intro-Outro: Welcome to Endless Coffee Cup, a regular discussion of marketing news, culture, and media for our complex digital lifestyle. Join Matt Bailey as he engages in conversation to find insights beyond the latest headlines and deeper understanding for those involved in marketing. Grab a cup of coffee, have a seat, and thanks for joining.
[00:01:37] Matt Bailey: Jen, my first question to you. What do you think of the world of SEO?
[00:01:45] Jennifer Hood: I think the world of SEO has the exact same problems that every other industry that’s trying to understand their data has. And like every other industry, they feel like they’re the only ones facing it.
[00:01:57] Matt Bailey: That is a very, I would say politically correct way of approaching it. And, uh, great observation. Great observation. Jeff, I got to tell you, I so enjoyed the article you wrote.
[00:02:13] Jeff Ferguson: Oh, thank you very much.
[00:02:14] Matt Bailey: Um, it was, I, I, I mean, I felt like applauding after, after reading it because, I mean, it was a clinic on how to write an article. And, and so, my first thought was the editors at over there, Search Engine Journal, you must have had to gone through some special consideration in order to publish an article greater than a thousand words.
[00:02:40] Jeff Ferguson: Yeah, it’s, it’s actually kind of funny. The, the, I mean, their, their minimum over there is a thousand words. That’s, I mean, that’s, that’s, that’s what they, they usually ask us…
[00:02:47] Matt Bailey: Oh, ok.
[00:02:47] Jeff Ferguson: …they, to shoot for. Um, but, uh, you know, that one ended up clocking in at like 4,600. And, um, I was, uh, I mean, Jen was there through come of it, she’s been through a lot of it, but she saw a lot of like drafts and things like that. And, and, uh, um, I was going to say, yeah, it’s, it’s running a bit long. I’m getting into like New Yorker territory at, at this point here. And, and, uh, I, I’m not sure if they’re going to, uh, ask me to like break it up and, and things like that. And, and, uh, you know, I spoke with Danny Goodwin, the, the editor over there, and, um, he had actually said something to the effect that, that, um, multipart articles actually don’t do as well. And so, you know, if, if I was okay with, uh, keeping it as the, as it was, uh, that was fine. Right?
And, and, um, and then the funny thing was, is that, uh, we even tried our best to kind of like roll it back a little bit. There was, uh, a couple times where, uh, you know, Jen and I, um, you know, uh, compared some notes on a few different things and said, “Okay, great. I’m going to, uh, I’m going to change this and I’m going to pull this back and I’m going to take this paragraph out entirely and, and I’m going to run it through Grammarly to make sure that I’m just not overwriting things and stuff like that.” And, um, I think, uh, Jen was surprised it came back and it was actually longer, uh, this, after I did all that and, and, uh, it was just because we had, uh, so much, uh, great information to work with, um, through all of this.
I mean, we, Jen and I worked on this, it felt like at, at least two months, if not longer, um, on the process of, of this and it, it was done through a lot of, um, interviews and we, you know, we, we occasionally talked on the phone, we did a lot over email. Um, and a, a lot of it was just, uh, some classic back and forth where, uh, I would send over, um, you know, a different study that I had seen in the past or, or recently, um, recent passes, what I was shooting for for most of these and, and, uh, just letting her kind of have at it, uh, without really, um, you know, saying, “Hey, here’s my, here’s what I’m worried about,” or anything like that. It was more like, “Hey, as a statistician, what do you just think of this,” right?
And, and she would come back with stuff and, and I would in turn reach out to, um, different people that were mentioned in the, um, in the different, uh, pieces like, uh, Rob Osby or, or, uh, Rand Fishkin or things like that and kind of say, “Hey, what, you know, what is your thoughts on these types of things?” And, uh, if they wrote back like, um, uh, (?) did, um, for Rob Osby, because he had, he had since moved to a different company, you know, they would send over other data and that data would end up, you know, causing more discussions and things like that. And some of them you can actually see in almost like a storyteller type format, uh, on some of these things where we just kind of, you know, walk through all of these things.
And, uh, the funny thing is, and going back to the length of the article, is that, uh, we probably could have written four or five more articles that looked just like it, uh, based on other data and other things that came back and other explanations and things like that and, and we, uh, well, I mean, I ended up like cooling it down to this to really kind of make sure that I was proving the big points that I wanted to land, um, for an article like this, so.
[00:05:51] Matt Bailey: To the listener, it’s going to sound like we start, we’re doing things a little backwards here. So, I’m going to get into that a little bit more, Jeff, but first of all, I, I want to thank you listener for joining us, uh, on the Endless Coffee Cup podcast and a real special episode today. Uh, there’s an article that we’re going to be referring to and it’s in the show notes. It’s over at Search Engine Journal and the, the title of the article is, “Do We Have the Math to Truly Decode Google’s Algorithms?”
And, uh, a good friend of mine, Jeff Ferguson, has written an extensive article, uh, really examining the claims of SEOs that, that claim to understand Google’s algorithm scientifically, uh, through studies, uh, through things like that. And, and Jeff also brought along, uh, Jennifer Hood, who is a statistician. Uh, tell you what, Jennifer, if you could introduce yourself and then pass it over to Jeff to introduce himself, as well?
[00:06:52] Jennifer Hood: Sure. Uh, so like you said, I’m a, a statistician. I have 15 years of experience working in the, the data analytics world. So, first working for Volvo, uh, dealing with our product mainly in North America, but also having responsibility globally. So, rather complex products when we come to modern vehicles. And then in the last two or three years, I branched out and started my own consulting firm where I do analytics consulting for small businesses, uh, for, for my company, Avant Analytics, um, mostly companies that can’t afford an in-house analyst. And I also train people on analytics, how to get into analytics careers, how to incorporate more analytics in the work they do, uh, under my second business, The Career Force.
[00:07:38] Matt Bailey: Oh, it is a pleasure to meet you, Jennifer. And we should probably talk more later, uh, coming from, uh, you know, coming from a journalist background, but being forced into analytics, uh, mainly, mainly because I wanted to see where I was making money, uh, and after that, I fell in love with it. Uh, so, Jeff, hey, little bit of background, Jeff and I, I think we’ve known each other, I think we’re, oh man, we’re coming up well over 15 years now, I think.
[00:08:08] Jeff Ferguson: Yeah, at least. Yeah, absolutely. Yeah, uh, you and I met on the, on the road, uh, from my speaking appearances over the years and, and, um, I actually can’t even remember where the, the first time would’ve been. I mean, I, I have a feeling it probably would’ve been New York, um, where we, I think we bonded on our, our love of a good whiskey, but, uh, um, you know, we’ve since met up like literally all over the world, um…
[00:08:32] Matt Bailey: Right
[00:08:32] Jeff Ferguson: …since then, right? You know, like, uh, I think the last time we actually, uh, saw each other was in Scotland where we were both working at a search conference out there, um, ironically enough with Rand Fishkin who’s in this article, but, uh, um, who was, who was kind of brought in as a keynote speaker. Um, but, uh, yeah, and I, I think that was, uh, one of the funny coincidences, too, I think the last time we tried to do a podcast together we did it in Scotland and made the big mistake of trying to doing it in a pub. And, um, was just…
[00:08:59] Matt Bailey: During the Fringe Festival.
[00:09:00] Jeff Ferguson: Yeah, during the Fringe Festival. Yeah. So, it was just way too loud. I don’t think any of it was useful. So, we, uh, uh, it never, it never saw the light of day, but, uh, yeah. Anyway, uh, just to kind of take my, my turn at it, uh, my name is Jeff Ferguson. Uh, I am obviously an author for, um, Search Engine Journal, but my day job is actually, um, a partner and head of production for Amplitude Digital, which is a, a small, uh, digital, uh, agency out here in Los Angeles. Actually, we’re not small anymore. No, we’re, we’re actually a pretty decent size. Yeah. And then I also am a, an adjunct professor for, uh, UCLA, um, teaching classes on, uh, digital marketing and search engine optimization.
[00:09:39] Matt Bailey: Great. I, congrats to you, Jeff. That is, uh, I know you love teaching and, uh, I do as well, and that is so very cool to hear, uh, all that you’re doing. And then, uh, you know, getting your company sort of, uh, brought in, acquired by another larger company and doing well, so congrats to you. That, that’s, that’s awesome.
[00:10:00] Jeff Ferguson: Thanks a bunch.
[00:10:01] Matt Bailey: Well, I tell you what, like I said, this article, I, I’ll start first with your illustration to start, because like I said, this was a masterclass on how to write a good article. Uh, you started with an illustration and I love the concept of the gentleman scientist that, and, and I have used this a couple of times now because it is such a great illusion to SEO, that there are people coming from so many different backgrounds, none of which, you, you know, very few grew up in the search engine world studying search engines and then have become SEOs.
Um, you know, those of us that have been in it for decades now, we came from, you know, sales, uh, you know, journalism, some came from programming which had a little bit of a jump. Uh, but it so well describes the industry. And you and I have talked for years about the problems in the industry and what gives it that, that snake oil tarnish, uh, that, that has been consistently attached to search engine optimization.
[00:11:10] Jeff Ferguson: Yeah, absolutely. And I, and I, um, I think an early draft for this article or an early title for this article was, um, you know, can, can SEOs, um, kind of, uh, move themselves away from this hacker past, uh, that they have?
Um, you know, where, uh, in the beginning it really was this, this idea of, um, these kind of like mad scientist characters that were, um, you know, trying their best to, to figure out like how, uh, you know, Google’s algorithms worked and what they needed to do. And they were, uh, probably a lot more successful, uh, back in the early days of it because it was a less complicated, uh, algorithm. And, um, you could do, you know, some fairly simple tests and figure out that, “Hey, look, if you put the keywords, um, you know, you know, a bunch of times then, then you’ll show up better.”
Um, you know, the, the guy that, the, the godfather of the word SEO or search engine optimization, um, you know, he was working on a website for, um, Jefferson Starship and, and, uh, his, his client was all ticked at him for not showing up for their own name. And so, his first act was actually to go through and repeat the band’s name over and over again. Um, you know, and which is ironic because he, he basically did something that would get you, you know, thrown off Google these days.
Um, but, um, yeah, so we, we, you know, we had this, this type of beginning to it and, and this very snake oily kind of feel to it. And, and a lot of people that were taking, um, credit for a lot of things that would, uh, happen fairly naturally and, and things that, um, you know, Google was already kind of looking for. And over time, uh, Google is, got better. All the rest of the search engines just got better at figuring this type of stuff out, but we’ve still got this, uh, kind of snake oil kind of feeling to it. It’s just that now, um, those snake oil systems have, uh, better tools and better data, uh, to kind of prove themselves, uh, throughout all this.
And that’s, that was kind of like the heart of, of a lot of this, but, you know, it’s somebody that, um, kind of makes his living, uh, in SEO, along with, you know, all, paid search and, and, uh, any type of digital media and things like that. We, we have to contend with both clients or, uh, other SEOs and things like that that appear, um, you know, with so-called like data or so-called proof, um, that their methodologies that they’re using in SEO are based on fact. Um, and a lot of times they will point to these type of, um, SEO studies and say, “Hey, look, um, you know, it’s not up to me, uh, you know, Semrush did a, a study that said this,” or, you know, “Moz did a study that said this,” or whatever it is.
And while, I don’t know. Like I, I don’t want to make, uh, those companies out as complete charlatans and things like that. Um, but I think the, they’re not really taking on the responsibility of presenting these, these findings that they have in the, in the proper way. Um, instead they, they love to just kind of show them off as, “Hey, look what we found. We have this correlate of data, um, that, you know, shows this is related to this.” And, um, and then they kind of leave it there. Right? You know, and sometimes they, they do make the leap where they’ll specifically say, “Hey, look, based on this we can safely assume that, um, you know, this is a ranking factor,” right?
But, um, all too often, they, they kind of just put it out there. Right? It’s just kind of saying, “Hey, look, this is, this study says this,” and we should pay attention to that kind of stuff. And, and, um, you know, but they really don’t go into the idea that saying, “Hey, while it might mean this, it could also mean this,” um, you know, that kind of stuff.
And they, uh, you know, and as we kind of like delve into in the article, um, that, uh, you know, Jen took a lot of them to task based on the idea that even the way that they’re claiming that these things are related may not, um, be the strongest of correlation, either. And there, there’s, sometimes there’s really good reasons for that, which we, we discussed with a few other, uh, people that, that wrote some of these studies.
Uh, but for the most part, like, there’s so many other things wrong with how it’s actually done, um, that it just, it isn’t right. You know, it’s not, it’s not the way we should be doing this and, and, uh, thankfully so many SEOs, um, that read the article came back and said, “This is, this is great. I’m, I’m glad somebody finally said this out loud. And, and, you know, we’ve, we’ve been seeing a variation of this for years now, but it never really seemed to, to pick up.”
And, um, I guess the, the almost the, the, the better compliment, there was actually several better compliments, but, um, some of them were the only people that, that were, seemed to be against the article or the people that were selling these, these type of studies to people are, read it or…
[00:15:28] Matt Bailey: What?
[00:15:29] Jeff Ferguson: Yeah, exactly. And then, um, probably the greatest one of all was that, um, the Vice President of Marketing for Semrush, um, said in a Twitter exchange that, “Yes, we can do better. Look, we can do better on how we present this data. Um, and, you know, hopefully we can all work together, uh, to kind of make that happen.”
[00:15:46] Matt Bailey: That is good to hear. Um, because, and it’s not just the companies that are putting out this data. It’s that it also spawns hundreds of other companies that repeat the same thing because they rely on the studies that are done by these larger companies. And really, they take it as gospel. As soon as certain people in the industry put forth a study and say, “This is what it means,” there are hundreds to thousands of other SEOs that hang on every word, and they go and repeat it.
[00:16:21] Jeff Ferguson: Oh, absolutely. Yeah, I mean the…
[00:16:22] Matt Bailey: And then it just rolls on.
[00:16:24] Jeff Ferguson: Yeah. I mean, a good example of it and we, and one of the, the two big studies that we kind of covered in there was a Rand Fishkin study on the, uh, at SparkToro, um, using the information that, that claimed that, um, you know, more than 50% of Google searches, um, don’t get any sort of click, right? And, and it’s something where, um, you know, Jen kind of takes that to task very nicely, too, but the, but the interesting thing is, is that information, um, you know, you’ll see it start off so many other articles, right? You know, like you see it on Search Engine Journal, on Search Engine Watch, all of these different places. You’ll see, um, things where it’s really statements like, “Hey, we all know that, you know, more than 50% of searches don’t get a click these days.” And it’s like, we…
[00:17:05] Matt Bailey: Right.
[00:17:05] Jeff Ferguson: …we actually don’t know that, right? And the, the, the even, uh, crazier thing about it is that information like that can actually get brought up in front of Congress. A, you know, Rand Fishkin study got used in a, in a congressional review in the early stages of, of them deciding whether or not, uh, Google’s a monopoly, right? And, and, um, senators brought this out and, and quoted it and things like that. And, and, uh, um, you know, everybody had a, a nice, you know, like, “Yay for us,” that kind of stuff, like, “What a great piece of data,” but as Jen’s told, that, that data actually wasn’t very valid and, and was, uh, you know, that’s just dangerous.
[00:17:40] Matt Bailey: You know, let’s talk about this a little bit deeper, um, because yeah. And I’m sure you’ve experienced this a number of times where I see something cited, I follow the link, and what I find is another article that repeats the same claim and cites somebody else. And you start this rabbit hole. And ultimately, you cannot find the source of the data, but it has been repeated so much that it’s now accepted as fact. And I’ve seen these in IBM reports, I’ve seen them in, you, you know, Fortune 100 reports citing a statistic that doesn’t exist anywhere.
[00:18:18] Jeff Ferguson: Right.
[00:18:20] Matt Bailey: And this is, this is part of that danger. And yeah, to get to a congressional record now, amazing.
[00:18:26] Jeff Ferguson: Yeah. Jen probably can speak to that a little bit too, but she, as she was doing some of the research on it, I would hand her a specific study and she would, um, she would start down that rabbit hole and come back and say, “Hey, I found this,” and I hadn’t even known about that. Like, I didn’t even know where that started, but, uh, but yeah.
[00:18:40] Matt Bailey: Well, Jen, what, what was your experience in this specific report and, and what Jeff brought out or what you brought out was the availability bias. Uh, so explain a little bit of that to us and, and also what you observed.
[00:18:55] Jennifer Hood: Sure. So, what I saw in most of these studies is exactly what you were just talking about, that one thing references another and studies that may have actually started out perfectly fine in explaining their context, um, get wildly misrepresented when they’ve been repeated two, three times through multiple iterations. They lose that context, which is so important and makes them actually have some value in the first place. They’ve lost it by the time they’ve been copied a half a dozen times and, and really lost everything around it.
I, I talked about a, several different types of bias. I’m not sure that all of them even made it into the article, but there’s so many types of bias that you can have with logic and with how you select your data. And they’re problems that even great mathematicians make, because we like to pick things that are easy for us to work with.
[00:20:00] So, availability bias is just that. We take the information that’s easily available to us, and we assume that it represents the entire population. So, in the case of, uh, Rand’s article, he took data from Jumpshot, which was a sample of data for just US searches. It excluded mobile and Mac users completely. Um, and…
[00:20:15] Matt Bailey: Oh wow.
[00:20:16] Jennifer Hood: …and so it ended up being less than 1% of all searches. Uh, but it, this was, went a step further than some of the other ones in him claiming that it was statistically significant. So, not just implying that this was okay data, that it’s something that we could see correlation in, but flat out stating it, it has statistical relevance. Um, so, availability bias is us just taking data that’s easy to access and assuming that it’s a good sample representing everything.
[00:20:48] Jeff Ferguson: Yeah. And, and I, Jen, the, the other point I thought was really interesting, um, in, uh, for that, uh, study specifically was the, the idea that, um, I think you did the calculation and we ended up pulling up, uh, some of the calculations that you did in the interview I didn’t include in there, but, uh, we had said, “Hey, look, uh, in reality, like the, the percentage is probably closer to this of the, the population.”
However, um, the, the real one that, that really nailed it down, I thought, was the idea that given the fact that, uh, this was, you know, data that was just from, um, I think it’s Avast users, which is a, uh, um, uh, virus checking type software, um, that pulls this information that it lacks, um, being random, which is one of the cornerstones of, of, um, good data quality and I, and I, I’d love to hear more about that.
[00:21:38] Jennifer Hood: Yeah, so you want to have data that’s random and representative, and it’s really hard to get really random data because it, it’s just not that easy to take a whole population and not have it affected by anything and pick out a certain number that ideally represent the whole. Um, so, when we talk about representative, we are looking at how well does the sample that we’re taking mimic what our entire population, so, experiences.
So, of these, um, searches that they use as their data set, how well do they represent every search, this 1% they’ve selected? So, when you’re leaving out different groups of users, you’re probably leaving out people that are using it in a different way. We could, I could theorize that a Mac user is probably a slightly different demographic search wise than possibly a Windows user is.
I don’t have the data to say that for sure, but, but we know that the people that buy Macs have a little more disposable income, um, they’re probably a little more into the tech, so they’re probably looking at things, looking for things a little differently than, um, maybe your, your average 70 year old that is maybe one of the people that’s Googling “Google,” which is something that…
[00:23:00] Matt Bailey: Right.
[00:23:00] Jennifer Hood: …came up, as well, when we talked about number of searches is that our, Google is not a saint of a company, but I think we can also be fair to them and say, “Should we really expect them to be sending people anywhere else, um, for some of these simple things that, at least to me logically not being from the SEO world, don’t seem like there’s added value to going anywhere else. Um, but that’s the message of that article is, um, Google’s stealing all of this from everybody else rather than, “Hey, this is what it is, and it doesn’t look good for Google, but we need to do more to explain why this might be happening.”
[00:23:43] Matt Bailey: And that’s interesting because what jumped out to me when I wrote, when I was reading the article, I’m like, “Wait a minute, Avast sounds so familiar.” And you don’t have to go far. Avast is actually selling the data from people that use their antivirus software. Uh, and up until late 2019, uh, people were not opt, they were automatically opted in, uh, they were having this data collected on them without them knowing about it. And then it’s being resold into an article that talks about how bad Google’s being. So, that whole string of logic was just blowing my mind as I went through it here.
[00:24:28] Jennifer Hood: Yeah. So, we referenced that study and I, if I remember correctly, Jumpshot’s not even in business anymore because of those sorts of things.
[00:24:38] Matt Bailey: Yeah. They were the one that was brokering the data collected by Avast on users that had not opted in. Um, they started implementing an explicit opt in and people stopped using it. So, I, I think that helps the availability bias a little bit there.
[00:24:57] Jeff Ferguson: Yeah.
[00:24:57] Jennifer Hood: Yeah, absolutely.
[00:24:59] Matt Bailey: Well, the other thing, and I, I you brought this up very well, uh, both of you, are the companies that are either commissioning or performing the studies are, have a, a naturally invested interest in the data that’s produced from the study. Uh, I often laughed when, when Google, uh, produces studies that tells us about the power and the reach and the influence of video advertising, uh, you know, and it’s a commissioned study from Google, you know, and I’m like, “Well, thanks, Google. Yeah. Who benefits the most from this?”
Um, but we see this in the companies that are producing these reports that, well, the natural conclusion is, “Our software helps you do this.” What, I, I, I guess in the, I, I’m sure you see this in so many other industries, Jen, that the company that commissions it just happens to benefit from it.
[00:25:52] Jennifer Hood: Yeah. And I think this is where it comes down more to packaging than anything because the studies themselves, most of them aren’t bad if they’re in the right context. It’s when they get transformed and packaged as, “This tells us something meaningful that we should take action on,” or, “This is the reason you should buy our product,” and then you look at it and, and really it’s not justification. It’s, it’s kind of clouding everything in enough statistical terminology that it’s hard enough for people that aren’t in that realm to understand that it sounds completely legitimate.
Um, so, it’s really the sales pitch around it that I take more issue with than anything, um, that these studies are being represented as things that in any way should inform a decision. That’s problematic. I think, I want to be careful here that we don’t kind of just write off every mathematician or every analyst…
[00:26:56] Matt Bailey: No. Yeah.
[00:26:56] Jennifer Hood: …who’s been involved in this because they may have been doing good things. And we saw this come up with, um, with one of the things that, I think it was, uh, Tom Capper was, had some slides in what we were doing and after the article came out and he looked into it, um, he started responding and saying, “What you’re saying, I agree with. I in no way meant that this is supposed to represent everything.”
But by the time that it got quoted and re-quoted and referenced, all of a sudden somebody else was holding this up as, “This is fact. We should make decisions off of this. This is why our product is better.” Um, and that becomes extremely problematic and extremely difficult then for anyone to trust the results, because then they question the statistics. They don’t necessarily question the packaging.
[00:27:47] Matt Bailey: That is a great, great observation. Um, yeah, I mean, my background is in journalism and I, I can only imagine my, my professor that I had in journalism, just the, the, the rage they must be feeling at the modern internet. That, because she preached to us over and over, editorial and advertising. The two should never meet. They should be, there should be a wall between the two, editorial and advertising.
And yet, like our modern internet and, and studies and research reports are now, like you said, Jen, it’s the context. The context of the research and the sales pitch are integrated to a point where they can’t be pulled apart.
[00:28:33] Jeff Ferguson: And that, I think that was, uh, one of the common responses we got on this stuff is, is that because we brought to the, the forefront a, a lot of the, uh, well, not a lot. Some of the responses that came back from people that actually issued these, these studies or whatever, um, you know, kind of said, “Oh, you just don’t get it because, you know, this, this information could be useful. It can spawn, um, you know, other experiments and other studies and things like that.”
And that we, we agree with. Like, that’s fine. That’s actually how this is supposed to work. Or, but it’s when it’s suddenly, um, kind of presented as this holy grail of, of how things work, then, then it, it gets more sinister in nature and, and gets really dangerous. And, and I mean, a really good example and, and Jen dives in this a little bit is the, the, uh, concept of domain authority, which is a, a metric that, that Moz uses and, and most of the other, uh, SEO tools have a version of that, um, that same kind of metric that’s out there.
And, you know, Google has said publicly that that actually doesn’t align with anything that they’re doing and, and, uh, but, you know, uh, domain, uh, uh, Moz specifically says domain authority is, is something where it’s supposed to replicate, um, what used to be like page rank basically. I mean, they, they, they give some nice language around that kind of stuff and, and, um, you know, but over and over again they have to keep clarifying that like, “Hey, it’s not a ranking factor. Um, you know, it’s something that we use wherever,” and, and that’s because this confusion is so widespread.
[00:30:00] But the, uh, the crazy thing is, is, is that that domain authority gets used to sell other things, right? And you’ve, you’ve probably crossed paths with them before where, uh, it gets used to sell domain names. It get used to sell, um, you know, uh, guest posts on, uh, on certain websites and things like that. Um, it gets used to sell links and all these kind of things, too. It, it is a way to kind of say, “Hey, look, this is a verifiable quantity, uh, that we can all measure.” But as, as, you know, Jen kind of discover, discovered, it’s, it’s a very weak metric, right? It’s not something that actually aligns very well with much of anything. So, like, why, why are we using it? Like, why is this a verifiable thing?
And, and, like again, I don’t want to completely tear down all of these tools. They, they provide a lot of great data. I mean, I use, myself, I use Moz and, uh, Semrush and Ahrefs, right? Like I, I use all of them on my job on a regular basis for multiple things, but it’s something where, um, you know, I know to ignore certain things in there because I say, “Hey, look, this, this actually isn’t based on fact at all.” Um, you know, when it runs a report and says, “Hey, look, you know, your, your, uh, title tags are too long,” or, “You’ve got duplicate this,” or things like that, that, that’s useful. That’s something that actually, you know, I, I, you know, will use over and over again and report back to the clients or whatever, but in instances where we turn around and kind of say, “Hey, look, uh, we’re providing this service and it’s increasing your domain authority,” then it gets really weak and it’s not really what it’s intended for.
[00:31:24] Matt Bailey: Absolutely. Absolutely. Um, yeah. And, and that’s the thing. I mean, people who I think have a much deeper understanding, I know where that number’s coming from and, you know, that’s your own metric, it’s your own, uh, sort of index, but now you’re selling it as, “Well, this is what we’re going to measure everything by.” And like you said, things get traded based on that. Um, yeah, you’ve got an entirely new economy now, uh, and a new vocabulary and everything surrounds that rather than, uh, kind of the traditional things that people just know they should do.
[00:32:01] Jeff Ferguson: Yeah, absolutely. Yeah. And that’s the, that’s always been the core of SEO for me. I mean, it’s something where, um, you know, like, and, and I’m writing a book on the subject right now, and it’s something we teach at UCLA as well, which is the idea that like, hey, just, just about everything that, that is really like the core of SEO work are really things that you should be doing anyway, if you really think about it, right? There, there is some truly, um, things that are like, yeah, I would call them like true SEO type of functions where you really are doing things on your website that speak directly to a search engine. Uh, you know, like the schema of codes and things like that and that’s all fine and good.
But most of the really big stuff that Google, um, will tell you that they’re looking for, um, are, you know, time tested concepts of, of writing great relevant content, uh, on a properly built website that, you know, is, is fast and mobile friendly, um, and obviously just crawlable so they can get in there and look at it, um, and using, you know, public relation style tactics to get links from other type of websites, um, you know, something that it was already in place, which is why Google used it, um, because that’s how the scientific community, uh, used to, um, share information for so long, right? So, these are, these are things that, they’re not, they’re not earth shattering and there are some specifics to it on how to do those in a best practice sort of way.
Um, but where this gets corrupted is, um, all the people that, um, that have kind of like, you know, another type of bias called confirmation bias where they’re, they’re looking for, um, the easy answer on how to accomplish certain things, and so, when they see these types of studies that, that may not necessarily, um, you know, be sinister in nature, um, but if they see somebody come along and say, “Hey, look, um, you know, you’ll rank better if you have a certain amount of words in your article, or if you’ve got a certain, uh, if you’ve got a lot of, uh, Twitter followers or, or Facebook likes.”
So, things like that. Things that actually do have like a, a strong correlation to it, or at least some type of correlation to this type of information, but they’re not ranking factors. And Google will tell you, like, “Hey, that’s actually not a ranking factor.” Um, you know, we understand that you might see a correlation there, but it’s for the same reasons, right? You know, you’re, you’re doing, you know, you’re ranking very well on, on Google for the same reason that you are, um, you know, getting so much activity on social media because it’s a great article. Right? You know, they get, that’s, that’s kind of the heart of it, right? And, but that’s never brought up.
Instead, it gets presented as this idea of, you know, um, hey, hey, it’s a, it’s a direct ranking factor or it’s driving ranking, right? Which isn’t the way to kind of look at this stuff. And, um, you know, if you call, you know, these people that present this data, um, usually the toolmakers, whatever it is, they’ll just kind of shrug and say, “Hey look, we never claimed this was a scientific study,” uh, you know, “We never said specifically that it was a ranking factor,” whatever it is, but they, they know this is how this information is, is getting used out in the world and they knew nothing of it. Right? And, and that’s, that’s just irresponsible as far as I’m concerned.
[00:34:47] Jennifer Hood: Yeah. The most blatant, the most blatant example of that is Moz having to clarify that domain authority isn’t a ranking factor. It seems, even being brand new to the world, it seems obvious to me that of course it wouldn’t be a ranking factor, but people are getting confused, and so that they’ve needed to clarify that they know that people are misinterpreting this information. And I say Moz because that’s the easy example that came up in some of these studies, but I’m sure they’re not the only one.
[00:35:17] Matt Bailey: No. And that’s, you, you know, unfortunately, I, I would say even in our media driven age, there are people who traffic in confusion and really just, you know, just mowing people down, uh, with data that sounds good. It might have enough numbers in it to sound scientificy. Uh, but, and, and that’s, and, and, and, you know, we see this in all areas of media. Uh, and then I think every industry has its own flavor of people. I, I think, Jeff, we’ve said it for years. SEO is art and science. It’s a mix of art because there’s, there’s expression, there’s creativity, but there’s also some science to it.
And I think a lot of these studies try to reduce it all to a number, to a science, to an index. And, and I think there’s a danger of that because every, it’s going to try and conform everyone, or everyone’s going to try and conform to what this latest study said. And to me, all it does is tell Google what to target, uh, if you’re trying to get rid of people who are trying to game the algorithm.
[00:36:27] Jeff Ferguson: Absolutely. Yeah, I mean, another great example of that is, is the, uh, and kind of goes back to what I was saying about, um, the word count in an article, uh, Gary Illyes, um, this morning, uh, on Twitter, uh, posted a conversation, um, that somebody had had with, uh, with one of their clients, an SEO client, or, and basically they said, “Hey, look, we, we need you to write this article about this subject and it has to be 2,000 words.”
And, um, the writer came back and says, “Why does it have to be 2,000 words?” And they go, “Well, for SEO,” and the, the, the, the writer goes like, “It’s an article on how to download a CSV file,” right? And he goes, “Boy, how, how on earth could this be 2,000 words long?” Right? Like, he had, uh, you know, it’s like, “No, just do it. You just make it what it is.” And you can see this kind of like confusion get really prevalent. I mean, um, you know, as a, as a…
[00:37:15] Matt Bailey: Oh, I’m not trying to bake a cake, you know?
[00:37:16] Jeff Ferguson: Yeah, exactly. Well, that’s even a good, that’s even another good example of it is that if you’ve ever seen, um…
[00:37:22] Matt Bailey: Oh…
[00:37:22] Jeff Ferguson: …like recipe pages, right? It’s…
[00:37:24] Matt Bailey: Absolutely. I hate it.
[00:37:25] Jeff Ferguson: Yeah. You don’t get a recipe, you get the life story of the, of the author and how their, their parents used to make it when they were kids, right? And I think, and, or like, you know, and I’m going like, “What on earth is all this, this mucker?” And then I go like, “Oh, you know what? That’s, that’s my fault.” And I’ve actually had to explain that to multiple friends…
[00:37:40] Matt Bailey: Yeah.
[00:37:40] Jeff Ferguson: …where, you know, it’s, it’s not my fault specifically, but it’s one thing, I go, “Hey, look, it’s the, it’s the damned SEO industry’s fault because, you know, this, this information gets passed around and somebody says, ‘Oh, if I really want to rank very well, like, I have to have this certain amount of words.” No, but, but here’s the funny thing to it is again, this is where that correlation gets bizarre, right?
In the process of this person writing so much about that, they end up writing a really great article, right? In, in, in some forms, it may be a bit long, right? But in the sense that, that they really do end up illustrating that they’re an expert on that topic, at least that, that recipe and, and they give good examples of it and they, they use, um, you know, variations of words and things like that, all things that Google looks at, um, to kind of say, “Hey, look, this is a good example, uh, of a good answer for this query,” right? It provides a certain amount of authority and, and trustworthiness and things like that that they look for. And it ends up, you know, solving that kind of problem for it.
But for most people, what, what they looked at is that, “Hey, look, I have a word count I need to get to.” And, and it’s like, if you can say that with half as many words and still pull off everything else, then, then do that because otherwise you’re just frustrating your audience.
[00:38:48] Jennifer Hood: You bring up recipes. I love to cook. One of the, the things I’ve noticed cropping up the last few months is everybody’s putting a “Jump to Recipe” button at the top of their page.
[00:38:56] Matt Bailey: Yes.
[00:38:57] Jeff Ferguson: Right.
[00:38:57] Jennifer Hood: So, they know people are getting fed up with it and just leaving because it’s a, they’ve got these nonsense articles that don’t relate.
[00:39:07] Matt Bailey: Yeah. And, and that’s, that’s what I was going to ask is at what cost, you know? Yes, you’re doing all the right things. You’re writing a great article. I don’t want an article right now. That’s, and, and, and that, actually, Jen, I, I’d like for you, you, you know, for the audience here, causation and correlation, uh, this, this is invoked numerous times throughout the article. Uh, if you could explain the difference between that and, and how people can apply that to, you know, their assumptions about some of these studies or, or even what happens in their business.
[00:39:41] Jennifer Hood: I think most people generally understand causation and correlation are different, but maybe not in a very concrete way. So, my favorite example of this is the correlation of ice cream sales and drownings. Um, so there’s a, a very strong correlation…
[00:39:58] Matt Bailey: I shouldn’t laugh at that.
[00:40:00] Jennifer Hood: …there, there’s a very strong correlation between ice cream sales and drowning. As ice cream sales go up, drownings go up. So, in theory, I could predict the number of drownings based on ice cream sales. That doesn’t mean ice cream sales are causing drownings. There’s a third factor that is much more likely to be impacting both of them, and that’s temperature. It gets hotter, we want something cold to eat, we want to go swimming, and as a result, both go up.
Um, so I think that’s a really simplistic way that seems really obvious to all of us, but what ends up happening is as we get closer to things that we don’t really understand but we’re trying to put it in boxes or put it into a formula that really makes sense of it, we’re more likely to accept things as causation when it may or may not be a meaningful correlation or maybe it’s a meaningful correlation, but it doesn’t necessarily mean that the first factor that you’re using to predict what happens is what you should focus on. There may be something else entirely that’s driving both of that.
So, that somewhat goes back to what Jeff said about social media ranking and websites. So, um, like he said, people could look at it as social media presence and, um, good follower counts on social media drives website views, but it could just be that you’re doing everything right or doing a lot of things right that are driving both of them higher.
[00:41:30] Matt Bailey: Absolutely. Yeah. There, it’s tricky when you get into that. And I know one of the first things that comes to mind is when Google Trends came out and I think I spent an entire week digging into Google Trends because it was fun to see these trends and see them spike every year around the same time, and now I got to figure out what that is. What’s causing that? What, what’s happening with that? And, and sometimes you come up with a, I think, a legitimate connection there. Uh, other times it’s just fun to try and figure out why that would happen.
Yeah. But I think, as humans, we naturally try to connect events and it gets hard when we’re trying to predict them, but yet, when we’re looking at past data, sometimes we make connections that aren’t necessarily the right connections.
[00:42:15] Jeff Ferguson: Yeah. And the, the, again, the interesting thing about correlation and causation, that’s yet another, um, kind of statistical definition that is kind of paraded out in a lot of these, these studies where they’ll lead with that, right? They’ll lead with the explanation of, of causation doesn’t equal, uh, cause, you know, correlation doesn’t equal causation and, and it’s important to remember this kind of stuff, whatever it is.
So, again, it gives them a certain air of authority, uh, by the fact that they actually say that kind of stuff out, yet, then they still, um, present the rest of this data without, uh, really analyzing both sides of the argument, right? And, and so they end up, you know, causing the real problem that they, they said that they were trying to avoid by doing it, but, you know, and again, it gives them a nice out. It gives them a nice place to say, “Hey look, we, we warned everybody that this, this doesn’t mean this. We warned everybody this isn’t scientific study.”
I mean the, the caveats before some of these articles, um, are ridiculously long, right? You know, and, and the way to do stuff, if they put them up at front at all. A lot of times they end up at the bottom of these things if they don’t even have to like add them later.
Um, you know, that was something again we saw, um, with, uh, Rand Fishkin’s original SparkToro, um, study on, on clicks was that there was, um, things added later after the initial publication of the article, uh, that, you know, basically said, “Hey, look, I got some good questions about this data. And, um, you know, obviously it, it, um, you know, there’s no way to tell if people, um, search on something and then, um, you know, don’t find what they’re looking for and then go search for something else. So, that would be a big part of that group. Um, we obviously, you know, we can’t track people that click over to mobile apps. We can’t, um, you know, there’s a big chunk of people in there that are finding things that are just direct answers. Uh, you know, somebody looks up and goes to like, ‘Hey, what time is it in Helsinki?’ You’re just going to get an answer, right?”
And, and, you know, Jen touched on that a little bit as well, but it’s something where, you know, he came out later on and clarified that kind of information. But somebody that was reading that, that article for the first time when it first got published wouldn’t have seen all that, and, and instead, they kind of carry forward with the idea of like, “Hey, look, Google is keeping, you know, more than 50% of the clicks.”
[00:44:19] Matt Bailey: Well, and, and where it really comes out in the article is, uh, when it comes to bounce rate. Uh, those few paragraphs just, my mind exploded because there, there’s this underlying correlation that people are making of bounce rate and content quality. And that connection just makes my head explode because, you know, we just brought up recipes. Uh, am I there, you know, once I find my recipe, I find it, I print it, I’m off the website. I’m, you know, I, I, and so, is that an indicator of low quality or is it an indicator that I found what I wanted, and I’m gone?
Uh, and, and I’ve got numerous data studies to show, like one example was, uh, I was working for a company that made a sugar substitute and the website averages an 80% bounce rate consistently. Well, it’s because their primary searches are people looking for how many teaspoons of the substitute equal a cup of sugar or a, you know, a, a tablespoon of sugar. They’re looking for the conversion table. And those are 80, you know, 80% of the searches. So, naturally they land on the page with the conversion table, they have their answer, they’re gone.
And so, when someone says bounce rate is an indicator of quality, I, I, you know, immediately I’m trying to say, “No, it’s not.” It’s, you, you’re making a, an assumption there. Now, you know, and, and talking with numerous people at Google about that, you know, those are some of the obstacles to making that connection in the algorithm. But yet, I see this consistently in the writing and some of the analysis that’s brought forward.
[00:46:04] Jeff Ferguson: Yeah, absolutely. It, I mean, it’s, uh, you know, it’s, it’s really kooky. I mean, we still see it. Um, the whole concept of all of what’s usually called like the engagement, um, metrics, uh, that are around things where you’ve got time on site and, uh, bounce rate and many other factors that, that come with this. Um, again, they’ll bring back in the idea of social shares and, and a bunch of other factors, as well, and they’ll, they’ll specifically say, “Hey, look, these are, um, you know, most likely, uh, signals of authority or, uh, signals of a variety of different factors,” and you really have to do your best to it.
And, you know, all too often people will, um, sometimes they’ll, they’ll destroy perfectly great, uh, articles that are very useful, um, in the name of actually improving things like bounce rate. Uh, sometimes they’ll other do, they’ll do other sinister things like, um, you know, make it so it’s impossible to back out of an article, um, so that it ends up tricking things. I mean, there’s, there’s so many different ways to do this rather than just saying, “Hey, look, you know, bounce rate might actually have some usefulness here as far as this, but let’s make sure we’re comparing it to the right things.”
And in, instead of like comparing it to themselves, which is what they should do, like you said, there, you know, there, there is a way to look at this from the stance of, “Hey, look, there’s an average for the website. Let’s compare this one piece of content to that average and see where we fall,” right? Instead, they try and find this universal, uh, industry standard, uh, type bounce rate that’s out in the world, uh, that says that like, “Hey, what’s a good bounce rate?”
I mean, you see it up on Quora constantly where people are asking the same questions, and they ask very similar questions about other types of ratios, such as like, “What’s a good click-through rate? Uh, what’s a good cost per click? What’s a good,” whatever. And, you know, you have to break their, their hearts and basically say, “Hey, look, it’s the one that works for you,” right?
It, it, it is not something where, look, you can go out and look, and I, I know, um, you know, uh, Larry Kim and, and his, uh, one of his first companies, um, uh, WordStream I think, whatever it is, still puts out this industry average click-through rate, um, type of presentation every year. And, and it actually does break it down by industry. So, you can actually see things like, “Hey, look, this is the average click-through rate for automotive,” or whatever it is, and people live or die by this number. Right?
And when in reality, it, it’s not even that specific. I mean, I, you know, I mentioned a really good example of it is, uh, the automotive category. Like, what are we talking about with automotive? Are we talking about individual parts? Are we talking about cars? What type of cars? You know, would the click-through rate for BMW be the same as the click-through rate for the Toyota? Right?
I mean, it’s something where you can’t make those kind of big leaps, but people do all the time and then if they’re not reaching it, they, you know, they kill themselves doing it and they call themselves a fire, you know, uh, you know, a failure or sometimes they even get fired for not actually reaching these goals, and other times if they’re doing better than the average, uh, they kind of kick back their heels and say, “Hey, look, I’m doing a great job here,” while meanwhile, like, they’re not selling anything, right? You know, like that’s…
[00:48:53] Matt Bailey: Right. Right.
[00:48:54] Jeff Ferguson: …that’s the problem with this type of data and it gets abused constantly.
[00:48:58] Matt Bailey: Wow. Jen, you had, I think, a lot to say about that, that Backlink, Backlinko study, um, about bounce rates, that it seemed to actually go different ways depending upon how you looked at it.
[00:49:09] Jennifer Hood: Yeah. It, it seems to go different ways and I think what’s striking me listening to Jeff talk about this is again, very similar problem to other industries that it feels like people are divorcing how they actually use Google from what they’re trying to do with SEO. So, it, because we just talked about this with a recipe. How would you go and if you found a recipe, you’d print it out, you’d save it, you’d move on.
[00:50:00] Um, and we see this in other industries, that it’s really easy when you’re on the back end of things to start thinking really theoretical and trying to solve the whole problem and kind of forgetting the human element of what do we actually want out of it? What’s your goal when you search Google? How might you use Google that would trigger it to, um, not answer correctly and you’d refine your search or not to send you somewhere else?
I would be really surprised if everybody’s experience that’s working in SEO is not quite similar to most of the customers’ when they back out of the, uh, the real theoretical working on it from a business perspective.
[00:50:20] Matt Bailey: I think that’s an amazing point that you just made that when they’re analyzing Google, they’re, they’re, yes, they’re analyzing completely different behavior, uh, than they would exhibit themselves personally. I think that is just, that, that I, I, I had to mute the mic because I was just laughing at that. That, it is so true, um, you know, and, and in the industry, we have our own jokes of, you know, when you’re presenting to someone and like the CEO says, “Well, that’s not how I use a website,” or, “That’s not how I use that.” And, and we, you joke about that all the time, but yet, you just brought that up that as the industry, we’re doing it.
[00:50:59] Jennifer Hood: Yeah, but that’s everybody. Everybody says, “That’s why, that’s how I’ve always done it,” or, um, “We don’t do it this way because…” It’s just our natural hesitation to change things. But when we’re in work mode, I think in everyday mode, too, but we really want to have everything be scientific and it’s rare that everything is scientific. There’s usually something that we don’t understand or something that adds variation that, uh, maybe we could figure out, but it’s just not worth the amount of time that it would take to actually figure out every factor, um, that may come into play.
[00:51:36] Jeff Ferguson: Right. And that’s the, yeah, and that’s a, a concept, um, uh, that Jen and I talked about, uh, we didn’t include in the article but it’s, it’s called emergence and it’s, um, it’s a, it’s a, it’s a really big, um, kind of like thought experiment, I guess, in the industry, um, uh, that’s used in multiple industries right now and, and the whole idea is, is, uh, something classic we’ve all heard probably our, our whole professional life, which is the idea that, um, things can be greater than the sum of their parts, right?
And, and that’s, that’s just, I mean, it’s been around forever. And, um, from a statistics standpoint, that, that exists as well, where, when you’ve got, you know, incredibly complex systems, uh, like a Google algorithm or the human body or a car or things like that, there’s, there’s going to be factors to where, um, they’re going to show up and they, they could be weak in nature when you look at them on an individual basis, which is something that, that, you know, Jen did highlight in some of these studies where, “Hey, this is, this is weak and, and, um, you know, it, it shouldn’t be trusted.”
And we, we did get some pushback from that. And for, and it was a very valid pushback, um, from a, a very smart, uh, guy named Russ Jones, um, that worked for Moz at one point, and I think he’s with another company now. And it, we go like, that’s a, that’s a very valid point and I wish it’s something that would get brought up in these studies more, which is the concept that, “Hey, look, there’s, there’s going to be a valid reason why some of this information’s weak.”
Um, that said, right, you, you can’t use that as a crutch, uh, for some of this data, right? Some of this time, you, you, there’s still certain, um, standards when it comes to the information that, um, you know, things can still be, uh, really weak, right? It can be far too weak that it’s, it’s useful at all, right? And, and Jen found, um, one of those, I think it was in the analysis of, uh, domain authority and, and how it was dealing with inbound links and, and, uh, um, I, again, I thought that was amazing, too, where, um, you know, it was the correlation to, I think it was like the, the first, what, 5 or 10, uh, results on a, on a page and, and how domain authority, you know, stated specifically that it was an extraordinarily weak correlation, um, uh, metric wise, and that it was stronger later on.
And, um, this kind of goes back to the, the study, uh, by Rob, uh, Osby that we, we talked about before, where they kind of use that as a way to basically say, “Hey, most likely these top ones have more to do with engagement than they do links, right?” But again, you, you go down the same rabbit hole where it’s, it’s kind of like, “Well, no, because that could just as, as easily mean that, that, you know, they’re, they’re getting more engagement because they are the first 10 links.” Right?
You know, like, but it’s not something that’s actually being discussed when this information presented, so people just kind of like fall back on the idea of we’re going, “Hey, I have to, I have to get more engagement, so I rank better.” It’s like, “Well, no, you’ll, you’ll get more engagement because you rank better.” Right?
You know, like, you know, and, and there’s nothing saying that you shouldn’t try your best, too, to make your article more engaging. Right? You know, like you should take the time to write a great title and meta description tag so that it’s more learned for your audience. Um, you know, you should be writing stuff that’s interesting for people in the first place, right? You know, like, that’s your job as a writer…
[00:54:38] Matt Bailey: Oh yeah.
[00:54:38] Jennifer Hood: …to kind of create stuff for it, right? But like, it’s something where, you know, but they want it, again, to be this kind of mechanical, uh, kind of solution to it, which is like, “Hey, look, I’m going to try and do this, and as a reward I’ll get this in search engines,” and it’s completely backwards. Right? It’s, it’s something where everything that we’re doing in SEO we should be doing because it’s the right thing to do. It’s the right thing, you know, uh, as a marketer, it’s the right thing as a website builder, it’s the right thing as a public relations person.
It, you know, it’s like the right thing to do for a job. These are our jobs and the, and the, you know, going for higher quality is just that what we should be striving for, and as a bonus, we rank really well in search engines, right? Like that’s the, that’s the order of operations for these type of activities, but SEO has had that backwards forever, uh, as a way to kind of make it sound, um, like they’ve, they’ve figured out something that nobody else has.
[00:55:28] Matt Bailey: You know, and you’re dead on, Jeff. I mean, the more statistical studies, the more things like that, it wraps people around, you know, “These are the 300 things I need to do,” and it takes the focus off of, “Did I answer the user’s question? Are they happy?” I, I mean, that’s, that’s what it comes down to. “Am I making someone happy with what they see? Am I giving them what they need?” And instead, it’s focused on, you, I, and Jen, I have to say, I love the quote that, you know, a coin flip is a better job than some of these studies because they’re focusing on such, such meticulously isolated things.
[00:56:06] Jeff Ferguson: Yeah. She almost didn’t want to include that win. We actually went back and forth on that, but Jen, Jen, tell her why was I thought this, your, your answer on there, like you, you really, you know, like you saw it in print and you went, “Uh oh. I better, I’m rethinking this now,” but, uh, but yeah…
[00:56:18] Matt Bailey: Oh, no. It was, it was wonderful. It was wonderful. And, and because we’re, these studies focus on one or two elements. Um, and, and Jen, I mean, I, I would love for you, the paragraph’s in the article, but what would it take to create a valid statistical study of Google’s algorithm?
[00:56:41] Jennifer Hood: A valid statistical study would take things that I think would be nearly impossible, um, partly because we just don’t see this causation and, um, this level of work going into things that aren’t life and death like medical drug trials. Um, but we would need a sample of data that is statistically representative of the entire population. And whether that’s the US, if you’re primarily focused on how you, uh, rank in the US, but maybe it’s beyond that. Maybe it’s, you’re interested for English speaking, um, countries or English speaking search results because that’s, kind of separates into a whole ‘nother thing. What about other languages and how do they rank and are the factors different?
Um, so you need enough of a sample. You need to be able to control factors at different times so that you could keep everything the same and change one thing and see how it adjusts, how it impacts the final result. Uh, you would need Google’s algorithm to stay the same long enough to do that, which my understanding from Jeff after I said, “They must constantly be updating because they likely have a lot of people working on this.” Um, it sounds like they’re updating once a day or more in some cases. Um, so you would need a, a lot of focus on that. You’d need to be able to control for variability, um, and even then, it would still be a struggle to name everything as a causal factor.
Um, but you would also likely need to separate things in segments that I didn’t see come up in the studies, which is, um, I would think the intent of how people are searching would matter. I know I get different search results if I search for, uh, “How to do this,” versus, um, a recipe for prime rib, um, versus, “What cut is prime rib?”
And so, you need to separate those factors because they’re likely a part of the algorithm and separating them would let you tell do they actually matter or not? Maybe my theory’s wrong. Maybe it just happens to be how the result turns out, but, um, you would have to control all of those different things. Uh, and at the end of the day, I would be curious. I, I’ve speculated that the results you get for Google may not be the same results I get for Google on the same search terms.
[00:59:07] Matt Bailey: That was going to be my next question is knowing that Google is attempting to personalize results, I mean, that just throws everything haywire, doesn’t it?
[00:59:18] Jennifer Hood: Yeah. So, then it’s, what are you measuring? And I would guess probably an incognito search. Um, I, I do make YouTube videos and so I have some experience on the SEO side for YouTube videos and trying to rank for the YouTube algorithm, um, with some success, but again, it’s not a science. Um, but with that, the results I get if I search for data analytics are very different than what you’re going to get because I search for analytics topics all of the time. Um, and you may get more introductory things, um, things that are maybe people that, um, have a good match overall, probably have a little more of a following, those sorts of things.
[01:00:00] So, that, that machine learning and that constant personalization think is going to make this even more difficult going forward. That it’s, we get further away from one truth and there are really many things that are true. Um, so ultimately, what is your goal of the work you’re doing? Is the goal just to rank or is the goal to drive traffic or business or sales? Um, and I think that’s maybe from the outside looking in what becomes more important is, are you able to accomplish your end goal of doing all that work, um, or is your goal just to be a, a Google search star versus a, a YouTube star? Um, that, that’s what comes to mind when I look at it. Ultimately, what do any of those rankings get you?
[01:00:53] Matt Bailey: That was a great, great, excellent point, Jen. Thank you. And it’s, it’s great to hear, uh, some outsider views of the SEO industry. And I, I think, you know, it’s funny just the past 20 minutes we keep coming back to quality, of are you giving people information that they want? And that, that is really a true indicator of success rather than some of these things that are being measured. It’s that, I, I would say the immeasurable quality of are people happy with what they see and what the information they’re getting? Did they go away saying, “It answered my question,” and then go on with their day? Uh, it really sometimes it seems to be very, uh, inconsequential what we do considering the larger picture.
[01:01:42] Jeff Ferguson: Absolutely.
[01:01:43] Matt Bailey: So, I, I would say, Jeff, if you could sum up, uh, you, you know, why’d you, why did you write this article? Uh, you know, or, I don’t want to keep you any longer here. We’ve been, we’ve been going quite a while, but who were you writing this article for?
[01:01:56] Jeff Ferguson: Um…
[01:01:56] Matt Bailey: Who was in your mind?
[01:01:57] Jeff Ferguson: It was, it was multiple people, I think. I mean, we’re, you know, at the, the broader level of it, I guess the, the most basic level is for, um, you know, people that, um, may have a certain amount of stock in, uh, these types of studies and a certain amount of belief to it. And, and hopefully it change, kind of changes the mind of people that are maybe just getting to SEO or, you know, that, that, uh, you know, have to work with organic search as part of their business and things like that. And, and, um, you know, they at least see this type of thing and understand that like, these, the, this information that’s being presented may not be in their best interest and, and something that they need to take with a, a certain grain of salt.
Um, but, you know, a secondary reaction was, uh, to hopefully, um, you know, cause some change, uh, in the industry. Right? You know, and, and this was something that, that came up, um, in some of the Twitter discussions about this stuff where, um, you know, we had a few people that, that were really, uh, trying to tear the article apart from different angles and things like that. And, um, in the end I said, “Hey, look, you can, you can spend time trying to tear me apart or trying to tend, you know, tear Jen apart, um, or you guys can do better. Right? You can do better at what you’re actually presenting. We can, we can do a lot better in presenting this type of information.”
Again, that, it goes back to the, the, the head of marketing for Semrush that did say, “Look, we can do better. This is something we want to do.” And, you know, she gets the, the grand prize for, like, how to handle this situation. Right? Well, you know, everybody else, um, is either been completely quiet on the matter or, you know, spin all their, trying, uh, you know, trying to make me look like a fool, uh, on Twitter, you know, like it’s, it’s something where, um, I go, “Look, no, you can do this. You can embrace and learn something from this experience and, and move forward, um, and, and actually do something about it.”
[01:03:44] Matt Bailey: Great. Great. I really appreciate it. And, uh, Jen, thank you so much for taking part in this article and, and also in the podcast, uh, having your perspective on things has been very, very helpful, uh, and, and very pleasant to talk with you, as well.
[01:04:00] Jennifer Hood: You’re welcome. Thank you for having me. It’s been, it was very interesting to work on this. As a, a small business owner myself, I was hoping to find the opposite of what we found because I was really hoping that I would just have a checklist at the end of all the things I needed to do and I could go do them and it, it would demystify this whole process, but, um, it just confirmed why all of the things that I thought I was trying to follow before weren’t just yielding this automatic amazing result. Um, so it’s been very interesting to work on and, uh, to see some of the challenges and really dig into the numbers behind what’s being shared.
[01:04:41] Matt Bailey: You and millions of other people, Jen, would just love to have that checklist. I know. I know. That’s, and that’s why I think we keep saying it’s an art and a science. Uh, there are things you absolutely essentially have to do, and then there are times where you just have to, uh, you, you know, go on your knowledge of your audience and just go out on your own and make those decisions.
Um, but thank you, both of you. I’ve really enjoyed this time. Uh, I, I wish you the best and Jeff, I look forward to more articles and, uh, also look forward to your upcoming book, as well. Uh, you know, I, you and I have talked how many times about similar things within this industry that either need to be cleaned up, uh, but to Jen’s point, you know, as a business owner, “Just give me some straight information,” and, uh, that’s something you and I have talked about numerous times. So, I’m looking forward to more stuff from you about that.
[01:05:36] Jeff Ferguson: Thanks a bunch, Matt.
[01:05:37] Matt Bailey: Alright. Thank you so much, listener. It has been another edition of the Endless Coffee Cup. Uh, please follow the links, find Jeff and Jen, follow the link to the article, read it all. You’ll love it. And, uh, follow both of them, uh, through their social media on LinkedIn. Uh, I challenge you because you’re going to get great information. Jeff and Jen, thank you so much for joining me. I really appreciate the time.
[01:06:03] Jennifer Hood: Thanks Matt.
[01:06:03] Jeff Ferguson: Thank you.