An Agency Workflow for Google My Business Dead Ends



There are times when your digital marketing agency will find itself serving a local business with a need for which Google has made no apparent provisions. Unavailable categories for unusual businesses come instantly to mind, but scenarios can be more complex than this.

Client workflows can bog down as you worry over what to do, fearful of making a wrong move that could get a client’s listing suspended or adversely affect its rankings or traffic. If your agency has many employees, an entry-level SEO could be silently stuck on an issue, or even doing the wrong thing because they don’t know how or where to ask the right questions.

The best solution I know of consists of a combination of:

  • Client contracts that are radically honest about the nature of Google
  • Client management that sets correct expectations about the nature of Google
  • A documented process for seeking clarity when unusual client scenarios arise
  • Agency openness to experimentation, failure, and on-going learning
  • Regular monitoring for new Google developments and changes
  • A bit of grit

Let’s put the fear of often-murky, sometimes-unwieldy Google on the back burner for a few minutes and create a proactive process your team can use when hitting what feels like procedural dead end on the highways and byways of local search.

The apartment office conundrum

As a real-world example of a GMB dead end, a few months ago, I was asked a question about on-site offices for apartment complexes. The details:

  • Google doesn’t permit the creation of listings for rental properties but does allow such properties to be listed if they have an on-site office, as many apartment complexes do.
  • Google’s clearest category for this model is “apartment complex”, but the brand in question was told by Google (at the time) that if they chose that category, they couldn’t display their hours of operation.
  • This led the brand I was advising to wonder if they should use “apartment rental agency” as their category because it does display hours. They didn’t want to inconvenience the public by having them arrive at a closed office after hours, but at the same time, they didn’t want to misrepresent their category.

Now that’s a conundrum!

When I was asked to provide some guidance to this brand, I went through my own process of trying to get at the heart of the matter. In this post, I’m going to document this process for your agency as fully as I can to ensure that everyone on your team has a clear workflow when puzzling local SEO scenarios arise.

I hope you’ll share this article with everyone remotely involved in marketing your clients, and that it will prevent costly missteps, save time, move work forward, and support success.

Step 1: Radical honesty sets the stage right

Whether you’re writing a client contract, holding a client onboarding meeting, or having an internal brand discussion about local search marketing, setting correct expectations is the best defense against future disappointments and disputes. Company leadership must task itself with letting all parties know:

  1. Google has a near-monopoly on search. As such, they can do almost anything they feel will profit them. This means that they can alter SERPs, change guidelines, roll out penalties and filters, monetize whatever they like, and fail to provide adequate support to the public that makes up and interacts with the medium of their product. There is no guarantee any SEO can offer about rankings, traffic, or conversions. Things can change overnight. That’s just how it is.
  2. While Google’s monopoly enables them to be whimsical, brands and agencies do not have the same leeway if they wish to avoid negative outcomes. There are known practices which Google has confirmed as contrary to their vision of search (buying links, building listings for non-existent locations, etc.). Client and agency agree not to knowingly violate Google’s guidelines. These guidelines include:

Don’t accept work under any other conditions than that all parties understand Google’s power, unpredictability, and documented guidelines. Don’t work with clients, agencies, software providers, or others that violate guidelines. These basic rules set the stage for both client and agency success.

Step 2: Confirm that the problem really exists

When a business believes it is encountering an unusual local search marketing problem, the first task of the agency staffer is to vet the issue. The truth is, clients sometimes perceive problems that don’t really exist. In my case of the apartment complex, I took the following steps.

  1. I confirmed the problem. I observed the lacking display of hours of operation on GMB listings using the “apartment complex” category.
  2. I called half-a-dozen nearby apartment complex offices and asked if they were open either by appointment only, or 24/7. None of them were. At least in my corner of the world, apartment complex offices have set, daily business hours, just like retail, opening in the AM and closing in the PM each day.
  3. I did a number of Google searches for “apartment rental agency” and all of the results Google brought up were for companies that manage rentals city-wide — not rentals of units within a single complex.

So, I was now convinced that the business was right: they were encountering a real dead end. If they categorized themselves as an “apartment complex”, their missing hours could inconvenience customers. If they chose the “apartment rental agency” designation to get hours to display, they could end up fielding needless calls from people looking for city-wide rental listings. The category would also fail to be strictly accurate.

As an agency worker, be sure you’ve taken common-sense steps to confirm that a client’s problem is, indeed, real before you move on to next steps.

Step 3: Search for a similar scenario

As a considerate agency SEO, avoid wasting the time of project leads, managers, or company leadership by first seeing if the Internet holds a ready answer to your puzzle. Even if a problem seems unusual, there’s a good chance that somebody else has already encountered it, and may even have documented it. Before you declare a challenge to be a total dead-end, search the following resources in the following order:

  1. Do a direct search in Google with the most explicit language you can (e.g. “GMB listing showing wrong photo”, “GMB description for wrong business”, “GMB owner responses not showing”). Click on anything that looks like it might contain an answer, look at the date on the entry, and see what you can learn. Document what you see.
  2. Go to the Google My Business Help Community forum and search with a variety of phrases for your issue. Again, note the dates of responses for the currency of advice. Be aware that not all contributors are experts. Looks for thread responses from people labeled Gold Product Expert; these members have earned special recognition for the amount and quality of what they contribute to the forum. Some of these experts are widely-recognized, world-class local SEOs. Document what you learn, even if means noting down “No solution found”.
  3. Often, a peculiar local search issue may be the result of a Google change, update, or bug. Check the MozCast to see if the SERPs are undergoing turbulent weather and Sterling Sky’s Timeline of Local SEO Changes. If the dates of a surfaced issue correspond with something appearing on these platforms, you may have found your answer. Document what you learn.
  4. Check trusted blogs to see if industry experts have written about your issue. The nice thing about blogs is that, if they accept comments, you can often get a direct response from the author if something they’ve penned needs further clarification. For a big list of resources, see: Follow the Local SEO Leaders: A Guide to Our Industry’s Best Publications. Document what you learn.

    If none of these tactics yields a solution, move on to the next step.

    Step 4: Speak up for support

    If you’ve not yet arrived at an answer, it’s time to reach out. Take these steps, in this order:

    1) Each agency has a different hierarchy. Now is the time to reach out to the appropriate expert at your business, whether that’s your manager or a senior-level local search expert. Clearly explain the issue and share your documentation of what you’ve learned/failed to learn. See if they can provide an answer.

    2) If leadership doesn’t know how to solve the issue, request permission to take it directly to Google in private. You have a variety of options for doing so, including:

    In the case of the apartment complex, I chose to reach out via Twitter. Responses can take a couple of days, but I wasn’t in a hurry. They replied:

    As I had suspected, Google was treating apartment complexes like hotels. Not very satisfactory since the business models are quite different, but at least it was an answer I could document. I’d hit something of a dead-end, but it was interesting to consider Google’s advice about using the description field to list hours of operation. Not a great solution, but at least I would have something to offer the client, right from the horse’s mouth.

    In your case, be advised that not all Google reps have the same level of product training. Hopefully, you will receive some direct guidance on the issue if you describe it well and can document Google’s response and act on it. If not, keep moving.

    3) If Google doesn’t respond, responds inexpertly, or doesn’t solve your problem, go back to your senior-level person. Explain what happened and request advice on how to proceed.

    4) If the senior staffer still isn’t certain, request permission to publicly discuss the issue (and the client). Head to supportive fora. If you’re a Moz Pro customer, feel free to post your scenario in the Moz Q&A forum. If you’re not yet a customer, head to the Local Search Forum, which is free. Share a summary of the challenge, your failure to find a solution, and ask the community what they would do, given that you appear to be at a dead end. Document the advice you receive, and evaluate it based on the expertise of respondents.

    Step 5: Make a strategic decision

    At this point in your workflow, you’ve now:

    • Confirmed the issue
    • Searched for documented solutions
    • Looked to leadership for support
    • Looked to Google for support
    • Looked to the local SEO industry for support

    I’m hoping you’ve arrived at a strategy for your client’s scenario by now, but if not, you have 3 things left to do.

    1. Take your entire documentation back to your team/company leader. Ask them to work with you on an approved response to the client.
    2. Take that response to the client, with a full explanation of any limitations you encountered and a description of what actions your agency wants to take. Book time for a thorough discussion. If what you are doing is experimental, be totally transparent about this with the client.
    3. If the client agrees to the strategy, enact it.

    In the case of the apartment complex, there were several options I could have brought to the client. One thing I did recommend is that they do an internal assessment of how great the risk really was of the public being inconvenienced by absent hours.

    How many people did they estimate would stop by after 5 PM in a given month and find the office closed? Would that be 1 person a month? 20 people? Did the convenience of these people outweigh risks of incorrectly categorizing the complex as an “apartment rental agency”? How many erroneous phone calls or walk-ins might that lead to? How big of a pain would that be?

    Determining these things would help the client decide whether to just go with Google’s advice of keeping the accurate category and using the description to publish hours, or, to take some risks by miscategorizing the business. I was in favor of the former, but be sure your client has input in the final decision.

    And that brings us to the final step — one your agency must be sure you don’t overlook.

    Step 6: Monitor from here on out

    In many instances, you’ll find a solution that should be all set to go, with no future worries. But, where you run into dead-end scenarios like the apartment complex case and are having to cobble together a workaround to move forward, do these two things:

    1. Monitor outcomes of your implementation over the coming months. Traffic drops, ranking drops, or other sudden changes require a re-evaluation of the strategy you selected. *This is why it is so critical to document everything and to be transparent with the client about Google’s unpredictability and the limitations of local SEOs.
    2. Monitor Google for changes. Today’s dead end could be tomorrow’s open road.

    This second point is particularly applicable to the apartment complex I was advising. About a month after I’d first looked at their issue, Google made a major change. All of a sudden, they began showing hours for the “apartment complex” category!

    If I’d stopped paying attention to the issue, I’d never have noticed this game-changing alteration. When I did see hours appearing on these listings, I confirmed the development with apartment marketing expert Diogo Ordacowski:

    Moral: be sure you are continuing to keep tabs on any particularly aggravating dead ends in case solutions emerge in future. It’s a happy day when you can tell a client their worries are over. What a great proof of the engagement level of your agency’s staff!

    When it comes to Google, grit matters

    Image Credit: The Other Dan

    “What if I do something wrong?”

    It’s totally okay if that question occurs to you sometimes when marketing local businesses. There’s a lot on the line — it’s true! The livelihoods of your clients are a sacred trust. The credibility that your agency is building matters.

    But, fear not. Unless you flagrantly break guidelines, a dose of grit can take you far when dealing with a product like Google My Business which is, itself, an experiment. Sometimes, you just have to make a decision about how to move forward. If you make a mistake, chances are good you can correct it. When a dead end with no clear egress forces you to test out solutions, you’re just doing your job.

    So, be transparent and communicative, be methodical and thorough in your research, and be a bit bold. Remember, your clients don’t just count on you to churn out rote work. In Google’s increasingly walled garden, the agency which can see over the wall tops when necessity calls are bringing extra value.


Source link

The Data You’re Using to Calculate CTR is Wrong and Here’s Why



Click Through Rate (CTR) is an important metric that’s useful for making a lot of calculations about your site’s SEO performance, from estimating revenue opportunity, prioritize keyword optimization, to the impact of SERP changes within the market. Most SEOs know the value of creating custom CTR curves for their sites to make those projections more accurate. The only problem with custom CTR curves from Google Search Console (GSC) data is that GSC is known to be a flawed tool that can give out inaccurate data. This convolutes the data we get from GSC and can make it difficult to accurately interpret the CTR curves we create from this tool. Fortunately, there are ways to help control for these inaccuracies so you get a much clearer picture of what your data says.

By carefully cleaning your data and thoughtfully implementing an analysis methodology, you can calculate CTR for your site much more accurately using 4 basic steps:

  1. Extract your sites keyword data from GSC — the more data you can get, the better.
  2. Remove biased keywords — Branded search terms can throw off your CTR curves so they should be removed.
  3. Find the optimal impression level for your data set — Google samples data at low impression levels so it’s important to remove keywords that Google may be inaccurately reporting at these lower levels.
  4. Choose your rank position methodology — No data set is perfect, so you may want to change your rank classification methodology depending on the size of your keyword set.

Let’s take a quick step back

Before getting into the nitty gritty of calculating CTR curves, it’s useful to briefly cover the simplest way to calculate CTR since we’ll still be using this principle. 

To calculate CTR, download the keywords your site ranks for with click, impression, and position data. Then take the sum of clicks divided by the sum of impressions at each rank level from GSC data you’ll come out with a custom CTR curve. For more detail on actually crunching the numbers for CTR curves, you can check out this article by SEER if you’re not familiar with the process.

Where this calculation gets tricky is when you start to try to control for the bias that inherently comes with CTR data. However, even though we know it gives bad data we don’t really have many other options, so our only option is to try to eliminate as much bias as possible in our data set and be aware of some of the problems that come from using that data.

Without controlling and manipulating the data that comes from GSC, you can get results that seem illogical. For instance, you may find your curves show position 2 and 3 CTR’s having wildly larger averages than position 1. If you don’t know that data that you’re using from Search Console is flawed you might accept that data as truth and a) try to come up with hypotheses as to why the CTR curves look that way based on incorrect data, and b) create inaccurate estimates and projections based on those CTR curves.

Step 1: Pull your data

The first part of any analysis is actually pulling the data. This data ultimately comes from GSC, but there are many platforms that you can pull this data from that are better than GSC’s web extraction.

Google Search Console — The easiest platform to get the data from is from GSC itself. You can go into GSC and pull all your keyword data for the last three months. Google will automatically download a csv. file for you. The downside to this method is that GSC only exports 1,000 keywords at a time making your data size much too small for analysis. You can try to get around this by using the keyword filter for the head terms that you rank for and downloading multiple 1k files to get more data, but this process is an arduous one. Besides the other methods listed below are better and easier.

Google Data Studio — For any non-programmer looking for an easy way to get much more data from Search Console for free, this is definitely your best option. Google Data Studio connects directly to your GSC account data, but there are no limitations on the data size you can pull. For the same three month period trying to pull data from GSC where I would get 1k keywords (the max in GSC), Data Studio would give me back 200k keywords!

Google Search Console API — This takes some programming know-how, but one of the best ways to get the data you’re looking for is to connect directly to the source using their API. You’ll have much more control over the data you’re pulling and get a fairly large data set. The main setback here is you need to have the programming knowledge or resources to do so.

Keylime SEO Toolbox — If you don’t know how to program but still want access to Google’s impression and click data, then this is a great option to consider. Keylime stores historical Search Console data directly from the Search Console API so it’s as good (if not better) of an option than directly connecting to the API. It does cost $49/mo, but that’s pretty affordable considering the value of the data you’re getting.

The reason it’s important what platform you get your data from is that each one listed gives out different amounts of data. I’ve listed them here in the order of which tool gives the most data from least to most. Using GSC’s UI directly gives by far the least data, while Keylime can connect to GSC and Google Analytics to combine data to actually give you more information than the Search Console API would give you. This is good because whenever you can get more data, the more likely that the CTR averages you’re going to make for your site are going to be accurate.

Step 2: Remove keyword bias

Once you’ve pulled the data, you have to clean it. Because this data ultimately comes from Search Console we have to make sure we clean the data as best we can.

Remove branded search & knowledge graph keywords

When you create general CTR curves for non-branded search it’s important to remove all branded keywords from your data. These keywords should have high CTR’s which will throw off the averages of your non-branded searches which is why they should be removed. In addition, if you’re aware of any SERP features like knowledge graph you rank for consistently, you should try to remove those as well since we’re only calculating CTR for positions 1–10 and SERP feature keywords could throw off your averages.

Step 3: Find the optimal impression level in GSC for your data

The largest bias from Search Console data appears to come from data with low search impressions which is the data we need to try and remove. It’s not surprising that Google doesn’t accurately report low impression data since we know that Google doesn’t even include data with very low searches in GSC. For some reason Google decides to drastically over report CTR for these low impression terms. As an example, here’s an impression distribution graph I made with data from GSC for keywords that have only 1 impression and the CTR for every position.

If that doesn’t make a lot of sense to you, I’m right there with you. This graph says a majority of the keywords with only one impression has 100 percent CTR. It’s extremely unlikely, no matter how good your site’s CTR is, that one impression keywords are going to get a majority of 100 percent CTR. This is especially true for keywords that rank below #1. This gives us pretty solid evidence low impression data is not to be trusted, and we should limit the number of keywords in our data with low impressions.

Step 3 a): Use normal curves to help calculate CTR

For more evidence of Google giving us biased data we can look at the distribution of CTR for all the keywords in our data set. Since we’re calculating CTR averages, the data should adhere to a Normal Bell Curve. In most cases CTR curves from GSC are highly skewed to the left with long tails which again indicates that Google reports very high CTR at low impression volumes.

If we change the minimum number of impressions for the keyword sets that we’re analyzing we end up getting closer and closer to the center of the graph. Here’s an example, below is the distribution of a site CTR in CTR increments of .001.

The graph above shows the impressions at a very low impression level, around 25 impressions. The distribution of data is mostly on the right side of this graph with a small, high concentration on the left implies that this site has a very high click-through rate. However, by increasing the impression filter to 5,000 impressions per keyword the distribution of keywords gets much much closer to the center.

This graph most likely would never be centered around 50% CTR because that’d be a very high average CTR to have, so the graph should be skewed to the left. The main issue is we don’t know how much because Google gives us sampled data. The best we can do is guess. But this raises the question, what’s the right impression level to filter my keywords out to get rid of faulty data?

One way to find the right impression level to create CTR curves is to use the above method to get a feel for when your CTR distribution is getting close to a normal distribution. A Normally Distributed set of CTR data has fewer outliers and is less likely to have a high number of misreported pieces of data from Google.

3 b): Finding the best impression level to calculate CTR for your site

You can also create impression tiers to see where there’s less variability in the data you’re analyzing instead of Normal Curves. The less variability in your estimates, the closer you’re getting to an accurate CTR curve.

Tiered CTR tables

Creating tiered CTR needs to be done for every site because the sampling from GSC for every site is different depending on the keywords you rank for. I’ve seen CTR curves vary as much as 30 percent without the proper controls added to CTR estimates. This step is important because using all of the data points in your CTR calculation can wildly offset your results. And using too few data points gives you too small of a sample size to get an accurate idea of what your CTR actually is. The key is to find that happy medium between the two.

In the tiered table above, there’s huge variability from All Impressions to >250 impressions. After that point though, the change per tier is fairly small. Greater than 750 impressions are the right level for this site because the variability among curves is fairly small as we increase impression levels in the other tiers and >750 impressions still gives us plenty of keywords in each ranking level of our data set.

When creating tiered CTR curves, it’s important to also count how much data is used to build each data point throughout the tiers. For smaller sites, you may find that you don’t have enough data to reliably calculate CTR curves, but that won’t be apparent from just looking at your tiered curves. So knowing the size of your data at each stage is important when deciding what impression level is the most accurate for your site.

Step 4: Decide which position methodology to analyze your data

Once you’ve figured out the correct impression-level you want to filter your data by you can start actually calculating CTR curves using impression, click, and position data. The problem with position data is that it’s often inaccurate, so if you have great keyword tracking it’s far better to use the data from your own tracking numbers than Google’s. Most people can’t track that many keyword positions so it’s necessary to use Google’s position data. That’s certainly possible, but it’s important to be careful with how we use their data.

How to use GSC position

One question that may come up when calculating CTR curves using GSC average positions is whether to use rounded positions or exact positions (i.e. only positions from GSC that rank exactly 1. So, ranks 1.0 or 2.0 are exact positions instead of 1.3 or 2.1 for example).

Exact position vs. rounded position

The reasoning behind using exact position is we want data that’s most likely to have been ranking in position 1 for the time period we’re measuring. Using exact position will give us the best idea of what CTR is at position 1. Exact rank keywords are more likely to have been ranking in that position for the duration of the time period you pulled keywords from. The problem is that Average Rank is an average so there’s no way to know if a keyword has ranked solidly in one place for a full time period or the average just happens to show an exact rank.

Fortunately, if we compare exact position CTR vs rounded position CTR, they’re directionally similar in terms of actual CTR estimations with enough data. The problem is that exact position can be volatile when you don’t have enough data. By using rounded positions we get much more data, so it makes sense to use rounded position when not enough data is available for exact position.

The one caveat is for position 1 CTR estimates. For every other position average rankings can pull up on a keywords average ranking position and at the same time they can pull down the average. Meaning that if a keyword has an average ranking of 3. It could have ranked #1 and #5 at some point and the average was 3. However, for #1 ranks, the average can only be brought down which means that the CTR for a keyword is always going to be reported lower than reality if you use rounded position.

A rank position hybrid: Adjusted exact position

So if you have enough data, only use exact position for position 1. For smaller sites, you can use adjusted exact position. Since Google gives averages up to two decimal points, one way to get more “exact position” #1s is to include all keywords which rank below position 1.1. I find this gets a couple hundred extra keywords which makes my data more reliable.

And this also shouldn’t pull down our average much at all, since GSC is somewhat inaccurate with how it reports Average Ranking. At Wayfair, we use STAT as our keyword rank tracking tool and after comparing the difference between GSC average rankings with average rankings from STAT the rankings near #1 position are close, but not 100 percent accurate. Once you start going farther down in rankings the difference between STAT and GSC become larger, so watch out how far down in the rankings you go to include more keywords in your data set.

I’ve done this analysis for all the rankings tracked on Wayfair and I found the lower the position, the less closely rankings matched between the two tools. So Google isn’t giving great rankings data, but it’s close enough near the #1 position, that I’m comfortable using adjusted exact position to increase my data set without worrying about sacrificing data quality within reason.


GSC is an imperfect tool, but it gives SEOs the best information we have to understand an individual site’s click performance in the SERPs. Since we know that GSC is going to throw us a few curveballs with the data it provides its important to control as many pieces of that data as possible. The main ways to do so is to choose your ideal data extraction source, get rid of low impression keywords, and use the right rank rounding methods. If you do all of these things you’re much more likely to get more accurate, consistent CTR curves on your own site.


Source link

How Does the Local Algorithm Work? – Whiteboard Friday



When it comes to Google’s algorithms, there’s quite a difference between how they treat local and organic. Get the scoop on which factors drive the local algorithm and how it works from local SEO extraordinaire, Joy Hawkins, as she offers a taste of her full talk from MozCon 2019.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hello, Moz fans. I’m Joy Hawkins. I run a local SEO agency from Toronto, Canada, and a search forum known as the Local Search Forum, which basically is devoted to anything related to local SEO or local search. Today I’m going to be talking to you about Google’s local algorithm and the three main factors that drive it. 

If you’re wondering what I’m talking about when I say the local algorithm, this is the algorithm that fuels what we call the three-pack here. When you do a local search or a search that Google thinks has local intents, like plumbers let’s say, you traditionally will get three results at the top with the map, and then everything below it I refer to as organic. This algorithm I’ll be kind of breaking down is what fuels this three-pack, also known as Google My Business listings or Google Maps listings.

They’re all talking about the exact same thing. If you search Google’s Help Center on what they look at with ranking these entities, they tell you that there are three main things that fuel this algorithm. The three things that they talk about are proximity, prominence, and relevance. I’m going to basically be breaking down each one and explaining how the factors work.

1. Proximity

I’ll kind of start here with proximity. Proximity is basically defined as your location when you are searching on your phone or your computer and you type something in. It’s where Google thinks you are located. If you’re not really sure, often you can scroll down to the bottom of your page, and at the bottom of your page it will often list a zip code that Google thinks you’re in.

Zip code (desktop)

The other way to tell is if you’re on a phone, sometimes you can also see a little blue dot on the map, which is exactly where Google thinks you’re located. On a high level, we often think that Google thinks we’re located in a city, but this is actually pretty false, which I know that there’s been actually a lot of talk at MozCon about how Google pretty much always knows a little deeper than that as far as where users are located.

Generally speaking, if you’re on a computer, they know what zip code you’re in, and they’ll list that at the bottom. There are a variety of tools that can help you check ranking based on zip codes, some of which would be Moz Check Your Presence Tool, BrightLocal, Whitespark, or Places Scout. All of these tools have the ability to track at the zip code level. 

Geo coordinates (mobile)

However, when you’re on a phone, usually Google knows your location even more detailed, and they actually generally know the geo coordinates of your actual location, and they pinpoint this using that little blue dot.

It knows even more about the zip code. It knows where you’re actually located. It’s a bit creepy. But there are a couple of tools that will actually let you see results based on geo coordinates, which is really cool and very accurate. Those tools include the Local Falcon, and there is a Chrome extension which is 100% free, that you can put in your browser, called GS Location Changer.

I use this all the time in an incognito browser if I want to just see what search results look like from a very, very specific location. Now these two levels, depending on what industry you are working in, it’s really important to know which level you need to be looking at. If you work with lawyers, for example, zip code level is usually good enough.

There aren’t enough lawyers to make a huge difference at certain like little points inside a given zip code. However, if you work with dentists or restaurants, let’s say, you really need to be looking at geo coordinate levels. We have seen lots of cases where we will scan a specific keyword using these two tools, and depending on where in that zip code we are, we see completely different three-packs.

It’s very, very key to know that this factor here for proximity really influences the results that you see. This can be challenging, because when you’re trying to explain this to clients or business owners, they search from their home, and they’re like, “Why am I not there?” It’s because their proximity or their location is different than where their office is located.

I realize this is a challenging problem to solve for a lot of agencies on how to represent this, but that’s kind of the tools that you need to look at and use. 

2. Prominence

Moving to the next factor, so prominence, this is basically how important Google thinks you are. Like Is this business a big deal, or are they just some random, crappy business or a new business that we don’t know much about?

  • This looks at things like links, for example. 
  • Store visits, if you are a brick-and-mortar business and you get no foot traffic, Google likely won’t think you’re very prominent. 
  • Reviews, the number of reviews often factors in here. We often see in cases where businesses have a lot of reviews and a lot of old reviews, they generally have a lot of prominence.
  • Citations can also factor in here due to the number of citations. That can also factor into prominence. 

3. Relevance

Moving into the relevance factor, relevance is basically, does Google think you are related to the query that is typed in? You can be as prominent as anyone else, but if you do not have content on your page that is structured well, that covers the topic the user is searching about, your relevance will be very low, and you will run into issues.

It’s very important to know that these three things all kind of work together, and it’s really important to make sure you are looking at all three. On the relevance end, it looks at things like:

  • content
  • onsite SEO, so your title tags, your meta tags, all that nice SEO stuff
  • Citations also factor in here, because it looks at things like your address. Like are you actually in this city? Are you relevant to the city that the user is trying to get locations from? 
  • Categories are huge here, your Google My Business categories. Google currently has just under 4,000 different Google My Business categories, and they add an insane amount every year and they also remove ones. It’s very important to keep on top of that and make sure that you have the correct categories on your listing or you won’t rank well.
  • The business name is unfortunately a huge factor as well in here. Merely having keywords in your business name can often give you relevance to rank. It shouldn’t, but it does. 
  • Then review content. I know Mike Blumenthal did a really cool experiment on this a couple years ago, where he actually had a bunch of people write a bunch of fake reviews on Yelp mentioning certain terms to see if it would influence ranking on Google in the local results, and it did. Google is definitely looking at the content inside the reviews to see what words people are using so they can see how that impacts relevance. 

How to rank without proximity, prominence, or relevance

Obviously you want all three of these things. It is possible to rank if you don’t have all three, and I’ll give a couple examples. If you’re looking to expand your radius, you service a lot of people.

You don’t just service people on your block. You’re like, “I serve the whole city of Chicago,” for example. You are not likely going to rank in all of Chicago for very common terms, things like dentist or personal injury attorney. However, if you have a lot of prominence and you have a really relevant page or content related to really niche terms, we often see that it is possible to really expand your radius for long tail keywords, which is great.

Prominence is probably the number one thing that will expand your radius inside competitive terms. We’ll often see Google bringing in a business that is slightly outside of the same area as other businesses, just because they have an astronomical number of reviews, or maybe their domain authority is ridiculously high and they have all these linking domains.

Those two factors are definitely what influences the amount of area you cover with your local exposure. 

Spam and fake listings

On the flip side, spam is something I talk a lot about. Fake listings are a big problem in the local search space. Fake listings, these lead gen providers create these listings, and they rank with zero prominence.

They have no prominence. They have no citations. They have no authority. They often don’t even have websites, and they still rank because of these two factors. You create 100 listings in a city, you are going to be close to someone searching. Then if you stuff a bunch of keywords in your business name, you will have some relevance, and by somehow eliminating the prominence factor, they are able to get these listings to rank, which is very frustrating.

Obviously, Google is kind of trying to evolve this algorithm over time. We are hoping that maybe the prominence factor will increase over time to kind of eliminate that problem, but ultimately we’ll have to see what Google does. We also did a study recently to test to see which of these two factors kind of carries more weight.

An experiment: Linking to your site within GMB

One thing I’ve kind of highlighted here is when you link to a website inside your Google My Business listing, there’s often a debate. Should I link to my homepage, or should I link to my location page if I’ve got three or four or five offices? We did an experiment to see what happens when we switch a client’s Google My Business listing from their location page to their homepage, and we’ve pretty much almost always seen a positive impact by switching to the homepage, even if that homepage is not relevant at all.

In one example, we had a client that was in Houston, and they opened up a location in Dallas. Their homepage was optimized for Houston, but their location page was optimized for Dallas. I had a conversation with a couple of other SEOs, and they were like, “Oh, well, obviously link to the Dallas page on the Dallas listing. That makes perfect sense.”

But we were wondering what would happen if we linked to the homepage, which is optimized for Houston. We saw a lift in rankings and a lift in the number of search queries that this business showed for when we switched to the homepage, even though the homepage didn’t really mention Dallas at all. Something to think about. Make sure you’re always testing these different factors and chasing the right ones when you’re coming up with your local SEO strategy. Finally, something I’ll mention at the top here.

Local algorithm vs organic algorithm

As far as the local algorithm versus the organic algorithm, some of you might be thinking, okay, these things really look at the same factors. They really kind of, sort of work the same way. Honestly, if that is your thinking, I would really strongly recommend you change it. I’ll quote this. This is from a Moz whitepaper that they did recently, where they found that only 8% of local pack listings had their website also appearing in the organic search results below.

I feel like the overlap between these two is definitely shrinking, which is kind of why I’m a bit obsessed with figuring out how the local algorithm works to make sure that we can have clients successful in both spaces. Hopefully you learned something. If you have any questions, please hit me up in the comments. Thanks for listening.

Video transcription by

If you liked this episode of Whiteboard Friday, you’ll love all the SEO thought leadership goodness you’ll get from our newly released MozCon 2019 video bundle. Catch Joy’s full talk on the differences between the local and organic algorithm, plus 26 additional future-focused topics from our top-notch speakers:

Grab the sessions now!

We suggest scheduling a good old-fashioned knowledge share with your colleagues to educate the whole team — after all, who didn’t love movie day in school? 😉


Source link

Amazon vs. Google: Decoding the World’s Largest E-commerce Search Engine



A lot of people forget that Amazon is a search engine, let alone the largest search engine for e-commerce. With 54 percent of product searches now taking place on Amazon, it’s time to take it seriously as the world’s largest search engine for e-commerce. In fact, if we exclude YouTube as part of Google, Amazon is technically the second largest search engine in the world.

As real estate on Google becomes increasingly difficult to maintain, moving beyond a website-centric e-commerce strategy is a no brainer. With 54% of shoppers choosing to shop on e-commerce marketplaces, it’s no surprise that online marketplaces are the number one most important digital marketing channel in the US, according to a 2018 study by the Digital Marketing Institute. While marketplaces like Etsy and Walmart are growing fast, Amazon maintains its dominance of e-commerce market share owning 47 percent of online sales, and 5 percent of all retail sales in the US.

Considering that there are currently over 500 million products listed on, and more than two-thirds of clicks happen on the first page of Amazon’s search results—selling products on Amazon is no longer as easy as “set it and forget it.” 

Enter the power of SEO.

When we think of SEO, many of us are aware of the basics of how Google’s algorithm works, but not many of us are up to speed with SEO on Amazon. Before we delve into Amazon’s algorithm, it’s important to note how Google and Amazon’s starkly different business models are key to what drives their algorithms and ultimately how we approach SEO on the two platforms.

The academic vs. The stockbroker

Google was born in 1998 through a Ph.D. project by Lawrence Page and Sergey Brin. It was the first search engine of its kind designed to crawl and index the web more efficiently than any existing systems at the time.

Google was built on a foundation of scientific research and academia, with a mission to;

“Organize the world’s information and make it universally accessible and useful” — Google

Now, answering 5.6 billion queries every day, Google’s mission is becoming increasingly difficult — which is why their algorithm is designed as the most complex search engine in the world, continuously refined through hundreds of updates every year.

In contrast to Brin and Page, Jeff Bezos began his career on Wall Street in a series of jobs before starting Amazon in 1994 after reading that the web was growing at 2,300 percent. Determined to take advantage of this, he made a list of the top products most likely to sell online and settled with books because of their low cost and high demand. Amazon was built on a revenue model, with a mission to:

“Be the Earth’s most customer-centric company, where customers can find and discover anything they might want to buy online, and endeavors to offer its customers the lowest possible prices.” — Amazon

Amazon doesn’t have searcher intent issues

When it comes to SEO, the contrasting business models of these two companies lead the search engines to ask very different questions in order to deliver the right results to the user.

On one hand, we have Google who asks the question:

“What results most accurately answer the searcher’s query?”

Amazon, on the other hand, wants to know:

“What product is the searcher most likely to buy?”

On Amazon, people aren’t asking questions, they’re searching for products—and what’s more, they’re ready to buy. So, while Google is busy honing an algorithm that aims to understand the nuances of human language, Amazon’s search engine serves one purpose—to understand searches just enough to rank products based on their propensity to sell.

With this in mind, working to increase organic rankings on Amazon becomes a lot less daunting.

Amazon’s A9 algorithm: The secret ingredient

Amazon may dominate e-commerce search, but many people haven’t heard of the A9 algorithm. Which might seem unusual, but the reason Amazon isn’t keen on pushing their algorithm through the lens of a large scale search engine is simply that Amazon isn’t in the business of search.

Amazon’s business model is a well-oiled revenue-driving machine — designed first and foremost to sell as many products as possible through its online platform. While Amazon’s advertising platform is growing rapidly, and AWS continues as their fastest-growing revenue source — Amazon still makes a large portion of revenue through goods sold through the marketplace.

With this in mind, the secret ingredient behind Amazon’s A9 algorithm is, in fact: Sales Velocity

What is sales velocity, you ask? It’s essentially the speed and volume at which your products sell on Amazon’s marketplace.

There are lots of factors which Amazon SEO’s refer to as “direct” and “indirect” ranking factors, but ultimately every single one of them ties back to sales velocity in some way.

At Wolfgang Digital, we approach SEO on Google based on three core pillars — Technology, Relevance, and Authority.

Evidently, Google’s ranking pillars are all based on optimizing a website in order to drive click through on the SERP.

On the other hand, Amazon’s core ranking pillars are tied back to driving revenue through sales velocity — Conversion Rate, Keyword Relevance and of course, Customer Satisfaction.

Without further ado, let’s take a look at the key factors behind each of these pillars, and what you can optimize to increase your chances of ranking on Amazon’s coveted first page.

Conversion rate

Conversion rates on Amazon have a direct impact on where your product will rank because this tells Amazon’s algorithm which products are most likely to sell like hotcakes once they hit the first page.

Of all variables to monitor as an Amazon marketer, working to increase conversion rates is your golden ticket to higher organic rankings.

Optimize pricing

Amazon’s algorithm is designed to predict which products are most likely to convert. This is why the price has such a huge impact on where your products rank in search results. If you add a new product to Amazon at a cheaper price than the average competitor, your product is inclined to soar to the top-ranking results, at least until it gathers enough sales history to determine the actual sales performance.

Even if you’re confident that you have a supplier advantage, it’s worth checking your top-selling products and optimizing pricing where possible. If you have a lot of products, repricing software is a great way to automate pricing adjustments based on the competition while still maintaining your margins.

However, Amazon knows that price isn’t the only factor that drives sales, which is why Amazon’s first page isn’t simply an ordered list of items priced low to high. See the below Amazon UK search results for “lavender essential oil:”

Excluding the sponsored ads, we can still see that not all of the cheap products are ranked high and the more expensive ones lower down the page. So, if you’ve always maintained the idea that selling on Amazon is a race to the bottom on price, read on my friends.

Create listings that sell

As we discussed earlier, Amazon is no longer a “set it and forget” platform, which is why you should treat each of your product listings as you would a product page on your website. Creating listings that convert takes time, which is why not many sellers do it well, so it’s an essential tactic to steal conversions from the competition.


Make your titles user-friendly, include the most important keywords at the front, and provide just enough information to entice clicks. Gone are the days of keyword stuffing titles on Amazon, in fact, it may even hinder your rankings by reducing clicks and therefore conversions.

Bullet points

These are the first thing your customer sees, so make sure to highlight the best features of your product using a succinct sentence in language designed to convert.

Improve the power of your bullet points by including information that your top competitors don’t provide. A great way to do this is to analyze the “answered questions” for some of your top competitors.

Do you see any trending questions that you could answer in your bullet points to help shorten the buyer journey and drive conversions to your product?

Product descriptions

Given that over 50 percent of Amazon shoppers said they always read the full description when they are considering purchasing a product, a well-written product description can have a huge impact on conversions.

Your description is likely to be the last thing a customer will read before they choose to buy your product over a competitor, so give these your time and care, reiterating points made in your bullet points and highlighting any other key features or benefits likely to push conversions over the line.

Taking advantage of A+ content for some of your best selling products is a great way to craft a visually engaging description, like this example from Safavieh.

Of course, A+ content requires additional design costs which may not be feasible for everyone. If you opt for text-only descriptions, make sure your content is easy to read while still highlighting the best features of your product.

For an in-depth breakdown on creating a beautifully crafted Amazon listing, I highly recommend this post from Startup Bros.

AB test images

Images are incredibly powerful when it comes to increasing conversions, so if you haven’t tried split testing different image versions on Amazon, you could be pleasantly surprised. One of the most popular tools for Amazon AB testing is Splitly — it’s really simple to use, and affordable with plans starting at $47 per month.

Depending on your product type, it may be worth investing the time into taking your own pictures rather than using the generic supplier provided images. Images that tend to have the biggest impact on conversions are the feature images (the one you see in search results) and close up images, so try testing a few different versions to see which has the biggest impact.

Amazon sponsored ads

The best thing about Amazon SEO is that your performance on other marketing channels can help support your organic performance.

Unlike on Google, where advertising has no impact on organic rankings, if your product performs well on Amazon ads, it may help boost organic rankings. This is because if a product is selling through ads, Amazon’s algorithm may see this as a product that users should also see organically.

A well-executed ad campaign is particularly important for new products, in order to boost their sales velocity in the beginning and build up the sales history needed to rank better organically.

External traffic

External traffic involves driving traffic from social media, email, or other sources to your Amazon products.

While external sources of traffic are a great way to gain more brand exposure and increase customer reach, a well-executed external traffic strategy also impacts your organic rankings because of its role in increasing sales and driving up conversion rates.

Before you start driving traffic straight to your Amazon listing, you may want to consider using a landing page tool like Landing Cube in order to protect your conversion rate as much as possible.

With a landing page tool, you drive traffic to a landing page where customers get a special offer code to use on your product listing page—this way, you only drive traffic which is guaranteed to convert.

Keyword relevance

A9 still relies heavily on keyword matching to determine the relevance of a product to searcher’s query, which is why this is a core pillar of Amazon SEO.

While your title, bullet points, and descriptions are essential for converting customers, if you don’t include the relevant keywords, your chances of driving traffic to convert are slim to none.

Every single keyword incorporated in your Amazon listing will impact your rankings, so it’s important to deploy a strategic approach.

Steps for targeting the right keywords on Amazon:

  1. Brainstorm as many search terms you think someone would use to find your product.
  2. Analyze 3–5 competitors with the most reviews to identify their target keywords.
  3. Validate the top keywords for your product using an Amazon keyword tool such as Magnet, Ahrefs, or
  4. Download the keyword lists into Excel, and filter out any duplicate or irrelevant keywords. 
  5. Prioritize search terms with the highest search volume, bearing in mind that broad terms will be harder to rank for. Depending on the competition, it may make more sense to focus on lower volume terms with lower competition—but this can always be tested later on.

Once you have refined the keywords you want to rank for, here are some things to remember:

  • Include your most important keywords at the start of the title, after your brand name.
  • Use long-tail terms and synonyms throughout your bullets points and descriptions.
  • Use your backend search terms wisely — these are a great place for including some common misspellings, different measurement versions e.g. metric or imperial, color shades and descriptive terms.
  • Most importantly — don’t repeat keywords. If you’ve included a search term once in your listing i.e. the title, you don’t need to include it in your backend search terms. Repeating a keyword, or keyword stuffing will not improve your rankings.

Customer satisfaction

Account health

Part of Amazon’s mission statement is “to be the Earth’s most customer-centric company.” This relentless focus on the customer is what drives Amazon’s astounding customer retention, with 85 percent of Prime shoppers visiting the marketplace at least once a week and 56% of non-Prime members reporting the same. A focus on the customer is at the core of Amazon’s success, which is why stringent customer satisfaction metrics are a key component to selling on Amazon.

Your account health metrics are the bread and butter of your success as an Amazon seller, which is why they’re part of Amazon’s core ranking algorithm. Customer experience is so important to Amazon that, if you fail to meet the minimum performance requirements, you risk getting suspended as a seller—and they take no prisoners.

On the other hand, if you are meeting your minimum requirements but other sellers are performing better than you by exceeding theirs, they could be at a ranking advantage. 

Customer reviews

Customer reviews are one of the most important Amazon ranking factors — not only do they tell Amazon how customers feel about your product, but they are one of the most impactful conversion factors in e-commerce. Almost 95 percent of online shoppers read reviews before buying a product, and over 60 percent of Amazon customers say they wouldn’t purchase a product with less than 4.5 stars.

On Amazon, reviews help to drive both conversion rate and keyword relevance, particularly for long-tail terms. In short, they’re very important.

Increasing reviews for your key products on Amazon was historically a lot easier, through acquiring incentivized reviews. However, in 2018, Amazon banned sellers from incentivizing reviews which makes it even more difficult to actively build reviews, especially for new products.

Tips for building positive reviews on Amazon:

  • Maintain consistent communication throughout the purchase process using Amazon email marketing software. Following up to thank someone for their order and notify when the order if fulfilled, creates a seamless buying experience which leaves customers more likely to give a positive review.
  • Adding branded package inserts to thank customers for their purchase makes the buying experience personal, differentiating you as a brand rather than a nameless Amazon seller. Including a friendly reminder to leave a review in a nice delivery note will have better response rates than the generic email they receive from Amazon.
  • Providing upfront returns information without a customer having to ask for it shows customers you are confident in the quality of your product. If a customer isn’t happy with your product, adding fuel to the fire with a clunky or difficult returns process is more likely to result in negative reviews through sheer frustration.
  • Follow up with helpful content related to your products such as instructions, decor inspiration, or recipe ideas, including a polite reminder to provide a review in exchange.
  • And of course, deliver an amazing customer experience from start to finish.

Key takeaways for improving Amazon SEO

As a marketer well versed in the world of Google, venturing onto Amazon can seem like a culture shock — but mastering the basic principles of Amazon SEO could be the difference between getting lost in a sea of competitors and driving a successful Amazon business.

  • Focus on driving sales velocity through increasing conversion rate, improving keyword relevance, nailing customer satisfaction and actively building reviews.
  • Craft product listings for customers first, search engines second.
  • Don’t neglect product descriptions in the belief that no one reads them—over 50% of Amazon shoppers report reading the full description before buying a product.
  • Keywords carry a lot of weight. If you don’t include a keyword in your listing, your chances of ranking for it are slim.
  • Images are powerful. Take your own photos instead of using generic supplier images and be sure to test, test, and test.
  • Actively build positive reviews by delivering an amazing customer experience.
  • Invest in PPC and driving external traffic to support organic performance, especially for new products.

What other SEO tips or tactics do you apply on Amazon? Tell me in the comments below!


Source link

Lead Volume vs. Lead Quality By RuthBurrReedy



Ruth Burr Reedy is an SEO and online marketing consultant and speaker and the Vice President of Strategy at UpBuild, a technical marketing agency specializing in SEO, web analytics, and conversion rate optimization. This is the first post in a recurring monthly series and we’re excited! 

When you’re onboarding a new SEO client who works with a lead generation model, what do you do?

Among the many discovery questions you ask as you try to better understand your client’s business, you probably ask them, “What makes a lead a good lead?” That is, what are the qualities that make a potential customer more likely to convert to sale?

A business that’s given some thought to their ideal customer might send over some audience personas; they might talk about their target audience in more general terms. A product or service offering might be a better fit for companies of a certain size or budget, or be at a price point that requires someone at a senior level (such as a Director, VP, or C-level employee) to sign off, and your client will likely pass that information on to you if they know it. However, it’s not uncommon for these sorts of onboarding conversations to end with the client assuring you: “Just get us the leads. We’ll make the sales.”

Since SEO agencies often don’t have access to our clients’ CRM systems, we’re often using conversion to lead as a core KPI when measuring the success of our campaigns. We know enough to know that it’s not enough to drive traffic to a site; that traffic has to convert to become valuable. Armed with our clients’ assurances that what they really need is more leads, we dive into understanding the types of problems that our client’s product is designed to solve, the types of people who might have those problems, and the types of resources they might search for as they tend to solve those problems. Pretty soon, we’ve fixed the technical problems on our client’s site, helped them create and promote robust resources around their customers’ problems, and are watching the traffic and conversions pour in. Feels pretty good, right?

Unfortunately, this is often the point in a B2B engagement where the wheels start to come off the bus. Looking at the client’s analytics, everything seems great — traffic is up, conversions are also up, the site is rocking and rolling. Talk to the client, though, and you’ll often find that they’re not happy.

“Leads are up, but sales aren’t,” they might say, or “yes, we’re getting more leads, but they’re the wrong leads.” You might even hear that the sales team hates getting leads from SEO, because they don’t convert to sale, or if they do, only for small-dollar deals.

What happened?

At this point, nobody could blame you for becoming frustrated with your client. After all, they specifically said that all they cared about was getting more leads — so why aren’t they happy? Especially when you’re making the phone ring off the hook?

A key to client retention at this stage is to understand things from your client’s perspective — and particularly, from their sales team’s perspective. The important thing to remember is that when your client told you they wanted to focus on lead volume, they weren’t lying to you; it’s just that their needs have changed since having that conversation.

Chances are, your new B2B client didn’t seek out your services because everything was going great for them. When a lead gen company seeks out a new marketing partner, it’s typically because they don’t have enough leads in their pipeline. “Hungry for leads” isn’t a situation any sales team wants to be in: every minute they spend sitting around, waiting for leads to come in is a minute they’re not spending meeting their sales and revenue targets. It’s really stressful, and could even mean their jobs are at stake. So, when they brought you on, is it any wonder their first order of business was “just get us the leads?” Any lead is better than no lead at all.

Now, however, you’ve got a nice little flywheel running, bringing new leads to the sales team’s inbox all the livelong day, and the team has a whole new problem: talking to leads that they perceive as a waste of their time. 

A different kind of lead

Lead-gen SEO is often a top-of-funnel play. Up to the point when the client brought you on, the leads coming in were likely mostly from branded and direct traffic — they’re people who already know something about the business, and are closer to being ready to buy. They’re already toward the middle of the sales funnel before they even talk to a salesperson.

SEO, especially for a business with any kind of established brand, is often about driving awareness and discovery. The people who already know about the business know how to get in touch when they’re ready to buy; SEO is designed to get the business in front of people who may not already know that this solution to their problems exists, and hopefully sell it to them.

A fledgling SEO campaign should generate more leads, but it also often means a lower percentage of good leads. It’s common to see conversion rates, both from session to lead and from lead to sale, go down during awareness-building marketing. The bet you’re making here is that you’re driving enough qualified traffic that even as conversion rates go down, your total number of conversions (again, both to lead and to sale) is still going up, as is your total revenue.

So, now you’ve brought in the lead volume that was your initial mandate, but the leads are at a different point in their customer journey, and some of them may not be in a position to buy at all. This can lead to the perception that the sales team is wasting all of their time talking to people who will never buy. Since it takes longer to close a sale than it does to disqualify a lead, the increase in less-qualified leads will become apparent long before a corresponding uptick in sales — and since these leads are earlier in their customer journey, they may take longer to convert to sale than the sales team is used to.

At this stage, you might ask for reports from the client’s CRM, or direct access, so you can better understand what their sales team is seeing. To complicate matters further, though, attribution in most CRMs is kind of terrible. It’s often very rigid; the CRM’s definitions of channels may not match those of Google Analytics, leading to discrepancies in channel numbers; it may not have been set up correctly in the first place; it’s opaque, often relying on “secret sauce” to attribute sales per channel; and it still tends to encourage salespeople to focus on the first or last touch. So, if SEO is driving a lot of traffic that later converts to lead as Direct, the client may not even be aware that SEO is driving those leads.

None of this matters, of course, if the client fires you before you have a chance to show the revenue that SEO is really driving. You need to show that you can drive lead quality from the get-go, so that by the time the client realizes that lead volume alone isn’t what they want, you’re prepared to have that conversation.

Resist the temptation to qualify at the keyword level

When a client is first distressed about lead quality, It’s tempting to do a second round of keyword research and targeting to try to dial in their ideal decision-maker; in fact, they may specifically ask you to do so. Unfortunately, there’s not a great way to do that at the query level. Sure, enterprise-level leads might be searching “enterprise blue widget software,” but it’s difficult to target that term without also targeting “blue widget software,” and there’s no guarantee that your target customers are going to add the “enterprise” qualifier. Instead, use your ideal users’ behaviors on the site to determine which topics, messages, and calls to action resonate with them best — then update site content to better appeal to that target user

Change the onboarding conversation

We’ve already talked about asking clients, “what makes a lead a good lead?” I would argue, though, that a better question is “how do you qualify leads?” 

Sit down with as many members of the sales team as you can (since you’re doing this at the beginning of the engagement — before you’re crushing it driving leads, they should have a bit more time to talk to you) and ask how they decide which leads to focus on. If you can, ask to listen in on a sales call or watch over their shoulder as they go through their new leads. 

At first, they may talk about how lead qualification depends on a complicated combination of factors. Often, though, the sales team is really making decisions about who’s worth their time based on just one or two factors (usually budget or title, although it might also be something like company size). Try to nail them down on their most important one.

Implement a lead scoring model

There are a bunch of different ways to do this in Google Analytics or Google Tag Manager (Alex from UpBuild has a writeup of our method, here). Essentially, when a prospect submits a lead conversion form, you’ll want to:

  • Look for the value of your “most important” lead qualification factor in the form,
  • And then fire an Event “scoring” the conversion in Google Analytics as e.g. Hot, Warm, or Cold.

This might look like detecting the value put into an “Annual Revenue” field or drop-down and assigning a score accordingly; or using RegEx to detect when the “Title” field contains Director, Vice President, or CMO and scoring higher. I like to use the same Event Category for all conversions from the same form, so they can all roll up into one Goal in Google Analytics, then using the Action or Label field to track the scoring data. For example, I might have an Event Category of “Lead Form Submit” for all lead form submission Events, then break out the Actions into “Hot Lead — $5000+,” “Warm Lead — $1000–$5000,” etc.

Note: Don’t use this methodology to pass individual lead information back into Google Analytics. Even something like Job Title could be construed as Personally Identifiable Information, a big no-no where Google Analytics is concerned. We’re not trying to track individual leads’ behaviors, here; we’re trying to group conversions into ranges.

How to use scored leads

Drive the conversation around sales lifecycle. The bigger the company and the higher the budget, the more time and touches it will take before they’re ready to even talk to you. This means that with a new campaign, you’ll typically see Cold leads coming in first, then Hot and Warm trickling in overtime. Capturing this data allows you to set an agreed-upon time in the future when you and the client can discuss whether this is working, instead of cutting off campaigns/strategies before they have a chance to perform (it will also allow you to correctly set Campaign time-out in GA to reflect the full customer journey).

Allocate spend. How do your sales team’s favorite leads tend to get to the site? Does a well-timed PPC or display ad after their initial visit drive them back to make a purchase? Understanding the channels your best leads use to find and return to the site will help your client spend smarter.

Create better-targeted content. Many businesses with successful blogs will have a post or two that drives a great deal of traffic, but almost no qualified leads. Understanding where your traffic goals don’t align with your conversion goals will keep you from wasting time creating content that ranks, but won’t make money.

Build better links. The best links don’t just drive “link equity,” whatever that even means anymore — they drive referral traffic. What kinds of websites drive lots of high-scoring leads, and where else can you get those high-quality referrals?

Optimize for on-page conversion. How do your best-scoring leads use the site? Where are the points in the customer journey where they drop off, and how can you best remove friction and add nurturing? Looking at how your Cold leads use the site will also be valuable — where are the points on-site where you can give them information to let them know they’re not a fit before they convert?

The earlier in the engagement you start collecting this information, the better equipped you’ll be to have the conversation about lead quality when it rears its ugly head.


Source link

How to Get a Customer to Edit Their Negative Review



“When you forgive, you in no way change the pas — but you sure do change the future.” — Bernard Meltzer

Your brand inhabits a challenging world in which its consumers’ words make up the bulk of your reputation. Negative reviews can feel like the ultimate revenge, punishing dissatisfactory experiences with public shaming, eroded local rankings, and attendant revenue loss. Some business owners become so worried about negative reviews, they head to fora asking if there is any way to opt-out and even querying whether they should simply remove their business listings altogether rather than face the discordant music.

But hang in there. Local business customers may be more forgiving than you think. In fact, your customers may think differently than you might think. 

I’ve just completed a study of consumer behavior as it relates to negative reviews becoming positive ones and I believe this blog post will hold some very welcome surprises for concerned local business owners and their marketers — I know that some of what I learned both surprised and delighted me. In fact, it’s convinced me that, in case after case, negative reviews aren’t what we might think they are at all.

Let’s study this together, with real-world examples, data, a poll, and takeaways that could transform your outlook. 

Stats to start with

Your company winds up with a negative review, and the possibility of a permanently lost customer. Marketing wisdom tells us that it’s more costly to acquire a new customer than to keep an existing one happy. But it’s actually more far-reaching. The following list of stats tells the story of why you want to do anything you can to get the customer to edit a bad review to reflect more positive sentiment:

  • 57 percent of consumers will only use a business if it has four or more stars — (BrightLocal)
  • One study showed that ~1.5-star rating increase improved conversions from 10.4 percent to 12.8 percent, representing about 13,000 more leads for the brand. — (Location3)
  • 73.8 percent of customers are either likely or extremely likely to continue doing business with a brand that resolves their complaints. — (GatherUp)
  • A typical business only hears from four percent of its dissatisfied customers, meaning that the negative reviews you rectify for outspoken people could solve problems for silent ones. — (Ruby Newell-Lerner)
  • 89 percent of consumers read businesses’ responses to reviews. — (BrightLocal)

The impact of ratings, reviews, and responses are so clear that every local brand needs to devote resources to better understanding this scenario of sentiment and customer retention.

People power: One reason consumers love reviews

The Better Business Bureau was founded in 1912. The Federal Trade Commission made its debut just two years later. Consumer protections are deemed a necessity, but until the internet put the potential of mass reviews directly into individuals hands, the “little guy” often felt he lacked a truly audible voice when the “big guy” (business) didn’t do right by him.

You can see how local business review platforms have become a bully pulpit, empowering everyday people to make their feelings known to a large audience. And, you can see from reviews, like the one below, the relish with which some consumers embrace that power:

Here, a customer is boasting the belief that they outwitted an entity which would otherwise have defrauded them, if not for the influence of a review platform. That’s our first impression. But if we look a little closer, what we’re really seeing here is that the platform is a communications tool between consumer and brand. The reviewer is saying:

“The business has to do right by me if I put this on Yelp!”

What they’re communicating isn’t nice, and may well be untrue, but it is certainly a message they want to be amplified.

And this is where things get interesting.

Brand power: Full of surprises!

This month, I created a spreadsheet to organize data I was collecting about negative reviews being transformed into positive ones. I searched Yelp for the phrase “edited my review” in cities in every region of the United States and quickly amassed 50 examples for in-depth analysis. In the process, I discovered three pieces of information that could be relevant to your brand.

Surprise #1: Many consumers think of their reviews as living documents

In this first example, we see a customer who left a review after having trouble making an appointment and promising to update their content once they’d experienced actual service. As I combed through consumer sentiment, I was enlightened to discover that many people treat reviews as live objects, updating them over time to reflect evolving experiences. How far do reviewers go with this approach? Just look:

In the above example, the customer has handled their review in four separate updates spanning several days. If you look at the stars, they went from high to low to high again. It’s akin to live updates from a sporting event, and that honestly surprised me to see.

Brands should see this as good news because it means an initial negative review doesn’t have to be set in stone.

Surprise #2: Consumers can be incredibly forgiving

“What really defines you is how you handle the situation after you realize you made a mistake.”

I couldn’t have said it better myself, and this edited review typifies for me the reasonableness I saw in case after case. Far from being the scary, irrational customers that business owners dread, it’s clear that many people have the basic understanding that mistakes can happen… and can be rectified. I even saw people forgiving auto dealerships for damaging their cars, once things had been made right.

Surprise #3: Consumers can be self-correcting.

The customer apparently isn’t “always right,” and some of them know it. I saw several instances of customers editing their reviews after realizing that they were the ones who made a mistake. For example, one rather long review saga contained this:

“I didn’t realize they had an hourly option so my initial review was 3 stars. However, after the company letting me know they’d be happy to modify my charges since I overlooked the hourly option, it was only fair to edit my review. I thought that was really nice of them. 5 stars and will be using them again in the future.”

When a customer has initially misunderstood a policy or offering and the business in question takes the time to clarify things, fair-minded individuals can feel honor-bound to update their reviews. Many updated reviews contained phrases like “in good conscience” and “in all fairness.”

Overall, in studying this group of reviewers, I found them to be reasonable people, meaning that your brand has (surprising) significant power to work with dissatisfied customers to win back their respect and their business.

How negative reviews become positive: Identifying winning patterns

In my case study, the dominant, overall pattern of negative reviews being transformed into positive ones consisted of these three Rs:

  1. Reach — the customer reaches out with their negative experience, often knowing that, in this day and age, powerful review platforms are a way to reach brands.
  2. Remedy — Some type of fix occurs, whether this results from intervention on the part of the brand, a second positive experience outweighing an initial negative one, or the consumer self-correcting their own misunderstanding.
  3. Restoration — The unhappy customer is restored to the business as a happy one, hopefully, ready to trust the brand for future transactions, and the reputation of the brand is restored by an edited review reflecting better satisfaction.

Now, let’s bucket this general pattern into smaller segments for a more nuanced understanding. Note: There is an overlap in the following information, as some customers experienced multiple positive elements that convinced them to update their reviews.

Key to review transformation:

  • 70 percent mentioned poor service/rude service rectified by a second experience in which staff demonstrated caring.
  • 64 percent mentioned the owner/manager/staff proactively, directly reached out to the customer with a remedy.
  • 32 percent mentioned item replaced or job re-done for free.
  • 20 percent mentioned customer decided to give a business a second chance on their own and was better-pleased by a second experience.
  • 6 percent mentioned customer realized the fault for a misunderstanding was theirs.

From this data, two insights become clear and belong at the core of your reputation strategy:

Poor and rude service seriously fuel negative reviews

This correlates well with the findings of an earlier GatherUp study demonstrating that 57 percent of consumer complaints revolve around customer service and employee behavior. It’s critical to realize that nearly three-quarters of these disasters could be turned around with subsequent excellent service. As one customer in my study phrased it:

“X has since gone above and beyond to resolve the issue and make me feel like they cared.”

Proactive outreach is your negative review repair kit

Well over half of the subjects in my study specifically mentioned that the business had reached out to them in some way. I suspect many instances of such outreach went undocumented in the review updates, so the number may actually be much higher than represented.

Outreach can happen in a variety of ways:

  • The business may recognize who the customer is and have their name and number on file due to a contract.
  • The business may not know who the customer is but can provide an owner response to the review that includes the company’s contact information and an earnest request to get in touch.
  • The business can DM the customer if the negative review is on Yelp.

You’re being given a second chance if you get the customer’s ear a second time. It’s then up to your brand to do everything you can to change their opinion. Here’s one customer’s description of how far a local business was willing to go to get back into his good graces:

“X made every effort to make up for the failed programming and the lack of customer service the night before. My sales rep, his manager and even the finance rep reached out by phone, text and email. I was actually in meetings all morning, watching my phone buzz with what turned out to be their calls, as they attempted to find out what they could do to make amends. Mark came over on my lunch break, fixed/reprogrammed the remote and even comped me a free tank of gas for my next fill up. I appreciated his sincere apologies and wanted to update/revise my review as a token of my appreciation.”

What a great example of dedication to earning forgiveness!

Should you actively ask restored customers to edit their negative reviews?

I confess — this setup makes me a bit nervous. I took Twitter poll to gauge sentiment among my followers:

Respondents showed strong support for asking a customer who has been restored to happiness to edit their review. However, I would add a few provisos.

Firstly, not one of the subjects in my study mentioned that the business requested they update their review. Perhaps it went undocumented, but there was absolutely zero suggestion that restored customers had been prompted to re-review the business.

Secondly, I would want to be 100 percent certain that the customer is, indeed, delighted again. Otherwise, you could end up with something truly awful on your review profile, like this:

Suffice it to say, never demand an edited review, and certainly don’t use one as blackmail!

With a nod to the Twitter poll, I think it might be alright to mention you’d appreciate an updated review. I’d be extremely choosy about how you word your request so as not to make the customer feel obligated in any way. And I’d only do so if the customer was truly, sincerely restored to a sense of trust and well-being by the brand.

So what are negative reviews, really?

In so many cases, negative reviews are neither punishment nor the end of the road.

They are, in fact, a form of customer outreach that’s often akin to a cry for help.

Someone trusted your business and was disappointed. Your brand needs to equip itself to ride to the rescue. I was struck by how many reviewers said they felt uncared-for, and impressed by how business owners like this one completely turned things around:

In this light, review platforms are simply a communications medium hosting back-and-forth between customer people and business people. Communicate with a rescue plan and your reputation can “sparkle like diamonds”, too.


I want to close by mentioning how evident it was to me, upon completing this study, that reviewers take their task seriously. The average word count of the Yelp reviews I surveyed was about 250 words. If half of the 12,584 words I examined expressed disappointment, your brand is empowered to make the other half express forgiveness for mistakes and restoration of trust.

It could well be that the industry term “negative” review is misleading, causing unnecessary fear for local brands and their marketers. What if, instead, we thought of this influential content as “reviews-in-progress,” with the potential for transformation charting the mastery of your brand at customer service.

The short road is that you prevent negative experiences by doubling down on staff hiring and training practices that leave people with nothing to complain about in the entire customer service ecosystem. But re-dubbing online records of inevitable mistakes as “reviews-in-progress” simply means treading a slightly longer road to reputation, retention, and revenue. If your local brand is in business for the long haul, you’ve got this!


Source link

Supercharge Your Link Building Outreach! 5 Tips for Success – Whiteboard Friday



Spending a ton of effort on outreach and waking up to an empty inbox is a demoralizing (and unfortunately common) experience. And when it comes to your outreach, getting those emails opened is half the battle. In today’s Whiteboard Friday, we welcome recent MozCon 2019 alum Shannon McGuirk to share five of her best tips to make your outreach efficient and effective — the perfect follow-up to her talk about building a digital PR newsroom.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Shannon McGuirk. I’m the Head of PR and Content at a UK-based digital marketing agency called Aira. So at this year’s MozCon, I spoke about how to supercharge your link building with a digital PR newsroom and spoke about the three different types of media and journalist writing that we should be tapping into.

But I only had half an hour to be able to share my insights and thoughts. As a next step from that presentation, I need to equip you guys with everything in order to be able to go out and actually speak to these journalists. So for my Whiteboard Friday today, I’m going to be sharing my five tips for success for supercharging your outreach, specifically evolved around email outreach alone.

In the U.K. and in the U.S. as well, we’re seeing, as our industry grows and develops, journalists don’t want to be called anymore, and instead the best way to get in touch with them is via email or on social media. So let’s dive straight in. 

1. Subject lines A/B tests

So tip one then. I want to share some insights with you that I did for subject lines and specifically around some A/B testing.

Back in the early part of the summer, around April time, we started working on a tool called BuzzStream. Now that allowed us to be able to send different kinds of tests and emails out with a variety of different subject lines in order for us to understand how many open rates we were getting and to try and encourage journalists, through the use of our language and emojis, to open up those all-important pitch emails so that we could follow up and make sure that we’re bringing those links home.

Journalist’s name in subject line

So we ran two different types of A/B tests. The first one here you can see was with the journalist’s name in the subject line and the journalist’s name without. It turns out then that actually, when we were running this data, we were seeing far more opens if we had the journalist’s name in the subject line. It was getting their attention. It was getting that cut-through that we needed when they’re getting hundreds of emails per day and to see their name in a little nib meant that we were increasing open rates. So that was our first learning from test number one. 

“Data” vs “story tip”

Now test number two, we had a bit of a gut feel and a little bit of an instinct to feel that there were certain types of words and language that we were using that were either getting us more open rates or not. For this one specifically, it was around the use of the word “data.” So we compared the use of the word “data” with story tip, and again including the journalist’s name and not, to try and see how many journalists were opening up our emails.

At Aira, we have around a 33% open rate with any campaigns that we launch, and again this is tracked through BuzzStream. But when we started to do these A/B tests, combine story tip, full name, and then follow with “data,” we increased that to 52%. So that jump up, it doesn’t mean that you’re going to get 52% more links off the back of your outreach, but it means that you are getting more people opening up their email, considering your data, considering your campaigns, which is half of the problem, when we all know as outreachers, content marketers, digital PRs how difficult it can be for someone to even just open that initial approach.

So now, off the back of those A/B tests, make sure that whenever you’re writing those emails out you have story tip for Tom and then followed by data and whatever research you’ve got in that campaign. 

2. Headline language

For tip two then, keeping on the theme of language, I did a piece of research for another conference that I was speaking at earlier in the summer called SearchLeeds and another one called outREACH.

I analyzed 35,000 articles across 6 different top 10 news sites in the U.K. The language that came out of that, around the headlines specifically, was so interesting. So I split these 35,000 articles down into relevant sectors, took the likes of travel, automotive, business, what have you, and then I was able to create around 30 word clouds according to different articles that had been produced within these different industries at different titles.

I was able to start to see common words that were used in headlines, and that got my mind ticking a bit. I was starting to think, well, actually as a team, at Aira, we should be starting to pitch and use language within our pitches that journalists are already using, because they straightaway resonate with the story that we’ve got. So here’s a quick snapshot of the kind of word clouds that the analysis revealed.

You can kind of see some core words shining through. So we’ve got research, best, stats, experts, that kind of thing. Now the top five words that were most commonly used across all sectors within the headlines were: best, worst, data, new, and revealed. Now “data” is really interesting, because if we go back to our A/B testing, we know that that’s a strong word and that that will get you more opens with your subject lines.

But it also reaffirms that that A/B test is right and that we definitely should be using “data.” So combine story tip for that journalist’s name, Tom or what have you, with data and then start to use some of the language here, out of these top five, and again you’re going to increase your open rates, which is half of the problem with what we’re doing with outreach.

3. Use color

So tip three then. Now this was quite an experimental approach that we took, and a huge recommendation of mine, when you’re doing your email outreach, is actually to start to use color within that all-important pitch email itself. So we’ve moved from subject lines into looking at the body of the email. We use color and bolding back at Aira.

So we use color straightaway when we’re writing the email. So we’ll start with something like, “Dear Tom, I have a story that you might be interested in.” Straight under that, so we’re already using again the language that they’ll be using, story, going back to our A/B test. But then straight under that, we will bold, capitalize, and put in a really bright color — reds, greens, blues — nice, strong primary colors there the headline that we think Tom might write off the back of our outreach.

So here’s an example. “New data reveals that 21% of drivers have driven with no insurance.” Not the most exciting headline in the world. But if Tom here is an automotive editor or a digital online automotive writer, straightaway he knows what I’m talking to him about. Again, he can start to see how this data can be used to craft stories for his own audience.

Again, as I said, this is quite experimental. We’re in the early phases of it at Aira, but we know it’s working, and it’s something that I learnt, again, at outREACH conference too. Straight under this use of color with headline, you should pull out your key stats. Now only keep those bullet points to three to five. Journalists are busy.

They’re on deadlines. Don’t be having huge, bulk paragraphs or long-winded sentences. Tell them the headline, follow it up with the key stats. Be clean, be punchy, and get to the point really quickly. Below this, obviously sign off and include any press material, Google Drive links, press packs that you’ve got under that. Again, we’re seeing this work really, really well.

We’re still in the early stages, and I hope to share some insights, some kind of data and metrics as to the success results of it. But we’ve been able to secure links from the likes of the Mail Online, the Telegraph back in the U.K., and also last week just FoxBusiness using this exact approach. 

4. Use emojis

So tip four then, and again this is a really playful technique and something that we only learnt with experimentation.

Start to use emojis within your pitches as well. Now this can be used within the subject line. Again, you’re looking to try and get the journalist to get that piece of attention straightaway and look at your headline. Or start to use them within the body of the email too, because they break up that text and it makes your email stand out far more than if you have someone that’s pitching in a business piece of data and you’ve just got huge stacks and research pieces.

Actually throw in some emojis that are relating to the business world, a laptop or whatever it may be, something that proves your point around the campaign. Again, it’s more engaging for a journalist to read that. It means that they’ll probably remember your email over the other 200 that they’re getting that day. So really nice, simplistic tip then for me.

If you’re pitching something in the automotive world, put a car or traffic lights on the end. If you’re doing something in the travel sphere, sun, beaches, something that just gets that journalist’s eye. It means that your email is going to be opened above anyone else’s. 

5. Use Twitter

Finally then, so I know I’ve kept this around email outreach for the last couple of points.

But one thing that we’re seeing work really well with the implementation of this digital PR newsroom is starting to approach and speak to journalists on Twitter. Twitter we know is a new source for journalists. Trending topics will obviously be picked up in the press and covered on a daily if not hourly basis. As soon as something breaks on Twitter, we’ll see journalists, writers, bloggers turn that trending feature into an article that’s really resonant and relevant for their audience.

So in the run-up to your campaign, way before the launch, we’re talking like three or four weeks here, reach out to the journalists on Twitter. Start to engage with them. Like some articles. Start to let them know that you’re in and engaging with them on their social media platform. Don’t push it too hard.

You don’t want to go overboard with this. But a little bit of engagement here and there means that when your email comes into their inbox, it’s not a new name, and you’re already starting to build the foundations of that relationship. Secondary to this then, feel free and start to experiment with DM’ing journalists as well. We know that they’re getting two, three, or four hundred emails per day. If you take to Twitter and send them a quick overview of your up-and-coming campaign via a Twitter DM, it’s likely that they’ll read that on the journey home or potentially when they’re walking from meeting to meeting.

Again, it puts you one step ahead of your competitors. Recently we’ve got some of our best pieces of coverage through warming the press up and specific journalists through Twitter, because when your campaign launches, you’re not going out with it cold. Instead the journalist knows that it’s coming in. They may even have the editorial space to cover that feature for you too. It’s something that we’ve seen really work, and again I can’t stress enough that you really have to find that balance.

You don’t want to be plaguing journalists. You don’t want to be a pain and starting to like every single tweet they do. But if it is relevant and you find an opportunity to engage and speak to them about your campaign the weeks in advance, it opens up that door. Again, you may be able to secure an exclusive out of it, which means that you get that first huge hit. So there are my five tips for link building in 2019, and it will help you supercharge things.

Now if you have any comments for me, any questions, please pop them in the thread below or reach out to me on Twitter. As I’ve just said, feel free to send me a DM. I’m always around and would love to help you guys a little bit more if you do have any questions for me. Thanks, Moz fans.

Video transcription by

Did you miss Shannon’s groundbreaking talk at MozCon 2019, How to Supercharge Link Building with a Digital PR Newsroom? Download the deck here and don’t miss out on next year’s conference — super early bird discounts are available now!

Save my spot at MozCon 2020


Source link

Case Study: How a Media Company Grew 400% and Used SEO to Get Acquired



Disclaimer: I’m currently the Director of Demand Generation at Nextiva, and writing this case study post-mortem as the former VP of Marketing at Sales Hacker (Jan. 2017 – Sept. 2018).

Every B2B company is investing in content marketing right now. Why? Because they all want the same thing: Search traffic that leads to website conversions, which leads to money.

But here’s the challenge: Companies are struggling to get traction because competition has reached an all-time high. Keyword difficulty (and CPC) has skyrocketed in most verticals. In my current space, Unified Communication as a Service (UCaaS), some of the CPCs have nearly doubled since 2017, with many keywords hovering close to $300 per click.

Not to mention, organic CTRs are declining, and zero-click queries are rising.

Bottom line: If you’re not creating 10x quality content based on strategic keyword research that satisfies searcher intent and aligns back to business goals, you’re completely wasting your time.

So, that’s exactly what we did. The outcome? We grew from 19k monthly organic sessions to over 100k monthly organic sessions in approximately 14 months, leading to an acquisition by

We validated our hard work by measuring organic growth (traffic and keywords) against our email list growth and revenue, which correlated positively, as we expected. 

Organic Growth Highlights

January 2017–June 2018

As soon as I was hired at Sales Hacker as Director of Marketing, I began making SEO improvements from day one. While I didn’t waste any time, you’ll also notice that there was no silver bullet.

This was the result of daily blocking and tackling. Pure execution and no growth hacks or gimmicks. However, I firmly believe that the homepage redesign (in July 2017) was a tremendous enabler of growth.

Organic Growth to Present Day

I officially left Sales Hacker in August of 2018, when the company was acquired by However, I thought it would be interesting to see the lasting impact of my work by sharing a present-day screenshot of the organic traffic trend, via Google Analytics. There appears to be a dip immediately following my departure, however, it looks like my predecessor, Colin Campbell, has picked up the slack and got the train back on the rails. Well done!

Unique considerations — Some context behind Sales Hacker’s growth

Before I dive into our findings, here’s a little context behind Sales Hacker’s growth:

  • Sales Hacker’s blog is 100 percent community-generated — This means we didn’t pay “content marketers” to write for us. Sales Hacker is a publishing hub led by B2B sales, marketing, and customer success contributors. This can be a blessing and a curse at the same time — on one hand, the site gets loads of amazing free content. On the other hand, the posts are not even close to being optimized upon receiving the first draft. That means, the editorial process is intense and laborious.
  • Aggressive publishing cadence (4–5x per week) — Sales Hacker built an incredible reputation in the B2B Sales Tech niche — we became known as the go-to destination for unbiased thought leadership for practitioners in the space (think of Sales Hacker as the sales equivalent to Growth Hackers). Due to high demand and popularity, we had more content available than we could handle. While it’s a good problem to have, we realized we needed to keep shipping content in order to avoid a content pipeline blockage and a backlog of unhappy contributors.
  • We had to “reverse engineer” SEO — In short, we got free community-generated and sponsored content from top sales and marketing leaders at SaaS companies like Intercom, HubSpot, Pipedrive, LinkedIn, Adobe and many others, but none of it was strategically built for SEO out of the box. We also had contributors like John Barrows, Richard Harris, Lauren Bailey, Tito Bohrt, and Trish Bertuzzi giving us a treasure trove of amazing content to work with. However, we had to collaborate with each contributor from beginning to end and guide them through the entire process. Topical ideation (based on what they were qualified to write about), keyword research, content structure, content type, etc. So, the real secret sauce was in our editorial process. Shout out to my teammate Alina Benny for learning and inheriting my SEO process after we hired her to run content marketing. She crushed it for us!
  • Almost all content was evergreen and highly tactical — I made it a rule that we’d never agree to publish fluffy pieces, whether it was sponsored or not. Plain and simple. Because we didn’t allow “content marketers” to publish with us, our content had a positive reputation, since it was coming from highly respected practitioners. We focused on evergreen content strategies in order to fuel our organic growth. Salespeople don’t want fluff. They want actionable and tactical advice they can implement immediately. I firmly believe that achieving audience satisfaction with our content was a major factor in our SEO success.
  • Outranking the “big guys” — If you look at the highest-ranking sales content, it’s the usual suspects. HubSpot, Salesforce, Forbes, Inc, and many other sites that were far more powerful than Sales Hacker. But it didn’t matter as much as traditional SEO wisdom tells us, largely due to the fact that we had authenticity and rawness to our content. We realized most sales practitioners would rather read insights from their peers in their community, above the traditional “Ultimate Guides,” which tended to be a tad dry.
  • We did VERY little manual link building — Our link building was literally an email from me, or our CEO, to a site we had a great relationship with. “Yo, can we get a link?” It was that simple. We never did large-scale outreach to build links. We were a very lean, remote digital marketing team, and therefore lacked the bandwidth to allocate resources to link building. However, we knew that we would acquire links naturally due to the popularity of our brand and the highly tactical nature of our content.
  • Our social media and brand firepower helped us to naturally acquire links — It helps A LOT when you have a popular brand on social media and a well-known CEO who authored an essential book called “Hacking Sales”. Most of Sales Hacker’s articles would get widely circulated by over 50+ SaaS partners which would help drive natural links.
  • Updating stale content was the lowest hanging fruit — The biggest chunk of our new-found organic traffic came from updating / refreshing old posts. We have specific examples of this coming up later in the post.
  • Email list growth was the “north star” metric — Because Sales Hacker is not a SaaS company, and the “product” is the audience, there was no need for aggressive website CTAs like “book a demo.” Instead, we built a very relationship heavy, referral-based sales cadence that was supported by marketing automation, so list growth was the metric to pay attention to. This was also a key component to positioning Sales Hacker for acquisition. Here’s how the email growth progression was trending.

So, now that I’ve set the stage, let’s dive into exactly how I built this SEO strategy.

Bonus: You can also watch the interview I had with Dan Shure on the Evolving SEO Podcast, where I breakdown this strategy in great detail.

1) Audience research

Imagine you are the new head of marketing for a well-known startup brand. You are tasked with tackling growth and need to show fast results — where do you start?

That’s the exact position I was in. There were a million things I could have done, but I decided to start by surveying and interviewing our audience and customers.

Because Sales Hacker is a business built on content, I knew this was the right choice.

I also knew that I would be able to stand out in an unglamorous industry by talking to customers about their content interests.

Think about it: B2B tech sales is all about numbers and selling stuff. Very few brands are really taking the time to learn about the types of content their audiences would like to consume.

When I was asking people if I could talk to them about their media and content interests, their response was: “So, wait, you’re actually not trying to sell me something? Sure! Let’s talk!”

Here’s what I set out to learn:

  • Goal 1 — Find one major brand messaging insight.
  • Goal 2 — Find one major audience development insight.
  • Goal 3 — Find one major content strategy insight.
  • Goal 4 — Find one major UX / website navigation insight.
  • Goal 5 — Find one major email marketing insight.

In short, I accomplished all of these learning goals and implemented changes based on what the audience told me.

If you’re curious, you can check out my entire UX research process for yourself, but here are some of the key learnings:

Based on these outcomes, I was able to determine the following:

  • Topical “buckets” to focus on — Based on the most common daily tasks, the data told us to build content on sales prospecting, building partnerships and referral programs, outbound sales, sales management, sales leadership, sales training, and sales ops.
  • Thought leadership — 62 percent of site visitors said they kept coming back purely due to thought leadership content, so we had to double down on that.
  • Content Types — Step by step guides, checklists, and templates were highly desired. This told me that fluffy BS content had to be ruthlessly eliminated at all costs.
  • Sales Hacker Podcast — 76 percent of respondents said they would listen to the Sales Hacker Podcast (if it existed), so we had to launch it!

2) SEO site audit — Key findings

I can’t fully break down how to do an SEO site audit step by step in this post (because it would be way too much information), but I will share the key findings and takeaways from our own Site Audit that led to some major improvements in our website performance.

Lack of referring domain growth

Sales Hacker was not able to acquire referring domains at the same rate as competitors. I knew this wasn’t because of a link building acquisition problem, but due to a content quality problem.

Lack of organic keyword growth

Sales Hacker had been publishing blog content for years (before I joined) and there wasn’t much to show for it from an organic traffic standpoint. However, I do feel the brand experienced a remarkable social media uplift by building content that was helpful and engaging. 

Sales Hacker did happen to get lucky and rank for some non-branded keywords by accident, but the amount of content published versus the amount of traffic they were getting wasn’t making sense. 

To me, this immediately screamed that there was an issue with on-page optimization and keyword targeting. It wasn’t anyone’s fault – this was largely due to a startup founder thinking about building a community first, and then bringing SEO into the picture later. 

At the end of the day, Sales Hacker was only ranking for 6k keywords at an estimated organic traffic cost of $8.9k — which is nothing. By the time Sales Hacker got acquired, the site had an organic traffic cost of $122k.

Non-optimized URLs

This is common among startups that are just looking to get content out. This is just one example, but truth be told, there was a whole mess of non-descriptive URLs that had to get cleaned up.

Poor internal linking structure

The internal linking concentration was poorly distributed. Most of the equity was pointing to some of the lowest value pages on the site.

Poor taxonomy, site structure, and navigation

I created a mind-map of how I envisioned the new site structure and internal linking scheme. I wanted all the content pages to be organized into categories and subcategories.

My goals with the new proposed taxonomy would accomplish the following:

  • Increase engagement from natural site visitor exploration
  • Allow users to navigate to the most important content on the site
  • Improve landing page visibility from an increase in relevant internal links pointing to them.

Topical directories and category pages eliminated with redirects

Topical landing pages used to exist on, but they were eliminated with 301 redirects and disallowed in robots.txt. I didn’t agree with this configuration. Example: /social-selling/

Trailing slash vs. non-trailing slash duplicate content with canonical errors

Multiple pages for the same exact intent. Failing to specify the canonical version.

Branded search problems — “Sales Hacker Webinar”

Some of the site’s most important content is not discoverable from search due to technical problems. For example, a search for “Sales Hacker Webinar” returns irrelevant results in Google because there isn’t an optimized indexable hub page for webinar content. It doesn’t get that much search volume (0–10 monthly volume according to Keyword Explorer), but still, that’s 10 potential customers you are pissing off every month by not fixing this.

3) Homepage — Before and after

Sooooo, this beauty right here (screenshot below) was the homepage I inherited in early 2017 when I took over the site.

Fast forward six months later, and this was the new homepage we built after doing audience and customer research…

New homepage goals

  • Tell people EXACTLY what Sales Hacker is and what we do.
  • Make it stupidly simple to sign up for the email list.
  • Allow visitors to easily and quickly find the content they want.
  • Add social proof.
  • Improve internal linking.

I’m proud to say, that it all went according to plan. I’m also proud to say that as a result, organic traffic skyrocketed shortly after.

Special Note: Major shout out to Joshua Giardino, the lead developer who worked with me on the homepage redesign. Josh is one of my closest friends and my marketing mentor. I would not be writing this case study today without him!

There wasn’t one super measurable thing we isolated in order to prove this. We just knew intuitively that there was a positive correlation with organic traffic growth, and figured it was due to the internal linking improvements and increased average session duration from improving the UX.

4) Updating and optimizing existing content

Special note: We enforced “Ditch the Pitch”

Before I get into the nitty-gritty SEO stuff, I’ll tell you right now that one of the most important things we did was blockade contributors and sponsors from linking to product pages and injecting screenshots of product features into blog articles, webinars, etc.

Side note: One thing we also had to do was add a nofollow attribute to all outbound links within sponsored content that sent referral traffic back to partner websites (which is no longer applicable due to the acquisition).

The #1 complaint we discovered in our audience research was that people were getting irritated with content that was “too salesy” or “too pitchy” — and rightfully so, because who wants to get pitched at all day?

So we made it all about value. Pure education. School of hard knocks style insights. Actionable and tactical. No fluff. No nonsense. To the point.

And that’s where things really started to take off.

Before and after: “Best sales books”

What you are about to see is classic SEO on-page optimization at its finest.

This is what the post originally looked like (and it didn’t rank well for “best sales books).

And then after…

And the result…

Before and after: “Sales operations”

What we noticed here was a crappy article attempting to explain the role of sales operations.

Here are the steps we took to rank #1 for “Sales Operations:”

  • Built a super optimized mega guide on the topic.
  • Since the old crappy article had some decent links, we figured let’s 301 redirect it to the new mega guide.
  • Promote it on social, email and normal channels.

Here’s what the new guide on Sales Ops looks like…

And the result…

5) New content opportunities

One thing I quickly realized Sales Hacker had to its advantage was topical authority. Exploiting this was going to be our secret weapon, and boy, did we do it well: 

“Cold calling”

We knew we could win this SERP by creating content that was super actionable and tactical with examples.

Most of the competing articles in the SERP were definition style and theory-based, or low-value roundups from domains with high authority.

In this case, DA doesn’t really matter. The better man wins.

“Best sales tools”

Because Sales Hacker is an aggregator website, we had the advantage of easily out-ranking vendor websites for best and top queries.

Of course, it also helps when you build a super helpful mega list of tools. We included over 150+ options to choose from in the list. Whereas SERP competitors did not even come close.

“Channel sales”

Notice how Sales Hacker’s article is from 2017 still beats HubSpot’s 2019 version. Why? Because we probably satisfied user intent better than them.

For this query, we figured out that users really want to know about Direct Sales vs Channel Sales, and how they intersect.

HubSpot went for the generic, “factory style” Ultimate Guide tactic.

Don’t get me wrong, it works very well for them (especially with their 91 DA), but here is another example where nailing the user intent wins.

“Sales excel templates”

This was pure lead gen gold for us. Everyone loves templates, especially sales excel templates.

The SERP was easily winnable because the competition was so BORING in their copy. Not only did we build a better content experience, but we used numbers, lists, and power words that salespeople like to see, such as FAST and Pipeline Growth.

Special note: We never used long intros

The one trend you’ll notice is that all of our content gets RIGHT TO THE POINT. This is inherently obvious, but we also uncovered it during audience surveying. Salespeople don’t have time for fluff. They need to cut to the chase ASAP, get what they came for, and get back to selling. It’s really that straightforward.

When you figure out something THAT important to your audience, (like keeping intros short and sweet), and then you continuously leverage it to your advantage, it’s really powerful.

6) Featured Snippets

Featured snippets became a huge part of our quest for SERP dominance. Even for SERPs where organic clicks have reduced, we didn’t mind as much because we knew we were getting the snippet and free brand exposure.

Here are some of the best-featured snippets we got!

Featured snippet: “Channel sales”

Featured snippet: “Sales pipeline management”

Featured snippet: “BANT”

Featured snippet: “Customer success manager”

Featured snippet: “How to manage a sales team”

Featured snippet: “How to get past the gatekeeper”

Featured snippet: “Sales forecast modeling”

Featured snippet: “How to build a sales pipeline”

7) So, why did Sales Hacker get acquired?

At first, it seems weird. Why would a SaaS company buy a blog? It really comes down to one thing — community (and the leverage you get with it).

Two learnings from this acquisition are:

1. It may be worth acquiring a niche media brand in your space

2. It may be worth starting your own niche media brand in your space

I feel like most B2B companies (not all, but most) come across as only trying to sell a product — because most of them are. You don’t see the majority of B2B brands doing a good job on social. They don’t know how to market to emotion. They completely ignore top-funnel in many cases and, as a result, get minimal engagement with their content.

There’s really so many areas of opportunity to exploit in B2B marketing if you know how to leverage that human emotion — it’s easy to stand out if you have a soul. Sales Hacker became that “soul” for Outreach — that voice and community.

But one final reason why a SaaS company would buy a media brand is to get the edge over a rival competitor. Especially in a niche where two giants are battling over the top spot.

In this case, it’s Outreach’s good old arch-nemesis, Salesloft. You see, both Outreach and Salesloft are fighting tooth and nail to win a new category called “Sales Engagement”.

As part of the acquisition process, I prepared a deck that highlighted how beneficial it would be for Outreach to acquire Sales Hacker, purely based on the traffic advantage it would give them over Salesloft.

Sales Hacker vs. Salesloft vs Outreach — Total organic keywords

This chart from 2018 (data exported via SEMrush), displays that Sales Hacker is ranking for more total organic keywords than Salesloft and Outreach combined.

Sales Hacker vs. Salesloft vs Outreach — Estimated traffic cost

This chart from 2018 (data exported via SEMrush), displays the cost of the organic traffic compared by domain. Sales Hacker ranks for more commercial terms due to having the highest traffic cost.

Sales Hacker vs. Salesloft vs Outreach — Rank zone distributions

This chart from 2018 (data exported via SEMrush), displays the rank zone distribution by domain. Sales Hacker ranked for more organic keywords across all search positions.

Sales Hacker vs. Salesloft vs Outreach — Support vs. demand keywords

This chart from 2018 (data exported via SEMrush), displays support vs demand keywords by domain. Because Sales Hacker did not have a support portal, all its keywords were inherently demand focused.

Meanwhile, Outreach was mostly ranking for support keywords at the time. Compared to Salesloft, they were at a massive disadvantage.


I wouldn’t be writing this right now without the help, support, and trust that I got from so many people along the way.

  • Joshua Giardino — Lead developer at Sales Hacker, my marketing mentor and older brother I never had. Couldn’t have done this without you!
  • Max Altschuler — Founder of Sales Hacker, and the man who gave me a shot at the big leagues. You built an incredible platform and I am eternally grateful to have been a part of it.
  • Scott Barker — Head of Partnerships at Sales Hacker. Thanks for being in the trenches with me! It’s a pleasure to look back on this wild ride, and wonder how we pulled this off.
  • Alina Benny — My marketing protege. Super proud of your growth! You came into Sales Hacker with no fear and seized the opportunity.
  • Mike King — Founder of iPullRank, and the man who gave me my very first shot in SEO. Thanks for taking a chance on an unproven kid from the Bronx who was always late to work.
  • Yaniv Masjedi — Our phenomenal CMO at Nextiva. Thank you for always believing in me and encouraging me to flex my thought leadership muscle. Your support has enabled me to truly become a high-impact growth marketer.

Thanks for reading — tell me what you think below in the comments!


Source link

How to Boost Content Linkability Without Wasting Your Marketing Budget



I’m always fascinated with the marketing budgets of enterprise-level companies that are ready to pay astronomical sums to contractors. A recent shmooze in the community was thanks to Hertz that paid 32M to Accentura agency, which (so far) hasn’t resulted in any substantial changes to their site.

Though I personally don’t work with client’s who throw around millions of dollars, that doesn’t affect the quality of services that I provide. My average client wants to get the maximum by spending as little as possible. It might sound like a tough job for me and indeed it is, but I love the challenges that a small budget brings, as it helps me stay creative and reach new professional heights.

So while the budget isn’t a challenge, changing my client’s mindset is, and that’s because all of my clients are victims of one of the biggest misconceptions about content marketing: They think that once they start publishing content pieces regularly, inbound traffic will hit their site like a meteorite

And it’s not just the traffic — links are a subject to a similar misconception. Each time I share studies like the one by Brian Dean that clearly shows that links don’t come on its own, there’s always someone that’s going to say: “That’s because their content’s just not good enough.” When I have a call with clients that ask for quality content with zero focus on links.

The bottom line is, traffic and links don’t just show up out of thin air. Regardless of how good your content is, how well structured and valuable it may seem, it has nearly zero chances of getting attention in today’s overcrowded digital space.

In this post, I want to share with you five bulletproof tactics that help me boost content linkability without having a big fat budget to waste.

A note on content and modern-day link building

Before we dive into the best ways to boost your content without breaking the bank, it’s important to touch on what link-building is today. Links are a digital marketing currency — which you need to earn and spend wisely. And to earn them, you need to build relationships. 

A while ago, I noticed a shift in a client’s mindset: After a few projects delivered together, they started to ask for in-depth forms of content like how-to’s, case studies, and guides — which (according to Brian’s research) is exactly the type of content that has the highest chances of getting links. But that’s not necessarily the number one reason why people allocate links.

Links are inherently relationships. And if you agree that linking to a strategic partner brings more benefits compared to referring to a random stranger, then you’ll find appreciate Robbie Richards methods.

Robbie’s roundups are a textbook definition of highly linkable content. A post about the best keyword research tools published not that long ago on his blog attracted nearly 300 referring domains and a decent organic traffic share:

What’s his secret?

Robbie made sure to target the experts within his business circle. In a nutshell, his roundup posts work as part of a well-delivered outreach strategy that has a strong focus on gaining links by leveraging existing relationships. This is the key to modern-day link-building — a combination of content, links, and partnerships. 

Without further ado, let’s talk the best ways to promote content that doesn’t involve any where-do-I-get-the-money-for-it drama.

5 bulletproof ways to blow up your content without breaking the bank

If you’re creating quality content with zero focus on links, you won’t be getting optimal traffic. The only chance to make your content stand out is to focus on its potential linkability even before you actually start writing it. Here are some of favorite ways to get your content seen. 

1. Adding expert quotes

Quoting an expert is one of my favorite ways to boost content linkability and shareability. It’s quick, easy, and doesn’t require a significant time investment. When you write out of your expertise area, adding a quote of a thought-leader grants your content more credibility and value, not to mention boosting its linking potential.

Depending on how influential your company is, you can either select an existing quote or reach out to the experts and ask for a new one.

Here’s a tip: If you decide to go with a pre-existing quote, contact the expert in advance to confirm it. This way, you can make sure that they still stand by that opinion, plus, they’re okay with you quoting them.

Remember, while quoting experts is a good idea, you also need to find the right expert and the right quote. Here’s how to do that:

  • If your brand has a big audience, I recommend starting by checking your current followers and subscribers across various channels, including social media. You might not know it, but there’s a good chance you’ll find real influencers among people who follow your brand’s pages. To speed up the process of spotting influencers among your Twitter followers, you can use Followerwonk. This tool allows you to export all your followers to a list and sort them by the size of their audience.
  • Another way is to analyze the websites that link back to your site. To do that, you can use Moz Link Explorer that will show the list of URLs that are referring to your site. Chances are, some of those authors are pretty influential in their niche.
  • Finally, you could use BuzzSumo to find relevant influencers to contact. For example, you could export a list of bloggers who are contributing to the industry-leading blogs.

    The last option is less suitable for link building purposes, as the influencers that you find have no idea of your business existence and are hard to get on board. However, it’s not impossible. Before getting in touch, make sure to scratch their backs: Share their content on your social media, sign up for their newsletter, etc. To find the influencer’s most recent pieces, search on BuzzSumo Content Analyzer by “Author: [INSERT NAME].” This helps build a bridge and create the right first impression.

    Don’t forget that expert quotes need to be allocated in content with special formatting which means you need to involve a designer/developer.

    Here are a few examples that I personally find quite visually appealing:

    And another one:

    2. Strategically linking back to blogs that you’re interested in

    Strategic link building is like playing poker while blindfolded. A strategic approach always pays off in the long run in almost any area, but when applied to link building, it depends on how well you can spot linking opportunities. Based on this, your chances of acquiring links are either very high or very low.

    If you want industry leaders to link back to your content someday, you have to prove that your content deserves their attention. The best way to get your foot in the door is to link back to them.

    You need to find the right experts to link back to. How do you do that?

    The mechanic behind finding the right sites to refer to is similar to the one that I shared in a section about expert quotes. However, there’re a few more strategies that I want to add:

    • Are you a part of any industry groups on Facebook? If so, go and check the members of those groups and find people that are also involved in link building. Now, you have a legit reason to contact them (since you’re both a part of one group on Facebook/LinkedIn) and ask whether they’re interested in getting a link in your upcoming post. Please note, that you shouldn’t skip this step, as by this you’re making them aware that you’re expecting for the favor to be returned.
    • Have you ever participated in any roundups? If yes, then reach out to the experts that were also featured in this post.
    • Finally, check your current blog subscribers, clients, and partners. The chances that they’re also interested in partnering up on a link building side are quite high.

    3. Adding good images/GIFs and hiring a designer for professional-looking visuals

    In 2019, using stock images in your content is a big no. After all, they are easily recognizable for their abstract nature and give away the fact that the author didn’t invest much into creating custom visuals. 

    However, there is a way to adapt it to your unique brand style and still make it work. And to do it, you don’t even need to hire a designer right away.

    The drag-&-drop tools like Vengagge, Canva, or Visme make it easy to create pretty nice graphics. For example, Canva has a lot of great grids and predefined templates, which makes the whole design process really fast.

    What you need to do is take a good-looking cover image, for example, like the ones we use in our blog, and cheer it up with custom-made designs in Canva. You can add your picture, your brand’s logo, or anything else your heart desires. Such an approach allows us to maintain our own unique style while staying within the budget.

    Static images are not the only way to pretty up your content. One of my favorite visual elements is GIFs. They are perfect for visualizing step-by-steps and how-tos and can easily demonstrate how to perform something in a digital tool. You can even use them to tell a story. At one of my recent presentations, I used a GIF to explain why simply posting on Twitter is not enough to get attention to a brand.

    I saw many posts that were able to acquire loads of links and social shares thanks to good graphics, for instance, this post that featured the SEO experts in Halloween costumes.

    Without a doubt, this requires a little bit of a budget, but I’d say it’s 100 percent worth it because it’s creating value. The last time our company did something like this for a client, we hired a designer who charged us $30 USD for one image. It’s not too bad since custom-made images make it way easier to pitch your posts to other blogs to get more links!

    Hint: When you’re looking for custom graphics that won’t make your wallet cry, you can always find freelancers on sites like Upwork or on freelancing Facebook groups.

    4. Delivering email outreach by targeting the “low hanging fruits”

    We’ve done a lot of email outreach campaigns here at Digital Olympus, and so, I’ve noticed that we have a fast turnaround rate when our outreach targets are in the “right state of mind,” meaning they’re interested in cooperating with us.

     There are many reasons why they might show interest. For example, perhaps they’ve recently published a piece and are now invested in promoting it. To spot content marketers and authors like these, you can use Pitchbox. Pitchbox lets you create a list of posts that were published within the last 24 hours based on the keywords of your choice.

    The biggest bonus of Pitchbox is that it not only pulls together a list of content pages but it also provides contact details. In addition to this, Pitchbox automates the whole outreach process.

    Another tool that can pull together a list of posts published within the last 24 hours is Buzzsumo. Here’s a great piece by Sujan Patel that shows how to deliver outreach the right way.

    There can be many speculations about which email outreach techniques work and which don’t, but the truth remains: It’s a very hard time-consuming job that requires lots of skill and practice. In one of my recent posts, I write about proven email outreach techniques and how to master them.

    5. Adding stats that don’t involve a huge time investment

    You’ve heard that a picture is worth a thousand words. How about this: A number knocks out 10 thousand words. By adding statistics to your piece, you can simply mark out the whole process of having to refer to another page.

    But fresh, relevant stats don’t grow on trees. You need to know where you can find them.

    The easiest and the cost-efficient way of adding numbers to your piece is by running Twitter polls. They can collect up to 1k results for only $100 USD of properly paid promotion efforts. The biggest plus of running polls on Twitter is that you can create a specific list of people (aka a tailored audience) that will see your ad. For a detailed explanation on how to work with tailored audiences, I recommend checking this post.

    Besides running Twitter polls, you can use survey tools that will help you collect answers for a fee:

    • Survata will show your survey across their online publisher’s network with the average cost per answer starting from 1 USD;
    • Surveymonkey market research module starts from $1.25 for 200 complete responses. As you can see from a screenshot below, it allows you to set up a more laser-targeted group by selecting a particular industry.

    Another quick hack that I use from time to time is comparing already existing data sets to reveal new insights. Statista is a great site for getting data on any topic. For instance, on one graph you can show the revenue growth on the major SMM platforms as well as the growth of their audience. Plus, don’t forget that while the numbers are good, the story is key. Statistics tend to be dry without a proper story that they are wrapped in. For inspiration, you can use this great post that shares many stories that were built on numbers.

    It doesn’t always have to be serious. Numbers draw more attention than written copy, so you can create a fun poll, for example, whether your followers are more into dogs or cats.

    Creating captivating content is hard work and often a hella lot of money, but there are ways to spare a few bucks here and there. By utilizing the strategies that I shared, you can make sure that your content gets the audience it needs without time waste, huge costs, and stress. The amount of backend work you put into research and advertising is what makes your audience not only scroll through your content but actually read it. This is what will differentiate your piece from millions of similar ones.

    Create a strategy and go for it! Whether it’s polling, graphics, emails, quotes, or backlinks, make a game plan that will promote your content the right way. Then your site will rock.

    Do you have any other tips or suggestions? Tell me below in the comments!


Source link

Spying On Google: 5 Ways to Use Log File Analysis To Reveal Invaluable SEO Insights



Log File Analysis should be a part of every SEO pro’s tool belt, but most SEOs have never conducted one. Which means most SEOs are missing out on unique and invaluable insights that regular crawling tools just can’t produce. 

Let’s demystify Log File Analysis so it’s not so intimidating. If you’re interested in the wonderful world of log files and what they can bring to your site audits, this guide is definitely for you. 

What are Log Files?

Log Files are files containing detailed logs on who and what is making requests to your website server. Every time a bot makes a request to your site, data (such as the time, date IP address, user agent, etc.) is stored in this log. This valuable data allows any SEO to find out what Googlebot and other crawlers are doing on your site. Unlike regular crawlings, such as with the Screaming Frog SEO Spider, this is real-world data — not an estimation of how your site is being crawled. It is an exact overview of how your site is being crawled.

Having this accurate data can help you identify areas of crawl budget waste, easily find access errors, understand how your SEO efforts are affecting crawling and much, much more. The best part is that, in most cases, you can do this with simple spreadsheet software. 

In this guide, we will be focussing on Excel to perform Log File Analysis, but I’ll also discuss other tools such as Screaming Frog’s less well-known Log File Analyser which can just make the job a bit easier and faster by helping you manage larger data sets. 

Note: owning any software other than Excel is not a requirement to follow this guide or get your hands dirty with Log Files.

How to Open Log Files

Rename .log to .csv

When you get a log file with a .log extension, it is really as easy as renaming the file extension .csv and opening the file in spreadsheet software. Remember to set your operating system to show file extensions if you want to edit these.

How to open split log files

Log files can come in either one big log or multiple files, depending on the server configuration of your site. Some servers will use server load balancing to distribute traffic across a pool or farm of servers, causing log files to be split up. The good news is that it’s really easy to combine, and you can use one of these three methods to combine them and then open them as normal:

  1. Use the command line in Windows by Shift + right-clicking in the folder containing your log files and selecting “Run Powershell from here”

Then run the following command:

copy *.log mylogfiles.csv

You can now open mylogfile.csv and it will contain all your log data.

Or if you are a Mac user, first use the cd command to go to the directory of your log files:

cd Documents/MyLogFiles/

Then, use the cat or concatenate command to join up your files:

cat *.log > mylogfiles.csv

2) Using the free tool, Log File Merge, combine all the log files and then edit the file extension to .csv and open as normal.

3) Open the log files with the Screaming Frog Log File Analyser, which is as simple as dragging and dropping the log files:

Splitting Strings

(Please note: This step isn’t required if you are using Screaming Frog’s Log File Analyser)

Once you have your log file open, you’re going to need to split the cumbersome text in each cell into columns for easier sorting later.

Excel’s Text to Column function comes in handy here, and is as easy as selecting all the filled cells (Ctrl / Cmd + A) and going to Excel > Data > Text to Columns and selecting the “Delimited” option, and the delimiter being a Space character.

Once you’ve separated this out, you may also want to sort by time and date — you can do so in the Time and Date stamp column, commonly separating the data with the “:” colon delimiter.

Your file should look similar to the one below:

As mentioned before, don’t worry if your log file doesn’t look exactly the same — different log files have different formats. As long as you have the basic data there (time and date, URL, user-agent, etc.) you’re good to go!

Understanding Log Files

Now that your log files are ready for analysis, we can dive in and start to understand our data. There are many formats that log files can take with multiple different data points, but they generally include the following:

  1. Server IP
  2. Date and time
  3. Server request method (e.g. GET / POST)
  4. Requested URL
  5. HTTP status code
  6. User-agent

More details on the common formats can be found below if you’re interested in the nitty gritty details:

  • WC3
  • Apache and NGINX
  • Amazon Elastic Load Balancing
  • HA Proxy
  • JSON

How to quickly reveal crawl budget waste

As a quick recap, Crawl Budget is the number of pages a search engine crawls upon every visit of your site. Numerous factors affect crawl budget, including link equity or domain authority, site speed, and more. With Log File Analysis, we will be able to see what sort of crawl budget your website has and where there are problems causing crawl budget to be wasted. 

Ideally, we want to give crawlers the most efficient crawling experience possible. Crawling shouldn’t be wasted on low-value pages and URLs, and priority pages (product pages for example) shouldn’t have slower indexation and crawl rates because a website has so many dead weight pages. The name of the game is crawl budget conservation, and with good crawl budget conversion comes better organic search performance.

See crawled URLs by user agent

Seeing how frequently URLs of the site are being crawled can quickly reveal where search engines are putting their time into crawling.

If you’re interested in seeing the behavior of a single user agent, this is easy as filtering out the relevant column in excel. In this case, with a WC3 format log file, I’m filtering the cs(User-Agent) column by Googlebot:

And then filtering the URI column to show the number of times Googlebot crawled the home page of this example site:

This is a fast way of seeing if there are any problem areas by URI stem for a singular user-agent. You can take this a step further by looking at the filtering options for the URI stem column, which in this case is cs-uri-stem:

From this basic menu, we can see what URLs, including resource files, are being crawled to quickly identify any problem URLs (parameterized URLs that shouldn’t be being crawled for example).

You can also do broader analyses with Pivot tables. To get the number of times a particular user agent has crawled a specific URL, select the whole table (Ctrl/cmd + A), go to Insert > Pivot Table and then use the following options:

All we’re doing is filtering by User Agent, with the URL stems as rows, and then counting the number of times each User-agent occurs.

With my example log file, I got the following:

Then, to filter by specific User-Agent, I clicked the drop-down icon on the cell containing “(All),” and selected Googlebot:

Understanding what different bots are crawling, how mobile bots are crawling differently to desktop, and where the most crawling is occurring can help you see immediately where there is crawl budget waste and what areas of the site need improvement.

Find low-value add URLs

Crawl budget should not be wasted on Low value-add URLs, which are normally caused by session IDs, infinite crawl spaces, and faceted navigation.

To do this, go back to your log file, and filter by URLs that contain a “?” or question mark symbols from the URL column (containing the URL stem). To do this in Excel, remember to use “~?” or tilde question mark, as shown below:

A single “?” or question mark, as stated in the auto filter window, represents any single character, so adding the tilde is like an escape character and makes sure to filter out the question mark symbol itself.

Isn’t that easy?

Find duplicate URLs

Duplicate URLs can be a crawl budget waste and a big SEO issue, but finding them can be a pain. URLs can sometimes have slight variants (such as a trailing slash vs a non-trailing slash version of a URL).

Ultimately, the best way to find duplicate URLs is also the least fun way to do so — you have to sort by site URL stem alphabetically and manually eyeball it.

One way you can find trailing and non-trailing slash versions of the same URL is to use the SUBSTITUTE function in another column and use it to remove all forward slashes:

=SUBSTITUTE(C2, “/”, “”)

In my case, the target cell is C2 as the stem data is on the third column.

Then, use conditional formatting to identify duplicate values and highlight them.

However, eyeballing is, unfortunately, the best method for now.

See the crawl frequency of subdirectories

Finding out which subdirectories are getting crawled the most is another quick way to reveal crawl budget waste. Although keep in mind, just because a client’s blog has never earned a single backlink and only gets three views a year from the business owner’s grandma doesn’t mean you should consider it crawl budget waste — internal linking structure should be consistently good throughout the site and there might be a strong reason for that content from the client’s perspective.

To find out crawl frequency by subdirectory level, you will need to mostly eyeball it but the following formula can help:

=IF(RIGHT(C2,1)="/",SUM(LEN(C2)-LEN(SUBSTITUTE(C2,"/","")))/LEN("/")+SUM(LEN(C2)-LEN(SUBSTITUTE(C2,"=","")))/LEN("=")-2, SUM(LEN(C2)-LEN(SUBSTITUTE(C2,"/","")))/LEN("/")+SUM(LEN(C2)-LEN(SUBSTITUTE(C2,"=","")))/LEN("=")-1) 

The above formula looks like a bit of a doozy, but all it does is check if there is a trailing slash, and depending on the answer, count the number of trailing slashes and subtract either 2 or 1 from the number. This formula could be shortened if you remove all trailing slashes from your URL list using the RIGHT formula — but who has the time. What you’re left with is subdirectory count (starting from 0 from as the first subdirectory).

Replace C2 with the first URL stem / URL cell and then copy the formula down your entire list to get it working.

Make sure you replace all of the C2s with the appropriate starting cell and then sort the new subdirectory counting column by smallest to largest to get a good list of folders in a logical order, or easily filter by subdirectory level. For example, as shown in the below screenshots:

The above image is subdirectories sorted by level.

The above image is subdirectories sorted by depth.

If you’re not dealing with a lot of URLs, you could simply sort the URLs by alphabetical order but then you won’t get the subdirectory count filtering which can be a lot faster for larger sites.

See crawl frequency by content type

Finding out what content is getting crawled, or if there are any content types that are hogging crawl budget, is a great check to spot crawl budget waste. Frequent crawling on unnecessary or low priority CSS and JS files, or how crawling is occurring on images if you are trying to optimize for image search, can easily be spotted with this tactic.

In Excel, seeing crawl frequency by content type is as easy as filtering by URL or URI stem using the Ends With filtering option.

Quick Tip: You can also use the “Does Not End With” filter and use a .html extension to see how non-HTML page files are being crawled — always worth checking in case of crawl budget waste on unnecessary js or css files, or even images and image variations (looking at you WordPress). Also, remember if you have a site with trailing and non-trailing slash URLs to take that into account with the “or” operator with filtering.

Spying on bots: Understand site crawl behavior

Log File Analysis allows us to understand how bots behave by giving us an idea of how they prioritize. How do different bots behave in different situations? With this knowledge, you can not only deepen your understanding of SEO and crawling, but also give you a huge leap in understanding the effectiveness of your site architecture.

See most and least crawled URLs

This strategy has been touched up previously with seeing crawled URLs by user-agent, but it’s even faster.

In Excel, select a cell in your table and then click Insert > Pivot Table, make sure the selection contains the necessary columns (in this case, the URL or URI stem and the user-agent) and click OK.

Once you have your pivot table created, set the rows to the URL or URI stem, and the summed value as the user-agent.

From there, you can right-click in the user-agent column and sort the URLs from largest to smallest by crawl count:

Now you’ll have a great table to make charts from or quickly review and look for any problematic areas:

A question to ask yourself when reviewing this data is: Are the pages you or the client would want being crawled? How often? Frequent crawling doesn’t necessarily mean better results, but it can be an indication as to what Google and other content user-agents prioritize most.

Crawl frequency per day, week, or month

Checking the crawling activity to identify issues where there has been loss of visibility around a period of time, after a Google update or in an emergency can inform you where the problem might be. This is as simple as selecting the “date” column, making sure the column is in the “date” format type, and then using the date filtering options on the date column. If you’re looking to analyze a whole week, just select the corresponding days with the filtering options available.

Crawl frequency by directive

Understanding what directives are being followed (for instance, if you are using a disallow or even a no-index directive in robots.txt) by Google is essential to any SEO audit or campaign. If a site is using disallows with faceted navigation URLs, for example, you’ll want to make sure these are being obeyed. If they aren’t, recommend a better solution such as on-page directives like meta robots tags.

To see crawl frequency by directive, you’ll need to combine a crawl report with your log file analysis.

(Warning: We’re going to be using VLOOKUP, but it’s really not as complicated as people make it out to be)

To get the combined data, do the following:

  1. Get the crawl from your site using your favorite crawling software. I might be biased, but I’m a big fan of the Screaming Frog SEO Spider, so I’m going to use that.

    If you’re also using the spider, follow the steps verbatim, but otherwise, make your own call to get the same results.

  2. Export the Internal HTML report from the SEO Spider (Internal Tab > “Filter: HTML”) and open up the “internal_all.xlsx” file.

    From there, you can filter the “Indexability Status” column and remove all blank cells. To do this, use the “does not contain” filter and just leave it blank. You can also add the “and” operator and filter out redirected URLs by making the filter value equal “does not contain → “Redirected” as shown below:

    This will show you canonicalized, no-index by meta robots and canonicalized URLs.

  3. Copy this new table out (with just the Address and Indexability Status columns) and paste it in another sheet of your log file analysis export.
  4. Now for some VLOOKUP magic. First, we need to make sure the URI or URL column data is in the same format as the crawl data.

    Log Files don’t generally have the root domain or protocol in the URL, so we either need to remove the head of the URL using “Find and Replace” in our newly made sheet, or make a new column in your log file analysis sheet append the protocol and root domain to the URI stem. I prefer this method because then you can quickly copy and paste a URL that you are seeing problems with and take a look. However, if you have a massive log file, it is probably a lot less CPU intensive with the “Find and Replace” method.

    To get your full URLs, use the following formula but with the URL field changed to whatever site you are analyzing (and make sure the protocol is correct as well). You’ll also want to change D2 to the first cell of your URL column


    Drag” class=”redactor-autoparser-object”>”&D…

    down the formula to the end of your Log file table and get a nice list of full URLs:

  5. Now, create another column and call it “Indexability Status”. In the first cell, use a VLOOKUP similar to the following: =VLOOKUP(E2,CrawlSheet!A$1:B$1128,2,FALSE). Replace E2 with the first cell of you “Full URL” column, then make the lookup table into your new. crawl sheet. Remember to sue the dollar signs so that the lookup table doesn’t change as you. apply the formula to further roles. Then, select the correct column (1 would be the first column of the index table, so number 2 is the one we are after). Use the FALSE range lookup mode for exact matching. Now you have a nice tidy list of URLs and their indexability status matched with crawl data:

    Crawl frequency by depth and internal links

    This analysis allows us to see how a site’s architecture is performing in terms of crawl budget and crawlability. The main aim is to see if you have far more URLs than you do requests — and if you do then you have a problem. Bots shouldn’t be “giving up” on crawling your entire site and not discovering important content or wasting crawl budget on content that is not important.

    Tip: It is also worth using a crawl visualization tool alongside this analysis to see the overall architecture of the site and see where there are “off-shoots” or pages with poor internal linking.

    To get this all-important data, do the following:

    1. Crawl your site with your preferred crawling tool and export whichever report has both the click depth and number of internal links with each URL.

      In my case, I’m using the Screaming Frog SEO Spider, going exporting the Internal report:

    2. Use a VLOOKUP to match your URL with the Crawl Depth column and the number of Inlinks, which will give you something like this:
    3. Depending on the type of data you want to see, you might want to filter out only URLs returning a 200 response code at this point or make them filterable options in the pivot table we create later. If you’re checking an e-commerce site, you might want to focus solely on product URLs, or if you’re optimizing crawling of images you can filter out by file type by filtering the URI column of your log file using the “Content-Type” column of your crawl export and making an option to filter with a pivot table. As with all of these checks, you have plenty of options!
    4. Using a pivot table, you can now analyze crawl rate by crawl depth (filtering by the particular bot in this case) with the following options:

    To get something like the following:

    Better data than Search Console? Identifying crawl issues

    Search Console might be a go-to for every SEO, but it certainly has flaws. Historical data is harder to get, and there are limits on the number of rows you can view (at this time of writing it is 1000). But, with Log File Analysis, the sky’s the limit. With the following checks, we’re going to be discovered crawl and response errors to give your site a full health check.

    Discover Crawl Errors

    An obvious and quick check to add to your arsenal, all you have to do is filter the status column of your log file (in my case “sc-status” with a W3C log file type) for 4xx and 5xx errors:

    Find inconsistent server responses

    A particular URL may have varying server responses over time, which can either be normal behavior, such as when a broken link has been fixed or a sign there is a serious server issue occurring such as when heavy traffic to your site causes a lot more internal server errors and is affecting your site’s crawlability.

    Analyzing server responses is as easy as filtering by URL and by Date:

    Alternatively, if you want to quickly see how a URL is varying in response code, you can use a pivot table with the rows set to the URL, the columns set to the response codes and counting the number of times a URL has produced that response code. To achieve this setup create a pivot table with the following settings:

    This will produce the following:

    As you can see in the above table, you can clearly see “/inconcistent.html” (highlighted in the red box) has varying response codes.

    View Errors by Subdirectory

    To find which subdirectories are producing the most problems, we just need to do some simple URL filtering. Filter out the URI column (in my case “cs-uri-stem”) and use the “contains” filtering option to select a particular subdirectory and any pages within that subdirectory (with the wildcard *):

    For me, I checked out the blog subdirectory, and this produced the following:

    View Errors by User Agent

    Finding which bots are struggling can be useful for numerous reasons including seeing the differences in website performance for mobile and desktop bots, or which search engines are best able to crawl more of your site.

    You might want to see which particular URLs are causing issues with a particular bot. The easiest way to do this is with a pivot table that allows for filtering the number of times a particular response code occurs per URI. To achieve this make a pivot table with the following settings:

    From there, you can filter by your chosen bot and response code type, such as image below, where I’m filtering for Googlebot desktop to seek out 404 errors:

    Alternatively, you can also use a pivot table to see how many times a specific bot produces different response codes as a whole by creating a pivot table that filters by bot, counts by URI occurrence, and uses response codes as rows. To achieve this use the settings below:

    For example, in the pivot table (below), I’m looking at how many of each response code Googlebot is receiving:

    Diagnose on-page problems 

    Websites need to be designed not just for humans, but for bots. Pages shouldn’t be slow loading or be a huge download, and with log file analysis, you can see both of these metrics per URL from a bot’s perspective.

    Find slow & large pages

    While you can sort your log file by the “time taken” or “loading time” column from largest to smallest to find the slowest loading pages, it’s better to look at the average load time per URL as there could be other factors that might have contributed to a slow request other than the web page’s actual speed.

    To do this, create a pivot table with the rows set to the URI stem or URL and the summed value set to the time taken to load or load time:

    Then using the drop-down arrow, in this case, where it says “Sum of time-taken” and go to “Value Field Settings”:

    In the new window, select “Average” and you’re all set:

    Now you should have something similar to the following when you sort the URI stems by largest to smallest and average time taken:

    Find large pages

    You can now add the download size column (in my case “sc-bytes”) using the settings shown below. Remember that the set the size to the average or sum depending on what you would like to see. For me, I’ve done the average:

    And you should get something similar to the following:

    Bot behavior: Verifying and analyzing bots

    The best and easiest way to understand bot and crawl behavior is with log file analysis as you are again getting real-world data, and it’s a lot less hassle than other methods.

    Find un-crawled URLs

    Simply take the crawl of your website with your tool of choice, and then take your log file an compare the URLs to find unique paths. You can do this with the “Remove Duplicates” feature of Excel or conditional formatting, although the former is a lot less CPU intensive especially for larger log files. Easy!

    Identify spam bots

    Unnecessary server strain from spam and spoof bots is easily identified with log files and some basic command line operators. Most requests will also have an IP associated with it, so using your IP column (in my case, it is titled “c-ip” in a W3C format log), remove all duplicates to find each individual requesting IP.

    From there, you should follow the process outlined in Google’s document for verifying IPs (note: For Windows users, use the nslookup command):

    Or, if you’re verifying a bing bot, use their handy tool:

    Conclusion: Log Files Analysis — not as scary as it sounds

    With some simple tools at your disposal, you can dive deep into how Googlebot behaves. When you understand how a website handles crawling, you can diagnose more problems than you can chew — but the real power of Log File Analysis lies in being able to test your theories about Googlebot and extending the above techniques to gather your own insights and revelations.

    What theories would you test using log file analysis? What insights could you gather from log files other than the ones listed above? Let me know in the comments below.


Source link