It’s Content and It’s Links – Are We Making SEO Too Complicated?

[ad_1]

AndrewDennis33

Content and links — to successfully leverage search as a marketing channel you need useful content and relevant links.

Many experienced SEOs have run numerous tests and experiments to correlate backlinks with higher rankings, and Google has espoused the importance of “great content” for as long as I can remember.

In fact, a Google employee straight up told us that content and links are two of the three (the other being RankBrain) most important ranking factors in its search algorithm.

So why do we seem to overcomplicate SEO by chasing new trends and tactics, overreacting to fluctuations in rankings, and obsessing over the length of our title tags? SEO is simple — it’s content and it’s links.

Now, this is a simple concept, but it is much more nuanced and complex to execute well. However, I believe that by getting back to basics and focusing on these two pillars of SEO we can all spend more time doing the work that will be most impactful, creating a better, more connected web, and elevating SEO as a practice within the marketing realm.

To support this movement, I want to provide you with strategic, actionable takeaways that you can leverage in your own content marketing and link building campaigns. So, without further ado, let’s look at how you can be successful in search with content and links.

Building the right content

As the Wu-Tang Clan famously said, “Content rules everything around me, C.R.E.A.M,” …well, it was something like that. The point is, everything in SEO begins and ends with content. Whether it’s a blog post, infographic, video, in-depth guide, interactive tool, or something else, content truly rules everything around us online.

Content attracts and engages visitors, building positive associations with your brand and inspiring them to take desired actions. Content also helps search engines better understand what your website is about and how they should rank your pages within their search results.

A screenshot of a cell phone

Description automatically generated

So where do you start with something as wide-reaching and important as a content strategy? Well, if everything in SEO begins and ends with content, then everything in content strategy begins and ends with keyword research.

Proper keyword research is the difference between a targeted content strategy that drives organic visibility and simply creating content for the sake of creating content. But don’t just take my word for it — check out this client project where keyword research was executed after a year of publishing content that wasn’t backed by keyword analysis:

A close up of a map

Description automatically generated

(Note: Each line represents content published within a given year, not total organic sessions of the site.)

In 2018, we started creating content based on keyword opportunities. The performance of that content has quickly surpassed (in terms of organic sessions) the older pages that were created without strategic research.

Start with keyword research

The concept of keyword research is straightforward — find the key terms and phrases that your audience uses to find information related to your business online. However, the execution of keyword research can be a bit more nuanced, and simply starting is often the most difficult part.

The best place to start is with the keywords that are already bringing people to your site, which you can find within Google Search Console.

Beyond the keywords that already bring people to your website, a baseline list of seed keywords can help you expand your keyword reach.

Seed keywords are the foundational terms that are related to your business and brand.

As a running example, let’s use Quip, a brand that sells oral care products. Quip’s seed keywords would be:

  • [toothbrush]
  • [toothpaste]
  • [toothbrush set]
  • [electric toothbrush]
  • [electric toothbrush set]
  • [toothbrush subscription]

These are some of the most basic head terms related to Quip’s products and services. From here, the list could be expanded, using keyword tools such as Moz’s Keyword Explorer, to find granular long-tail keywords and other related terms.

Expanded keyword research and analysis

The first step in keyword research and expanding your organic reach is to identify current rankings that can and should be improved.

Here are some examples of terms Moz’s Keyword Explorer reports Quip has top 50 rankings for:

  • [teeth whitening]
  • [sensitive teeth]
  • [whiten teeth]
  • [automatic toothbrush]
  • [tooth sensitivity]
  • [how often should you change your toothbrush]

These keywords represent “near-miss” opportunities for Quip, where it ranks on page two or three. Optimization and updates to existing pages could help Quip earn page one rankings and substantially more traffic.

For example, here are the first page results for [how often should you change your toothbrush]:

A screenshot of a cell phone

Description automatically generated

As expected, the results here are hyper-focused on answering the question how often a toothbrush needs to be changed, and there is a rich snippet that answers the question directly.

Now, look at Quip’s page where we can see there is room for improvement in answering searcher intent:

A picture containing person, object

Description automatically generated

The title of the page isn’t optimized for the main query, and a simple title change could help this page earn more visibility. Moz reports 1.7k–2.9k monthly search volume for [how often should you change your toothbrush]:

A screenshot of a cell phone

Description automatically generated

This is a stark contrast to the volume reported by Moz for [why is a fresh brush head so important] which is “no data” (which usually means very small):

A screenshot of a cell phone

Description automatically generated

Quip’s page is already ranking on page two for [how often should you change your toothbrush], so optimizing the title could help the page crack the top ten.

Furthermore, the content on the page is not optimized either:

A picture containing person

Description automatically generated

Rather than answering the question of how often to change a toothbrush concisely (like the page that has earned the rich snippet), the content is closer to ad copy. Putting a direct, clear answer to this question at the beginning of the content could help this page rank better.

And that’s just one query and one page!

Keyword research should uncover these types of opportunities, and with Moz’s Keyword Explorer you can also find ideas for new content through “Keyword Suggestions.”

Using Quip as an example again, we can plug in their seed keyword [toothbrush] and get multiple suggestions (MSV = monthly search volume):

  • [toothbrush holder] – MSV: 6.5k–9.3k
  • [how to properly brush your teeth] – MSV: 851–1.7k
  • [toothbrush cover] – MSV: 851–1.7k
  • [toothbrush for braces] – MSV: 501–850
  • [electric toothbrush holder] – MSV: 501–850
  • [toothbrush timer] – MSV: 501–850
  • [soft vs medium toothbrush] – MSV: 201–500
  • [electric toothbrush for braces] – MSV: 201–500
  • [electric toothbrush head holder] – MSV: 101–200
  • [toothbrush delivery] – MSV: 101–200

Using this method, we can extrapolate one seed keyword into ten more granular and related long-tail keywords — each of which may require a new page.

This handful of terms generates a wealth of content ideas and different ways Quip could address pain points and reach its audience.

Another source for keyword opportunities and inspiration are your competitors. For Quip, one of its strongest competitors is Colgate, a household name brand. Moz demonstrates the difference in market position with its “Competitor Overlap” tool:

A screenshot of a cell phone

Description automatically generated

Although many of Colgate’s keywords aren’t relevant to Quip, there are still opportunities to be gleaned here for Quip. One such example is [sensitive teeth], where Colgate is ranking top five, but Quip is on page two:

A screenshot of a computer

Description automatically generated

While many of the other keywords show Quip is ranking outside of the top 50, this is an opportunity that Quip could potentially capitalize on.

To analyze this opportunity, let’s look at the actual search results first.

A screenshot of a cell phone

Description automatically generated

It’s immediately clear that the intent here is informational — something to note when we examine Quip’s page. Also, scrolling down we can see that Colgate has two pages ranking on page one:

A screenshot of a cell phone

Description automatically generated

One of these pages is from a separate domain for hygienists and other dental professionals, but it still carries the Colgate brand and further demonstrates Colgate’s investment into this query, signaling this is a quality opportunity.

The next step for investigating this opportunity is to examine Colgate’s ranking page and check if it’s realistic for Quip to beat what they have. Here is Colgate’s page:

A screenshot of a social media post

Description automatically generated

This page is essentially a blog post:

A screenshot of a cell phone

Description automatically generated

If this page is ranking, it’s reasonable to believe that Quip could craft something that would be at least as good of a result for the query, and there is room for improvement in terms of design and formatting.

One thing to note, that is likely helping this page rank is the clear definition of “tooth sensitivity” and signs and symptoms listed on the sidebar:

A screenshot of a social media post

Description automatically generated

Now, let’s look at Quip’s page:

A screenshot of a cell phone

Description automatically generated

This appears to be a blog-esque page as well.

A screenshot of a cell phone

Description automatically generated

This page offers solid information on sensitive teeth, which matches the queries intent and is likely why the pages ranks on page two. However, the page appears to be targeted at [tooth sensitivity]:

A screenshot of a social media post

Description automatically generated

This is another great keyword opportunity for Quip:

A screenshot of a cell phone

Description automatically generated

However, this should be a secondary opportunity to [sensitive teeth] and should be mixed in to the copy on the page, but not the focal point. Also, the page one results for [tooth sensitivity] are largely the same as those for [sensitive teeth], including Colgate’s page:

A screenshot of a cell phone

Description automatically generated

So, one optimization Quip could make to the page could be to change some of these headers to include “sensitive teeth” (also, these are all H3s, and the page has no H2s, which isn’t optimal). Quip could draw inspiration from the questions that Google lists in the “People also ask” section of the SERP:

A screenshot of a cell phone

Description automatically generated

Also, a quick takeaway I had was that Quip’s page does not lead off with a definition of sensitive teeth or tooth sensitivity. We learned from Colgate’s page that quickly defining the term (sensitive teeth) and the associated symptoms could help the page rank better.

These are just a few of the options available to Quip to optimize its page, and as mentioned before, an investment into a sleek, easy to digest design could separate its page from the pack.

If Quip were able to move its page onto the first page of search results for [sensitive teeth], the increase in organic traffic could be significant. And [sensitive teeth] is just the tip of the proverbial iceberg — there is a wealth of opportunity with associated keywords, that Quip would rank well for also:

A screenshot of a cell phone

Description automatically generated

Executing well on these content opportunities and repeating the process over and over for relevant keywords is how you scale keyword-focused content that will perform well in search and bring more organic visitors.

Google won’t rank your page highly for simply existing. If you want to rank in Google search, start by creating a page that provides the best result for searchers and deserves to rank.

At Page One Power, we’ve leveraged this strategy and seen great results for clients. Here is an example of a client that is primarily focused on content creation and their corresponding growth in organic sessions:

A picture containing text

Description automatically generated

These pages (15) were all published in January, and you can see that roughly one month after publishing, these pages started taking off in terms of organic traffic. This is because these pages are backed by keyword research and optimized so well that even with few external backlinks, they can rank on or near page one for multiple queries.

However, this doesn’t mean you should ignore backlinks and link acquisition. While the above pages rank well without many links, the domain they’re on has a substantial backlink profile cultivated through strategic link building. Securing relevant, worthwhile links is still a major part of a successful SEO campaign.

Earning real links and credibility

The other half of this complicated “it’s content and it’s links” equation is… links, and while it seems straightforward, successful execution is rather difficult — particularly when it comes to link acquisition.

While there are tools and processes that can increase organization and efficiency, at the end of the day link building takes a lot of time and a lot of work — you must manually email real website owners to earn real links. As Matt Cutts famously said (we miss you, Matt!), “Link building is sweat, plus creativity.”

However, you can greatly improve your chances for success with link acquisition if you identify which pages (existing or need to be created) on your site are link-worthy and promote them for links.

Spoiler alert: these are not your “money pages.”

Converting pages certainly have a function on your website, but they typically have limited opportunities when it comes to link acquisition. Instead, you can support these pages — and other content on your site — through internal linking from more linkable pages.

So how do you identify linkable assets? Well, there are some general characteristics that directly correlate with link-worthiness:

  • Usefulness — concept explanation, step-by-step guide, collection of resources and advice, etc.
  • Uniqueness — a new or fresh perspective on an established topic, original research or data, prevailing coverage of a newsworthy event, etc.
  • Entertaining — novel game or quiz, humorous take on a typically serious subject, interactive tool, etc.

Along with these characteristics, you also need to consider the size of your potential linking audience. The further you move down your marketing funnel, the smaller the linking audience size; converting pages are traditionally difficult to earn links to because they serve a small audience of people looking to buy.

Instead, focus on assets that exist at the top of your marketing funnel and serve large audiences looking for information. The keywords associated with these pages are typically head terms that may prove difficult to rank for, but if your content is strong you can still earn links through targeted, manual outreach to relevant sites.

Ironically, your most linkable pages aren’t always the pages that will rank well for you in search, since larger audiences also mean more competition. However, using linkable assets to secure worthwhile links will help grow the authority and credibility of your brand and domain, supporting rankings for your keyword-focused and converting pages.

Going back to our Quip example, we see a page on their site that has the potential to be a linkable asset:

A close up of a map

Description automatically generated

Currently, this page is geared more towards conversions which hurts linkability. However, Quip could easily move conversion-focused elements to another page and internally link from this page to maintain a pathway to conversion while improving link-worthiness.

To truly make this page a linkable asset, Quip would need add depth on the topic of how to brush your teeth and hone in on a more specific audience. As the page currently stands, it is targeted at everybody who brushes, but to make the page more linkable Quip could focus on a specific age group (toddlers, young children, elderly, etc.) or perhaps a profession or group who works odd hours or travels frequently and doesn’t have the convenience of brushing at home. An increased focus on audience will help with linkability, making this page one that shares useful information in a way that is unique and entertaining.

It also happens that [how to properly brush your teeth] was one of the opportunities we identified earlier in our (light) keyword research, so this could be a great opportunity to earn keyword rankings and links!

Putting it all together and simplifying our message

Now before we put it all together and solve SEO once and for all, you might be thinking, “What about technical and on-page SEO?!?”

And to that, I say, well those are just makeu…just kidding!

Technical and on-page elements play a major role in successful SEO and getting these elements wrong can derail the success of any content you create and undermine the equity of the links you secure.

Let’s be clear: if Google can’t crawl your site, you’re not showing up in its search results.

However, I categorize these optimizations under the umbrella of “content” within our content and links formula. If you’re not considering how search engines consume your content, along with human readers, then your content likely won’t perform well in the results of said search engines.

Rather than dive into the deep and complex world of technical and on-page SEO in this post, I recommend reading some of the great resources here on Moz to ensure your content is set up for success from a technical standpoint.

But to review the strategy I’ve laid out here, to be successful in search you need to:

  1. Research your keywords and niche – Having the right content for your audience is critical to earning search visibility and business. Before you start creating content or updating existing pages, make sure you take the time to research your keywords and niche to better understand your current rankings and position in the search marketplace.
  2. Analyze and expand keyword opportunities – Beyond understanding your current rankings, you also need to identify and prioritize available keyword opportunities. Using tools like Moz you can uncover hidden opportunities with long-tail and related key terms, ensuring your content strategy is targeting your best opportunities.
  3. Craft strategic content that serves your search goals – Using keyword analysis to inform content creation, you can build content that addresses underserved queries and helpful guides that attract links. An essential aspect of a successful content plan is balancing keyword-focused content with broader, more linkable content and ensuring you’re addressing both SEO goals.
  4. Promote your pages for relevant links – Billions of new pages go live each day, and without proper promotion, even the best pages will be buried in the sea of content online. Strategic promotion of your pages will net you powerful backlinks and extra visibility from your audience.

Again, these concepts seem simple but are quite difficult to execute well. However, by drilling down to the two main factors for search visibility — content and links — you can avoid being overwhelmed or focusing on the wrong priorities and instead put all your efforts into the strategies that will provide the most SEO impact.

However, along with refocusing our own efforts, as SEOs we also need to simplify our message to the uninitiated (or as they’re also known, the other 99% of the population). I know from personal experience how quickly the eyes start to glaze over when I get into the nitty-gritty of SEO, so I typically pivot to focus on the most basic concepts: content and links.

People can wrap their minds around the simple process of creating good pages that answer a specific set of questions and then promoting those pages to acquire endorsements (backlinks). I suggest we embrace this same approach, on a broader scale, as an industry.

When we talk to potential and existing clients, colleagues, executives, etc., let’s keep things simple. If we focus on the two concepts that are the easiest to explain we will get better understanding and more buy-in for the work we do (it also happens that these two factors are the biggest drivers of success).

So go out, shout it from the rooftops — CONTENT AND LINKS — and let’s continue to do the work that will drive positive results for our websites and help secure SEOs rightful seat at the marketing table.

[ad_2]

Source link

The Content Distribution Playbook – Whiteboard Friday

[ad_1]

rosssimmonds

If you’re one of the many marketers that shares your content on Facebook, Twitter, and Linked before calling it good and moving on, this Whiteboard Friday is for you. In a super actionable follow-up to his MozCon 2019 presentation, Ross Simmonds reveals how to go beyond the mediocre when it comes to your content distribution plan, reaching new audiences in just the right place at the right time.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

What’s going on, Whiteboard Friday fans? My name is Ross Simmonds from Foundation Marketing, and today we’re going to be talking about how to develop a content distribution playbook that will drive meaningful and measurable results for your business. 

What is content distribution and why does it matter?

First and foremost, content distribution is the thing that you need to be thinking about if you want to combat the fact that it is becoming harder and harder than ever before to stand out as a content marketer, as a storyteller, and as a content creator in today’s landscape. It’s getting more and more difficult to rank for content. It’s getting more and more difficult to get organic reach through our social media channels, and that is why content distribution is so important.

You are facing a time when organic reach on social continues to drop more and more, where the ability to rank is becoming even more difficult because you’re competing against more ad space. You’re competing against more featured snippets. You’re competing against more companies. Because content marketers have screamed at the top of their lungs that content is king and the world has listened, it is becoming more and more difficult to stand out amongst the noise.

Most marketers have embraced this idea because for years we screamed, “Content is king, create more content,”and that is what the world has done. Most marketers start by just creating content, hoping that traffic will come, hoping that reach will come, and hoping that as a result of them creating content that profits will follow. In reality, the profits never come because they miss a significant piece of the puzzle, which is content distribution.

In today’s video, we’re going to be talking about how you can distribute your content more effectively across a few different channels, a few different strategies, and how you can take your content to the next level. 

There are two things that you can spend when it comes to content distribution: 

  1. You can spend time, 
  2. or you can spend money. 

In today’s video, we’re going to talk about exactly how you can distribute your content so when you write that blog post, you write that landing page, when you create that e-book, you create that infographic, whatever resource you’ve developed, you can ensure that that content is reaching the right people on the right channel at the right time.

◷: Owned channels

So how can you do it? We all have heard of owned channels. Owned channels are things that you own as a business, as a brand, as an organization. These are things that you can do without question probably today. 

Email marketing

For example, email marketing, it’s very likely that you have an email list of some sort. You can distribute your content to those people. 

In-app notifications

Let’s say you have a website that offers people a solution or a service directly inside of the site. Say it’s software as a service or something of that nature. If people are logging in on a regular basis to access your product, you can use in-app notifications to let those people know that you’ve launched a blog post. Or better yet, if you have a mobile app of any sort, you can do the same thing. You can use your app to let people know that you just launched a new piece of content.

Social channels

You have social media channels. Let’s say you have Twitter, LinkedIn, Facebook. Share that content to your heart’s desire on those channels as well. 

On-site banner

If you have a website, you can update an on-site banner, at the top or in the bottom right, that is letting people know who visit your site that you have a new piece of content. Let them know. They want to know that you’re creating new content. So why not advise them that you have done such?

Sales outreach

If you have a sales team of any sort, let’s say you’re in B2B and you have a sales team, one of the most effective ways is to empower your sales team, to communicate to your sales team that you have developed a new piece of content so they can follow up with leads, they can nurture those existing relationships and even existing customers to let them know that a new piece of content has gone live. That one-to-one connection can be huge. 

◷: Social media / other channels

So when you’ve done all of that, what else can you do? You can go into social media. You can go into other channels. Again, you can spend time distributing your content into these places where your audience is spending time as well. 

Social channels and groups

So if you have a Twitter account, you can send out tweets. If you have a Facebook page, of course you can put up status updates.

If you have a LinkedIn page, you can put up a status update as well. These three things are typically what most organizations do in that Phase 2, but that’s not where it ends. You can go deeper. You can do more. You can go into Facebook groups, whether as a page or as a human, and share your content into these communities as well. You can let them know that you’ve published a new piece of research and you would love for them to check it out.

Or you’re in these groups and you’re looking and waiting and looking for somebody to ask a question that your blog post, your research has answered, and then you respond to that question with the content that you’ve developed. Or you do the same exact thing in a LinkedIn group. LinkedIn groups are an awesome opportunity for you to go in and start seeding your content as well.

Medium

Or you go to Medium.com. You repurpose the content that you’ve developed. You launch it on Medium.com as well. There’s an import function on Medium where you can import your content, get a canonical link directly to your site, and you can share that on Medium as well. Medium.com is a great distribution channel, because you can seed that content to publications as well.

When your content is going to these publications, they already have existing subscribers, and those subscribers get notified that there’s a new piece being submitted by you. When they see it, that’s a new audience that you wouldn’t have reached before using any of those owned channels, because these are people who you wouldn’t have had access to before. So you want to take advantage of that as well.

Keep in mind you don’t always have to upload even the full article. You can upload a snippet and then have a CTA at the bottom, a call to action driving people to the article on your website. 

LinkedIn video

You can use LinkedIn video to do the same thing. Very similar concept. Imagine you have a LinkedIn video. You look into the camera and you say to your connections, “Hey, everyone, we just launched a new research piece that is breaking down X, Y, and Z, ABC. I would love for you to check it out. Check the link below.”

If you created that video and you shared it on your LinkedIn, your connections are going to see this video, and it’s going to break their pattern of what they typically see on LinkedIn. So when they see it, they’re going to engage, they’re going to watch that video, they’re going to click the link, and you’re going to get more reach for the content that you developed in the past. 

Slack communities

Slack communities are another great place to distribute your content. Slack isn’t just a great channel to build internal culture and communicate as an internal team.

There are actual communities, people who are passionate about photography, people who are passionate about e-commerce, people who are passionate about SEO. There are Slack communities today where these people are gathering to talk about their passions and their interests, and you can do the same thing that you would do in Facebook groups or LinkedIn groups in these various Slack communities. 

Instagram / Facebook stories

Instagram stories and Facebook stories, awesome, great channel for you to also distribute your content. You can add a link to these stories that you’re uploading, and you can simply say, “Swipe up if you want to get access to our latest research.” Or you can design a graphic that will say, “Swipe up to get our latest post.” People who are following you on these channels will swipe up. They’ll land on your article, they’ll land on your research, and they’ll consume that content as well. 

LinkedIn Pulse

LinkedIn Pulse, you have the opportunity now to upload an article directly to LinkedIn, press Publish, and again let it soar. You can use the same strategies that I talked about around Medium.com on LinkedIn, and you can drive results. 

Quora

Quora, it’s like a question-and-answer site, like Yahoo Answers back in the day, except with a way better design. You can go into Quora, and you can share just a native link and tag it with relevant content, relevant topics, and things of that nature. Or you can find a few questions that are related to the topic that you’ve covered in your post, in your research, whatever asset you developed, and you can add value to that person who asked that question, and within that value you make a reference to the link and the article that you developed in the past as well.

SlideShare

SlideShare, one of OGs of B2B marketing. You can go to SlideShare, upload a presentation version of the content that you’ve already developed. Let’s say you’ve written a long blog post. Why not take the assets within that blog post, turn them into a PDF, a SlideShare presentation, upload them there, and then distribute it through that network as well? Once you have those SlideShare presentations put together, what’s great about it is you can take those graphics and you can share them on Twitter, you can share them on Facebook, LinkedIn, you can put them into Medium.com, and distribute them further there as well.

Forums

You can go into forums. Let’s think about it. If your audience is spending time in a forum communicating about something, why not go into these communities and into these forums and connect with them on a one-to-one basis as well? There’s a huge opportunity in forums and communities that exist online, where you can build trust and you can seed your content into these communities where your audience is spending time.

A lot of people think forums are dead. They could never be more alive. If you type in your audience, your industry forums, I promise you you’ll probably come across something that will surprise you as an opportunity to seed your content. 

Reddit communities

Reddit communities, a lot of marketers get the heebie-jeebies when I talk about Reddit. They’re all like, “Marketers on Reddit? That doesn’t work. Reddit hates marketing.” I get it.

I understand what you’re thinking. But what they actually hate is the fact that marketers don’t get Reddit. Marketers don’t get the fact that Redditors just want value. If you can deliver value to people using Reddit, whether it’s through a post or in the comments, they will meet you with happiness and joy. They will be grateful of the fact that you’ve added value to their communities, to their subreddits, and they will reward you with upvotes, with traffic and clicks, and maybe even a few leads or a customer or two in the process.

Do not ignore Reddit as being the site that you can’t embrace. Whether you’re B2B or B2C, Redditors can like your content. Redditors will like your content if you go in with value first. 

Imgur

Sites like Imgur, another great distribution channel. Take some of those slides that you developed in the past, upload them to Imgur, and let them sing there as well.

There are way more distribution channels and distribution techniques that you can use that go beyond even what I’ve described here. But these just a few examples that show you that the power of distribution doesn’t exist just in a couple posts. It exists in actually spending the time, taking the time to distribute your stories and distribute your content across a wide variety of different channels.

$: Paid marketing

That’s spending time. You can also spend money through paid marketing. Paid marketing is also an opportunity for any brand to distribute their stories. 

Remarketing

First and foremost, you can use remarketing. Let’s talk about that email list that you’ve already developed. If you take that email list and you run remarketing ads to those people on Facebook, on Twitter, on LinkedIn, you can reach those people and get them engaged with new content that you’ve developed.

Let’s say somebody is already visiting your page. People are visiting your website. They’re visiting your content. Why not run remarketing ads to those people who already demonstrate some type of interest to get them back on your site, back engaged with your content, and tell your story to them as well? Another great opportunity is if you’ve leveraged video in any way, you can do remarketing ads on Facebook to people who have watched 10 seconds, 30 seconds, 20 seconds, whatever it may be to your content as well.

Quora ads

Then one of the opportunities that is definitely underrated is the fact that Quora now offers advertising as well. You can run ads on Quora to people who are asking or looking at questions related to your industry, related to the content that you’ve developed, and get your content in front of them as well. 

Influencer marketing

Then influencers, you can do sponsored content. You can reach out to these influencers and have them talk about your stories, talk about your content, and have them share it as well on behalf of the fact that you’ve developed something new and something that is interesting.

Think differently & rise above mediocrity

When I talk about influencer marketing, I talk about Reddit, I talk about SlideShare, I talk about LinkedIn video, I talk about Slack communities, a lot of marketers will quickly say, “I don’t think this is for me. I think this is too much. I think that this is too much manual work. I think this is too many niche communities. I think this is a little bit too much for my brand.

I get that. I understand your mindset, but this is what you need to recognize. Most marketers are going through this process. If you think that by distributing your content into the communities that your audience is spending time is just a little bit off brand or it doesn’t really suit you, that’s what most marketers already think. Most marketers already think that Twitter, Facebook, and LinkedIn is all they need to do to share their stories, get their content out there, and call it a day.

If you want to be like most marketers, you’re going to get what most marketers receive as a result, which is mediocre results. So I push you to think differently. I push you to push yourself to not be like most marketers, not to go down the path of mediocrity, and instead start looking for ways that you can either invest time or money into channels, into opportunities, and into communities where you can spread your content with value first and ultimately generate results for your business at the end of all of it.

So I hope that you can use this to uncover for yourself a content distribution playbook that works for your brand. Whether you’re in B2C or you’re in B2B, it doesn’t matter. You have to understand where your audience is spending time, understand how you can seed your content into these different spaces and unlock the power of content distribution. My name is Ross Simmonds.

I really hope you enjoyed this video. If you have any questions, don’t hesitate to reach out on Twitter, at TheCoolestCool, or hit me up any other way. I’m on every other channel. Of course I am. I love social. I love digital. I’m everywhere that you could find me, so feel free to reach out.

I hope you enjoyed this video and you can use it to give your content more reach and ultimately drive meaningful and measurable results for your business. Thank you so much.

Video transcription by Speechpad.com


If Ross’s Whiteboard Friday left you feeling energized and inspired to try new things with your content marketing, you’ll love his full MozCon 2019 talk — Keywords Aren’t Enough: How to Uncover Content Ideas Worth Chasing — available in our recently released video bundle. Learn how to use many of these same distribution channels as idea factories for your content, plus access 26 additional future-focused SEO topics from our top-notch speakers:

Grab the sessions now!

And don’t be shy — share the learnings with your whole team, preferably with snacks. It’s what video was made for!



[ad_2]

Source link

The Featured Snippet Cheat Sheet and Q&A

[ad_1]

BritneyMuller

Earlier this week, I hosted a webinar all about featured snippets covering essential background info, brand-new research we’ve done, the results of all the tests I’ve performed, and key takeaways. Things didn’t quite go as planned, though. We had technical difficulties that interfered with our ability to broadcast live, and lots of folks were left with questions after the recording that we weren’t able to answer in a follow-up Q&A.

The next best thing to a live webinar Q&A? A digital one that you can bookmark and come back to over and over again! We asked our incredibly patient, phenomenally smart attendees to submit their questions via email and promised to answer them in an upcoming blog post. We’ve pulled out the top recurring questions and themes from those submissions and addressed them below. If you had a question and missed the submission window, don’t worry! Ask it down in the comments and we’ll keep the conversation going.

If you didn’t get a chance to sign up for the original webinar, you can register for it on-demand here:

Watch the webinar

And if you’re here to grab the free featured snippets cheat sheet we put together, look no further — download the PDF directly here. Print it off, tape it to your office wall, and keep featured snippets top-of-mind as you create and optimize your site content. 

Now, let’s get to those juicy questions!


1. Can I win a featured snippet with a brand-new website?

If you rank on page one for a keyword that triggers a featured snippet (in positions 1–10), you’re a contender for stealing that featured snippet. It might be tougher with a new website, but you’re in a position to be competitive if you’re on page one — regardless of how established your site is.

We’ve got some great Whiteboard Fridays that cover how to set a new site up for success:

2. Does Google provide a tag that identifies traffic sources from featured snippets? Is there a GTM tag for this?

Unfortunately, Google does not provide a tag to help identify traffic from featured snippets. I’m not aware of a GTM tag that helps with this, either, but would love to hear any community suggestions or ideas in the comments!

It’s worth noting that it’s currently impossible to determine what percentage of your traffic comes from the featured snippet versus the duplicate organic URL below the featured snippet.

3. Do you think it’s worth targeting longer-tail question-based queries that have very low monthly searches to gain a featured snippet?

Great question! My advice is this: don’t sleep on low-search-volume keywords. They often convert really well and in aggregate they can do wonders for a website. I suggest prioritizing long tail keywords that you foresee providing a high potential ROI.

For example, there are millions of searches a month for the keyword “shoes.” Very competitive, but that query is pretty vague. In contrast, the keyword “size 6 red womens nike running shoes” is very specific. This searcher knows what they want and they’re dialing in their search to find it. This is a great example of a long tail keyword phrase that could provide direct conversions.

4. What’s the best keyword strategy for determining which queries are worth creating featured snippet-optimized content for?

Dr. Pete wrote a great blog post outlining how to perform keyword research for featured snippets back in 2016. Once you’ve narrowed down your list of likely queries, you need to look at keywords that you rank on page one for, that trigger a snippet, and that you don’t yet own. Next, narrow your list down further by what you envision will have the highest ROI for your goals. Are you trying to drive conversions? Attract top-of-funnel site visitors? Make sure the queries you target align with your business goals, and go from there. Both Moz Pro and STAT can be a big help with this process.

A tactical pro tip: Use the featured snippet carousel queries as a starting point. For instance, if there’s a snippet for the query “car insurance” with a carousel of “in Florida,” “in Michigan,” and so on, you might consider writing about state-specific topics to win those carousel snippets. For this technique, the bonus is that you don’t really need to be on page one for the root term (or ranking at all) — often, carousel snippets are taken from off-SERP links.

5. Do featured snippets fluctuate according to language, i.e. if I have several versions of my site in different languages, will the snippet display for each version?

This is a great question! Unfortunately, we haven’t been able to do international/multi-language featured snippet research just yet, but hope to in the future. I would suspect the featured snippet could change depending on language and search variation. The best way to explore this is to do a search in an incognito (and un-logged-in) browser window of Google Chrome.

If you’ve performed research along these lines, let us know what you found out down in the comments!

6. Why do featured snippet opportunities fluctuate in number from day to day?

Change really is the only constant in search. In the webinar, I discussed the various tests I did that caused Moz to lose a formerly won featured snippet (and what helped it reappear once again). Changes as simple as an extra period at the end of a sentence were enough to lose us the snippet. With content across the web constantly being created and edited and deprecated and in its own state of change, it’s no wonder that it’s tough to win and keep a featured snippet — sometimes even from one day to the next.

The SERPs are incredibly volatile things, with Google making updates multiple times every day. But when it comes down to the facts, there are a few things that reliably cause volatility (is that an oxymoron?):

  • If a snippet is pulling from a lower-ranking URL (not positions 1–3); this could mean Google is testing the best answer for the query
  • Google regularly changing which scraped content is used in each snippet
  • Featured snippet carousel topics changing

The best way to change-proof yourself is to become an authority in your particular niche (E-A-T, remember?) and strive to rank higher to increase your chances of capturing and keeping a featured snippet.

7. How can I use Keyword Lists to find missed SERP feature opportunities? What’s the best way to use them to identify keyword gaps?

Keyword Lists are a wonderful area to uncover feature snippet (and other SERP feature) opportunity gaps. My favorite way to do this is to filter the Keyword List by your desired SERP feature. We’ll use featured snippets as an example. Next, sort by your website’s current rank (1–10) to determine your primary featured snippet gaps and opportunities.

The filters are another great way to tease out additional gaps:

  • Which keywords have high search volume and low competition? 
  • Which keywords have high organic CTR that you currently rank just off page one for?

8. What are best practices around reviewing the structure of content that’s won a snippet, and how do I know whether it’s worth replicating?

Content that has won a featured snippet is definitely worth reviewing (even if it doesn’t hold the featured snippet over time). Consider why Google might have provided this as a featured snippet:

  • Does it succinctly answer the query? 
  • Might it sound good as a voice answer? 
  • Is it comprehensive for someone looking for additional information? 
  • Does the page provide additional answers or information around the topic? 
  • Are there visual elements? 

It’s best to put on your detective hat and try to uncover why a piece of content might be ranking for a particular featured snippet:

  • What part of the page is Google pulling that featured snippet content from? 
  • Is it marked up in a certain way? 
  • What other elements are on the page? 
  • Is there a common theme? 
  • What additional value can you glean from the ranking featured snippet?

9. Does Google identify and prioritize informational websites for featured snippets, or are they determined by a correlation between pages with useful information and frequency of snippets? 

In other words, would being an e-commerce site harm your chances of winning featured snippets, all other factors being the same?

I’m not sure whether Google explicitly categorizes informational websites. They likely establish a trust metric of sorts for domains and then seek out information or content that most succinctly answers queries within their trust parameters, but this is just a hypothesis.

While informational sites tend to do overwhelmingly better than other types of websites, it’s absolutely possible for an e-commerce website to find creative ways of snagging featured snippets.

It’s fascinating how various e-commerce websites have found their way into current featured snippets in extremely savvy ways. Here’s a super relevant example: after our webinar experienced issues and wasn’t able to launch on time, I did a voice search for “how much do stamps cost” to determine how expensive it would be to send apology notes to all of our hopeful attendees. 

This was the voice answer:

“According to stamps.com the cost of a one ounce first class mail stamp is $0.55 at the Post Office, or $.047 if you buy and print stamps online using stamps.com.”

Pretty clever, right? I believe there are plenty of savvy ways like this to get your brand and offers into featured snippets.

10. When did the “People Also Ask” feature first appear? What changes to PAAs do you anticipate in the future?



People Also Ask boxes first appeared in July 2015 as a small-scale test. Their presence in the SERPs grew over 1700% between July 2015 and March 2017, so they certainly exploded in popularity just a few years ago. Funny enough, I was one of the first SEOs to come across Google’s PAA testing — you can read about that stat and more in my original article on the subject: Infinite “People Also Ask” Boxes: Research and SEO Opportunities

We recently published some great PAA research by Samuel Mangialavori on the Moz Blog, as well: 5 Things You Should Know About “People Also Ask” & How to Take Advantage

And there are a couple of great articles cataloging the evolution of PAAs over the years here:

When it comes to predicting the future of PAAs, well, we don’t have a crystal ball yet, but featured snippets continue to look more and more like PAA boxes with their new-ish accordion format. Is it possible Google will merge them into a single feature someday? It’s hard to say, but as SEOs, our best bet is to maintain flexibility and prepare to roll with the punches the search engines send our way.

11. Can you explain what you meant by “15% of image URLs are not in organic”?

Sure thing! The majority of images that show up in featured snippet boxes (or to be more accurate, the webpage those images live on) do not rank organically within the first ten pages of organic search results for the featured snippet query.

12. How should content creators consider featured snippets when crafting written content? Are there any tools that can help?

First and foremost, you’ll want to consider the searcher

  • What is their intent? 
  • What desired information or content are they after? 
  • Are you providing the desired information in the medium in which they desire it most (video, images, copy, etc)? 

Look to the current SERPs to determine how you should be providing content to your users. Read all of the results on page one:

  • What common themes do they have? 
  • What topics do they cover? 
  • How can you cover those better?

Dr. Pete has a fantastic Whiteboard Friday that covers how to write content to win featured snippets. Check it out: How to Write Content for Answers Using the Inverted Pyramid



You might also get some good advice from this classic Whiteboard Friday by Rand Fishkin: How to Appear in Google’s Answer Boxes

13. “Write quality content for people, not search engines” seems like great advice. But should I also be using any APIs or tools to audit my content? 

The only really helpful tool that comes to mind is the Flesch-Kincaid readability test, but even that can be a bit disruptive to the creative process. The very best tool you might have for reviewing your content might be a real person. I would ensure that your content can be easily understood when read out loud to your targeted audience. It may help to consider whether your content, as a featured snippet, would make for an effective, helpful voice search result.

14. What’s the best way to stay on top of trends when it comes to Google’s featured snippets?

Find publications and tools that resonate, and keep an eye on them. Some of my favorites include:

  1. MozCast to keep a pulse on the Google algorithm
  2. Monitoring tools like STAT (email alerts when you win/lose a snippet? Awesome.)
  3. Cultivating a healthy list of digital marketing heroes to follow on Twitter
  4. Industry news publications like Search Engine Journal and, of course, the Moz Blog 😉
  5. Subscribing to SEO newsletters like the Moz Top 10

One of the very best things you can do, though, is performing your own investigative featured snippet research within your space. Publishing the trends you observe helps our entire community grow and learn. 


Thank you so much to every attendee who submitted their questions. Digging into these follow-up thoughts and ideas is one of the best parts of putting on a presentation. If you’ve got any lingering questions after the webinar, I would love to hear them — leave me a note in the comments and I’ll be on point to answer you. And if you missed the webinar sign-up, you can still access it on-demand whenever you want.

We also promised you some bonus content, yeah? Here it is — I compiled all of my best tips and tricks for winning featured snippets into a downloadable cheat sheet that I hope is a helpful reference for you:

Free download: The Featured Snippets Cheat Sheet

There’s no reason you shouldn’t be able to win your own snippets when you’re armed with data, drive, and a good, solid plan! Hopefully this is a great resource for you to have on hand, either to share around with colleagues or to print out and keep at your desk:

Grab the cheat sheet

Again, thank you so much for submitting your questions, and we’ll see you in the comments for more.



[ad_2]

Source link

SEO Analytics for Free – Combining Google Search with the Moz API

[ad_1]

Purple-Toolz

I’m a self-funded start-up business owner. As such, I want to get as much as I can for free before convincing our finance director to spend our hard-earned bootstrapping funds. I’m also an analyst with a background in data and computer science, so a bit of a geek by any definition.

What I try to do, with my SEO analyst hat on, is hunt down great sources of free data and wrangle it into something insightful. Why? Because there’s no value in basing client advice on conjecture. It’s far better to combine quality data with good analysis and help our clients better understand what’s important for them to focus on.

In this article, I will tell you how to get started using a few free resources and illustrate how to pull together unique analytics that provide useful insights for your blog articles if you’re a writer, your agency if you’re an SEO, or your website if you’re a client or owner doing SEO yourself.

The scenario I’m going to use is that I want analyze some SEO attributes (e.g. backlinks, Page Authority etc.) and look at their effect on Google ranking. I want to answer questions like “Do backlinks really matter in getting to Page 1 of SERPs?” and “What kind of Page Authority score do I really need to be in the top 10 results?” To do this, I will need to combine data from a number of Google searches with data on each result that has the SEO attributes in that I want to measure.

Let’s get started and work through how to combine the following tasks to achieve this, which can all be setup for free:

  • Querying with Google Custom Search Engine
  • Using the free Moz API account
  • Harvesting data with PHP and MySQL
  • Analyzing data with SQL and R

Querying with Google Custom Search Engine

We first need to query Google and get some results stored. To stay on the right side of Google’s terms of service, we’ll not be scraping Google.com directly but will instead use Google’s Custom Search feature. Google’s Custom Search is designed mainly to let website owners provide a Google like search widget on their website. However, there is also a REST based Google Search API that is free and lets you query Google and retrieve results in the popular JSON format. There are quota limits but these can be configured and extended to provide a good sample of data to work with.

When configured correctly to search the entire web, you can send queries to your Custom Search Engine, in our case using PHP, and treat them like Google responses, albeit with some caveats. The main limitations of using a Custom Search Engine are: (i) it doesn’t use some Google Web Search features such as personalized results and; (ii) it may have a subset of results from the Google index if you include more than ten sites.

Notwithstanding these limitations, there are many search options that can be passed to the Custom Search Engine to proxy what you might expect Google.com to return. In our scenario, we passed the following when making a call:

https://www.googleapis.com/customsearch/v1?key=<google_api_id>&userIp=
<ip_address>&cx<custom_search_engine_id>&q=iPhone+X&cr=countryUS&start=
1</custom_search_engine_id></ip_address></google_api_id>

Where:

  • https://www.googleapis.com/customsearch/v1 – is the URL for the Google Custom Search API
  • key=<GOOGLE_API_ID> – Your Google Developer API Key
  • userIp=<IP_ADDRESS> – The IP address of the local machine making the call
  • cx=<CUSTOM_SEARCH_ENGINE_ID> – Your Google Custom Search Engine ID
  • q=iPhone+X – The Google query string (‘+’ replaces ‘ ‘)
  • cr=countryUS – Country restriction (from Goolge’s Country Collection Name list)
  • start=1 – The index of the first result to return – e.g. SERP page 1. Successive calls would increment this to get pages 2–5.

Google has said that the Google Custom Search engine differs from Google .com, but in my limited prod testing comparing results between the two, I was encouraged by the similarities and so continued with the analysis. That said, keep in mind that the data and results below come from Google Custom Search (using ‘whole web’ queries), not Google.com.

Using the free Moz API account

Moz provide an Application Programming Interface (API). To use it you will need to register for a Mozscape API key, which is free but limited to 2,500 rows per month and one query every ten seconds. Current paid plans give you increased quotas and start at $250/month. Having a free account and API key, you can then query the Links API and analyze the following metrics:

Moz data field

Moz API code

Description

ueid

32

The number of external equity links to the URL

uid

2048

The number of links (external, equity or nonequity or not,) to the URL

umrp**

16384

The MozRank of the URL, as a normalized 10-point score

umrr**

16384

The MozRank of the URL, as a raw score

fmrp**

32768

The MozRank of the URL’s subdomain, as a normalized 10-point score

fmrr**

32768

The MozRank of the URL’s subdomain, as a raw score

us

536870912

The HTTP status code recorded for this URL, if available

upa

34359738368

A normalized 100-point score representing the likelihood of a page to rank well in search engine results

pda

68719476736

A normalized 100-point score representing the likelihood of a domain to rank well in search engine results

NOTE: Since this analysis was captured, Moz documented that they have deprecated these fields. However, in testing this (15-06-2019), the fields were still present.

Moz API Codes are added together before calling the Links API with something that looks like the following:

www.apple.com%2F?Cols=103616137253&AccessID=MOZ_ACCESS_ID&
Expires=1560586149&Signature=<MOZ_SECRET_KEY>

Where:

  • http://lsapi.seomoz.com/linkscape/url-metrics/” class=”redactor-autoparser-object”>http://lsapi.seomoz.com/linksc… – Is the URL for the Moz API
  • http%3A%2F%2Fwww.apple.com%2F – An encoded URL that we want to get data on
  • Cols=103616137253 – The sum of the Moz API codes from the table above
  • AccessID=MOZ_ACCESS_ID – An encoded version of the Moz Access ID (found in your API account)
  • Expires=1560586149 – A timeout for the query – set a few minutes into the future
  • Signature=<MOZ_SECRET_KEY> – An encoded version of the Moz Access ID (found in your API account)

Moz will return with something like the following JSON:

Array
(
    [ut] => Apple
    [uu] => <a href="http://www.apple.com/" class="redactor-autoparser-object">www.apple.com/</a>
    [ueid] => 13078035
    [uid] => 14632963
    [uu] => www.apple.com/
    [ueid] => 13078035
    [uid] => 14632963
    [umrp] => 9
    [umrr] => 0.8999999762
    [fmrp] => 2.602215052
    [fmrr] => 0.2602215111
    [us] => 200
    [upa] => 90
    [pda] => 100
)

For a great starting point on querying Moz with PHP, Perl, Python, Ruby and Javascript, see this repository on Github. I chose to use PHP.

Harvesting data with PHP and MySQL

Now we have a Google Custom Search Engine and our Moz API, we’re almost ready to capture data. Google and Moz respond to requests via the JSON format and so can be queried by many popular programming languages. In addition to my chosen language, PHP, I wrote the results of both Google and Moz to a database and chose MySQL Community Edition for this. Other databases could be also used, e.g. Postgres, Oracle, Microsoft SQL Server etc. Doing so enables persistence of the data and ad-hoc analysis using SQL (Structured Query Language) as well as other languages (like R, which I will go over later). After creating database tables to hold the Google search results (with fields for rank, URL etc.) and a table to hold Moz data fields (ueid, upa, uda etc.), we’re ready to design our data harvesting plan.

Google provide a generous quota with the Custom Search Engine (up to 100M queries per day with the same Google developer console key) but the Moz free API is limited to 2,500. Though for Moz, paid for options provide between 120k and 40M rows per month depending on plans and range in cost from $250–$10,000/month. Therefore, as I’m just exploring the free option, I designed my code to harvest 125 Google queries over 2 pages of SERPs (10 results per page) allowing me to stay within the Moz 2,500 row quota. As for which searches to fire at Google, there are numerous resources to use from. I chose to use Mondovo as they provide numerous lists by category and up to 500 words per list which is ample for the experiment.

I also rolled in a few PHP helper classes alongside my own code for database I/O and HTTP.

In summary, the main PHP building blocks and sources used were:

One factor to be aware of is the 10 second interval between Moz API calls. This is to prevent Moz being overloaded by free API users. To handle this in software, I wrote a “query throttler” which blocked access to the Moz API between successive calls within a timeframe. However, whilst working perfectly it meant that calling Moz 2,500 times in succession took just under 7 hours to complete.

Analyzing data with SQL and R

Data harvested. Now the fun begins!

It’s time to have a look at what we’ve got. This is sometimes called data wrangling. I use a free statistical programming language called R along with a development environment (editor) called R Studio. There are other languages such as Stata and more graphical data science tools like Tableau, but these cost and the finance director at Purple Toolz isn’t someone to cross!

I have been using R for a number of years because it’s open source and it has many third-party libraries, making it extremely versatile and appropriate for this kind of work.

Let’s roll up our sleeves.

I now have a couple of database tables with the results of my 125 search term queries across 2 pages of SERPS (i.e. 20 ranked URLs per search term). Two database tables hold the Google results and another table holds the Moz data results. To access these, we’ll need to do a database INNER JOIN which we can easily accomplish by using the RMySQL package with R. This is loaded by typing “install.packages(‘RMySQL’)” into R’s console and including the line “library(RMySQL)” at the top of our R script.

We can then do the following to connect and get the data into an R data frame variable called “theResults.”

library(RMySQL)
# INNER JOIN the two tables
theQuery <- "
    SELECT A.*, B.*, C.*
    FROM
    (
        SELECT 
            cseq_search_id
        FROM cse_query
    ) A -- Custom Search Query
    INNER JOIN
    (
        SELECT 
            cser_cseq_id,
            cser_rank,
            cser_url
        FROM cse_results
    ) B -- Custom Search Results
    ON A.cseq_search_id = B.cser_cseq_id
    INNER JOIN
    (
        SELECT *
        FROM moz
    ) C -- Moz Data Fields
    ON B.cser_url = C.moz_url
    ;
"
# [1] Connect to the database
# Replace USER_NAME with your database username
# Replace PASSWORD with your database password
# Replace MY_DB with your database name
theConn <- dbConnect(dbDriver("MySQL"), user = "USER_NAME", password = "PASSWORD", dbname = "MY_DB")
# [2] Query the database and hold the results
theResults <- dbGetQuery(theConn, theQuery)
# [3] Disconnect from the database
dbDisconnect(theConn)

NOTE: I have two tables to hold the Google Custom Search Engine data. One holds data on the Google query (cse_query) and one holds results (cse_results).

We can now use R’s full range of statistical functions to begin wrangling.

Let’s start with some summaries to get a feel for the data. The process I go through is basically the same for each of the fields, so let’s illustrate and use Moz’s ‘UEID’ field (the number of external equity links to a URL). By typing the following into R I get the this:

> summary(theResults$moz_ueid)
   Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
      0       1      20   14709     182 2755274 
> quantile(theResults$moz_ueid,  probs = c(1, 5, 10, 25, 50, 75, 80, 90, 95, 99, 100)/100)
       1%        5%       10%       25%       50%       75%       80%       90%       95%       99%      100% 
      0.0       0.0       0.0       1.0      20.0     182.0     337.2    1715.2    7873.4  412283.4 2755274.0 

Looking at this, you can see that the data is skewed (a lot) by the relationship of the median to the mean, which is being pulled by values in the upper quartile range (values beyond 75% of the observations). We can however, plot this as a box and whisker plot in R where each X value is the distribution of UEIDs by rank from Google Custom Search position 1-20.

Note we are using a log scale on the y-axis so that we can display the full range of values as they vary a lot!

A box and whisker plot in R of Moz’s UEID by Google rank (note: log scale)

Box and whisker plots are great as they show a lot of information in them (see the geom_boxplot function in R). The purple boxed area represents the Inter-Quartile Range (IQR) which are the values between 25% and 75% of observations. The horizontal line in each ‘box’ represents the median value (the one in the middle when ordered), whilst the lines extending from the box (called the ‘whiskers’) represent 1.5x IQR. Dots outside the whiskers are called ‘outliers’ and show where the extents of each rank’s set of observations are. Despite the log scale, we can see a noticeable pull-up from rank #10 to rank #1 in median values, indicating that the number of equity links might be a Google ranking factor. Let’s explore this further with density plots.

Density plots are a lot like distributions (histograms) but show smooth lines rather than bars for the data. Much like a histogram, a density plot’s peak shows where the data values are concentrated and can help when comparing two distributions. In the density plot below, I have split the data into two categories: (i) results that appeared on Page 1 of SERPs ranked 1-10 are in pink and; (ii) results that appeared on SERP Page 2 are in blue. I have also plotted the medians of both distributions to help illustrate the difference in results between Page 1 and Page 2.

The inference from these two density plots is that Page 1 SERP results had more external equity backlinks (UEIDs) on than Page 2 results. You can also see the median values for these two categories below which clearly shows how the value for Page 1 (38) is far greater than Page 2 (11). So we now have some numbers to base our SEO strategy for backlinks on.

# Create a factor in R according to which SERP page a result (cser_rank) is on
> theResults$rankBin <- paste("Page", ceiling(theResults$cser_rank / 10))
> theResults$rankBin <- factor(theResults$rankBin)
# Now report the medians by SERP page by calling ‘tapply’
> tapply(theResults$moz_ueid, theResults$rankBin, median) 
Page 1 Page 2 
    38     11 

From this, we can deduce that equity backlinks (UEID) matter and if I were advising a client based on this data, I would say they should be looking to get over 38 equity-based backlinks to help them get to Page 1 of SERPs. Of course, this is a limited sample and more research, a bigger sample and other ranking factors would need to be considered, but you get the idea.

Now let’s investigate another metric that has less of a range on it than UEID and look at Moz’s UPA measure, which is the likelihood that a page will rank well in search engine results.

> summary(theResults$moz_upa)
   Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
   1.00   33.00   41.00   41.22   50.00   81.00 
> quantile(theResults$moz_upa,  probs = c(1, 5, 10, 25, 50, 75, 80, 90, 95, 99, 100)/100)
  1%   5%  10%  25%  50%  75%  80%  90%  95%  99% 100% 
  12   20   25   33   41   50   53   58   62   75   81 

UPA is a number given to a URL and ranges between 0–100. The data is better behaved than the previous UEID unbounded variable having its mean and median close together making for a more ‘normal’ distribution as we can see below by plotting a histogram in R.

A histogram of Moz’s UPA score

We’ll do the same Page 1 : Page 2 split and density plot that we did before and look at the UPA score distributions when we divide the UPA data into two groups.

# Report the medians by SERP page by calling ‘tapply’
> tapply(theResults$moz_upa, theResults$rankBin, median) 
Page 1 Page 2 
    43     39 

In summary, two very different distributions from two Moz API variables. But both showed differences in their scores between SERP pages and provide you with tangible values (medians) to work with and ultimately advise clients on or apply to your own SEO.

Of course, this is just a small sample and shouldn’t be taken literally. But with free resources from both Google and Moz, you can now see how you can begin to develop analytical capabilities of your own to base your assumptions on rather than accepting the norm. SEO ranking factors change all the time and having your own analytical tools to conduct your own tests and experiments on will help give you credibility and perhaps even a unique insight on something hitherto unknown.

Google provide you with a healthy free quota to obtain search results from. If you need more than the 2,500 rows/month Moz provide for free there are numerous paid-for plans you can purchase. MySQL is a free download and R is also a free package for statistical analysis (and much more).

Go explore!

[ad_2]

Source link

5 Things You Should Know About “People Also Ask” & How to Take Advantage

[ad_1]

SamuelMangialavori

It’s undeniable that the SERPs have changed considerably in the last year or so. Elements like featured snippets, Knowledge Graphs, local packs, and People Also Ask have really taken over the SEO world — and left some of us a bit confused.

In particular, the People Also Ask (PAA) feature caught my attention in the last few months. For many of the clients I’ve worked with, PAAs have really had an impact on their SERPs.

If you are anything like me, you might be asking yourself the same questions:

  • How important are these SERP features?
  • How many clicks do they “steal” from SEO?
  • And most importantly: who are these people that also ask SO MANY questions? Somehow, I always imagine the hipster-looking man from Answer the Public being the leader of such a group of people…

The first part of the post focuses on five things I’ve learned about People Also Ask, while the second part outlines some ideas on how to take advantage of such features.

Let’s get started! Here are five things you should know about PAAs.

1. PAA can occupy different positions on the SERP

I don’t know about you all, but I wasn’t fully aware of the above until a few months ago; I just assumed that most of the time PAAs appeared in the same location, IF and only IF it was actually triggered by Google. I didn’t really pay attention to this featured until I started digging into it.

Distinct from featured snippets (which appear always at the top of the SERP), PAAs can be located in several different parts of the page.

Let’s look at some examples:

Keyword example: [dj software]

Example of SERP where PAA is at the top of the page

For the keyword [dj software], this is what the SERP looks like:

  • 3 PPC ads
  • Related videos
  • 4 PAA listings at the top of the page
  • 10 organic results

Keyword example: [cocktail dresses under 50 pounds]

Example of SERP where PAA is in the middle of the page

For the keyword [cocktail dresses under 50 pounds], this is what the SERP looks like:

  • Shopping ads
  • 1 PPC ad
  • Image carousel
  • 3 organic results
  • 4 PAA listings in the middle of the page

Keyword example: [tv unit]

Example of SERP where PAA is at the bottom of the page

For the keyword [tv unit], this is what the SERP looks like:

  • Shopping ads
  • 1 PPC ad
  • 10 organic results
  • 3 PAA listings at the bottom of the page

Why does this matter to you?

Understanding the implications of the different positions of PAA in the SERPs impacts organic results’ CTR, especially on mobile, where space is very precious.

2. Do PAAs have a limit?

I’m just giving away the answer now: No-ish.

This feature has the ability to trigger a potentially infinite number of questions on the topic of interest. As Britney Muller researched in this Moz post, the initial 3–4 listing could continue into the hundreds once clicked on, in some cases.

With one simple click, the 4 PAA questions can trigger three more listings, and so on and so forth.

Has the situation changed at all since the original 2016 Moz article?

Yes, it has! What I’m seeing now is actually very mixed: PAAs can vary extensively, from a fixed number of 3–4 listings to a plethora of results.

Let’s look at an example of a query that’s showing a large number of PAAs:

Keyword example: [featured snippets]

Example of SERP where the number of PAA expands when clicked upon, and is not fixed

For the query [featured snippets], the PAA listings can be expanded if clicked on, which process generates a large number of new PAA listings that appear at the bottom of such SERP feature.

For other queries, Google will only show you 4 PAA listings and such number will not change even if the listings get clicked on:

Keyword example: [best italian wine]

Example of SERP where the number of PAA listings is fixed and does not expand

For the query [best italian wine], the PAA listings cannot be expanded, no matter how many times you hover or click on them.

Interestingly, it also appears that Google does not keep this feature consistent: a few days after I took the above screenshots, the fixed number of PAAs was gone. On the other hand, I’ve recently seen instances where the keywords have a fixed amount of only 3 PAAs instead of 4.

Now, the real question for Google would be:

“What methodology are they using to decide which keywords trigger an infinite amount of PAAs and which keywords cannot?”

As you might have guessed by now, I don’t have an answer today. I’ll continue to work on uncovering it and keep you folks posted when/if I get an answer from Google or discover further insights.

My two cents on the above:

The number of PAAs does not relate to particular verticals or keywords patterns at the moment, though this may change in the future (e.g. comparative keywords more or less inclined to a fixed amount of PAAs.)

Google’s experiments will continue, and they may change PAAs quite a bit in the next one to two years. I wouldn’t be surprised if we saw questions being answered in different ways. Read the next point to know more!

Why does this matter to you?

From an opportunity standpoint, the number of questions you can scrape to take advantage of will vary.

From a user standpoint, it impacts your search journey and offers a different number of answers to your questions.

3. PAAs can trigger video results

I came across this by reading an article on Search Engine Roundtable.

Example PAA with video results

I wasn’t able to replicate the above result myself in London — but that doesn’t matter, as we’re used to seeing Google experimenting with new features in the US first.

Answering a PAA listing with a video makes a lot of sense, especially if you consider the nature of many of the queries listed:

  • What is…
  • How to…
  • Why is/are…

And so on.

I expect this to be tested more and more by Google, to a point where most of the keywords that are currently showing video results in the SERPs will trigger video results in the PAA listings, too.

Keyword example: [how to clean suede shoes diy]

Example of SERP for keywords that often trigger video results

Video results will matter more and more in the near future. Why is that?

Just examine how hard Google is working on the interpretation and simplification of video results. Google has added key moments for videos in search results (read this article to know more). This new feature allows us to jump to the portion of the video that answers our specific query.

Why does this matter to you?

From an opportunity standpoint, you can optimize your YouTube and video results to be eligible to appear in PAAs.

From a user standpoint, it enriches your search journey for PAA queries that are better answered with videos.

4. PAA questions are frequently repeated for the same search topic and also trigger featured snippets

This might be obvious, but it’s important to understand these three points:

  1. Most PAA questions also trigger featured snippets
  2. The same PAA question (& answer) can be triggered for different keywords
  3. The same answer/listing that appears for a certain question in a PAA can also appear for different questions triggered by PAAs

Let’s look at some examples to better visualize what I mean:

1. PAA questions also trigger Featured Snippets

Keyword 1: [business card ideas]

Keyword 2: [what is on a good business card?]

Example of PAA listings for case n.1

The keyword [business card ideas] triggers some PAA listings, whose questions, if used as the main query, trigger a featured snippet.

2. Different keywords can trigger the same PAA question and show the same result. 

The same listing that appears for a PAA question for keyword X can also appear for the same question, triggered by a different keyword Y.

Keyword 1: [quality business cards]

Keyword 2: [business cards quality design]

Example of PAA listings for case n.2

To summarize: Different keywords, same question in the PAA and same listing in the PAA.

3. Different questions listed in a PAA triggered by different keywords can show the same result. 

The same listing that appears for a PAA question for keyword X can also appear for the same question, triggered by a different keyword Y.

Keyword 1: [quality business cards]

Keyword 2: [best business cards online]

Example of PAA listings for case n.3

To summarize: Different keywords, different question in the PAA but same listing in the PAA.

The above keywords are clearly different, but they show the same intent:

“I’m looking for a business card by using terms that highlight certain defining attributes — best & quality.”

Small Biz Trends in the above screenshot has created a page that matches that particular intent. Keyword intent is a crucial topic that the SEO community has been talking about for a few years by now.

Why does this matter to you?

From an opportunity standpoint, your PAA listings can trigger featured snippets and also have the possibility to cover a portfolio of different keyword permutations.

5. PAAs have a feedback feature

Most of you have probably glanced over this feature but never really paid attention to it: at the bottom of the last PAA listing, there is often a little hyperlink with the word Feedback.

By clicking on it, you’re shown the following pop-up:

Example of feedback for PAA

Google states that this option is available “on some search results” and it allows users to send feedback or suggest a translation. Even if you do go through the effort, Google says they will not reply to you directly, but rather collect the info submitted and work on the accuracy of the listings.

Does this mean they’ll actually change the PAA listing based off of feedback?

Unfortunately, I don’t have an answer for this (I’ve tried to submit feedback manually and nothing really happened) but I think it’s very unlikely.

The only for-sure thing you get from Google is the following response:

Google’s response after feedback submission

Why does this matter to you?

From an opportunity standpoint, if you notice that PAA listings (for questions you are trying to appear for) are not accurate, you can flag it to Google and hope they’ll change it.

Now that we’ve covered some interesting facts, how can we take advantage of PAA?

Determine how deeply your SERP is being affected by PAA (and other SERP features)

This task is fairly straightforward, but I guarantee you very few people actually pay much attention to it. When monitoring your rankings, you should really try to dig deeply into which other elements are affecting your overall organic traffic & organic CTR.

Start by asking yourself the following questions:

  • What elements affect the SERP for my core keywords?
  • How often do these SERP elements appear?
  • How deeply are they affecting my organic results?

You might spot an increasing amount of paid results (in the form of shopping ads for products or text ads for services) appearing for many of your key terms.

Established tools like SEMrush, Sistrix, and Ahrefs can show you the number of ads, overall spending, & how the ads look at a keyword level.

Kw: [hr software]

SEMrush ads history graph by keyword

Or it may be the case that organic SERP elements, such as video results, are being triggered in the SERP for many of your informational queries, or that featured snippets appear for a high percentage of your navigational & transactional terms, and so on.

Recently, I came across a client where over 90% of their primary keywords triggered PAAs at the top of the SERP. 90%!

Which tools can help?

At Distilled we use STAT, which reports on such insights in a really comprehensive manner with a great overview of all the SERP elements.

This is what the STAT SERP features interface looks like:

STAT SERP features

Ahrefs also does a great job of allowing you to download the SERP features of the top twenty results for any of the keywords you’re interested in.

Understanding where you stand in the current SERP landscape & how your SEO has been affected by it is a crucial step prior to implementing any SERP strategy.

Tactics to take advantage of PAAs

There are several ways to incorporate PAAs into your SEO strategy. It’s already been written about many times online, so I’m going to keep it simple and focus on a few easy tactics that I think will really improve your workflow:

1. Extract PAA listings

This one’s pretty straightforward: how can we take advantage of PAAs if we cannot find a way to extract those questions in the first place?

There are several ways to “scrape” PAAs, more or less compliant with Google’s Terms & Conditions (such as using Screaming Frog).

Personally, I like STAT’s report, so I’ll talk about how easy it is to extract PAA listings using this tool:

  • One of the features of STAT’s reporting is called “People also ask (Google),” which is pretty self-explanatory: for the keywords you’ve decided to track in the tool, this report will provide the PAA questions they trigger and the URLs appearing for those listings, along with their exact rankings within the PAA box.

This is an example of how the report will look like after you’ve downloaded the “People also ask (Google)” report:

STAT PAA report

2. Address questions in your content

Once you have a list of all PAA questions and you are able to see which URLs rank for such results, what should you do next?

This is the more complicated part: think how your content strategy can incorporate PAA findings and start experimenting. Similarly to featured snippets, PAAs should be included in your content plan. If that’s not yet the case, well, I hope this blog post can convince you to give it a go!

Since I am not focusing (sadly, for some) on content strategy with this article, I will not dwell on the topic too much. Instead, I’ll share a few tips on what you could do with the data gathered so far:

Understand what type of results such PAA questions are triggering: are they informational, navigational, transactional?

Many people think featured snippets and PAA questions are triggered by heavily informational or Q&A pages: trust me, do NOT assume anything. heck your data and behave accordingly. Keyword intent should never be taken for granted.

Create or re-optimize your content

Depending on the findings in the previous point, it may be a matter of creating new content that can address PAA questions or re-optimizing the existing content on your site.

If you discover that you have a chance at ranking in a PAA with your current transactional/editorial pages, it might be best to re-optimize what you have.

It may also be the case that one of the following options can be enough to rank in PAAs:

  • Adding questions and answers to your content (don’t limit yourself to just the bottom of the page)
  • Using the right headings to mark up such elements (h1, h2, h3, whatever works for your page)
  • Copying the formatting of results that are currently appearing in PAA
  • Simply changing the language used on your site

If you do not have any content to cover a certain keyword theme, think about creating new ones that would match the keyword intent that Google is favoring. Editorial content with SEO in mind (don’t limit yourself to PAA, but look at the overall SERP spectrum) or simple FAQs pages could really help win PAA or featured snippets.

Depending on your KPIs (traffic, leads, signups, etc), tailor your newly optimized content and be ready to retain users on your site

Once users land on your site after clicking on a PAA listing, what do you want them to see/do? Don’t do half the job, worry about the entire user journey from the start!

3. Test schema on your page

The SEO community has gone a bit cray-cray over the new FAQs schema — my colleague Emily Potter wrote a great post on it.

FAQs and how-to schema represent an interesting opportunity for SERP features such as featured snippets and PAAs, so why not give it a go? Having the right content & testing the right type of schema may help you win precious snippets or PAAs. In the future, I expect Google to increase the amount of markup that refers to informational queries, so stay tuned — and test, test, and test some more!

Think of the extended search volume opportunity

Without digging too much into this topic (it deserves a post on its own), I’ve been thinking about the following idea quite a lot recently:

What if we started looking at PAAs as organic listings, hence counting the search volume for the keywords that trigger such PAAs?

Since PAAs and other elements have been redefining the SERPs as we know them, maybe it’s time for us marketers to redefine how these features are impacting our organic results. Maybe it’s time for us to consider the extended search opportunity that such features bring to the table and not limit ourselves at the tactics mentioned above.

Just something to think about!

PAA can be your friend

By now, I hope you’ve learned a bit more about People Also Ask and how it can help your SEO strategy moving forward.

PAA can be your friend indeed if you’re willing to spend time understanding how your organic visibility can be influenced by such features. The fact that PAAs are now popular for a large portfolio of queries makes me think Google considers them a new, key part of the user journey.

With voice search on the rise, I expect Google to pay even more attention to elements like featured snippets and People Also Ask. I don’t think they’re going anywhere soon — so my dear fellow SEOs, you should start optimizing for the SERPs starting today!

Feel free to get in touch with us at Distilled or on Twitter at @SamuelMng to discuss this further, or just have a chat about who these people who also ask so many questions actually are…



[ad_2]

Source link

Shopify SEO: The Guide to Optimizing Shopify

[ad_1]

cml63

A trend we’ve been noticing at Go Fish Digital is that more and more of our clients have been using the Shopify platform. While we initially thought this was just a coincidence, we can see that the data tells a different story:

Graph Of Shopify Usage Statistics

The Shopify platform is now more popular than ever. Looking at BuiltWith usage statistics, we can see that usage of the CMS has more than doubled since July 2017. Currently, 4.47% of the top 10,000 sites are using Shopify.

Since we’ve worked with a good amount of Shopify stores, we wanted to share our process for common SEO improvements we help our clients with. The guide below should outline some common adjustments we make on Shopify stores.

What is Shopify SEO?

Shopify SEO simply means SEO improvements that are more unique to Shopify than other sites. While Shopify stores come with some useful things for SEO, such as a blog and the ability to redirect, it can also create SEO issues such as duplicate content. Some of the most common Shopify SEO recommendations are:

  • Remove duplicate URLs from internal linking architecture
  • Remove duplicate paginated URLs
  • Create blog content for keywords with informational intent
  • Add “Product,” “Article,” & “BreadcrumbList” structured data
  • Determine how to handle product variant pages
  • Compress images using crush.pics
  • Remove unnecessary Shopify apps

We’ll go into how we handle each of these recommendations below:

Duplicate content

In terms of SEO, duplicate content is the highest priority issue we’ve seen created by Shopify. Duplicate content occurs when either duplicate or similar content exists on two separate URLs. This creates issues for search engines as they might not be able to determine which of the two pages should be the canonical version. On top of this, often times link signals are split between the pages.

We’ve seen Shopify create duplicate content in several different ways:

  1. Duplicate product pages
  2. Duplicate collections pages through pagination

Duplicate product pages

Shopify creates this issue within their product pages. By default, Shopify stores allow their /products/ pages to render at two different URL paths:

  • Canonical URL path: /products/
  • Non-canonical URL path: /collections/.*/products/

Shopify accounts for this by ensuring that all /collections/.*/products/ pages include a canonical tag to the associated /products/ page. Notice how the URL in the address differs from the “canonical” field:

URL In Address Bar Is Different Than Canonical Link

While this certainly helps Google consolidate the duplicate content, a more alarming issue occurs when you look at the internal linking structure. By default, Shopify will link to the non-canonical version of all of your product pages.

Shopify collection page links to non-canonical URLs

As well, we’ve also seen Shopify link to the non-canonical versions of URLs when websites utilize “swatch” internal links that point to other color variants.

Thus, Shopify creates your entire site architecture around non-canonical links by default. This creates a high-priority SEO issue because the website is sending Google conflicting signals:

  1. “Here are the pages we internally link to the most often”
  2. “However, the pages we link to the most often are not the URLs we actually want to be ranking in Google. Please index these other URLs with few internal links”

While canonical tags are usually respected, remember Google does treat these as hints instead of directives. This means that you’re relying on Google to make a judgement about whether or not the content is duplicate each time that it crawls these pages. We prefer not to leave this up to chance, especially when dealing with content at scale.

Adjusting internal linking structure

Fortunately, there is a relatively easy fix for this. We’ve been able to work with our dev team to adjust the code in the product.grid-item.liquid file. Following those instructions will allow your Shopify site’s collections pages to point to the canonical /product/ URLs.

Duplicate collections pages

As well, we’ve seen many Shopify sites that create duplicate content through the site’s pagination. More specifically, a duplicate is created of the first collections page in a particular series. This is because once you’re on a paginated URL in a series, the link to the first page will contain “?page=1”:

First page in Shopify pagination links to ?page=1 link

However, this will almost always be a duplicate page. A URL with “?page=1” will almost always contain the same content as the original non-parameterized URL. Once again, we recommend having a developer adjust the internal linking structure so that the first paginated result points to the canonical page.

Product variant pages

While this is technically an extension of Shopify’s duplicate content from above, we thought this warranted its own section because this isn’t necessarily always an SEO issue.

It’s not uncommon to see Shopify stores where multiple product URLs are created for the same product with slight variations. In this case, this can create duplicate content issues as often times the core product is the same, but only a slight attribute (color for instance) changes. This means that multiple pages can exist with duplicate/similar product descriptions and images. Here is an example of duplicate pages created by a variant: https://recordit.co/x6YRPkCDqG

If left alone, this once again creates an instance of duplicate content. However, variant URLs do not have to be an SEO issue. In fact, some sites could benefit from these URLs as they allow you to have indexable pages that could be optimized for very specific terms. Whether or not these are beneficial is going to differ on every site. Some key questions to ask yourself are:

  • Do your customers perform queries based on variant phrases?
  • Do you have the resources to create unique content for all of your product variants?
  • Is this content unique enough to stand on its own?

For a more in-depth guide, Jenny Halasz wrote a great article on determining the best course of action for product variations. If your Shopify store contains product variants, than it’s worth determining early on whether or not these pages should exist at a separate URL. If they should, then you should create unique content for every one and optimize each for that variant’s target keywords.

Crawling and indexing

After analyzing quite a few Shopify stores, we’ve found some SEO items that are unique to Shopify when it comes to crawling and indexing. Since this is very often an important component of e-commerce SEO, we thought it would be good to share the ones that apply to Shopify.

Robots.txt file

A very important note is that in Shopify stores, you cannot adjust the robots.txt file. This is stated in their official help documentation. While you can add the “noindex” to pages through the theme.liquid, this is not as helpful if you want to prevent Google from crawling your content all together.

An example robots.txt file in Shopify

Here are some sections of the site that Shopify will disallow crawling in:

  • Admin area
  • Checkout
  • Orders
  • Shopping cart
  • Internal search
  • Policies page

While it’s nice that Shopify creates some default disallow commands for you, the fact that you cannot adjust the robots.txt file can be very limiting. The robots.txt is probably the easiest way to control Google’s crawl of your site as it’s extremely easy to update and allows for a lot of flexibility. You might need to try other methods of adjusting Google’s crawl such as “nofollow” or canonical tags.

Adding the “noindex” tag

While you cannot adjust the robots.txt, Shopify does allow you to add the “noindex” tag. You can exclude a specific page from the index by adding the following code to your theme.liquid file.

{% if template contains 'search' %}
<meta name="robots" content="noindex">
{% endif %}

As well, if you want to exclude an entire template, you can use this code:

{% if handle contains 'page-handle-you-want-to-exclude' %}
<meta name="robots" content="noindex">
{% endif %}

Redirects

Shopify does allow you to implement redirects out-of-the-box, which is great. You can use this for consolidating old/expired pages or any other content that no longer exists. You can do this by going to Online Store > Navigation > URL Redirects.

So far, we havn’t found a way to implement global redirects via Shopify. This means that your redirects will likely need to be 1:1.

Log files

Similar to the robots.txt, it’s important to note that Shopify does not provide you with log file information. This has been confirmed by Shopify support.

Structured data

Product structured data

Overall, Shopify does a pretty good job with structured data. Many Shopify themes should contain “Product” markup out-of-the-box that provides Google with key information such as your product’s name, description, price etc. This is probably the highest priority structured data to have on any e-commerce site, so it’s great that many themes do this for you.

Shopify sites might also benefit from expanding the Product structured data to collections pages as well. This involves adding the Product structured data to define each individual product link in a product listing page. The good folks at Distilled recommend including this structured data on category pages.

Every product in Shopify collections page marked up with Product structured data

Article structured data

As well, if you use Shopify’s blog functionality, you should use “Article” structured data. This is a fantastic schema type that lets Google know that your blog content is more editorial in nature. We’ve seen that Google seems to pull content with “Article” structured data into platforms such as Google Discover and the “Interesting Finds” sections in the SERPs. Ensuring your content contains this structured data may increase the chances your site’s content is included in these sections.

BreadcrumbList structured data

Finally, one addition that we routinely add to Shopify sites are breadcrumb internal links with BreadcrumbList structured data. We believe breadcrumbs are crucial to any e-commerce site, as they provide users with easy-to-use internal links that indicate where they’re at within the hierarchy of a website. As well, these breadcrumbs can help Google better understand the website’s structure. We typically suggest adding site breadcrumbs to Shopify sites and marking those up with BreadcrumbList structured data to help Google better understand those internal links.

Keyword research

Performing keyword research for Shopify stores will be very similar to the research you would perform for other e-commerce stores.

Some general ways to generate keywords are:

  • Export your keyword data from Google AdWords. Track and optimize for those that generate the most revenue for the site.
  • Research your AdWords keywords that have high conversion rates. Even if the volume is lower, a high conversion rate indicates that this keyword is more transactional.
  • Review the keywords the site currently gets clicks/impressions for in Google Search Console.
  • Research your high priority keywords and generate new ideas using Moz’s Keyword Explorer.
  • Run your competitors through tools like Ahrefs. Using the “Content Gap” report, you can find keyword opportunities where competitor sites are ranking but yours is not.
  • If you have keywords that use similar modifiers, you can use MergeWords to automatically generate a large variety of keyword variations.

Keyword optimization

Similar to Yoast SEO, Shopify does allow you to optimize key elements such as your title tags, meta descriptions, and URLs. Where possible, you should be using your target keywords in these elements.

To adjust these elements, you simply need to navigate to the page you wish to adjust and scroll down to “Search Engine Listing Preview”:

Optimization Options For Metadata in Shopify

Adding content to product pages

If you decide that each individual product should be indexed, ideally you’ll want to add unique content to each page. Initially, your Shopify products may not have unique on-page content associated with them. This is a common issue for Shopify stores, as oftentimes the same descriptions are used across multiple products or no descriptions are present. Adding product descriptions with on-page best practices will give your products the best chance of ranking in the SERPs.

However, we understand that it’s time-consuming to create unique content for every product that you offer. With clients in the past, we’ve taken a targeted approach as to which products to optimize first. We like to use the “Sales By Product” report which can help prioritize which are the most important products to start adding content to. You can find this report in Analytics > Dashboard > Top Products By Units Sold.

Shopify revenue by product report

By taking this approach, we can quickly identify some of the highest priority pages in the store to optimize. We can then work with a copywriter to start creating content for each individual product. Also, keep in mind that your product descriptions should always be written from a user-focused view. Writing about the features of the product they care about the most will give your site the best chance at improving both conversions and SEO.

Shopify blog

Shopify does include the ability to create a blog, but we often see this missing from a large number of Shopify stores. It makes sense, as revenue is the primary goal of an e-commerce site, so the initial build of the site is product-focused.

However, we live in an era where it’s getting harder and harder to rank product pages in Google. For instance, the below screenshot illustrates the top 3 organic results for the term “cloth diapers”:

SERP for "cloth diaper" keyword.

While many would assume that this is primarily a transactional query, we’re seeing Google is ranking two articles and a single product listing page in the top three results. This is just one instance of a major trend we’ve seen where Google is starting to prefer to rank more informational content above transactional.

By excluding a blog from a Shopify store, we think this results in a huge missed opportunity for many businesses. The inclusion of a blog allows you to have a natural place where you can create this informational content. If you’re seeing that Google is ranking more blog/article types of content for the keywords mapped to your Shopify store, your best bet is to go out and create that content yourself.

If you run a Shopify store (or any e-commerce site), we would urge you to take the following few steps:

  1. Identify your highest priority keywords
  2. Manually perform a Google query for each one
  3. Make note of the types of content Google is ranking on the first page. Is it primarily informational, transactional, or a mix of both?
  4. If you’re seeing primarily mixed or informational content, evaluate your own content to see if you have any that matches the user intent. If so, improve the quality and optimize.
  5. If you do not have this content, consider creating new blog content around informational topics that seems to fulfill the user intent

As an example, we have a client that was interested in ranking for the term “CRM software,” an extremely competitive keyword. When analyzing the SERPs, we found that Google was ranking primarily informational pages about “What Is CRM Software?” Since they only had a product page that highlighted their specific CRM, we suggested the client create a more informational page that talked generally about what CRM software is and the benefits it provides. After creating and optimizing the page, we soon saw a significant increase in organic traffic (credit to Ally Mickler):

The issue that we see on many Shopify sites is that there is very little focus on informational pages despite the fact that those perform well in the search engines. Most Shopify sites should be using the blogging platform, as this will provide an avenue to create informational content that will result in organic traffic and revenue.

Apps

Similar to WordPress’s plugins, Shopify offers “Apps” that allow you to add advanced functionality to your site without having to manually adjust the code. However, unlike WordPress, most of the Shopify Apps you’ll find are paid. This will require either a one-time or monthly fee.

Shopify apps for SEO

While your best bet is likely teaming up with a developer who’s comfortable with Shopify, here are some Shopify apps that can help improve the SEO of your site.

  • Crush.pics: A great automated way of compressing large image files. Crucial for most Shopify sites as many of these sites are heavily image-based.
  • JSON-LD for SEO: This app may be used if you do not have a Shopify developer who is able to add custom structured data to your site.
  • Smart SEO: An app that can add meta tags, alt tags, & JSON-LD
  • Yotpo Reviews: This app can help you add product reviews to your site, making your content eligible for rich review stars in the SERPs.

Is Yoast SEO available for Shopify?

Yoast SEO is exclusively a WordPress plugin. There is currently no Yoast SEO Shopify App.

Limiting your Shopify apps

Similar to WordPress plugins, Shopify apps will inject additional code onto your site. This means that adding a large number of apps can slow down the site. Shopify sites are especially susceptible to bloat, as many apps are focused on improving conversions. Often times, these apps will add more JavaScript and CSS files which can hurt page load times. You’ll want to be sure that you regularly audit the apps you’re using and remove any that are not adding value or being utilized by the site.

Client results

We’ve seen pretty good success in our clients that use Shopify stores. Below you can find some of the results we’ve been able to achieve for them. However, please note that these case studies do not just include the recommendations above. For these clients, we have used a combination of some of the recommendations outlined above as well as other SEO initiatives.

In one example, we worked with a Shopify store that was interested in ranking for very competitive terms surrounding the main product their store focused on. We evaluated their top performing products in the “Sales by product” report. This resulted in a large effort to work with the client to add new content to their product pages as they were not initially optimized. This combined with other initiatives has helped improve their first page rankings by 113 keywords (credit to Jennifer Wright & LaRhonda Sparrow).

Graph of first-page keyword rankings over time

In another instance, a client came to us with an issue that they were not ranking for their branded keywords. Instead, third-party retailers that also carried their products were often outranking them. We worked with them to adjust their internal linking structure to point to the canonical pages instead of the duplicate pages created by Shopify. We also optimized their content to better utilize the branded terminology on relevant pages. As a result, they’ve seen a nice increase in overall rankings in just several months time.

Graph of total ranking improvements over time.

Moving forward

As Shopify usage continues to grow, it will be increasingly important to understand the SEO implications that come with the platform. Hopefully, this guide has provided you with additional knowledge that will help make your Shopify store stronger in the search engines.



[ad_2]

Source link

A Breakdown of HTML Usage Across ~8 Million Pages (& What It Means for Modern SEO)

[ad_1]

Catalin.Rosu

Not long ago, my colleagues and I at Advanced Web Ranking came up with an HTML study based on about 8 million index pages gathered from the top twenty Google results for more than 30 million keywords.

We wrote about the markup results and how the top twenty Google results pages implement them, then went even further and obtained HTML usage insights on them.

What does this have to do with SEO?

The way HTML is written dictates what users see and how search engines interpret web pages. A valid, well-formatted HTML page also reduces possible misinterpretation — of structured data, metadata, language, or encoding — by search engines.

This is intended to be a technical SEO audit, something we wanted to do from the beginning: a breakdown of HTML usage and how the results relate to modern SEO techniques and best practices.

In this article, we’re going to address things like meta tags that Google understands, JSON-LD structured data, language detection, headings usage, social links & meta distribution, AMP, and more.

Meta tags that Google understands

When talking about the main search engines as traffic sources, sadly it’s just Google and the rest, with Duckduckgo gaining traction lately and Bing almost nonexistent.

Thus, in this section we’ll be focusing solely on the meta tags that Google listed in the Search Console Help Center.

chart (3).png
Pie chart showing the total numbers for the meta tags that Google understands, described in detail in the sections below.

<meta name=”description” content=”…”>

The meta description is a ~150 character snippet that summarizes a page’s content. Search engines show the meta description in the search results when the searched phrase is contained in the description.

SELECTOR

COUNT

<meta name="description" content="*">

4,391,448

<meta name="description" content="">

374,649

<meta name="description">

13,831

On the extremes, we found 685,341 meta elements with content shorter than 30 characters and 1,293,842 elements with the content text longer than 160 characters.

<title>

The title is technically not a meta tag, but it’s used in conjunction with meta name=”description”.

This is one of the two most important HTML tags when it comes to SEO. It’s also a must according to W3C, meaning no page is valid with a missing title tag.

Research suggests that if you keep your titles under a reasonable 60 characters then you can expect your titles to be rendered properly in the SERPs. In the past, there were signs that Google’s search results title length was extended, but it wasn’t a permanent change.

Considering all the above, from the full 6,263,396 titles we found, 1,846,642 title tags appear to be too long (more than 60 characters) and 1,985,020 titles had lengths considered too short (under 30 characters).

titles.png
Pie chart showing the title tag length distribution, with a length less than 30 chars being 31.7% and a length greater than 60 chars being about 29.5%.

A title being too short shouldn’t be a problem —after all, it’s a subjective thing depending on the website business. Meaning can be expressed with fewer words, but it’s definitely a sign of wasted optimization opportunity.

SELECTOR

COUNT

<title>*</title>

6,263,396

missing <title> tag

1,285,738

Another interesting thing is that, among the sites ranking on page 1–2 of Google, 351,516 (~5% of the total 7.5M) are using the same text for the title and h1 on their index pages.

Also, did you know that with HTML5 you only need to specify the HTML5 doctype and a title in order to have a perfectly valid page?

<!DOCTYPE html>
<title>red</title>

<meta name=”robots|googlebot”>

“These meta tags can control the behavior of search engine crawling and indexing. The robots meta tag applies to all search engines, while the “googlebot” meta tag is specific to Google.”
– Meta tags that Google understands

SELECTOR

COUNT

<meta name="robots" content="..., ...">

1,577,202

<meta name="googlebot" content="..., ...">

139,458

HTML snippet with a meta robots and its content parameters.

So the robots meta directives provide instructions to search engines on how to crawl and index a page’s content. Leaving aside the googlebot meta count which is kind of low, we were curious to see the most frequent robots parameters, considering that a huge misconception is that you have to add a robots meta tag in your HTML’s head. Here’s the top 5:

SELECTOR

COUNT

<meta name="robots" content="index,follow">

632,822

<meta name="robots" content="index">

180,226

<meta name="robots" content="noodp">

115,128

<meta name="robots" content="all">

111,777

<meta name="robots" content="nofollow">

83,639

<meta name=”google” content=”nositelinkssearchbox”>

“When users search for your site, Google Search results sometimes display a search box specific to your site, along with other direct links to your site. This meta tag tells Google not to show the sitelinks search box.”
– Meta tags that Google understands

SELECTOR

COUNT

<meta name="google" content="nositelinkssearchbox">

1,263

Unsurprisingly, not many websites choose to explicitly tell Google not to show a sitelinks search box when their site appears in the search results.

<meta name=”google” content=”notranslate”>

“This meta tag tells Google that you don’t want us to provide a translation for this page.” – Meta tags that Google understands

There may be situations where providing your content to a much larger group of users is not desired. Just as it says in the Google support answer above, this meta tag tells Google that you don’t want them to provide a translation for this page.

SELECTOR

COUNT

<meta name="google" content="notranslate">

7,569

<meta name=”google-site-verification” content=”…”>

“You can use this tag on the top-level page of your site to verify ownership for Search Console.”
– Meta tags that Google understands

SELECTOR

COUNT

<meta name="google-site-verification" content="...">

1,327,616

While we’re on the subject, did you know that if you’re a verified owner of a Google Analytics property, Google will now automatically verify that same website in Search Console?

<meta charset=”…” >

“This defines the page’s content type and character set.”
– Meta tags that Google understands

This is basically one of the good meta tags. It defines the page’s content type and character set. Considering the table below, we noticed that just about half of the index pages we analyzed define a meta charset.

SELECTOR

COUNT

<meta charset="..." >

3,909,788

<meta http-equiv=”refresh” content=”…;url=…”>

“This meta tag sends the user to a new URL after a certain amount of time and is sometimes used as a simple form of redirection.”
– Meta tags that Google understands

It’s preferable to redirect your site using a 301 redirect rather than a meta refresh, especially when we assume that 30x redirects don’t lose PageRank and the W3C recommends that this tag not be used. Google is not a fan either, recommending you use a server-side 301 redirect instead.

SELECTOR

COUNT

<meta http-equiv="refresh" content="...;url=...">

7,167

From the total 7.5M index pages we parsed, we found 7,167 pages that are using the above redirect method. Authors do not always have control over server-side technologies and apparently they use this technique in order to enable redirects on the client side.

Also, using Workers is a cutting-edge alternative n order to overcome issues when working with legacy tech stacks and platform limitations.

<meta name=”viewport” content=”…”>

“This tag tells the browser how to render a page on a mobile device. Presence of this tag indicates to Google that the page is mobile-friendly.”
– Meta tags that Google understands

SELECTOR

COUNT

<meta name="viewport" content="...">

4,992,791

Starting July 1, 2019, all sites started to be indexed using Google’s mobile-first indexing. Lighthouse checks whether there’s a meta name=”viewport” tag in the head of the document, so this meta should be on every webpage, no matter what framework or CMS you’re using.

Considering the above, we would have expected more websites than the 4,992,791 out of 7.5 million index pages analyzed to use a valid meta name=”viewport” in their head sections.

Designing mobile-friendly sites ensures that your pages perform well on all devices, so make sure your web page is mobile-friendly here.

<meta name=”rating” content=”…” />

“Labels a page as containing adult content, to signal that it be filtered by SafeSearch results.”
– Meta tags that Google understands

SELECTOR

COUNT

<meta name="rating" content="..." />

133,387

This tag is used to denote the maturity rating of content. It was not added to the meta tags that Google understands list until recently. Check out this article by Kate Morris on how to tag adult content.

JSON-LD structured data

Structured data is a standardized format for providing information about a page and classifying the page content. The format of structured data can be Microdata, RDFa, and JSON-LD — all of these help Google understand the content of your site and trigger special search result features for your pages.

While having a conversation with the awesome Dan Shure, he came up with a good idea to look for structured data, such as the organization’s logo, in search results and in the Knowledge Graph.

In this section, we’ll be using JSON-LD (JavaScript Object Notation for Linked Data) only in order to gather structured data info.This is what Google recommends anyway for providing clues about the meaning of a web page.

Some useful bits on this:

  • At Google I/O 2019, it was announced that the structured data testing tool will be superseded by the rich results testing tool.
  • Now Googlebot indexes web pages using the latest Chromium rather than the old Chrome 42, meaning you can mitigate the SEO issues you may have had in the past, with structured data support as well.
  • Jason Barnard had an interesting talk at SMX London 2019 on how Google Search ranking works and according to his theory, there are seven ranking factors we can count on; structured data is definitely one of them.
  • Builtvisible‘s guide on Microdata, JSON-LD, & Schema.org contains everything you need to know about using structured data on your website.
  • Here’s an awesome guide to JSON-LD for beginners by Alexis Sanders.
  • Last but not least, there are lots of articles, presentations, and posts to dive in on the official JSON for Linking Data website.

Advanced Web Ranking’s HTML study relies on analyzing index pages only. What’s interesting is that even though it’s not stated in the guidelines, Google doesn’t seem to care about structured data on index pages, as stated in a Stack Overflow answer by Gary Illyes several years ago. Yet, on JSON-LD structured data types that Google understands, we found a total of 2,727,045 features:

json-ld-chart.png
Pie chart showing the structured data types that Google understands, with Sitelinks searchbox being 49.7% — the highest value.

STRUCTURED DATA FEATURES

COUNT

Article

35,961

Breadcrumb

30,306

Book

143

Carousel

13,884

Corporate contact

41,588

Course

676

Critic review

2,740

Dataset

28

Employer aggregate rating

7

Event

18,385

Fact check

7

FAQ page

16

How-to

8

Job posting

355

Livestream

232

Local business

200,974

Logo

442,324

Media

1,274

Occupation

0

Product

16,090

Q&A page

20

Recipe

434

Review snippet

72,732

Sitelinks searchbox

1,354,754

Social profile

478,099

Software app

780

Speakable

516

Subscription and paywalled content

363

Video

14,349

rel=canonical

The rel=canonical element, often called the “canonical link,” is an HTML element that helps webmasters prevent duplicate content issues. It does this by specifying the “canonical URL,” the “preferred” version of a web page.

SELECTOR

COUNT

<link rel=canonical href="*">

3,183,575

meta name=”keywords”

It’s not new that <meta name=”keywords”> is obsolete and Google doesn’t use it anymore. It also appears as though <meta name=”keywords”> is a spam signal for most of the search engines.

“While the main search engines don’t use meta keywords for ranking, they’re very useful for onsite search engines like Solr.”
– JP Sherman on why this obsolete meta might still be useful nowadays.

SELECTOR

COUNT

<meta name="keywords" content="*">

2,577,850

<meta name="keywords" content="">

256,220

<meta name="keywords">

14,127

Headings

Within 7.5 million pages, h1 (59.6%) and h2 (58.9%) are among the twenty-eight elements used on the most pages. Still, after gathering all the headings, we found that h3 is the heading with the largest number of appearances — 29,565,562 h3s out of 70,428,376  total headings found.

Random facts:

  • The h1–h6 elements represent the six levels of section headings. Here are the full stats on headings usage, but we found 23,116 of h7s and 7,276 of h8s too. That’s a funny thing because plenty of people don’t even use h6s very often.
  • There are 3,046,879 pages with missing h1 tags and within the rest of the 4,502,255 pages, the h1 usage frequency is 2.6, with a total of 11,675,565 h1 elements.
  • While there are 6,263,396 pages with a valid title, as seen above, only 4,502,255 of them are using a h1 within the body of their content.

Missing alt tags

This eternal SEO and accessibility issue still seems to be common after analyzing this set of data. From the total of 669,591,743 images, almost 90% are missing the alt attribute or use it with a blank value.

chart (4).png
Pie chart showing the img tag alt attribute distribution, with missing alt being predominant — 81.7% from a total of about 670 million images we found.

SELECTOR

COUNT

img

669,591,743

img alt=”*”

79,953,034

img alt=””

42,815,769

img w/ missing alt

546,822,940

Language detection

According to the specs, the language information specified via the lang attribute may be used by a user agent to control rendering in a variety of ways.

The part we’re interested in here is about “assisting search engines.”

“The HTML lang attribute is used to identify the language of text content on the web. This information helps search engines return language specific results, and it is also used by screen readers that switch language profiles to provide the correct accent and pronunciation.”
– Léonie Watson

A while ago, John Mueller said Google ignores the HTML lang attribute and recommended the use of link hreflang instead. The Google Search Console documentation states that Google uses hreflang tags to match the user’s language preference to the right variation of your pages.

lang-vs-hreflang.png
Bar chart showing that 65% of the 7.5 million index pages use the lang attribute on the html element, at the same time 21.6% use at least a link hreflang.

Of the 7.5 million index pages that we were able to look into, 4,903,665 use the lang attribute on the html element. That’s about 65%!

When it comes to the hreflang attribute, suggesting the existence of a multilingual website, we found about 1,631,602 pages — that means around 21.6% index pages use at least a link rel=”alternate” href=”*” hreflang=”*” element.

Google Tag Manager

From the beginning, Google Analytics’ main task was to generate reports and statistics about your website. But if you want to group certain pages together to see how people are navigating through that funnel, you need a unique Google Analytics tag. This is where things get complicated.

Google Tag Manager makes it easier to:

  • Manage this mess of tags by letting you define custom rules for when and what user actions your tags should fire
  • Change your tags whenever you want without actually changing the source code of your website, which sometimes can be a headache due to slow release cycles
  • Use other analytics/marketing tools with GTM, again without touching the website’s source code

We searched for *googletagmanager.com/gtm.js references and saw that about 345,979 pages are using the Google Tag Manager.

rel=”nofollow”

“Nofollow” provides a way for webmasters to tell search engines “don’t follow links on this page” or “don’t follow this specific link.”

Google does not follow these links and likewise does not transfer equity. Considering this, we were curious about rel=”nofollow” numbers. We found a total of 12,828,286 rel=”nofollow” links within 7.5 million index pages, with a computed average of 1.69 rel=”nofollow” per page.

Last month, Google announced two new link attributes values that should be used in order to mark the nofollow property of a link: rel=”sponsored” and rel=”ugc”. I’d recommend you go read Cyrus Shepard’s article on how Google’s nofollow, sponsored, & ugc links impact SEO, learn why Google changed nofollow,  the ranking impact of nofollow links, and more.

A table showing how Google’s nofollow, sponsored, and UGC link attributes impact SEO, from Cyrus Shepard’s article.

We went a bit further and looked up these new link attributes values, finding 278 rel=”sponsored” and 123 rel=”ugc”. To make sure we had the relevant data for these queries, we updated the index pages data set specifically two weeks after the Google announcement on this matter. Then, using Moz authority metrics, we sorted out the top URLs we found that use at least one of the rel=”sponsored” or rel=”ugc” pair:

  • https://www.seroundtable.com/
  • https://letsencrypt.org/
  • https://www.newsbomb.gr/
  • https://thehackernews.com/
  • https://www.ccn.com/
  • https://www.chip.pl/
  • https://www.gamereactor.se/
  • https://www.tribes.co.uk/

AMP

Accelerated Mobile Pages (AMP) are a Google initiative which aims to speed up the mobile web. Many publishers are making their content available parallel to the AMP format.

To let Google and other platforms know about it, you need to link AMP and non-AMP pages together.

Within the millions of pages we looked at, we found only 24,807 non-AMP pages referencing their AMP version using rel=amphtml.

Social

We wanted to know how shareable or social a website is nowadays, so knowing that Josh Buchea made an awesome list with everything that could go in the head of your webpage, we extracted the social sections from there and got the following numbers:

Facebook Open Graph

chart.png
Bar chart showing the Facebook Open Graph meta tags distribution, described in detail in the table below.

SELECTOR

COUNT

meta property="fb:app_id" content="*"

277,406

meta property="og:url" content="*"

2,909,878

meta property="og:type" content="*"

2,660,215

meta property="og:title" content="*"

3,050,462

meta property="og:image" content="*"

2,603,057

meta property="og:image:alt" content="*"

54,513

meta property="og:description" content="*"

1,384,658

meta property="og:site_name" content="*"

2,618,713

meta property="og:locale" content="*"

1,384,658

meta property="article:author" content="*"

14,289

Twitter card

chart (1).png
Bar chart showing the Twitter Card meta tags distribution, described in detail in the table below.

SELECTOR

COUNT

meta name="twitter:card" content="*"

1,535,733

meta name="twitter:site" content="*"

512,907

meta name="twitter:creator" content="*"

283,533

meta name="twitter:url" content="*"

265,478

meta name="twitter:title" content="*"

716,577

meta name="twitter:description" content="*"

1,145,413

meta name="twitter:image" content="*"

716,577

meta name="twitter:image:alt" content="*"

30,339

And speaking of links, we grabbed all of them that were pointing to the most popular social networks.

chart (2).png
Pie chart showing the external social links distribution, described in detail in the table below.

SELECTOR

COUNT

<a href*="facebook.com">

6,180,313

<a href*="twitter.com">

5,214,768

<a href*="linkedin.com">

1,148,828

<a href*="plus.google.com">

1,019,970

Apparently there are lots of websites that still link to their Google+ profiles, which is probably an oversight considering the not-so-recent Google+ shutdown.

rel=prev/next

According to Google, using rel=prev/next is not an indexing signal anymore, as announced earlier this year:

“As we evaluated our indexing signals, we decided to retire rel=prev/next. Studies show that users love single-page content, aim for that when possible, but multi-part is also fine for Google Search.”
– Tweeted by Google Webmasters

However, in case it matters for you, Bing says it uses them as hints for page discovery and site structure understanding.

“We’re using these (like most markup) as hints for page discovery and site structure understanding. At this point, we’re not merging pages together in the index based on these and we’re not using prev/next in the ranking model.”
– Frédéric Dubut from Bing

Nevertheless, here are the usage stats we found while looking at millions of index pages:

SELECTOR

COUNT

<link rel="prev" href="*"

20,160

<link rel="next" href="*"

242,387

That’s pretty much it!

Knowing how the average web page looks using data from about 8 million index pages can give us a clearer idea of trends and help us visualize common usage of HTML when it comes to SEO modern and emerging techniques. But this may be a never-ending saga — while having lots of numbers and stats to explore, there are still lots of questions that need answering:

  • We know how structured data is used in the wild now. How will it evolve and how much structured data will be considered enough?
  • Should we expect AMP usage to increase somewhere in the future?
  • How will rel=”sponsored” and rel=“ugc” change the way we write HTML on a daily basis? When coding external links, besides the target=”_blank” and rel=“noopener” combo, we now have to consider the rel=”sponsored” and rel=“ugc” combinations as well.
  • Will we ever learn to always add alt attributes values for images that have a purpose beyond decoration?
  • How many more additional meta tags or attributes will we have to add to a web page to please the search engines? Do we really needed the newly announced data-nosnippet HTML attribute? What’s next, data-allowsnippet?

There are other things we would have liked to address as well, like “time-to-first-byte” (TTFB) values, which correlates highly with ranking; I’d highly recommend HTTP Archive for that. They periodically crawl the top sites on the web and record detailed information about almost everything. According to the latest info, they’ve analyzed 4,565,694 unique websites, with complete Lighthouse scores and having stored particular technologies like jQuery or WordPress for the whole data set. Huge props to Rick Viscomi who does an amazing job as its “steward,” as he likes to call himself.

Performing this large-scale study was a fun ride. We learned a lot and we hope you found the above numbers as interesting as we did. If there is a tag or attribute in particular you would like to see the numbers for, please let me know in the comments below.

Once again, check out the full HTML study results and let me know what you think!



[ad_2]

Source link

6 Ways to Get More Organic Traffic, Without Ranking Your Website

[ad_1]

ryanwashere

A few years ago, I wrote a post here that caught some attention in the community.

I argued Google appears to be ranking websites heavily based on searcher intent — this is more true now than ever.

In fact, it might be algorithmically impossible to get your website on top of the SERPs.

If you find your website in this position, don’t give up on SEO!

The point of “Search Engine Optimization” is to get organic exposure through search engines — it doesn’t necessarily have to be your website.

We can leverage the ranking authority of other websites pass organic referral traffic to our sites.

I’m going to give 6 times when you should NOT rank your website.

Prefer to watch / listen? I outlined all these points as a part of a recent keynote: https://youtu.be/mMvIty5W93Y

1. When the keywords are just TOO competitive

We’ve all been there: trying to rank a website with no authority for highly competitive keywords.

These keywords are competitive because they’re valuable so we can’t give up on them.

Here’s a few workarounds I’ve used in the past.

Tactic 1: Offer to sponsor the content

Ardent sells a product that “decarboxylates” cannabis for medicinal users.

There’s a ton of challenges selling this product, mostly because patients don’t know what “decarboxylation” means.

So, naturally, ranking for the keyword “what is decarboxylation” is a critical step in their customer’s path to conversion. Problem is, that keyword is dominated by authoritative, niche relevant sites.

While Ardent should still build and optimize content around the subject, it might take years to rank.

When you’re trying to build a business, that’s not good enough.

We decided to reach out to those authoritative sites offering to “sponsor” one of their posts.

In this case, it worked exceptionally well — we negotiated a monthly rate ($250) to tag content with a CTA and link back to Ardent’s site.

Granted, this doesn’t work in every niche. If you operate in one of those spaces, there’s another option.

Tactic 2: Guest post on their site

Guest writing for Moz in 2015 put my agency on the map.

Publishing on powerful sites quickly expands your reach and lends credibility to your brand (good links, too).

More importantly, it gives you instant ranking power for competitive keywords.

As co-owner of an SEO agency, it would be amazing to rank in Google for “SEO services,” right?

seo-servce-google-search

Even with an authoritative site, it’s difficult to rank your site for the search “SEO service” nationally. You can leverage the authority of industry sites to rank for these competitive searches.

The post I wrote for Moz back in 2015 ranks for some very competitive keywords (admittedly, this was unintentional).

This post continues to drive free leads, in perpetuity.

moz-referral-traffic

When we know a client has to get visibility for a given keyword but the SERPs won’t budge, our agency builds guest posting into our client’s content strategies.

It’s an effective tactic that can deliver big results when executed properly.

2. When you can hijack “brand alternative” keywords

When you’re competing for SERP visibility with a large brand, SEO is an uphill battle.

Let’s look at a couple tactics if you find yourself in this situation.

Tactic #1: How to compete against HubSpot

HubSpot is a giant on the internet — they dominate the SERPs.

Being that large can have drawbacks, including people searching Googlef “HubSpot alternatives.” If you’re a competitor, you can’t afford to miss out on these keywords.

“Listicle” style articles dominate for these keywords, as they provide the best “type” of result for a searcher with that intent.

It’s ranking on top for a lot of valuable keywords to competitors.

As a competitor, you’ll want to see if you can get included in this post (and others). By contacting the author with a pitch, we can create an organic opportunity for ourselves.

This pitch generally has a low success. The author needs to feel motivated to add you to the article. Your pitch needs to contain a value proposition that can move them to action.

A few tips:

  • Find the author’s social profiles and add them. Then retweet, share, and like their content to give them a boost
  • Offer to share the article with your social profiles or email list if they include you in it
  • Offer to write the section for inclusion to save them time

While success rate isn’t great, the payoff is worth the effort.

Tactic #2: Taking advantage of store closures

Teavana is an international tea retailer with millions of advocates (over 200k searches per month in Google).

Just a few months ago, Starbucks decided to close all Teavana stores. With news of Teavana shutting down, fans of the brand would inevitably search for “Teavana replacements” to find a new company to buy similar tea from.

Teami is a small tea brand that sells a number of SKUs very similar to what Teavana. Getting in front of those searches would provide tremendous value to their business.

At that moment, we could do two things:

  1. Try to rank a page on Teami’s for “Teavana replacement”
  2. Get it listed on an authority website in a roundup with other alternatives

If you ask many SEO experts what to do, they’d probably go for the first option. But we went with the second option – getting it listed in a roundup post.

If we ranked Teami as a Teavana replacement — which we could do — people will check the site and know that we sell tea, but they won’t take it seriously because they don’t trust us yet that we are a good Teavana replacement.

How to pull it off for your business

Find a writer who writes about these topics on authoritative sites. You may need to search for broader keywords and see articles from authority magazine-like websites.

Check the author of the article, find their contact info, and send them a pitch.

We were able to get our client (Teami Blends) listed as the number-two spot in the article, providing a ton of referral traffic to the website.

3. When you want to rank for “best” keywords

When someone is using “best” keywords (i.e. best gyms in NYC), the SERPs are telling us the searcher doesn’t want to visit a gym’s website.

The SERPs are dominated by “roundup” articles from media sources — these are a far better result to satisfy the searcher’s intent.

That doesn’t mean we can’t benefit from “best keywords.” Let’s look at a few tactics.

Tactic #1: Capture searchers looking for “best” keywords

Let’s say you come to Miami for a long weekend.

You’ll likely search for “best coffee shops in Miami” to get a feel for where to dine while here.

If you own a coffee shop in Miami, that’s a difficult keyword to rank for – the SERPs are stacked against you.

A few years back we worked with a Miami-based coffee shop chain, Dr Smood, who faced this exact challenge.

Trying to jam their website in the SERPs would be a waste of resources. Instead, we focused on getting featured in press outlets for “best of Miami” articles.

local PR for links

How can you do it?

Find existing articles (ranking for your target “best of” keywords) and pitch for inclusion. You can offer incentives like free meals, discounts, etc. in exchange for inclusion.

You’ll also want to pitch journalists for future inclusion in articles. Scan your target publication for relevant journalists and send an opening pitch:

Hey [NAME],

My name is [YOUR NAME]. Our agency manages the marketing for [CLIENT].

We’ve got a new menu that we think would be a great fit for your column. We’d love to host you in our Wynwood location to sample the tasting menu.

If interested, please let me know a date / time that works for you!

We pitched dozens of journalists on local publications for Dr Smood.

author info

It resulted in a handful of high-impact features.

local PR for links

Work with food service businesses? I have more creative marketing tips for restaurants here.

Tactic #2: If you have a SaaS / training company

Let’s say you work for an online training company that helps agencies improve their processes and service output.

There’s hundreds of articles reviewing “best SEO training” that would be a killer feature for your business.

Getting featured here isn’t as hard as you might think — you just have to understand how to write value propositions into your pitch.

Part of that is taking the time to review your prospect and determine what might interest them:

  • Helping get traffic to their site?
  • Discounts / free access to your product?
  • Paying them…?

Here’s a few I came up with when pitching on behalf of The Blueprint Training.

Hey [NAME],

My name is [YOUR NAME]…nice to meet you.

I’ll get to the point – I just read your article on “Best SEO Trainings” on the [BLOG NAME] blog. I recently launched a deep SEO training and I’d love consideration to be included.

I recently launched a platform called The Blueprint Training – I think its a perfect fit for your article.

Now, I realize how much work it is to go back in and edit an article, so I’m willing to do all of the following:

– Write the section for you, in the same format as on the site

– Promote the article via my Twitter account (I get GREAT engagement)
– Give you complimentary access to the platform to see the quality for yourself

Let me know what you think and if there’s anything else I can do for you.

Enjoy your weekend!

If you can understand value propositioning, you’ll have a lot of success with this tactic.

4. When you need to spread your local footprint

Piggybacking off the previous example, when performing keyword research we found Google displayed completely different SERPs for keywords that all classified what Dr Smood offered.

  • Miami organic cafe
  • Miami coffee shop
  • Miami juice bar

The algorithm is telling us each of these keywords is different — it would be extremely difficult to rank the client’s website for all three.

However, we can use other owned properties to go after the additional keywords in conjunction with our website.

Properties like Yelp allow you to edit titles and optimize your listing just like you would your website.

We can essentially perform “on page” SEO for these properties and get them to rank for valuable keyword searches.

The structure we took with Dr Smood was as follows:

When doing this for your business, be sure to identify all the keyword opportunities available and pay attention to how the SERPs react for each.

Understand which citation pages (Yelp, MenuPages, etc) you have available to rank instead your website for local searches and optimize them as you would your website.

5. When you need to boost e-commerce sales

The SERPs for e-commerce stores are brutally competitive. Not only do you have to compete with massive brands / retailers, but also sites like Amazon and Etsy.

Look, I get it — selling on Amazon isn’t that simple. There’s a ton of regulations and fees that come with the platform.

But these regulations are what’s keeping a lot of larger brands from selling there, aka, there’s an opportunity there.

Amazon accounts for 40% of online retail in the US (and growing rapidly). Not only can you get your Amazon to rank in Google searches, but 90% of sales on the platform come from internal Amazon searches.

In other words, Amazon is its own marketing engine.

While you might take a haircut on your initial sales, you can use Amazon as a customer acquisition channel and optimize the lifetime value to recoup your lost upfront sales.

Here’s how we did it for a small e-commerce client.

Tactic: Radha Beauty Oil

Radha Beauty sells a range of natural oils for skin, hair and general health. Our keyword research found that Amazon listings dominated most of their target keywords.

With clients like this we make sure to track SERP result type, to properly understand what Google wants to rank for target keywords.

Specifically, Amazon listings had the following SERP share:

  • First result = 27.3%
  • Second result = 40.9%
  • Third result = 35.9%

Fortunately, this client was already selling on Amazon. Unfortunately, they had a limited budget. We didn’t have the hours in our retainer to optimize both their e-commerce store and their Amazon store.

This data gave us the firepower to have a conversation with the client that our time would drive more revenue optimizing their Amazon store over their e-commerce platform.

We focused our efforts optimizing their Amazon listings just like we would an e-commerce store:

  • Amazon product titles
  • Amazon descriptions
  • Generating reviews from past customers
  • Building links to Amazon store pages

The results were overwhelmingly positive.

If you’re a newer e-commerce brand, an Amazon store gives you the opportunity to outrank giants like Ulta in Google.

6. When the SERPs call for video

Predator Nutrition is an e-commerce site that sells health and fitness supplements. They have their own private label products, but they’re mainly a retailer (meaning they sell other brands as well).

While performing keyword research for them, we found a ton of search volume around people looking for reviews of products they sold.

youtube-review-keywords

The SERPs clearly show that searchers prefer to watch videos for “review” searches.

There are a couple ways you can capture these searches:

  1. Create videos for your YouTube channel reviewing products
  2. Find and pay an influencer to review products for you

I prefer method #2, as reviews on third-party channels rank better — especially if you’re targeting YouTubers with a large following.

Not only are you adding more branded content in the SERPs, but you’re getting your products reviewed for targeted audiences.

Final thoughts…

This industry tends to romanticize SEO as a traffic source.

Don’t get me wrong, I love how passionate our community is, but… we have to stop.

We’re trying to build businesses. We can’t fall in love with a single source of traffic (and turn our backs to others).

The internet is constantly changing. We need to adapt along with it.

What do you think?

[ad_2]

Source link

How to Onboard Clients with Immersion Workshops – Whiteboard Friday

[ad_1]

HeatherPhysioc

Spending quality time getting to know your client, their goals and capabilities, and getting them familiar with their team sets you up for a better client-agency relationship. Immersion workshops are the answer. Learn more about how to build a strong foundation with your clients in this week’s Whiteboard Friday presented by Heather Physioc.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, everybody, and welcome back to Whiteboard Friday. My name is Heather Physioc, and I’m Group Director of Discoverability at VMLY&R. So I learned that when you onboard clients properly, the rest of the relationship goes a lot smoother.

Through some hard knocks and bumps along the way, we’ve come up with this immersion workshop model that I want to share with you. So I actually conducted a survey of the search industry and found that we tend to onboard clients inconsistently from one to the next if we bother to do a proper onboarding with them at all. So to combat that problem, let’s talk through the immersion workshop.

Why do an immersion workshop with a client?

So why bother taking the time to pause, slow down, and do an immersion workshop with a client? 

1. Get knowledgeable fast

Well, first, it allows you to get a lot more knowledgeable about your client and their business a lot faster than you would if you were picking it up piecemeal over the first year of your partnership. 

2. Opens dialogue

Next it opens a dialogue from day one.

It creates the expectation that you will have a conversation and that the client is expected to participate in that process with you. 

3. Build relationships

You want to build a relationship where you know that you can communicate effectively with one another. It also starts to build relationships, so not only with your immediate, day-to-day client contact, but people like their bosses and their peers inside their organization who can either be blockers or advocates for the search work that your client is going to try to implement.

4. Align on purpose, roadmap, and measuring success

Naturally the immersion workshop is also a crucial time for you to align with your client on the purpose of your search program, to define the roadmap for how you’re going to deliver on that search program and agree on how you’re going to measure success, because if they’re measuring success one way and you’re measuring success a different way, you could end up at completely different places.

5. Understand the DNA of the brand

Ultimately, the purpose of a joint immersion workshop is to truly understand the DNA of the brand, what makes them tick, who are their customers, why should they care what this brand has to offer, which helps you, as a search professional, understand how you can help them and their clients. 

Setting

Do it live! (Or use video chats)

So the setting for this immersion workshop ideally should be live, in-person, face-to-face, same room, same time, same place, same mission.

But worst case scenario, if for some reason that’s not possible, you can also pull this off with video chats, but at least you’re getting that face-to-face communication. There’s going to be a lot of back-and-forth dialogue, so that’s really, really important. It’s also important to building the empathy, communication, and trust between people. Seeing each other’s faces makes a big difference. 

Over 1–3 days

Now the ideal setting for the immersion workshop is two days, in my opinion, so you can get a lot accomplished.

It’s a rigorous two days. But if you need to streamline it for smaller brands, you can totally pull it off with one. Or if you have the luxury of stretching it out and getting more time with them to continue building that relationship and digging deeper, by all means stretch it to three days. 

Customize the agenda

Finally, you should work with the client to customize the agenda. So I like to send them a base template of an immersion workshop agenda with sessions that I know are going to be important to my search work.

But I work side-by-side with that client to customize sessions that are going to be the right fit for their business and their needs. So right away we’ve got their buy-in to the workshop, because they have skin in the game. They know which departments are going to be tricky. They know what objectives they have in their heads. So this is your first point of communication to make this successful.

Types of sessions

So what types of sessions do we want to have in our immersion workshop? 

Vision

The first one is a vision session, and this is actually one that I ask the clients to bring to us. So we slot about 90 minutes for the client to give us a presentation on their brand, their overarching strategy for the year, their marketing strategy for the year.

We want to hear about their goals, revenue targets, objectives, problems they’re trying to solve, threats they see to the business. Whatever is on their mind or keeps them up at night or whatever they’re really excited about, that’s what we want to hear. This vision workshop sets the tone for the entire rest of the workshop and the partnership. 

Stakeholder

Next we want to have stakeholder sessions.

We usually do these on day one. We’re staying pretty high level on day one. So these will be with other departments that are going to integrate with search. So that could be the head of marketing, for example, like a CMO. It could be the sales team. If they have certain sales objectives they’re trying to hit, that would be really great for a search team to know. Or it could be global regions.

Maybe Latin America and Europe have different priorities. So we may want to understand how the brand works on the global scale as opposed to just at HQ. 

Practitioner

On day two is when we start to get a little bit more in the weeds, and we call these our practitioner sessions. So we want to work with our day-to-day SEO contacts inside the organization. But we also set up sessions with people like paid search if they need to integrate their search efforts.

We might set up time with analytics. So this will be where we demo our standard SEO reporting dashboards and then we work with the client to customize it for their needs. This is a time where we find out who they’re reporting up to and what kinds of metrics they’re measured on to determine success. We talk about the goals and conversions they’re measuring, how they’re captured, why they’re tracking those goals, and their existing baseline of performance information.

We also set up time with developers. Technology is essential to actually implementing our SEO recommendations. So we set up time with them to learn about their workflows and their decision-making process. I want to know if they have resource constraints or what makes a good project ticket in Jira to get our work done. Great time to start bonding with them and give them a say in how we execute search.

We also want to meet with content teams. Now content tends to be one of the trickiest areas for our clients. They don’t always have the resources, or maybe the search scope didn’t include content from day one. So we want to bring in whoever the content decision-makers or creators are. We want to understand how they think, their workflows and processes. Are they currently creating search-driven content, or is this going to be a shift in mentality?

So a lot of times we get together and talk about process, editorial calendaring, brand tone and voice, whatever it takes to get content done for search.

Summary and next steps

So after all of these, we always close with a summary and next steps discussion. So we work together to think about all the things that we’ve accomplished during this workshop and what our big takeaways and learnings are, and we take this time to align with our client on next steps.

When we leave that room, everybody should know exactly what they’re responsible for. Very powerful. You want to send a recap after the fact saying, “Here’s what we learned and here’s what we understand the next steps to be. Are we all aligned?” Heads nod. Great. 

Tools to use

So a couple of tools that we’ve created and we’ll make sure to link to these below.

Download all the tools

Onboarding checklist

We’ve created a standard onboarding checklist. The thing about search is when we’re onboarding a new client, we pretty commonly need the same things from one client to the next. We want to know things about their history with SEO. We need access and logins. Or maybe we need a list of their competitors. Whatever the case is, this is a completely repeatable process. So there’s no excuse for reinventing the wheel every single time.

So this standard onboarding checklist allows us to send this list over to the client so they can get started and get all the pieces in place that we need to be successful. It’s like mise en place when you’re cooking. 

Discussion guides

We’ve also created some really helpful session discussion guides. So we give our clients a little homework before these sessions to start thinking about their business in a different way.

We’ll ask them open-ended questions like: What kinds of problems are your business unit solving this year? Or what is one of the biggest obstacles that you’ve had to overcome? Or what’s some work that you’re really proud of? So we send that in advance of the workshop. Then in our business unit discussions, which are part of the stakeholder discussions, we’ll actually use a few of the questions from that discussion guide to start seeding the conversation.

But we don’t just go down the list of questions, checking them off one by one. We just start the conversation with a couple of them and then follow it organically wherever it takes us, open-ended, follow-up, and clarifying questions, because the conversations we are having in that room with our clients are far more powerful than any information you’re going to get from an email that you just threw over the fence.

Sticky note exercise

We also do a pretty awesome little sticky note exercise. It’s really simple. So we pass out sticky notes to all the stakeholders that have attended the sessions, and we ask two simple questions. 

  1. One, what would cause this program to succeed? What are all the factors that can make this work? 
  2. We also ask what will cause it to fail.

Before you know it, the client has revealed, in their own words, what their internal obstacles and blockers will be. What are the things that they’ve run into in the past that have made their search program struggle? By having that simple exercise, it gets everybody in the mind frame of what their role is in making this program a success. 

Search maturity assessment

The last tool, and this one is pretty awesome, is an assessment of the client’s organic search maturity.

Now this is not about how good they are at SEO. This is how well they incorporate SEO into their organization. Now we’ve actually done a separate Whiteboard Friday on the maturity assessment and how to implement that. So make sure to check that out. But a quick overview. So we have a survey that addresses five key areas of a client’s ability to integrate search with their organization.

  • It’s stuff like people. Do they have the right resources? 
  • Process. Do they have a process? Is it documented? Is it improving? 
  • Capacity. Do they have enough budget to actually make search possible? 
  • Knowledge. Are they knowledgeable about search, and are they committed to learning more? Stuff like that.

So we’ve actually created a five-part survey that has a number of different questions that the client can answer. We try to get as many people as possible on the client side to answer these questions as we can. Then we take the numerical answers and the open-ended answers and compile that into a maturity assessment for the brand after the workshop.

So we use that workshop time to actually execute the survey, and we have something that we can bring back to the client not long after to give them a picture of where they stand today and where we’re going to take them in the future and what the biggest obstacles are that we need to overcome to get them there. 

So this is my guide to creating an immersion workshop for your new clients. Be sure to check out the Whiteboard Friday on the maturity assessment as well.

We’d love to hear what you do to onboard your clients in the comments below. Thanks and we’ll see you on the next Whiteboard Friday.

Video transcription by Speechpad.com


Heather shared even more strong team-building goodness in her MozCon 2019 talk. Get access to her session and more in our newly released video bundle, plus access 26 additional future-focused SEO topics from our top-notch speakers:

Grab the sessions now!

Make sure to schedule a learning sesh with the whole team and maximize your investment in SEO education!

[ad_2]

Source link

E-A-T and the Quality Raters’ Guidelines – Whiteboard Friday

[ad_1]

MarieHaynes

EAT — also known as Expertise, Authoritativeness, and Trustworthiness — is a big deal when it comes to Google’s algorithms. But what exactly does this acronym entail, and why does it matter to your everyday work? In this bite-sized version of her full MozCon 2019 presentation, Marie Haynes describes exactly what E-A-T means and how it could have a make-or-break effect on your site.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. My name is Marie Haynes, from Marie Haynes Consulting, and I’m going to talk to you today about EAT and the Quality Raters’ Guidelines. By now, you’ve probably heard of EAT. It’s a bit of a buzzword in SEO. I’m going to share with you why EAT is a big part of Google’s algorithms, how we can take advantage of this news, and also why it’s really, really important to all of us.

The Quality Raters’ Guidelines

Let’s talk about the Quality Raters’ Guidelines. These guidelines are a document that Google has provided to this whole army of quality raters. There are apparently 16,000 quality raters, and what they do is they use this document, the Quality Raters’ Guidelines, to determine whether websites are high quality or not.

Now the quality raters do not have the power to put a penalty on your website. They actually have no direct bearing on rankings. But instead, what happens is they feed information back to Google’s engineers, and Google’s engineers can take that information and determine whether their algorithms are doing what they want them to do. Ben Gomes, the Vice President of Search at Google, he had a quote recently in an interview with CNBC, and he said that the quality raters, the information that’s in there is fundamentally what Google wants the algorithm to do.

“They fundamentally show us what the algorithm should do.”
– Ben Gomes, VP Search, Google

So we believe that if something is in the Quality Raters’ Guidelines, either Google is already measuring this algorithmically, or they want to be measuring it, and so we should be paying close attention to everything that is in there. 

How Google fights disinformation

There was a guide that was produced by Google earlier, in February of 2019, and it was a whole guide on how they fight disinformation, how they fight fake news, how they make it so that high-quality results are appearing in the search results.

There were a couple of things in here that were really interesting. 

1. Information from the quality raters allows them to build algorithms

The guide talked about the fact that they take the information from the quality raters and that allows them to build algorithms. So we know that it’s really important that the things that the quality raters are assessing are things that we probably should be paying attention to as well. 

2. Ranking systems are designed to ID sites with high expertise, authoritativeness, and trustworthiness

The thing that was the most important to me or the most interesting, at least, is this line that said our ranking systems are designed to identify sites with a high indicia of EAT, of expertise, authoritativeness, and trustworthiness.

So whether or not we want to argue whether EAT is a ranking factor, I think that’s semantics. What the word “ranking factor” means, what we really need to know is that EAT is really important in Google’s algorithms. We believe that if you’re trying to rank for any term that really matters to people, “your money or your life” really means if it’s a page that is helping people make a decision in their lives or helping people part with money, then you need to pay attention to EAT, because Google doesn’t want to rank websites that are for important queries if they’re lacking EAT.

The three parts of E-A-T

So it’s important to know that EAT has three parts, and a lot of people get hung up on just expertise. I see a lot of people come to me and say, “But I’m a doctor, and I don’t rank well.” Well, there are more parts to EAT than just expertise, and so we’re going to talk about that. 

1. Expertise

But expertise is very important. In the Quality Raters’ Guidelines, which each of you, if you have not read it yet, you really, really should read this document.

It’s a little bit long, but it’s full of so much good information. The raters are given examples of websites, and they’re told, “This is a high-quality website. This is a low-quality website because of this.” One of the things that they say for one of the posts is this particular page is to be considered low quality because the expertise of the author is not clearly communicated.

Add author bios

So the first clue we can gather from this is that for all of our authors we should have an author bio. Perhaps if you are a nationally recognized brand, then you may not need author bios. But for the rest of us, we really should be putting an author bio that says here’s who wrote this post, and here’s why they’re qualified to do so.

Another example in the Quality Raters’ Guidelines talks about was a post about the flu. What the quality raters were told is that there’s no evidence that this author has medical expertise. So this tells us, and there are other examples where there’s no evidence of financial expertise, and legal expertise is another one. Think about it.

If you were diagnosed with a medical condition, would you want to be reading an article that’s written by a content writer who’s done good research? It might be very well written. Or would you rather see an article that is written by somebody who has been practicing in this area for decades and has seen every type of side effect that you can have from medications and things like that?

Hire experts to fact-check your content

Obviously, the doctor is who you want to read. Now I don’t expect us all to go and hire doctors to write all of our content, because there are very few doctors that have time to do that and also the other experts in any other YMYL profession. But what you can do is hire these people to fact check your posts. We’ve had some clients that have seen really nice results from having content writers write the posts in a very well researched and referenced way, and then they’ve hired physicians to say this post was medically fact checked by Dr. So-and-so. So this is really, really important for any type of site that wants to rank for a YMYL query. 

One of the things that we started noticing, in February of 2017, we had a number of sites that came to us with traffic drops. That’s mostly what we do. We deal with sites that were hit by Google algorithm updates. What we were noticing is a weird thing was happening.

Prior to that, sites that were hit, they tended to have all sorts of technical issues, and we could say, “Yes, there’s a really strong reason why this site is not ranking well.” These sites were all ones that were technically, for the most part, sound. But what we noticed is that, in every instance, the posts that were now stealing the rankings they used to have were ones that were written by people with real-life expertise.

This is not something that you want to ignore. 

2. Authoritativeness

We’ll move on to authoritativeness. Authoritativeness is really very, very important, and in my opinion this is the most important part of EAT. Authoritativeness, there’s another reference in the Quality Raters’ Guidelines about a good post, and it says, “The author of this blog post has been known as an expert on parenting issues.”

So it’s one thing to actually be an expert. It’s another thing to be recognized online as an expert, and this should be what we’re all working on is to have other people online recognize us or our clients as experts in their subject matter. That sounds a lot like link building, right? We want to get links from authoritative sites.

The guide to this information actually tells us that PageRank and EAT are closely connected. So this is very, very important. I personally believe — I can’t prove this just yet — but I believe that Google does not want to pass PageRank through sites that do not have EAT, at least for YMYL queries. This could explain why Google feels really comfortable that they can ignore spam links from negative SEO attacks, because those links would come from sites that don’t have EAT.

Get recommendations from experts

So how do we do this? It’s all about getting recommendations from experts. The Quality Raters’ Guidelines say in several places the raters are instructed to determine what do other experts say about this website, about this author, about this brand. It’s very, very important that we can get recommendations from experts. I want to challenge you right now to look at the last few links that you have gotten for your website and look at them and say, “Are these truly recommendations from other people in the industry that I’m working in? Or are they ones that we made?”

In the past, pretty much every link that we could make would have the potential to help boost our rankings. Now, the links that Google wants to count are ones that truly are people recommending your content, your business, your author. So I did a Whiteboard Friday a couple of years ago that talked about the types of links that Google might want to value, and that’s probably a good reference to find how can we find these recommendations from experts.

How can we do link building in a way that boosts our authoritativeness in the eyes of Google? 

3. Trustworthiness

The last part, which a lot of people ignore, is trustworthiness. People would say, “Well, how could Google ever measure whether a website is trustworthy?” I think it’s definitely possible. Google has a patent. Now we know if there’s a patent, that they’re not necessarily doing this.

Reputation via reviews, blog posts, & other online content

But they do have a patent that talks about how they can gather information about a brand, about an individual, about a website from looking at a corpus of reviews, blog posts, and other things that are online. What this patent talks about is looking at the sentiment of these blog posts. Now some people would argue that maybe sentiment is not a part of Google’s algorithms.

I do think it’s a part of how they determine trustworthiness. So what we’re looking for here is if a business really has a bad reputation, if you have a reputation where people online are saying, “Look, I got scammed by this company.” Or, “I couldn’t get a refund.” Or, “I was treated really poorly in terms of customer service.” If there is a general sentiment about this online, that can affect your ability to rank well, and that’s very important. So all of these things are important in terms of trustworthiness.

Credible, clear contact info on website

You really should have very credible and clear contact information on your website. That’s outlined in the Quality Raters’ Guidelines. 

Indexable, easy-to-find info on refund policies

You should have information on your refund policy, assuming that you sell products, and it should be easy for people to find. All of this information I believe should be visible in Google’s index.

We shouldn’t be no indexing these posts. Don’t worry about the fact that they might be kind of thin or irrelevant or perhaps even duplicate content. Google wants to see this, and so we want that to be in their algorithms. 

Scientific references & scientific consensus

Other things too, if you have a medical site or any type of site that can be supported with scientific references, it’s very important that you do that.

One of the things that we’ve been seeing with recent updates is a lot of medical sites are dropping when they’re not really in line with scientific consensus. This is a big one. If you run a site that has to do with natural medicine, this is probably a rough time for you, because Google has been demoting sites that talk about a lot of natural medicine treatments, and the reason for this, I think, is because a lot of these are not in line with the general scientific consensus.

Now, I know a lot of people would say, “Well, who is Google to determine whether essential oils are helpful or not, because I believe a lot of these natural treatments really do help people?” The problem though is that there are a lot of websites that are scamming people. So Google may even err on the side of caution in saying, “Look, we think this website could potentially impact the safety of users.”

You may have trouble ranking well. So if you have posts on natural medicine, on any type of thing that’s outside of the generally accepted scientific consensus, then one thing you can do is try to show both sides of the story, try to talk about how actually traditional physicians would treat this condition.

That can be tricky. 

Ad experience

The other thing that can speak to trust is your ad experience. I think this is something that’s not actually in the algorithms just yet. I think it’s going to come. Perhaps it is. But the Quality Raters’ Guidelines talk a lot about if you have ads that are distracting, that are disruptive, that block the readers from seeing content, then that can be a sign of low trustworthiness.

“If any of Expertise, Authoritativeness, or Trustworthiness is lacking, use the ‘low’ rating.”

I want to leave you with this last quote, again from the Quality Raters’ Guidelines, and this is significant. The raters are instructed that if any one of expertise, authoritativeness, or trustworthiness is lacking, then they are to rate a website as low quality. Again, that’s not going to penalize that website. But it’s going to tell the Google engineers, “Wait a second. We have these low-quality websites that are ranking for these terms.How can we tweak the algorithm so that that doesn’t happen?”



But the important thing here is that if any one of these three things, the E, the A, or the T are lacking, it can impact your ability to rank well. So hopefully this has been helpful. I really hope that this helps you improve the quality of your websites. I would encourage you to leave a comment or a question below. I’m going to be hanging out in the comments section and answering all of your questions.

I have more information on these subjects at mariehaynes.com/eat and also /trust if you’re interested in these trust issues. So with that, I want to thank you. I really wish you the best of luck with your rankings, and please do leave a question for me below.

Video transcription by Speechpad.com


Feeling like you need a better understanding of E-A-T and the Quality Raters’ Guidelines? You can get even more info from Marie’s full MozCon 2019 talk in our newly released video bundle. Go even more in-depth on what drives rankings, plus access 26 additional future-focused SEO topics from our top-notch speakers:

Grab the sessions now!

Invest in a bag of popcorn and get your whole team on board to learn!

[ad_2]

Source link