2020 Local SEO Success: How to Feed, Fight, and Flip Google



MiriamEllis

Image credit: Migaspinto

If you own or market a business location that makes a real-world community more serviceable, diverse, and strong, I’m on your side.

I love interesting towns and cities, with a wide array of useful goods and services. Nothing in my career satisfies me more than advising any brand that’s determined to improve life quality in some spot on the map. It does my heart good to see it, but here’s my completely unsentimental take on the challenges you face:

The Internet, and Google’s local platforms in particular, are a complete mess.

Google is the biggest house on the local block; you can’t ignore it. Yet, the entries into the platform are poorly lit, the open-source concept is cluttered with spam, and growing litigation makes one wonder if there are bats in the belfry.

Google comprises both risk and tremendous opportunity for local businesses and their marketers. Succeeding in 2020 means becoming a clear-eyed surveyor of any structural issues as well as seeing the “good bones” potential, so that you can flip dilapidation into dollars. And something beyond dollar, too: civic satisfaction.

Grab your tools and get your teammates and clients together to build local success in the new year by sharing my 3-level plan and 4-quarter strategy.

Level 1: Feed Google

Image credit: Mcapdevila

Information about your business is going to exist on the Internet whether you put it there or not.

Google’s house may be structurally unsound, but it’s also huge, with a 90% search engine market share globally and over 2 trillion searches per year, 46% of which are for something local.

Residents, new neighbors, and travelers seeking what you offer will almost certainly find something about your company online, whether it’s a stray mention on social media, an unclaimed local business listing generated by a platform or the public, or a full set of website pages and claimed listings you’ve actively published.

Right now, running the most successful local business possible means acquiring the largest share you can of those estimated 1 trillion annual local searches. How do you do this? 

By feeding Google:

  • Website content about your business location, products, services, and attributes
  • Corroborating info about your company on other websites
  • Local business listing content
  • Image content
  • Video content
  • Social media content

Remember, without your content and the content of others, Google does not exist. Local business owners can often feel uncomfortably dependent on Google, but it’s really Google who is dependent on them.

Whether the business you’re marketing is small or large, declare 2020 the year you go to the drafting board to render a clear blueprint for a content architecture that spans your entire neighborhood of the Internet, including your website and relevant third-party sites, platforms, and apps. Your plans might look something like this:

Image detailing the architecture of local SEO, including what you should put on GMB, website, and via 3rd parties (all detailed in text below)

I recommend organizing your plan like this, making use of the links I’m including:

  1. Begin with a rock-solid foundation of business information on your website. Tell customers everything they could want to know to choose and transact with your business. Cover every location, service, product, and desirable attribute of your company. There’s no chance you won’t have enough to write about when you take into account everything your customers ask you on a daily basis + everything you believe makes your company the best choice in the local market. Be sure the site loads fast, is mobile-friendly, and as technically error-free as possible.
  2. Create a fully complete, accurate, guideline-abiding Google My Business listing for each location of your business.
  3. Build out your listings (aka structured citations) on the major platforms. Automate the work of both developing and monitoring them for sentiment and change via a product like Moz Local.
  4. Monitor and respond to all reviews as quickly as possible on all platforms. These equal your online reputation and are, perhaps, the most important content about your business on the Internet. Know that reviews are a two-way conversation and learn to inspire customers to edit negative reviews. Moz Local automates review monitoring and facilitates easy responses. If you need help earning reviews, check out Alpine Software Group’s two good products: GatherUp and Grade.Us.
  5. Audit your competition. In competitive markets, come check out our beta of Local Market Analytics for a multi-sampled understanding of who your competitors actually are for each location of your business, depending on searcher locale.
  6. Once you’ve found your competitors, audit them to understand the:
    1. quality, authority and rate of ongoing publication you need to surpass
    2. strength and number of linked unstructured citations you need to build
    3. number and quality of Google posts, videos, products, and other content you need to publish
    4. social engagement you need to create.
  7. As to the substance of your content, focus directly on your customers’ needs. Local Market Analytics is breaking ground in delivering actual local keyword volumes, and the end point of all of your research, whether via keyword tools, consumer surveys, or years of business experience, should be content that acts as customer service, turning seekers into shoppers.
  8. Use any leftover time to sketch in the finer details. For example, I’m less excited about schema for 2020 than I was in 2019 because of Google removing some of the benefits of review schema. Local business schema is still a good idea, though, if you have time for it. Meanwhile, pursuing relevant featured snippets could certainly be smart in the new year. I’d go strong on video this year, particularly YouTube, if there’s applicability and demand in your market.

The customer is the focus of everything you publish. Google is simply the conduit. Your content efforts may need to be modest or major to win the greatest possible share of the searches that matter to you. It depends entirely on the level of competition in your markets. Find that level, know your customers, and commit to feeding Google a steady, balanced diet of what they say they want so that it can be conveyed to the people you want to serve.

Level 2: Fight Google

Image credit: Scott Lewis

Let’s keep it real: ethical local companies which pride themselves on playing fair have good reason to be dubious about doing business with Google. Once you’ve put in the effort to feed Google all the right info to begin competing for rankings, you may well find yourself having to do online battle on an ongoing basis.

There are two fronts on which many people end up grappling with Google:

  • Problematic aspects within products
  • Litigation and protests against the brand.

Let’s break these down to prepare you:

Product issues

Google has taken on the scale of a public utility — one that’s replaced most of North America’s former reliance on telephone directories and directory assistance numbers.

Google has 5 main local interfaces: local packs, local finders, desktop maps, mobile maps and the Google Maps app. It’s been the company’s decision to allow these utilities to become polluted with misinformation in the form of listing and review spam, and irrelevant or harmful user-generated content. Google does remove spam, but not at the scale of the issue, which is so large that global networks of spammers are have sprung up to profit from the lack of quality control and failure to enforce product guidelines.

When you are marketing a local business, there’s a strong chance you will face one or more of the following issues while attempting to compete in Google’s local products:

  • Being outranked by businesses violating Google’s own guidelines with practices such as keyword-stuffed business titles and creating listings to represent non-existent locations or lead-gen companies. (Example)
  • Being the target of listing hijacking in which another company overtakes some aspect of your listing to populate it with their own details. (Example)
  • Being the target of a reputation attack by competitors or members of the public posting fake negative reviews of your business. (Example)
  • Being the target of negative images uploaded to your listing by competitors or the public. (Example)
  • Having Google display third-party lead-gen information on your listings, driving business away from you to others. (Example)
  • Having Google randomly experiment with local features with direct negative impacts on you, such as booking functions that reserve tables for your patrons without informing your business. (Example)
  • Being unable to access adequately trained Google staff or achieve timely resolution when things go wrong (Example)

These issues have real-world impacts. I’ve seen them misdirect and scam countless consumers including those having medical and mental health emergency needs, kill profits during holiday shopping seasons for companies, cause owners so much loss that they’ve had to lay off staff, and even drive small brands out of business.

Honest local business owners don’t operate this way. They don’t make money off of fooling the public, or maliciously attack neighboring shops, or give the cold shoulder to people in trouble. Only Google’s underregulated monopoly status has allowed them to stay in business while conducting their affairs this way.

Outlook issues

Brilliant people work for Google and some of their innovations are truly visionary. But the Google brand, as a whole, can be troubling to anyone firmly tied to the idea of ethical business practices. I would best describe the future of Google, in its present underregulated state of monopoly, as uncertain.

In their very short history, Google has been:

I can’t predict where all this is headed. What I do know is that nearly every local business I’ve ever consulted with has been overwhelmingly reliant on Google for profits. Whether you personally favor strong regulation or not, I recommend that every local business owner and marketer keep apprised of the increasing calls by governing bodies, organizations, and even the company’s own staff to break Google up, tax it, end contracts on the basis of human rights, and prosecute it over privacy, antitrust, and a host of other concerns.

Pick your battles

With Google so deeply embedded in your company’s online visibility, traffic, reputation and transactions, concerns with the brand and products don’t exist in some far-off place; they are right on your own doorstep. Here’s how to fight well:

1. Fight the spam

To face off with Google’s local spam, earn/defend the rankings your business needs, and help clean polluted SERPs up for the communities you serve, here are my best links for you:

2. Stay informed

If you’re ready to move beyond your local premises to the larger, ongoing ethical debate surrounding Google, here are my best links for you:

Whether your degree of engagement goes no further than local business listings or extends to your community, state, nation, or the world, I recommend increased awareness of the whole picture of Google in 2020. Education is power.

Level 3: Flip Google

Image credit: Province of British Columbia

You’ve fed Google. You’ve fought Google. Now, I want you to flip this whole scenario to your advantage.

My 2020 local SEO blueprint has you working hard for every customer you win from the Internet. So far, the ball has been almost entirely in Google’s court, but when all of this effort culminates in a face-to-face meeting with another human being, we are finally at your party under your roof, where you have all the control. This is where you turn Internet-driven customers into in-store keepers.

I encourage you to make 2020 the year you draft a strategy for making a larger portion of your sales as Google-independent as possible, flipping their risky edifice into su casa, built of sturdy bricks like community, pride, service, and loyalty.

How can you do this? Here’s a four-quarter plan you can customize to fit your exact business scenario:

Q1: Listen & learn

Image credit: Chris Kiernan, Small Business Saturday

The foundation of all business success is giving the customer exactly what they want. Hoping and guessing are no substitute for a survey of your actual customers.

If you already have an email database, great. If not, you could start collecting one in Q1 and run your survey at the end of the quarter when you have enough addresses. Alternatively, you could ask each customer if they would kindly take a very short printed survey while you ring up their purchase.

Imagine you’re marketing an independent bookstore. Such a survey might look like this, whittled down to just the data points you most want to gather from customers to make business decisions:

Have pens ready and a drop box for each customer to deposit their card. Make it as convenient and anonymous as possible, for the customer’s comfort.

In this survey and listening phase of the new year, I also recommend that you:

  1. Spend more time as the business owner speaking directly to your customers, really listening to their needs and complaints and then logging them in a spreadsheet. Speak with determination to discover how your business could help each customer more.
  2. Have all phone staff log the questions/requests/complaints they receive.
  3. Have all floor/field staff log the questions/requests/complaints they receive.
  4. Audit your entire online review corpus to identify dominant sentiment, both positive and negative
  5. If the business you’re marketing is large and competitive, now is the time to go in for a full-fledged consumer analysis project with mobile surveys, customer personae, etc.

End of Q1 Goal: Know exactly what customers want so that they’ll come to us for repeat business without any reliance on Google.

Q2: Implement your ready welcome

Image credit: Small Business Week in BC

In this quarter, you’ll implement as many of the requests you’ve gleaned from Q1 as feasible. You’ll have put solutions in place to rectify any complaint themes, and will have upped your game wherever customers have called for it.

In addition to the fine details of your business, large or small, life as a local SEO has taught me that these six elements are basic requirements for local business longevity:

  1. A crystal-clear USP
  2. Consumer-centric policies
  3. Adequate, well-trained, personable staff
  4. An in-demand inventory of products/services
  5. Accessibility for complaint resolution
  6. Cleanliness/orderliness of premises/services

The lack of any of these six essentials results in negative experiences that can either cause the business to shed silent customers in person or erode online reputation to the point that the brand begins to fail.

With the bare minimums of customers’ requirements met, Q2 is where we get to the fun part. This is where you take your basic USP and add your special flourish to it that makes your brand unique, memorable, and desirable within the community you serve.

A short tale of two yarn shops in my neck of the woods: At shop A, the premises are dark and dusty. Customer projects are on display, but aren’t very inspiring. Staff sits at a table knitting, and doesn’t get up when customers enter. At shop B, the lighting and organization are inviting, displayed projects are mouthwatering, and though the staff here also sits at a table knitting, they leap up to meet, guide, and serve. Guess which shop now knows me by name? Guess which shop has staff so friendly that they have lent me their own knitting needles for a tough project? Guess which shop I gave a five-star review to? Guess where I’ve spent more money than I really should?

This quarter, seek vision for what going above-and-beyond would look like to your customers. What would bring them in again and again for years to come? Keep it in mind that computers are machines, but you and your staff are people serving people. Harness human connection.

End of Q2 Goal: Have implemented customers’ basic requests and gone beyond them to provide delightful human experiences Google cannot replicate.

Q3: Participate, educate, appreciate

Now you know your customers, are meeting their specified needs, and doing your best to become one of their favorite businesses. It’s time to walk out your front door into the greater community to see where you can make common cause with a neighborhood, town, or city, as a whole.

2020 is the year you become a joiner. Analyze all of the following sources at a local level:

  • Print and TV news
  • School newsletters and papers
  • Place of worship newsletters and bulletins
  • Local business organization newsletters
  • Any form of publication surrounding charity, non-profits, activism, and government

Create a list of the things your community worries about, cares about, and aspires to. For example, a city near me became deeply involved in a battle over putting an industrial plant in a wetland. Another town is fundraising for a no-kill animal shelter and a walk for Alzheimer’s. Another is hosting interfaith dinners between Christians and Muslims.

Pick the efforts that feel best to you and show up, donate, host, speak, sponsor, and support in any way you can. Build real relationships so that the customers coming through your door aren’t just the ones you sell to, but the ones you’ve manned a booth with on the 4th of July, attended a workshop with, or cheered with at their children’s soccer match. This is how community is made.

Once you’re participating in community life, it’s time to educate your customers about how supporting your business makes life better in the place they live (get a bunch of good stats on this here). Take the very best things that you do and promote awareness of them face-to-face with every person you transact with.

For my fictitious bookseller client, just 10 minutes spent on Canva (you have to try Canva!) helped me whip together this free flyer I could give to every customer, highlighting stats about how supporting independent businesses improve communities:

Example of a flyer to give to customers thanking them for shopping local

If you’re marketing a larger enterprise, a flyer like this could focus on green practices you’re implementing at scale, philanthropic endeavors, and positive community involvement.

Finally, with the holiday season fast approaching in the coming quarter, this is the time to let customers know how much you appreciate their business. Recently, I wrote about businesses turning kindness into a form of local currency. Brands are out there delivering surprise flowers and birthday cakes to customers, picking them up when they’re stranded on roadsides, washing town signage, and replacing “you will be towed” plaques with ones that read “you’re welcome to park here.” Loyalty programs, coupons, discounts, sales, free events, parties, freebies, and fun are all at your disposal to say “Thank you, please come again!” to your customers.

End of Q3 Goal: Have integrated more deeply into community life, motivated customers to choose our business for aspirational reasons beyond sales, and have offered memorable acts of gratitude for their business, completely independent of Google.

Q4: Share customers and sell

Screenshot of local business allies spreadsheet

Every year, local consumer surveys indicate that 80–90% of people trust online reviews as much as they trust recommendations from friends and family. But I’ve yet to see a survey poll how much people trust recommendations they receive from trustworthy business owners.

You spent all of Q3 becoming a true ally to your community, getting personally involved in the struggles and dreams of the people you serve. At this point, if you’ve done a good job, the people who make up your brand have come closer to deserving the word “friend” from customers. As we move into Q4, it’s time to deepen alliances — this time with related local businesses.

In the classic movie Miracle on 34th Street, the owners of Macy’s and Gimbel’s begin sending shoppers to one another when either business lacks what the customer wants. They even create catalogues of their competitors’ inventory to assist with these referrals. In Q3, I’m hoping you joined a local business alliance that’s begun to acquaint you with other brands that feature goods/service that relate to yours so that you can begin dedicated outreach.

Q4, with Black Friday and Small Business Saturday, is traditionally the quarter in which local businesses expect to get out of the red, but how many more wedding cakes would you sell if all the caterers in town were referring to you, how many more tires would you vend if the muffler shops sent all their customers your way, how many more therapeutic massages might you book if every holistic medical center in your city confidently gave out your name?

Formalize B2B customer referrals in this quarter in seven easy steps:

  1. Create a spreadsheet headed with your contact information and an itemized list of the main goods, services, and brands you sell. Include specialties of your business. Create additional rows to be filled out with the information of other businesses.
  2. Create a list of every local business that could tie in with yours in any way for a customer’s needs.
  3. Invite the owners or qualified reps of each business on your list to a meeting at a neutral location, like a community center or restaurant.
  4. Bring your spreadsheet to the meeting.
  5. Discuss with your guests how a commitment to sharing customers will benefit all of you
  6. If others commit, have them fill out their column of the spreadsheet. Share print and digital copies with all participants.
  7. Whenever a customer asks for something you don’t offer, refer to the spreadsheet to make a recommendation. Encourage your colleagues to do likewise, and to train staff to use the spreadsheet to increase customer sharing and satisfaction.

Make a copy of my free Local Business Allies spreadsheet!

Q4 Goal: Make this the best final quarter yet by sharing customers with local business allies, decreasing dependence on Google for referrals.

Embrace truth and dare to draw the line

Image credit: TCDavis

House flipping is a runaway phenomenon in the US that has remodeled communities and sparked dozens of hit TV shows. Unfortunately, there’s a downside to the activity, as it can create negative gentrification, making life less good for residents.

You need have no fear of this when you flip Google, because turning their house into yours actually strengthens your real-world neighborhood, town, or city. It gives the residents who already live there more stable resources, more positive human contact, and a more closely knit community.

Truth: Google will remain dominant in the discovery-related phases of your consumers’ journeys for the foreseeable future. For new neighbors and travelers, Google will remain a valuable source of your business being found in the first place. Even if governing bodies break the company up at some point, the truth is that most local businesses need to utilize Google a search utility for discovery.

Dare: Draw a line on the pavement outside your front door this year, with transactional experiences on your side of the line. Google wants to own the transaction phase of your customers’ journey. Bookings, lead gen, local ads, and related features show where they are headed with this. If Google could, I’m sure they’d be glad to take a cut of every sale you make, and you’ll likely have to participate in their transactional aspirations to some degree. But…

In 2020, dare yourself to turn every customer you serve into a keeper, cutting out Google as the middleman wherever you can and building a truly local, regenerative base of loyalty, referrals, and community.

Wishing you a local 2020 of daring vision and self-made success!





Source link

Better Site Speed: 4 Outside-the-Box Ideas



Tom-Anthony

Most of us have done site speed audits, or seen audits done by others. These can be really helpful for businesses, but I often find they’re quite narrow in focus. Typically we use well-known tools that throw up a bunch of things to look at, and then we dive into things from there.

However, if we dig deeper, there are often other ideas on how site speed can be improved. I often see plenty of opportunities that are never covered in site speed audits. Most site speed improvements are the result of a bunch of small changes, and so in this post I’m going to cover a few ideas that I’ve never seen in any site speed audit, all of which can make a difference.

A different angle on image optimization

Consider optimized SVGs over PNGs

I was recently looking to book some tickets to see Frozen 2 (because of, erm, my kids…) and so landed on this page. It makes use of three SVG images for transport icons:

SVG images are vector images, so they’re well-suited for things like icons; if you have images displayed as PNGs you may want to ask your designers for the original SVGs, as there can be considerable savings. Though not always better, using an SVG can save 60% of the filesize.

In this case, these icons come in at about 1.2k each, so they are quite small. They would probably fly under the radar of site speed audits (and neither Page Speed Insights or GTMetrix mention these images at all for this page).

So you may be thinking, “They’re less than 5k combined — you should look for bigger issues!”, but let’s take a look. Firstly, we can run them all through Jake Archibald’s SVG compression tool; this is a great free tool and on larger SVGs it can make a big difference.

In this case the files are small, so you may still be thinking “Why bother?” The tool compresses them without any loss in quality from ~1240 bytes to ~630 bytes — a good ratio but not much of an overall saving.

However… now that we’ve compressed them, we can think differently about delivering them…

Inline images

GTMetrix makes recommendations around inlining small bits of CSS or JS, but doesn’t mention inlining images. Images can also be inlined, and sometimes this can be the right approach.

If you consider that even a very small image file requires a complete round trip (which can have a very real impact on speed), even for small files this can take a long time. In the case of the Cineworld transport images above, I simulated a “Fast 3G” connection and saw:

The site is not using HTTP2 so there is a long wait period, and then the image (which is 1.2kb) takes almost 600ms to load (no HTTP2 also means this is blocking other requests). There are three of these images, so between them they can be having a real impact on page speed.

However, we’ve now compressed them to only a few hundred bytes each, and SVG images are actually made up of markup in a similar fashion to HTML:

You can actually put SVG markup directly into an HTML document!

If we do this with all three of the transport images, the compressed HTML for this page that is sent from the server to our browser increases from 31,182 bytes to 31,532 bytes — an increase of only 350 bytes for all 3 images!

So to recap:

  • Our HTML request has increased 350 bytes, which is barely anything
  • We can discard three round trips to the server, which we can see were taking considerable time

Some of you may have realized that if the images were not inline they could be cached separately, so future page requests wouldn’t need to refetch them. But if we consider:

  • Each image was originally about 1.5kb over the network (they aren’t gzipping the SVGs), with about 350 bytes of HTTP headers on top for a total of about 5.5kb transferred. So, overall we’ve reduced the amount of content over the network.
  • This also means that it would take over 20 pageviews to benefit from having them cached.

Takeaway: Consider where there are opportunities to use SVGs instead of PNGs.

Takeaway: Make sure you optimize the SVG images, use the free tool I linked to.

Takeaway: Inlining small images can make sense and bring outsized performance gains.

Note: You can also inline PNGs — see this guide.

Note: For optimized PNG/JPG images, try Kraken.

Back off, JavaScript! HTML can handle this…

So often nowadays, thanks to the prevalence of JavaScript libraries that offer an off-the-shelf solution, I find JavaScript being used for functionality that could be achieved without it. More JS libraries means more to download, maybe more round trips for additional files from the server, and then the JavaScript execution time and costs themselves.

I have a lot of sympathy for how you get to this point. Developers are often given poor briefs/specs that fail to specify anything about performance, only function. They are often time-poor and so it’s easy to end up just dropping something in.

However, a lot of progress has been made in terms of the functionality that can be achieved with HTML and or CSS. Let’s look at some examples.

Combo box with search

Dropdown boxes that have a text search option are a fairly common interface element nowadays. One recent article I came across described how to use the Select2 Javascript library to make such a list:

It is a useful UI element, and can help your users. However, in the Select2 library is a JavaScript library, which in turn relies on some CSS and the JQuery library. This means three round trips to collect a bunch of files of varying sizes:

  • JQuery – 101kb
  • Select2 JavaScript – 24kb
  • Select2 CSS – 3kb

This is not ideal for site speed, but we could certainly make the case it is worth it in order to have a streamlined interface for users.

However, it is actually possible to have this functionality out of the box with the HTML datalist element:

This allows the user to search through the list or to free type their own response, so provides the same functionality. Furthermore, it has a native interface on smartphones!

You can see this in action in this codepen.

Details/Summary

LonelyPlanet has a beautiful website, and I was looking at this page about Spain, which has a ‘Read More’ link that most web users will be familiar with:

Like almost every implementation of this that I see, they have used a JavaScript library to implement this, and once again this comes with a bunch of overheads.

However, HTML has a pair of built-in tags called details and summary, which are designed to implement this functionality exactly. For free and natively in HTML. No overheads, and more accessible for users needing a screen reader, while also conveying semantic meaning to Google.

These tags can be styled in various flexible ways with CSS and recreate most of the JS versions I have seen out there.

Check out a simple demo here: https://codepen.io/TomAnthony/pen/GRRLrmm

…and more

For more examples of functionality that you can achieve with HTML instead of JS, check out these links:

  • http://youmightnotneedjs.com/
  • https://dev.to/ananyaneogi/html-can-do-that-c0n

Takeaway: Examine the functionality of your sites and see where there may be opportunities to reduce your reliance on large Javascript libraries where there are native HTML/CSS options.

Takeaway: Remember that it isn’t only the size of the JS files that is problematic, but the number of round trips that are required.

Note: There are cases where you should use the JS solution, but it is important to weigh up the pros and cons.

Networking tune-ups

Every time the browser has to collect resources from a server, it has to send a message across the internet and back; the speed of this is limited by the speed of light. This may sound like a ridiculous thing to concern ourselves with, but it means that even small requests add time to the page load. If you didn’t catch the link above, my post explaining HTTP2 discusses this issue in more detail.

There are some things we can do to help either reduce the distance of these requests or to reduce the number of round trips needed. These are a little bit more technical, but can achieve some real wins.

TLS 1.3

TLS (or SSL) is the encryption technology used to secure HTTPS connections. Historically it has taken two round trips between the browser and the server to setup that encryption — if the user is 50ms away from the server, then this means 200ms per connection. Keep in mind that Google historically recommends aiming for 200ms to deliver the HTML (this seems slightly relaxed in more recent updates); you’re losing a lot of that time here.

The recently defined TLS 1.3 standard reduces this from two round trips to just one, which can shave some precious time off the users initial connection to your website.

Speak to your tech team about migrating to TLS 1.3; browsers that don’t support it will fallback to TLS 1.2 without issue. All of this is behind the scenes and is not a migration of any sort. There is no reason not to do this.

If you are using a CDN, then it can be as simple as just turning it on.

You can use this tool to check which versions of TLS you have enabled.

QUIC / HTTP 3

Over the last 2-3 years we have seen a number of sites move from HTTP 1.1 to HTTP 2, which is a behind-the-scenes upgrade which can make a real improvement to speed (see my link above if you want to read more).

Right off the back of that, there is an emerging pair of standards known as QUIC + HTTP/3, which further optimize the connection between the browser and the server, further reducing the round trips required.

Support for these is only just beginning to become viable, but if you are a CloudFlare customer you can enable that today and over the coming 6 months as Chrome and Firefox roll support out, your users will get a speed boost.

Read more here: https://blog.cloudflare.com/http3-the-past-present-and-future/

Super routing

When users connect to your website, they have to open network connections from wherever they are to your servers (or your CDN). If you imagine the internet as a series of roads, then you could imagine they need to ‘drive’ to your server across these roads. However, that means congestion and traffic jams.

As it turns out, some of the large cloud companies have their own private roads which have fewer potholes, less traffic, and improved speed limits. If only your website visitors could get access to these roads, they could ‘drive’ to you faster!

Well, guess what? They can!

For CloudFlare, they provide this access via their Argo product, whereas if you are on AWS at all then you can use their Global Accelerator. This allows requests to your website to make use of their private networks and get a potential speed boost. Both are very cheap if you are already customers.

Takeaway: A lot of these sorts of benefits are considerably easier to get if you’re using a CDN. If you’re not already using a CDN, then you probably should be. CloudFlare is a great choice, as is CloudFront if you are using AWS. Fastly is the most configurable of them if you’re more of a pro.

Takeaway: TLS 1.3 is now very widely supported and offers a significant speed improvement for new connections.

Takeaway: QUIC / HTTP3 are only just starting to get support, but over the coming months this will roll out more widely. QUIC includes the benefits of TLS 1.3 as well as more. A typical HTTP2 connection nowadays needs 3 round trips to open; QUIC needs just one!

Takeaway: If you’re on CloudFlare or AWS, then there is potential to get speed ups just from flipping a switch to turn on smart routing features.

Let CSS do more

Above I talked about how HTML has built-in functionality that you can leverage to save relying on solutions that are ‘home-rolled’ and thus require more code (and processing on the browsers side) to implement. Here I’ll talk about some examples where CSS can do the same for you.

Reuse images

Often you find pages that are using similar images throughout the page in several places. For example, variations on a logo in different colors, or arrows that point in both directions. As unique assets (however similar they may be), each of these needs to be downloaded separately.

Returning to my hunt for cinema tickets above, where I was looking at this page, we can see a carousel that has left and right arrows:

Similarly to the logic used above, while these image files are small, they still require a round trip to fetch from the server.

However, the arrows are identical — just pointing in opposite directions! It’s easy for us to use CSS’s transform functionality to use one image for both directions:

You can check out this codepen for an example.

Another example is when the same logo appears in different styles on different parts of the page; often they will load multiple variations, which is not necessary. CSS can re-color logos for you in a variety of ways:

There is a codepen here showing this technique in action. If you want to calculate the CSS filter value required to reach an arbitrary color, then check out this amazing color calculator.

Interactions (e.g. menus & tabs)

Often navigation elements such as menus and tabs are implemented in JavaScript, but these too can be done in pure CSS. Check out this codepen for an example:

Animations

CSS3 introduced a lot of powerful animation capability into CSS. Often these are not only faster than JavaScript versions, but can also be smoother too as they can run in the native code of the operating system rather than having to execute relatively slower Javascript.

Check out Dozing Bird as one example:

You can find plenty more in this article. CSS animations can add a lot of character to pages at a relatively small performance cost.

…and more

For more examples of functionality that you can achieve using pure CSS solutions, take a look at:

  • http://youmightnotneedjs.com/
  • https://dev.to/ananyaneogi/css-can-do-that-18g7m

Takeaway: Use CSS to optimize how many files you have to load using rotations or filters.

Takeaway: CSS animations can add character to pages, and often require less resources than JavaScript.

Takeaway: CSS is perfectly capable of implementing many interactive UI elements.

Wrap up

Hopefully you’ve found these examples useful in themselves, but the broader point I want to make is that we should all try to think a bit more out of the box with regards to site speed. Of particular importance is reducing the number of round trips needed to the server; even small assets take some time to fetch and can have an appreciable impact on performance (especially mobile).

There are plenty more ideas than we’ve covered here, so please do jump into the comments if you have other things you have come across.



Source link

They’re the Best Around: The Top 25 Moz Blog Posts of 2019



FeliciaCrawford

Well, folks, it’s that time of year again. It’s hard to believe we’ve already gone another turn around the ol’ sun. But I’ve consulted my analytics data and made my SQL queries, and I’m here today to present to you the list of the top Moz Blog posts of 2019!

For a little perspective, we published 207 blog posts, averaging out to about 4 per week. Out of those 207, the twenty-five I’m sharing with you below were the most-read pieces of the year. If you’re strapped for time (and who isn’t in our industry?), survey says these are the articles that aren’t to be missed. And bonus — a good chunk of them are videos, so bring out the chocolate popcorn and settle down to watch!

(If chocolate popcorn sounds new and unfamiliar to you, I implore you to check out the Cinerama in Seattle’s Belltown neighborhood the next time you’re in town for MozCon. It is life-changing. Get the mix of regular and chocolate and never, ever look back.)

I’ll be sharing the top keywords each post ranks for according to Keyword Explorer, too, to give you some idea of why these posts have continued to be favorites throughout the year. Gotta love that “Explore by Site” feature — it makes my job way too easy sometimes! 😉

(For the Keyword Explorer nerds in the audience, I’ll be filtering the rankings to positions 1–3 and organizing them by highest monthly search volume. I want to see what we’re ranking highly for that gets lots of eyeballs!)

Ready to get started? I sure am. Let’s dive in.


The top 25 Moz Blog posts of 2019

1. On-Page SEO for 2019 – Whiteboard Friday

Britney Muller, January 4th

57,404 reads

Top keywords according to Keyword Explorer: seo 2019 (#3, 501–850), seo best practices 2019 (#3, 501–850), homepage seo 2019 (#1, 0–10)

On-page SEO has long been a favorite topic for y’all, and the top number-one winner, winner, chicken dinner post of 2019 reflects that loud and proud. In this expert checklist, Britney Muller shares her best tips for doing effective on-page SEO for 2019.

And if you want a hint on one reason this puppy has been so popular, check out #10 in this very list. 😉

2. The 60 Best Free SEO Tools [100% Free]

Cyrus Shepard, June 10th

51,170 reads

Top keywords according to Keyword Explorer: seo tools (#1, 6.5k–9.3k), free seo tools (#1, 1.7k–2.9k), free seo (#1, 501–850)

This post is a testament to the power of updating and republishing your best content. Cyrus originally authored this post years ago and gave it a sorely needed update in 2019. There are literally hundreds of free SEO tools out there, so this article focused on only the best and most useful to add to your toolbox.

3. The Ultimate Guide to SEO Meta Tags

Kate Morris, July 24th

42,276 reads

Top keywords according to Keyword Explorer: seo meta tags (#1, 501–850), 1-page meta (#2, 501–850), what are meta tags (#3, 501–850)

Here’s another vote for the power of republishing really good content that you know your audience craves. Originally published in November 2010, this is the second time we’ve asked Kate to update this article and it continues to deliver value ten years later. SEO certainly changes, but some topics remain popular and necessary throughout all the ups and downs.

4. The One-Hour Guide to SEO

Rand Fishkin, throughout 2019

41,185 reads for the first post (143,165 for all six combined)

Top keywords according to Keyword Explorer: moz seo guide (#2, 201–500), moz beginners guide to seo (#3, 101–200), moz guide to seo (#2, 11–50)

A “best of the Moz Blog” list wouldn’t be complete without Rand! His six-part video series detailing all the most important things to know about SEO was originally published on the Moz Blog as six separate Whiteboard Fridays. We’ve since redirected those posts to a landing page in our Learning Center, but the first episode on SEO strategy earned over 41k unique pageviews in its time live on the blog.

5. A New Domain Authority Is Coming Soon: What’s Changing, When, & Why

Russ Jones, February 5th

38,947 reads

Top keywords according to Keyword Explorer: moving a 60 da to a 90 da seo (#1, 0–10), moz da update 2019 (#1, 0–10), upcoming domain change (#1, 0–10)

When we upgraded our Domain Authority algorithm in March, we knew it would be a big deal for a lot of people — so we put extra effort into education ahead of the launch. Russ’s initial announcement post introducing the coming changes was the foremost source for information, earning ample attention as a result.

6. How Google Evaluates Links for SEO [20 Graphics]

Cyrus Shepard, July 1st

38,715 reads

Top keywords according to Keyword Explorer: free google picture of created equal (#2, 0–10), google 1 page 2 links (#2, 0–10), google top rankingillustrations (#2, 0–10)

All right, I admit it: we did a ton of content updating and republishing this year. And it seriously paid off. Cyrus revamped a perennially popular post by Rand from 2010, bumping it from ten graphics to twenty and giving it a much-needed refresh almost a decade after the original post. The top keywords are kind of weird, right? Check out the title on the original post — looks like we’ve got a little work to do with this one to get it ranking for more relevant terms!

7. Do Businesses Really Use Google My Business Posts? A Case Study

Ben Fisher, February 12th

32,938 reads

Top keywords according to Keyword Explorer:  google my business posts (#2, 201–500), how to post on google my business (#3, 101–200), google business post (#3, 51–100)

Even a couple of years after Google My Business Posts became an option, it wasn’t clear how many businesses are actually using them. Ben Fisher asked the important questions and did the legwork to find the answers in this case study that examined over 2,000 GMB profiles.

8. Announcing the New Moz SEO Essentials Certificate: What It Is & How to Get Certified

Brian Childs, May 1st

32,434 reads

Top keywords according to Keyword Explorer: moz certification (#3, 101–500), moz seo certification (#2, 51–100), moz academy (#3, 51–100)

One of our most-asked questions from time immemorial was “Does Moz offer an SEO certification?” With the launch of our SEO Essentials certificate in May of this year, the answer finally became yes! 

9. Optimizing for Searcher Intent Explained in 7 Visuals

Rand Fishkin, March 23rd

29,636 reads

Top keywords according to Keyword Explorer: user intent moz (#2, 0–10)

What does it mean to target the “intent” of searchers rather than just the keyword(s) they’ve looked up? These seven short visuals explain the practice of intent-targeting and optimization.

10. 7 SEO Title Tag Hacks for Increased Rankings + Traffic – Best of Whiteboard Friday

Cyrus Shepard, June 7th

26,785 reads

Top keywords according to Keyword Explorer: title tags for landing page (#2, 11–50), moz free hack (#1, 0–10),  title tag hacks (#1, 0–10)

Title tags can have a huge impact on your click-through rates when optimized correctly. In this Whiteboard Friday, Cyrus shares how to use numbers, dates, questions, top referring keywords, and more to boost your CTR, traffic, and rankings.

11. E-A-T and SEO: How to Create Content That Google Wants

Ian Booth, June 4th

25,681 reads

Top keywords according to Keyword Explorer: eat seo (#2, 201–500), eat google (#2, 51–100), eat google seo (#1, 11–50)

Ian Booth covers the three pillars of E-A-T and shares tips on how to incorporate each into your content strategy so that you can rank for the best search terms in your industry.

12. 10 Basic SEO Tips to Index + Rank New Content Faster – Whiteboard Friday

Cyrus Shepard, May 17th

24,463 reads

Top keywords according to Keyword Explorer: how to index a link faster (#2, 11–50), blog seo index (#1, 0–10),  fast on-demand seo (#2, 0–10)

When you publish new content, you want users to find it ranking in search results as fast as possible. Fortunately, there are a number of tips and tricks in the SEO toolbox to help you accomplish this goal. Sit back, turn up your volume, and let the Cyrus Shepard show you exactly how in this episode of Whiteboard Friday.

13. Page Speed Optimization: Metrics, Tools, and How to Improve – Whiteboard Friday

Britney Muller, February 1st

24,265 reads

Top keywords according to Keyword Explorer: page speed optimization (#1, 51–100),  page speed metrics (#3, 11–50), optimize page speed (#1, 0–10)

What are the most crucial things to understand about your site’s page speed, and how can you begin to improve? In this edition of Whiteboard Friday, Britney Muller goes over what you need to know to get started.

14. How Google’s Nofollow, Sponsored, & UGC Links Impact SEO

Cyrus Shepard, September 10th

24,262 reads

Top keywords according to Keyword Explorer:  how to send my publishers no follow links (#1, 0–10), moz nofollow links (#2, 0–10), rel= sponsored (#2, 0–10)

Google shook up the SEO world by announcing big changes to how publishers should mark nofollow links. The changes — while beneficial to help Google understand the web — nonetheless caused confusion and raised a number of questions. We’ve got the answers to many of your questions here.

15. How to Identify and Tackle Keyword Cannibalization in 2019

Samuel Mangialavori, February 11th

21,871 reads

Top keywords according to Keyword Explorer: keyword cannibalization (#2, 201–500), ahrefs keyword cannibalization (#3, 11–50), what is keyword cannibalization (#3, 11–50)

Keyword cannibalization is an underrated but significant problem, especially for sites that have been running for several years and end up having lots of pages. In this article, learn how to find and fix keyword cannibalization before it impacts your SEO opportunities.

16. How Bad Was Google’s Deindexing Bug?

Dr. Pete, April 11th

17,831 reads

Top keywords according to Keyword Explorer: google de-indexing again (#2, 11–50), google index bug (#3, 11–50)

On Friday, April 5, Google confirmed a bug that was causing pages to be deindexed. Our analysis suggests that roughly 4% of stable URLs fell out of page-1 rankings on April 5, and that deindexing impacted a wide variety of websites.

17. What Is BERT? – Whiteboard Friday

Britney Muller, November 8th

16,797 reads

Top keywords according to Keyword Explorer: what is bert (#2, 11–50), moz wbf (#2, 0–10)

There’s a lot of hype and misinformation about the newest Google algorithm update. What actually is BERT, how does it work, and why does it matter to our work as SEOs? Join our own machine learning and natural language processing expert Britney Muller as she breaks down exactly what BERT is and what it means for the search industry.

18. How Do I Improve My Domain Authority (DA)?

Dr. Pete, April 17th

16,478 reads

Top keywords according to Keyword Explorer: how to build domain authority (#2, 501–850), how to increase domain authority (#2, 501–850), how to improve domain authority (#1, 11–50)

Written to help research and inform his MozCon 2019 talk, this article by Dr. Pete covers how and why to improve a Domain Authority score.

19. How to Get Into Google News – Whiteboard Friday

Barry Adams, January 11th

16,265 reads

Top keywords according to Keyword Explorer: how to get on google news (#3, 101–200), google news inclusion (#3, 51–100), getting into google news (#3, 11–50)

How do you increase your chances of getting your content into Google News? Barry Adams shares the absolute requirements and the nice-to-have extras that can increase your chances of appearing in the much-coveted news carousel.

20. Topical SEO: 7 Concepts of Link Relevance & Google Rankings

Cyrus Shepard, April 1st

15,579 reads

Top keywords according to Keyword Explorer: link relevance (#2, 0–10), read more on seo (#2, 0–10),relevant links (#2, 0–10)

To rank in Google, it’s not simply the number of votes you receive from popular pages, but the relevance and authority of those links as well.

21. The 5 SEO Recommendations That Matter in the End

Paola Didone, March 26th

13,879 reads

Top keywords according to Keyword Explorer: seo recommendations (#1, 11–50), 10 seo recommend (#1, 0–10), seo recommendations report (#1, 0–10)

What are the most steadfast, evergreen SEO recommendations you can make for your clients? These are the top five that this SEO has encountered that consistently deliver positive results.

22. An SEO’s Guide to Writing Structured Data (JSON-LD)

Brian Gorman, May 9th

13,862 reads

Top keywords according to Keyword Explorer: json structured data (#3, 0–10), seo json content (#3, 0–10), seomoz structured data (#3, 0–10)

This guide will help you understand JSON-LD and structured data markup. Go beyond the online generators and prepare your web pages for the future of search!

23. A Comprehensive Analysis of the New Domain Authority

Russ Jones, March 5th

13,333 reads

Top keywords according to Keyword Explorer: does post clustering build domain authority (#2, 11–50), who invented domain authority (#3, 11–50), domain authority curve (#1, 0–10)

A statistical look at Moz’s much-improved Domain Authority. Find out how it performs vs previous versions of Domain Authority, competitor metrics, and more.

24. The Practical Guide to Finding Anyone’s Email Address

David Farkas, November 26th

13,263 reads

Top keywords according to Keyword Explorer: N/A in positions #1–3

The never-ending struggle with link building begins with finding contact info. David Farkas outlines a few simple and easy ways to discover the right person to reach out to, plus some tips on which tools and strategies work best.

25. How to Use Domain Authority 2.0 for SEO – Whiteboard Friday

Cyrus Shepard, March 8th

12,940 reads

Top keywords according to Keyword Explorer: domain authority 2.0 (#2, 11–50), thought domain authority keywords (#1, 0–10), domain authority for seo (#2, 0–10)

Domain Authority is a well-known metric throughout the SEO industry, but what exactly is the right way to use it? In this Whiteboard Friday, Cyrus Shepard explains what’s new with the new Domain Authority 2.0 update and how to best harness its power for your own SEO success.


That’s a wrap for the top posts of 2019! Did we miss any that were on your own must-read list? Let us know in the comments below. We can’t wait to see what 2020 has in store!



Source link

Actually Accurate Analytics – Whiteboard Friday



RuthBurrReedy

Clean, useful Google Analytics data is all-important — both for you, and for the clients and colleagues that will be working on the site in the future. Ruth Burr Reedy shares her absolute best tips for getting your Analytics data accurate, consistent, and future-proof in this week’s Whiteboard Friday.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, Moz fans. I’m Ruth Burr Reedy, and I am the Vice President of Strategy at UpBuild. We’re a technical marketing agency specializing in technical SEO and advanced web analytics. One of the things I wanted to talk about today, Whiteboard Friday, is about analytics.

So when I talk to SEOs about analytics and ask them, “When it comes to analytics, what do you do? What do you do first? When you’re taking on a new client, what do you do?” SEOs are often really eager to tell me, “I dive into the data. Here’s what I look like.Here are the views that I set up. Here’s how I filter things. Here’s where I go to gain insights.”

But what I often don’t hear people talk about, that I think is a super important first step with a new client or a new Analytics account, or really any time if you haven’t done it, is making sure your Analytics data is accurate and consistent. Taking the time to do some basic Analytics housekeeping is going to serve you so far into the future and even beyond your time at that given client or company.

The people who come after you will be so, so, so thankful that you did these things. So today we’re going to talk about actually accurate analytics. 

Is your Analytics code on every page?

So the first question that you should ask yourself is: Is your Analytics code on every page? Is it?

Are you sure? There are a lot of different things that can contribute to your Analytics code not actually being on every single page of your website. One of them is if portions of your site have a different CMS from the main CMS that’s driving your site. 

Forums, subdomains, landing pages

We see this a lot with things like subdomains, with things like forums. A really common culprit is if you’re using a tool like Marketo or HubSpot or Unbounce to build landing pages, it’s really easy to forget to put Analytics on those pages.

Over time those pages are out there in the world. Maybe it’s just one or two pages. You’re not seeing them in Analytics at all, which means you’re probably not thinking about them, especially if they’re old. But that doesn’t mean that they don’t still exist and that they aren’t still getting views and visits. 

Find orphan pages

So, okay, how do we know about these pages? Well, before you do anything, it’s important to remember that, because of the existence of orphan pages, you can’t only rely on a tool like Screaming Frog or DeepCrawl to do a crawl of your site and make sure that code is on every page, because if the crawler can’t reach the page and your code is not on the page, it’s kind of in an unseeable, shrouded in mystery area and we don’t want that.

Export all pages

The best way, the most sure way to make sure that you are finding every page is to go to your dev team, to go to your developers and ask them to give you an export of every single URL in your database. If you’re using WordPress, there’s actually a really simple tool you can use. It’s called Export All URLs in the grand tradition of very specifically named WordPress tools.

But depending on your CMS and how your site is set up, this is something that you can almost certainly do. I need a list of every single URL on the website, every single URL in our database. Your dev team can almost certainly do this. When you get this, what you can do, you could, if you wanted, simply load that list of URLs. You’d want to filter out things like images and make sure you’re just looking at the HTML documents.

Dedupe with Screaming Frog

Once you had that, you could load that whole thing into Screaming Frog as a list. That would take a while. What you could do instead, if you wanted, is run a Screaming Frog crawl and then dedupe that with Screaming Frog. So now you’ve got a list of your orphan pages, and then you’ve got a list of all of the pages that Screaming Frog can find. So now we have a list of every single page on the website.

We can use either a combination of crawler and list or just the list, depending on how you want to do it, to run the following custom search. 

What to do in Screaming Frog

Configuration > Custom > Search

So in Screaming Frog, what you can do is you can go to Configuration and then you go to Custom Search. It will pop up a custom search field. What this will allow you to do is while the crawler is crawling, it will search for a given piece of information on a page and then fill that in a custom field within the crawler so that you can then go back and look at all of the pages that have this piece of information.

What I like to do when I’m looking for Analytics information is set up two filters actually — one for all of the pages that contain my UA identifier and one for all of the pages that don’t contain it. Because if I just have a list of all the pages that contain it, I still don’t know which pages don’t contain it. So you can do this with your unique Google Analytics identifier.



If you’re deploying Google Analytics through Google Tag Manager, instead you would look for your GTM Number, your GTM ID. So it just depends how you’ve implemented Analytics. You’re going to be looking for one of those two numbers. Almost every website I’ve worked on has at least a few pages that don’t have Analytics on them.

What you’ll sometimes also find is that there are pages that have the code or that should have the code on them, but that still aren’t being picked up. So if you start seeing these errors as you’re crawling, you can use a tool like Tag Assistant to go in and see, “Okay, why isn’t this actually sending information back to Google Analytics?” So that’s the best way to make sure that you have code on every single page. 

Is your code in the <head> and as high as possible?

The other thing you want to take a look at is whether or not your Analytics code is in the head of every page and as close to the top of the head as possible. Now I know some of you are thinking like, “Yeah, that’s Analytics implementation 101.” But when you’re implementing Analytics, especially if you’re doing so via a plug-in or via GTM, and, of course, if you’re doing it via GTM, the implementation rules for that are a little bit different, but it’s really easy for over time, especially if your site is old, other things to get added to the head by other people who aren’t you and to push that code down.

Now that’s not necessarily the end of the world. If it’s going to be very difficult or time-consuming or expensive to fix, you may decide it’s not worth your time if everything seems like it’s firing correctly. But the farther down that code gets pushed, the higher the likelihood that something is going to go wrong, that something is going to fire before the tracker that the tracker is not going to pick up, that something is going to fire that’s going to prevent the tracker from firing.

It could be a lot of different things, and that’s why the best practice is to have it as high up in the head as possible. Again, whether or not you want to fix that is up to you. 

Update your settings:

Once you’ve gotten your code firing correctly on every single page of your website, I like to go into Google Analytics and change a few basic settings. 

1. Site Speed Sample Rate

The first one is the Site Speed Sample Rate.

So this is when you’re running site speed reports in Google Analytics. Typically they’re not giving you site timings or page timings for the site as a whole because that’s a lot of data. It’s more data than GA really wants to store, especially in the free version of the tool. So instead they use a sample, a sample set of pages to give you page timings. I think typically it’s around 1%.

That can be a very, very small sample if you don’t have a lot of traffic. It can become so small that the sample size is skewed and it’s not relevant. So I usually like to bump up that sample size to more like 10%. Don’t do 100%. That’s more data than you need. But bump it up to a number that’s high enough that you’re going to get relevant data.

2. Session and Campaign Timeout

The other thing that I like to take a look at when I first get my hands on a GA account is the Session and Campaign Timeout. So session timeout is basically how long somebody would have to stay on your website before their first session is over and now they’ve begun a new session if they come back and do something on your site where now they’re not being registered as part of their original visit.

Historically, GA automatically determined session timeout at 30 minutes. But this is a world where people have a million tabs open. I bet you right now are watching this video in one of a million tabs. The longer you have a tab open, the more likely it is that your session will time out. So I like to increase that timeout to at least 60 minutes.

The other thing that Google automatically does is set a campaign timeout. So if you’re using UTM parameters to do campaign tracking, Google will automatically set that campaign timeout at six months. So six months after somebody first clicks that UTM parameter, if they come back, they’re no longer considered part of that same campaign.

They’re now a new, fresh user. Your customer lifecycle might not be six months. If you’re like a B2B or a SaaS company, sometimes your customer lifecycle can be two years. Sometimes if you’re like an e-com company, six months is a really long time and you only need 30 days. Whatever your actual customer lifecycle is, you can set your campaign timeout to reflect that.

I know very few people who are actually going to make that window shorter. But you can certainly make that longer to reflect the actual lifecycle of your customers. 

3. Annotations

Then the third thing that I like to do when I go into a Google Analytics account is annotate what I can. I know a lot of SEOs, when you first get into a GA account, you’re like, “Well, no one has been annotating.Ho-hum. I guess going forward, as of today, we’re going to annotate changes going forward.”

That’s great. You should definitely be annotating changes. However, you can also take a look at overall traffic trends and do what you can to ask your coworkers or your client or whatever your relationship is to this account, “What happened here?” Do you remember what happened here? Can I get a timeline of major events in the company, major product releases, press releases, coverage in the press?

Things that might have driven traffic or seen a spike in traffic, product launches. You can annotate those things historically going back in time. Just because you weren’t there doesn’t mean it didn’t happen. All right. So our data is complete. It’s being collected the way that we want to, and we’re tracking what’s happening.

Account setup

Cool. Now let’s talk about account setup. I have found that many, many people do not take the time to be intentional and deliberate when it comes to how they set up their Google Analytics account. It’s something that just kind of happens organically over time. A lot of people are constrained by defaults. They don’t really get what they’re doing.

What we can do, even if this is not a brand-new GA account, is try to impose some structure, order, consistency, and especially some clarity, not only for ourselves as marketers, but for anybody else who might be using this GA account either now or in the future. So starting out with just your basic GA structure, you start with your account.

Your Account Name is usually just your company name. It doesn’t totally matter what your Account Name is. However, if you’re working with a vendor, I know they’d prefer that it be your company name as opposed to something random that only makes sense to you internally, because that’s going to make it easier for them. But if you don’t care about that, you could conceivably name your account whatever you want. Most of the time it is your company name.

Then you’ve got your property, and you might have various properties. A good rule of thumb is that you should have one property per website or per group of sites with the same experience. So if you have one experience that goes on and off of a subdomain, maybe you have mysite.com and then you also have store.mysite.com, but as far as the user experience is concerned it’s one website, that could be one property.

That’s kind of where you want to delineate properties is based on site experiences. Then drilling down to views, you can have as many views as you want. When it comes to naming views, the convention that I like to use is to have the site or section name that you’re tracking in that specific view and then information about how that view is set up and how it’s intending to be used.

Don’t assume that you’re going to remember what you were doing last year a year from now. Write it down. Make it clear. Make it easy for people who aren’t you to use. You can have as many views as you want. You can set up views for very small sections of your site, for very specific and weird filters if there are some customizations you want to do. You can set up as many views as you need to use.

Must-have views

1. Raw data – Unfiltered, Don’t Touch

But I think there are three views that you should make sure you have. The first is a Raw Data view. This is a view with no filters on it at all. If you don’t already have one of these, then all of your data in the past is suspect. Having a view that is completely raw and unfiltered means if you do something to mess up the filtering on all your other views, you at least have one source of total raw data.

I know this is not new information for SEOs when it comes to GA account setup, but so many people don’t do it. I know this because I go into your accounts and I see that you don’t have it. If you don’t have it, set it up right now. Pause this video. Go set it up right now and then come back and watch the rest, because it’s going to be good. In addition to naming it “Raw Data Unfiltered,” I like to also add something like “Don’t Touch” or “For Historical Purposes Only,” if you’re not into the whole brevity thing, something that makes it really clear that not only is this the raw data, but also no one should touch it.

This is not the data we’re using. This is not the data we’re make decisions by. This is just our backup. This is our backup data. Don’t touch it. 

2. Primary view – Filtered, Use This One

Then you’re going to want to have your Primary view. So however many views you as a marketer set up, there are going to be other people in your organization who just kind of want the data.

So pick a view that’s your primary filtered view. You’re going to have a lot of your basic filters on this, things like filtering out your internal IP range, filtering out known bots. You might set up some filtering to capture the full hostname if you’re tracking between subdomains, things like that. But it’s your primary view with basic filtering. You’re going to want to name that something like “Use This One.”

Sometimes if there’s like one person and they won’t stop touching your raw data, you can even say like, “Nicole Use This One.” Whatever you need to label it so that even if you got sick and were in the hospital and unreachable, you won the lottery, you’re on an island, no one can reach you, people can still say, “Which of these 17 views that are set up should I use? Oh, perhaps it’s the one called ‘Use This One.'” It’s a clue. 

3. Test view – Unfiltered

Then I like to always have at least one view that is a Test view. That’s usually unfiltered in its base state. But it’s where I might test out filters or custom dimensions or other things that I’m not ready to roll out to the primary view. You may have additional views on top of those, but those are the three that, in my opinion, you absolutely need to have.

4. All Website Data

What you should not have is a view called “All Website Data.” “All Website Data” is what Google will automatically call a view when you’re first setting up GA. A lot of times people don’t change that as they’re setting up their Analytics. The problem with that is that “All Website Data” means different things to different people. For some people, “All Website Data” means the raw data.

For some people, “All Website Data” means that this is the “Use This One” view. It’s unclear. If I get into a GA account and I see that there is a view named “All Website Data,” I know that this company has not thought about how they’re setting up views and how they’re communicating that internally. Likely there’s going to be some filtering on stuff that shouldn’t have been filtered, some historical mishmash.

It’s a sign that you haven’t taken the time to do it right. In my opinion, a good SEO should never have a view called “All Website Data.” All right. Great. So we’ve got our views set up. Everything is configured the way that we want it. How that’s configured may be up to you, but we’ve got these basic tenets in place.

Goals

Let’s talk about goals. Goals are really interesting. I don’t love this about Google Analytics, but goals are forever. Once you set a goal in GA, information that is tracked to that number or that goal number within that goal set will always be tracked back to that. What that means is that say you have a goal that’s “Blue Widget Sales” and you’re tracking blue widget sales.

Goals are forever

Over time you discontinue the blue widget and now you’re only tracking red widget sales. So you rename the “Blue Widget Sales” widget to now it’s called “Red Widget Sales.” The problem is renaming the goal doesn’t change the goal itself. All of that historical blue widget data will still be associated with that goal. Unless you’re annotating carefully, you may not have a good idea of when this goal switched from tracking one thing to be tracking another thing.

This is a huge problem when it comes to data governance and making decisions based on historical data. 

The other problem is you have a limited number of goals. So you need to be really thoughtful about how you set up your goals because they’re forever. 

Set goals based on what makes you money

A basic rule is that you should set goals based on what makes you money.

You might have a lot of micro conversions. You might have things like newsletter sign-ups or white paper downloads or things like that. If those things don’t make you money, you might want to track those as events instead. More on that in a minute. Whatever you’re tracking as a goal should be related to how you make money. Now if you’re a lead gen biz, things like white paper downloads may still be valuable enough that you want to track them as a goal.

It just depends on your business. Think about goals as money. What’s the site here to do? When you think about goals, again, remember that they’re forever and you don’t get that many of them. 

Group goals efficiently

So any time you can group goals efficiently, take some time to think about how you’re going to do that. If you have three different forms and they’re all going to be scheduling a demo in some way or another, but they’re different forms, is there a way that you can have one goal that’s “Schedule a Demo” and then differentiate between which form it was in another way?

Say you have an event category that’s “Schedule a Demo” and then you use the label to differentiate between the forms. It’s one goal that you can then drill down. A classic mistake that I see with people setting up goals is they have the same goal in different places on the website and they’re tracking that differently. When I say, “Hey, this is the same goal and you’re tracking it in three different places,” they often say, “Oh, well, that’s because we want to be able to drill down into that data.”

Great. You can do that in Google Analytics. You can do that via Google Analytics reporting. You can look at what URLs and what site sections people completed a given goal on. You don’t have to build that into the goal. So try to group as efficiently as possible and think long term. If it at any time you’re setting up a goal that you know is someday going to be part of a group of goals, try to set it up in such a way that you can add to that and then drill down into the individual reports rather than setting up new goals, because those 20 slots go quick.

Name goals clearly

The other thing you’re going to want to do with goals and with everything — this is clearly the thesis for my presentation — is name them clearly. Name them things where it would be impossible not to understand exactly what it is. Don’t name your goal “Download.” Don’t name your goal “Thank You Page.”

Name your goal something specific enough that people can look at it at a glance. Even people who don’t work there right now, people in the future, the future people can look at your goals and know exactly what they were. But again, name them not so specifically that you can’t then encompass that goal wherever it exists on the site. So “Download” might be too broad.

“Blue Widget White Paper Download” might be too specific. “White Paper Download” might be a good middle ground there. Whatever it is for you, think about how you’re going to name it in such a way that it’ll make sense to somebody else, even if you don’t work there anymore and they can’t ask you. Now from talking about goals it kind of segues naturally into talking about events, event tracking.

Events

Event tracking is one of the best things about Google Analytics now. It used to be that to track an event you had to add code directly to a page or directly to a link. That was hard to do at scale and difficult to get implemented alongside conflicting dev possibilities. But now, with Google Tag Manager, you can track as many events as you want whenever you want to do them.

You can set them up all by yourself, which means that now you, as the marketer, as the Analytics person, become the person who is in charge of Google Analytics events. You should take that seriously, because the other side of that coin is that it’s very possible to get event creep where now you’re tracking way too many events and you’re tracking them inefficiently and inconsistently in ways that make it difficult to extract insights from them on a macro level.

What do you want and why?

So with events, think about what you want and why. Any time somebody is like, “I want to track this,” ask them, “Okay, what are we going to do with that information?” If they’re like, “I don’t know. I just want to know it.” That might not be a good case to make to track an event. Understand what you’re going to do with the data. Resist the urge to track just for tracking’s sake.

Resist data for data’s sake. I know it’s hard, because data is cool, but try your best. 

Naming conventions

As you take over, now that you are the person in charge of events, which you are, you’re taking this on, this is yours now, develop naming conventions for your events and then become the absolute arbiter of those conventions. Do not let anybody name anything unless it adheres to your conventions.

Category

Now how you name things is up to you. Some suggestions, for category, I like that to be the site section that something is in or maybe the item type. So maybe it’s product pages. Maybe it’s forms. Maybe it’s videos. However you are going to group these events on a macro level, that should be your category.

Action

The action is the action. So that’s click, submit, play, whatever the action is doing. 

Label

Then the label is where I like to get unique and make sure that I’m drilling down to just this one thing. So maybe that’s where I’ll have the actual CTA of the button, or which form it was that people filled out, or what product it was that they purchased. Again, think about information that you can get from other reports.

So for example, you don’t need to capture the URL that the event was recorded on as part of the label, because you can actually go in and look at all of your events by URL and see where that happened without having to capture it in that way. The important thing is that you have rules, that those rules are something that you can communicate to other people, and that they would then be able to name their own categories, actions, and labels in ways that were consistent with yours.

Over time, as you do this and as you rename old events, you’re going to have a more and more usable body of data. You’re going to be increasingly comparing apples to apples. You’re not going to have some things where Click is the action and some things where Click is the label, or things that should be in one category that are in two or three categories. Over time you’re going to have a much more usable and controllable body of event data.

Be consistent

Then you need to be ruthless about consistency with usage of these naming conventions. There will be no just setting up an event real quick. Or, in fact, there will be just setting up an event real quick, but it will be using these rules that you have very thoroughly outlined and communicated to everybody, and that you are then checking up to make sure everything is still tracking the same way. A big thing to watch for when you’re being ruthless about consistency is capitalization.

Capitalization in category action and label and event tracking will come back as two different things. Capital “C” and lowercase “c” category are two different things. So make sure as you’re creating new events that you have some kind of standardization. Maybe it’s the first letter is always capitalized. Maybe it’s nothing is ever capitalized.

It doesn’t matter what it is as long as it’s all the same. 

Think about the future!

Then think about the future. Think about the day when you win the lottery and you move to a beautiful island in the middle of the sea and you turn off your phone and you never think about Google Analytics again and you’re lying in the sand and no one who works with you now can reach you. If you never came back to work again, could the people who work there continue the tracking work that you’ve worked so hard to set up?

If not, work harder to make sure that’s the case. Create documentation. Communicate your rules. Get everybody on the same page. Doing so will make this whole organization’s data collection better, more actionable, more usable for years to come. If you do come back to work tomorrow, if in fact you work here for the next 10 years, you’ve just set yourself up for success for the next decade.

Congratulations. So these are the things that I like to do when I first get into a GA account. Obviously, there are a lot of other things that you can do in GA. That’s why we all love GA so much. 

Homework

But to break it down and give you all some homework that you can do right now.

Check for orphan pages

Tonight, go in and check for orphan pages.

When it comes to Analytics, those might be different or they might be the same as orphan pages in the traditional sense. Make sure your code is on every page. 

Rename confusing goals and views (and remove unused ones)

Rename all your confusing stuff. Remove the views that you’re not using. Turn off the goals that you’re not using. Make sure everything is as up to date as possible. 

Guard your raw data

Don’t let anybody touch that raw data. Rename it “Do Not Touch” and then don’t touch it. 

Enforce your naming conventions

Create them. Enforce them. Protect them. They’re yours now.

You are the police of naming conventions. 

Annotate everything

Annotate as much as you can. Going forward you’re going to annotate all the time, because you can because you’re there, but you can still go back in time and annotate. 

Remove old users

One thing that I didn’t really talk about today but you should also do, when it comes to the general health of your Analytics, is go in and check who has user permissions to all of your different Analytics accounts.

Remove old users. Take a look at that once a quarter. Just it’s good governance to do. 

Update sampling and timeouts

Then you’re going to update your sampling and your timeouts. If you can do all of these things and check back in on them regularly, you’re going to have a healthy, robust, and extremely usable Analytics ecosystem. Let me know what your favorite things to do in Analytics are. Let me know how you’re tracking events in GTM.

I want to hear all about everything you all are doing in Analytics. So come holler at me in the comments. Thanks.

Video transcription by Speechpad.com



Source link

The Economics of Link Building



Alex-T

Life has taught me that good things should be expensive — especially when it comes to any type of digital marketing services. If you’re not an expert, you can end up getting something far from what you’ve been expecting.

Here’s an example of “the best mascot image you can get for your event” that I paid for when organizing one of our first Digital Olympus events:

Just for reference, this is how our mascot looked originally:

My point is, just like working with freelance designers, hiring SEO consultants is only safe when you know exactly what you need and can control every step of the contract. This both relates to the scope or work and the price of contract.

I get really confused when I hear that the price of an average SEO agency contract starts at $1k USD. This number was first shared by Rand Fishkin in 2012 when he asked 600 agencies about their typical rates. Later, in 2018, that same number was published by Ahrefs when they did a similar survey.

As an SEO practitioner, I’m a bit disappointed with the stability of rates, but what bothers me the most is that this rate doesn’t really include link building. I can hardly imagine a successful SEO campaign for an SMB site without acquiring links. To back up my statement with some numbers, I’d like to mention Ross Hudgens’ claim that acquiring a good link on a top-notch site should cost $1k USD. Ironically, that’s the whole budget of an average SEO contract.

But to be honest, I don’t quite agree with those rates even though I truly respect the opinion. It doesn’t seem that realistic at scale: if you want to build 10 links, it would cost you $10k, a hundred links, $100k etc. That’s just plain impossible for the majority of companies. Don’t get me wrong, I would LOVE to work with those rates, but I can hardly imagine a business willing to pay one hundred thousand dollars for one hundred links. And to be completely fair, in some niches even a hundred links won’t move the needle.

See for yourself. Here’s one of our clients who thought that 100 links would help them:

And here’s what’s been going on with their organic traffic coming back to their blog from the links that we built:

To give you some context for their SEO situation, this client also wanted to rank for keywords related to link building. Below you can see one of my favorite examples of how fierce the competition is in the niche where people want to rank for such a generic term as “link building”:

This screenshot is screaming a simple fact out loud: you need to have at least 2,000 referring domains to outrank the pages that are currently in the top. Remember the link building rates that I’ve just named? How much would such work be worth? Looks like you might need a new round of investments if a rate per link remains at $1k USD.

Now, look, I feel for you. Link building should be affordable for SMB sites because what’s the point in getting into it if the game’s been fixed to begin with? In this post, I’ll show you that link building shouldn’t cost an arm and a leg, and even a small site can do it with enough dedication put into solving the issue. I’ll walk you through some of the most popular link building strategies and explain why some of them aren’t economically attractive. And I’ll explain the costs of certain options (or in other words, why the hell does your link builder charge you so much?) and show you what benefits they can offer your business.

Link building landscape: Email outreach strategy to rule them all

Some time ago, I had quite a long flight to Bali where I was speaking at the DMMS conference. I had a chance to watch a few movies including Tolkien, who was among my favorite authors growing up. Sadly, the movie had a weak plot that doesn’t really begin to explain how Tolkien came to invent his own language. However, it did bring up something to do with link building, believe it or not. Connections that you build throughout your life impact you a great deal. Put “your site” in place of “you” in the last sentence and voilá — here’s my point. If you follow the wrong path, you’ll surround yourself with bad connections (and, using my link building metaphor, bad links).

I’m always keen to discuss things from a philosophical point of view, but let’s get practical for a moment. How can you build high-quality links that will bring the best SEO results and will still be affordable?

Even though there are tons of link building strategies, on a general note, you can narrow them down to a few:

Links that are acquired through email outreach 

First of all, let’s clear up on the terminology. I see any strategy that includes sending emails to other websites to negotiate the possibility of getting a link as email outreach. For instance, such well-known strategies as broken link building, building links through guest posting, scraping SERPs and then pitching your content to those sites, and many others. That’s all email outreach because they all involve pitching something to someone through emails. The only way in which some of those strategies are different from the others is that they require some sort of written content. For example, guest posting requires you to write a post — that’s obvious. This significantly increases the costs of work, and here we are, approaching the above-mentioned number of 1k USD. To be honest, guest posting is not my favorite strategy due to many limitations that it has (I’ll share them with you later in this post, so keep reading!)

Links from digital PR campaigns 

Even though this strategy also relies on sending emails, your recipients aren’t website owners but journalists. So, this strategy is quite harder to execute. They require newsworthy content, you should have the necessary connections, be able to pitch it to the journos etc. etc. Also, digital PR campaigns always cost 10X more than any traditional email outreach campaign. That’s just because they bring links from media outlets that have not only great SEO value, but also let your brand connect with a broader audience.

Paid links

I don’t like these types of links and I don’t recommend anyone to try to acquire them. But I feel that I can’t skip this point as, in reality, paid links are in high demand. Some marketers are always trying to find the shortcut and look for sites that sell links.

There aren’t too many options out there when it comes to link building. Let me show you how some of the listed options aren’t economically right or simply won’t bring any solid SEO boost.

What are the pros and cons of each strategy?

Below you’ll find a quick sum-up of the most significant pros and cons of each strategy. It’s important to mention that here, at my agency, we only build links through email outreach as I believe it is by far the most cost-effective strategy. As of links built through digital PR, I used to do that, but in my experience, the results were not quite worth their significant costs.

Paid links

Let’s start with the tricky option — paid links. Here I’m talking about the links that you can purchase through sponsored content and that won’t be labeled with a special tag. I’m not going to talk about the ethical side of this strategy, as that would require a separate post. I just want to state that I know tons of sites that do it.

Pros:

  • It’s very fast. You can build as many links as you’d like. The only limitation is your budget.

Cons:

  • Sites that sell links do it at scale. At some point, they will be penalized by Google.
  • Consequently, if those links are risky, you’ll have to disavow them some time later.
  • Most likely there will be a tiny number of sites with exceptionally high domain ratings.

Digital PR link building

A few years ago, I was one of the biggest digital PR fans around, but time passed, and now I clearly see what kind of limitations this approach bears. Digital PR is an essential part of the promotion strategy for businesses that have recently established their brand and want to build trust with their audience. Plus, links from media outlets will automatically give Google’s a signal that your site is a trustworthy business. The only downside is that the majority of businesses don’t have a big fat budget for a proper digital PR campaign. Here’s a good post from Gisele Navarro that shares some extra angles on why brands do and don’t need digital PR.

Pros:

  • Getting links from media outlets will eventually grow your domain authority and give Google enough reasons to believe that your brand is trustworthy.
  • They make your brand more visible to a broader audience.
  • Showing to your potential clients that your brand was featured in The New York Times or on BBC is cool. Like, really cool.

Cons:

  • It’s very, very expensive. The costs for an average digital PR campaign start from $30k–$40k USD.
  • This strategy requires specific content which is why it gets so pricey.
  • It takes a few months to build such links — to ideate and execute the campaign, gather attention, get coverage, etc.
  • The price per link is very high. Normally it revolves around $1k USD.

Email outreach link building

I believe this to be the best link building strategy that fits nearly every business’ needs, especially if your goal is to start getting traffic to already existing pages. And to top it off, its cost per link is affordable even for small and medium-sized businesses.

Pros:

  • You can build links to nearly any page (including your commercial pages).
  • The price per link doesn’t go through the roof (it varies from $100 to $500 USD depending on the referring site’s domain quality)
  • A lot of link building agencies even allow you to buy one link (however, we aren’t within that tier as we prefer quality over quantity).
  • It allows you to build relationships with your industry peers.
  • It makes your brand more visible to your target audience.
  • It helps you get links from top-notch industry sites.

Cons:

  • Requires some special skills and knowledge (an average email has only an 8.5% open rate which makes it quite a hard practice).
  • Such links can’t be built overnight. However, the time they take is less than the PR-based links.
  • Such links have some hidden reputational risks (if you do it the wrong way, sending tons of outreach emails = being potentially seen as a spammer).

To sum it up, there are many reasons to believe that link building through email outreach is your to-go strategy if your main goal is to get more organic traffic from Google. The next big question is how many links you need and what it’s going to cost you.

How to estimate the number of links you need

A few weeks ago, I was lucky to listen to Robbie Richards’ speech at the DMSS conference where he confirmed my link building formula. If you’re competing with a site with similar on-site characteristics (both sites are https, mobile-friendly, fast, Google considers them both a brand plus a few other factors) then, in order to outrank it in search, you need to keep in mind only two factors*:

  • Your domain’s authority should be circa the same number as of the pages that you want to outrank;
  • You should have the same or a bit more referring domains compared to the pages that currently outrank you.

*In particular cases, internal linking plays a huge role. Not that long ago, my good friend Joe Williams published a great post where he goes into more detail on the topic.

This formula might vary based on your estimated domain authority (DA) or on your domain rating (DR). If you have a higher domain score than the pages that you want to outrank, then you’ll just need fewer links. But if your DR is lower, you’ll need significantly more links, and that’s something you need to account for.

Here’s some context: let’s take a look at my own site. Digital Olympus is not doing very well in the SERPs because of its DR. On average, all sites that are ranking for search queries related to email outreach have a domain rating of 70–80, while our own site is only 56. So, this means that we need at least two times more links referring to our pages in comparison to the sites that are above us in search. For instance, to get this page to the top of search results for “email outreach,” we need to build around 200 links. As you can see from a screenshot below, the rest of the URLs have 100+ links, so we need to double that number to stand a chance:

Another approach to this situation would require us to calculate how many links we need to get the overall domain rating of 70. That’s around 250 links from sites with DR higher than 30 (I don’t consider sites that have smaller DR of good quality).

Once you know the necessary number of links to build, you should decide whether you’re capable of doing it on your own. I’m not trying to convince you to hire an agency, but if you’ve never done link building, it’s going to take around a year to set up the process and start building from 10–20 links a month, realistically speaking.

I don’t want to demotivate you, but such tasks are truly skill-demanding. A few years ago, I could barely build several links per month. So, if you have a budget and need links right away, it makes sense to hire someone to help you. The main reason why our clients hire us is that we’ve built relationships in the industry. We’re known, which allows us to build links fast.

What’s the right price for an email outreach link building campaign?

Different agencies have different rates when it comes to link building through email outreach. As a time-consuming strategy, it very much relies on the agency’s approach which is always unique even if it relies on the common practices. Some charge per campaign, some per link, and some would prefer to ask you to pay not less than a certain amount on a monthly basis.

For example, the people at LinksHero charge from $3k USD and promise to build around 5–15 links per month:

In case you want to pay as you go and don’t want to be bound by any monthly commitments, then DFYlinks.com is your best choice. Their link building services are highly recommended by such well-known experts as Cyrus Shepard, Ryan Stewart, and many others. DFYlinks sell guest post links and their cheapest option will cost you only $160 USD:

Another link building agency trusted by such industry experts as Ryan Stewart and Steven Kang is Authority Builders. Even though they don’t have a pricing page, I had a chat with their founder, Matt Diggity, and he said that their basic rate is $170–$180 USD.

If you’re wondering where my agency stands, we’re from a bunch that charge per number of acquired links, post-factum. I think it’s the best option for small and mid-size businesses, as it gives you more freedom and allows you to build links at your own pace.

Our rate is somewhere in the middle, even though the quality of our links is above average, as we’re getting our links from corporate and top-notch blogs. Plus, we don’t send mass emails so you won’t face any associated reputation risks. We’ve spent the last couple of years building relationships with people, so right now we’re simply reaching out to them instead of doing mass email blasts. For our services, we charge from $300 USD per link, so you can easily calculate your overall budget to build, say, a hundred links. However, we work only in the B2B niche — specializations are important to consider before you choose an agency.

So that’s the rundown on how much it costs to build links. Hopefully you should now be able to estimate your budget in order to build the desired number of links to your site. And let me just say this: for businesses that have already built some trust and visibility, getting even sixty new, quality referring domains can make all the difference and help them achieve sustainable organic traffic growth:

That’s a lot to take in, I know. But there’s more to talk about. For example, there are tons of hidden benefits to email outreach delivered the right way. Just stay with me, we’re getting there.

How to get more from every link that you earn

I love handmade email outreach link building as it allows you to do more than build links. You’re also building relationships that can help you move the needle far beyond link building alone.

People who are your link building partners today can organize a conference tomorrow and invite you to speak, which can allow you to become more visible within your niche. That’s not as rare as it may seem! And if you’re curious, yes, I’m referring to our own experience: besides doing link building, we also run our own digital marketing conference Digital Olympus (which, by the way, will be next held in Krakow on April 5th 2020).

Another benefit worth mentioning is that the companies that you connect with during your email outreach link building campaign also invest in growing their businesses. As a result, the site that has a domain score of 50 might get it up to 70 in a few months. In other words, today you’re paying for something that might get much more valuable in the future, and that’s what makes email outreach link building epic!

Here’s a list of sites from which we built links for one of our clients. You can see how their domain scores have grown since May 2019:

Start working on a link building profile that will rule them all!

Your next step is up to you, but in my experience, it’s important to start working on links as early as possible. Otherwise, there’ll be a huge gap between your site and your competitors who have been working on link building for a while.

Also, I know that the majority of businesses would like to run their link building campaigns in-house. Starting early gives you a leg-up to build your processes and test things. If you decide that it’s your way, please don’t follow the “best practices,” as 99% of them are infinitely outdated. Most of those strategies have been discontinued years ago in the link building community, and only rookies still fall for them.

The list of no-BS resources

If you’re looking for more information about doing DIY link building, here are a few useful posts that won’t turn you into a spammer who’s asking for a link because “they’ve been following another person’s blog for ages” (that’s a link builders private joke):

Conclusion

I’m not sure what else is there for me to say to convince you that email outreach is the way to do link building. And so I won’t try to convince you anymore — I’ll just sum up what I’ve told you already.

First of all, assess your situation and decide what’s more important for you at the moment: building links fast or building your own process of acquiring links in-house. If you decide in favor of the first option, calculate the number of links you need to build, estimate your budget, and find a reputable agency to help you out. And if you settle for the latter, get ready to spend some time on building relationships, mastering your outreach email copy, and streamlining creating valuable content.

But don’t worry — in the end, it’s all going to be worth it.



Source link

6 Local Search Marketing DIY Tips for the Crafting Industry



MiriamEllis

Think crafting is kids’ stuff? Think again. The owners of quilting, yarn, bead, fabric, woodworking, art supply, stationers, edible arts, and related shops know that:

  • The crafting industry generated $44 billion in 2016 in the US alone.
  • 63% of American households engage in at least one crafting project annually, while more than one in four participate in 5+ per year.
  • The top three craft store chains in the country (Michaels, JOANN, Hobby Lobby) operate nearly 3,000 locations, just among themselves.
  • There are an estimated 3,200 US storefronts devoted to quilting alone. Thousands more vend everything from the stuff of ancient arts (knitting, with a 1,000-year history) to the trendy and new (unicorn slime, which, yes, is really a thing).

Our local search marketing industry has devoted abundant time to advising major local business categories over the past couple of decades, but crafting is one substantial retail niche we may have overlooked. I’d like to rectify this today.

I feel personally inspired by craft store owners. Over the years, I’ve learned to sew, quilt, embroider, crochet, knit, and bead, and before I became a local search marketer, I was a working fine artist. I even drafted a sewing pattern once that was featured in a crafting magazine. Through my own exploration of arts and crafts, I’ve come to know so many independent business owners in this industry, and have marketed several of them. These are gutsy people who take risks, work extremely hard for their living, and often zestfully embrace any education they can access about marketing.

Today, I’m offering my six best marketing tips for craft retailers for a more successful and profitable 2020.

First, a quick definition of local search marketing

Your store is your location. Your market is made up of all of your customers’ locations. Anything you do to promote your location to the market you serve is considered local search marketing. Your market could be your neighborhood, your city, or a larger local region. Local search marketing can include both offline efforts, like hanging eye-catching signage or getting mentioned in local print news, and online efforts, like having a website, building listings on local business listing platforms, and managing customer reviews.

Whatever you do to increase local awareness about your location, interact online with customers, bring them through your front door, serve them in-store, and follow up with them afterwards in an ongoing relationship counts. You’re already doing some of this, and in the words of Martha Stewart, “It’s a good thing.” But with a little more attention and intention, these six tips can craft even greater success for your business:

1. Take a page from my Google scrapbook

To engage in local search marketing is to engage with Google. Since they first started mapping out communities and businesses in 2004, the search engine giant has come to dominate the online local scene. There are other important online platforms, but to be in front of the maximum number of potential customers and to compete for rankings in Google’s local search results, your crafting business needs to:

  1. Read the Guidelines for representing your business on Google and follow them to the letter. This set of rules tells you what you can and can’t do in the Google My Business product. Listing your business incorrectly or violating the guidelines in any way can result in listing suspension and other negative outcomes.
  2. Create your free Google My Business listing once you’ve read the guidelines. Here’s Moz’s cheat sheet to all of the different fields and features you can fill out in your listing. Fill out as many fields as you possibly can and then Google will take you through the steps of verifying your listing.
  3. Reckon with Google’s power. As our scrapbook says, Google owns your Google My Business listing, but you can take a lot of control over some of its contents. Even once you’ve verified your listing, it’s still open to suggested edits from the public, questions, reviews, user-uploaded photos and other activities. Main takeaway: your GMB listing is not a one-and-done project. It’s an interactive platform that you will be monitoring and managing from here on out.

2. Weave a strong web presence

Your Google My Business listing will likely be the biggest driver of traffic to your craft store, but you’ll want to cast your online net beyond this. Once you feel confident about the completeness and ongoing management of your GMB listing, there are 4 other strands of Internet activity for you to take firm hold of:

Your website

At bare minimum, your website should feature:

  • Your complete and accurate name, address, phone number, email, and fax number
  • Clear written driving directions to your place of business from all points of entry
  • A good text description of everything you sell and offer
  • An up-to-date list of all upcoming classes and events
  • Some high-quality photos of your storefront and merchandise

A more sophisticated website can also feature:

  • Articles and blog posts
  • Full inventory, including e-commerce shopping
  • Customer reviews and testimonials
  • Online classes, webinars and video tutorials
  • Customer-generated content, including photos, forums, etc.

The investment you make in your website should be based on how much you need to do to create a web presence that surpasses your local competitors. Depending on where your store is located, you may need only a modest site, or may need to go further to rank highly in Google’s search engine results and win the maximum number of customers.

Your other local listings

Beyond Google, your business listings on other online platforms like Yelp, Facebook, Bing, Apple Maps, Factual, Foursquare, and Infogroup can ensure that customers are encountering your business across a wide variety of sites and apps. Listings in these local business information indexes are sometimes referred to as “structured citations” and you have two main choices for building and maintaining them:

  • You can manually build a listing on each important platform and check back on it regularly to manage your reviews and other content on it, as well as to ensure that the basic contact info hasn’t been changed by the platform or the public in any way.
  • You can invest in local listings management software like Moz Local, which automates creation of these listings and gives you a simple dashboard that helps you respond to reviews, post new content, and be alerted to any emerging inaccuracies across key listing platforms, all in one place. This option can be a major time saver and deliver welcome peace of mind.

Structured citation management is critical to any local business for two key reasons. Firstly, it can be a source of valuable consumer discovery and new customers for your shop. Secondly, it ensures you aren’t losing customers to frustrating misinformation. One recent survey found that 22% of customers ended up at the wrong location of a business because online information about it was incorrect, and that 80% of them lost trust in the company when encountering such misinformation. Brick-and-mortar stores can’t afford to inconvenience or lose a single customer, and that’s why managing all your listings for accuracy is worth the investment of time/money.

Your unstructured citations

As we’ve just covered, a formal listing on a local business platform is called a “structured citation.” Unstructured citations, by contrast, are mentions of your business on any type of website: local online news, industry publications, a crafter’s blog, and lists of local attractions all count.

Anywhere your business can get mentioned on a relevant online publication can help customers discover you. And if trusted, authoritative websites link to yours when they mention your business, those links can directly improve your search engine rankings.

If you’re serving a market with little local competition, you may not need to invest a ton of time in seeking out unstructured citation opportunities. But if a nearby competitor is outranking you and you need to get ahead, earning high-quality mentions and links can be the best recipe for surpassing them. All of the following can be excellent sources of unstructured citations:

  • Sponsoring or participating in local events, organizations, teams, and causes
  • Hosting newsworthy happenings that get written up by local journalists
  • Holding contests and challenges that earn public mention
  • Joining local business organizations
  • Cross promoting with related local businesses
  • Getting featured/interviewed by online crafting magazines, fora, blogs, and videos

Read The Guide to Building Linked Unstructured Citations for Local SEO for more information.

Your social media presence

YouTube, Instagram, Pinterest, Facebook, Twitter, crafting forums…choices abound! How much time and where you invest in social media should be determined by two things:

  • What your local competition is doing
  • Where your potential customers spend social time

If your shop is literally the only game in town, you may not need to win at social to win business, but if you have multiple competitors, strategic social media investments can set you apart as the most helpful, most popular local option.

In your social efforts, emphasize sharing, showing and telling — not just selling. If you keep this basic principle in mind, the DIY revolution is at your fingertips, waiting to be engaged. One thing I’ve learned about crafters is that they will travel. Quilting retreats, knitting tours, and major craft expos prove this.

If you or a staff member happen to create one of the most-viewed videos on YouTube for the three-needle bind off or crafting felt succulents, it could inspire travelers to put your shop on their bucket list. One of my favorite knitters in the world films the English/Swedish language Kammebornia podcast which is so idyllic, it would certainly inspire me to visit the island of Gotland if I were ever anywhere nearby. Think what you can do via social media to make your shop an aspirational destination for even non-local customers.

3. Abandon fear of ripping out mistakes (and negative reviews)

As the old adage goes, “Good knitters are good rippers.” When you drop a stitch in an important project, you have to know how to see it, patiently rip out stitches back to it, and correct the mistake as skillfully as you can. This exact same technique applies to managing the reviews customers leave you online. When your business “drops the ball” for a customer and disappoints them, you can often go back and correct the error.

Reviews = your business’ reputation. It’s as simple (and maybe scary) as that. Consider these statistics about the power of local business reviews:

  • 87% of consumers read local business reviews (BrightLocal)
  • 27% of people who look for local information are actually seeking reviews about a particular store. (Streetfight Mag)
  • 30% of consumers say seeing business owners’ responses to reviews are key to them judging the company. (BrightLocal)
  • 73.8 percent of customers are either likely or extremely likely to continue doing business with a brand that resolves their complaints. (GatherUp)

To be competitive, your craft store must earn reviews. Many business owners feel apprehensive about negative reviews, but the good news is:

  • You can “rip out” some negative reviews simply by responding well to them. The owner response function actually makes reviews conversational, and a customer you’ve made things right with can edit their initial review to a more positive one.
  • Most consumers expect a business to receive some negative reviews. Multiple surveys find that a perfect 5 star rating can look suspicious to shoppers.
  • If you continuously monitor reviews, either manually or via convenient software like Moz Local that alerts you to incoming reviews, there is little to fear, because customers are more forgiving than you might have thought.

For a complete tutorial, read How to Get a Customer to Edit Their Negative Review. And be sure you are always doing what’s necessary to earn positive reviews by delivering excellent customer service, keeping your online listings accurate, and proactively asking customers to review you on Google and other eligible platforms.

4. Craft what online can’t — 5 senses engagement

Consider these three telling statistics:

  • Over half of consumers prefer to shop in-store to interact with products. (Local Search Association)
  • 80% of U.S. disposable income is spent within 20 miles of home (Access Development)
  • By 2021, mobile devices alone will influence $1.4 trillion in local sales. (Forrester)

There may be no retailer left in American who hasn’t felt the Amazon effect, but as a craft shop owner, you have an amazing advantage so many other industries lack. Crafters want to touch textiles and fibers before buying, to hold fabrics up to their faces, to see true colors, and handle highly tactile merchandise like beads and wood. When it comes to fulfilling the five senses, online shopping is miles behind what you can provide face-to-face.

And it’s not just customers’ desire to interact with products that sets you apart — it’s their desire to interact with experts. As pattern designer Amy Barickman of Indygo Junction perfectly sums it up:

“To survive and thrive, brick-and-mortar stores must now provide experiences that cannot be replicated online.”

The expertise of your staff, the classes you hold, and tie-in services you offer, the sensory appeal of your storefront, the time you take to build relationships with customers all contribute to creating valued interactions which the Internet just can’t replace.

This advantage ties in deeply with the quality of your staff hiring and training practices. One respected survey found that 57% of customer complaints stem from employee behavior and poor service. Specifically in the crafting industry, staff who are expert with the materials being sold are worth their weight in gold. Be prepared to assist both seasoned crafters and the new generations of customers who are just now embracing the creative industries.

Play to your strengths. In every way that you market your business, emphasize hands-on experiences to draw people off their computers and into your store. In every ad you run, blog post you write, phone call you answer, listing you build, invite people to come in to engage all five senses at your place of business. Soft lighting and music, a tea kiosk, fragrant fresh flowers, some comfy chairs, and plenty of tactile merchandise are all within your reach, making shopping a pleasure which customers will want to enjoy again and again.

5. Learn to read your competitors’ patterns

Need to know: there are no #1 rankings on Google. Google customizes the search engine results they show to each person, based on where that person is physically located at the time they look something up on their phone or computer. You can walk or drive around your city, performing the identical search, and watch the rankings change in the:

Local Packs

Maps

Organic results

If you’re doing business in an area with few competitors, you may only need to be aware of one or two other companies. But when competition is more dense and diverse, or you operate multiple locations, the need for competitive analysis can grow exponentially. And for each potential customer, the set of businesses you’re competing with changes, based on that customer’s location. 

How can you visualize and strategize for this? You have two options:

  1. If competition is quite low, you can manually find your true local competitors with this tutorial. It includes a free spreadsheet for helping you figure out which businesses are ranking for your most desired searches for the customers nearest you. This is a basic, doable approach for very small businesses.
  2. If your environment is competitive or you are marketing a large, enterprise craft store brand, you can automate analysis with software. Local Market Analytics from Moz, for example, is designed to do all the work of finding true competitors for you. This groundbreaking product multi-samples searchers’ locations and helps you analyze your strongest and weakest markets. Currently, Local Market Analytics focuses on organic results, and it will soon include data on local pack results, too.

Once you’ve completed this first task, you have one more step ahead if you find that some of your competitors are outranking you. You’ll want to stack up your metrics against theirs to analyze why they are surpassing you. Good news: we’ve got another tutorial and free spreadsheet for this project! What emerges from the work is a pattern of strengths and weaknesses that signal why Google is ranking some businesses ahead of others.

Knowing who your competitors are and gathering metrics about why they may be outranking you is what empowers you to create a winning local search marketing strategy. Whether you find you need more reviews, a stronger website, or some other improvement, you’ll be working from data instead of making random guesses about how to grow your business.

6. Open your grab bag

Every craft store and craft fair has its grab bags, and who can resist them? I’d like to close out this article by spilling a trove of marketing goodies into your hands. Sort through them and see if there’s a fresh idea in here that could really work for your business to take it to the next level.

  • Be more! This year, Michaels has partnered with UPS at 1,100 locations in a convenience experiment. You run a craft store, but could it be more? Is there something lacking in your local market that your shop could double as? A meeting house, a lending library, an adult classroom, a tea shop, a Wi-Fi spot, a holiday boutique, a place for live music?
  • Tie in! Your quilt shop can support apparel sewers with a few extra solids, textiles, and some fun patterns. Your yarn shop can find a nook for needle arts. Your woodshop could offer wooden needles for knitting and crochet, wooden hoops for embroidery, wood buttons, stamps, and a variety of wood boxes for crafters. You may sell everything needed for beading jewelry, but do you have the necessary supplies to bead clothing? Crafters are hungry for local resources for every kind of project, especially in rural areas, suburbs, and other communities where there are few businesses.
  • Teach! There are so many arts and crafts that are incredibly challenging to learn without being shown, face-to-face. Not everyone is lucky enough to have a grandparent or parent to demo exactly how you do a long tail cast on or master the dovetail joint. If you want to sell merchandise, show how to use it. Look at JOANN, which just unveiled its new concept store in Columbus, Ohio, centered on a “Creators Studio”. One independent fabric shop near me devotes half its floorspace to classes for children — the next generation of customers!
  • Email! Don’t make the mistake of thinking email is old school. Statistics say that 47% of marketers point to email marketing as delivering the highest ROI and 69% of consumers prefer to receive local business communications via email. If you’re one of the 50% of small business owners who hasn’t yet taken the leap of creating an email newsletter, do it!
  • Survey! Don’t guess what to stock or how to do business. Directly ask your customers via email, social media, and in-store surveys what they really want. I’ve seen businesses abandon scented products because they found they were deterring migraine-prone shoppers. I’ve seen others implement special ordering services to source hard-to-access items in-store instead of letting consumer drift away to the online world. Giving the customer what they want is the absolute key to your store’s success.
  • Go green! Whether it’s powering your shop with solar, supporting upcycling crafts, or stocking organic and sustainable inventory, embrace and promote every green practice you can engage in. Numerous studies cite the younger generations as being particularly defined by responsible consumption. Demonstrate solidarity with their aspirations in the way you operate and market.

Doers, makers, creators, crafters, artisans, artists… your business exists to support their drive to embellish personal and public life. When you need to grow your business, you’ll be drawing from the same source of inspiration that all creative people do: the ability to imagine, to envision a plan, to color outside the lines, to gather the materials you need to make something great.

Local search marketing is a template for ensuring that your business is ready to serve every crafter at every stage of their journey, from the first spark of an idea, to discovery of local resources, to transaction, and beyond. I hope you’ll take the template I’ve sketched out for you today and make it your own for a truly rewarding 2020.



Source link

Simple Spam Fighting: The Easiest Local Rankings You’ll Ever Earn



MiriamEllis

Image credit: Visit Lakeland

Reporting fake and duplicate listings to Google sounds hard. Sometimes it can be. But very often, it’s as easy as falling off a log, takes only a modest session of spam fighting and can yield significant local ranking improvements.

If your local business/the local brands your agency markets aren’t using spam fighting as a ranking tactic because you feel you lack the time or skills, please sit down with me for a sec.

What if I told you I spent about an hour yesterday doing something that moved a Home Depot location up 3 spots in a competitive market in Google’s local rankings less than 24 hours later? What if, for you, moving up a spot or two would get you out of Google’s local finder limbo and into the actual local pack limelight?

Today I’m going to show you exactly what I did to fight spam, how fast and easy it was to sweep out junk listings, and how rewarding it can be to see results transform in favor of the legitimate businesses you market.

Washing up the shady world of window blinds

Image credit: Aqua Mechanical

Who knew that shopping for window coverings would lead me into a den of spammers throwing shade all over Google?

The story of Google My Business spam is now more than a decade in the making, with scandalous examples like fake listings for locksmiths and addiction treatment centers proving how unsafe and unacceptable local business platforms can become when left unguarded.

But even in non-YMYL industries, spam listings deceive the public, waste consumers’ time, inhibit legitimate businesses from being discovered, and erode trust in the spam-hosting platform. I saw all of this in action when I was shopping to replace some broken blinds in my home, and it was such a hassle trying to find an actual vendor amid the chaff of broken, duplicate, and lead gen listings, I decided to do something about it.

I selected an SF Bay area branch of Home Depot as my hypothetical “client.” I knew they had a legitimate location in the city of Vallejo, CA — a place I don’t live but sometimes travel to, thereby excluding the influence of proximity from my study. I knew that they were only earning an 8th place ranking in Google’s Local Finder, pushed down by spam. I wanted to see how quickly I could impact Home Depot’s surprisingly bad ranking.

I took the following steps, and encourage you to take them for any local business you’re marketing, too:

Step 1: Search

While located at the place of business you’re marketing, perform a Google search (or have your client perform it) for the keyword phrase for which you most desire improved local rankings. Of course, if you’re already ranking well as you want to for the searchers nearest you, you can still follow this process for investigating somewhat more distant areas within your potential reach where you want to increase visibility.

In the results from your search, click on the “more businesses” link at the bottom of the local pack, and you’ll be taken to the interface commonly called the “Local Finder.”

The Local Finder isn’t typically 100% identical to the local pack in exact ranking order, but it’s the best place I know of to see how things stand beyond the first 3 results that make up Google’s local packs, telling a business which companies they need to surpass to move up towards local pack inclusion.

Find yourself in the local finder. In my case, the Home Depot location was at position 8. I hope you’re somewhere within the first set of 20 results Google typically gives, but if you’re not, keep paging through until you locate your listing. If you don’t find yourself at all, you may need to troubleshoot whether an eligibility issue, suspension, or filter is at play. But, hopefully that’s not you today.

Next, create a custom spreadsheet to record your findings. Or, much easier, just make a copy of mine!

Populate the spreadsheet by cutting and pasting the basic NAP (name, address, phone) for every competitor ranking above you, and include your own listing, too, of course! If you work for an agency, you’ll need to get the client to help you with this step by filling the spreadsheet out based on their search from their place of business.

In my case, I recorded everything in the first 20 results of the Local Finder, because I saw spam both above and below my “client,” and wanted to see the total movement resulting from my work in that result set.

Step 3: Identify obvious spam

We want to catch the easy fish today. You can go down rabbit holes another day, trying to ferret out weirdly woven webs of lead gen sites spanning the nation, but today, we’re just looking to weed out listings that clearly, blatantly don’t belong in the Local Finder. 

Go through these five easy steps:

  1. Look at the Google Streetview image for each business outranking you.
    Do you see a business with signage that matches the name on the listing? Move on. But if you see a house, an empty parking lot, or Google is marking the listing as “location approximate”, jot that down in the Notes section of your spreadsheet. For example, I saw a supposed window coverings showroom that Streetview was locating in an empty lot on a military base. Big red flag there.
  2. Make note of any businesses that share an address, phone number, or very similar name.
    Make note of anything with an overly long name that seems more like a string of keywords than a brand. For example, a listing in my set was called: Custom Window Treatments in Fairfield, CA Hunter Douglas Dealer.
  3. For every business you noted down in steps one and two, get on the phone.
    Is the number a working number? If someone answers, do they answer with the name of the business? Note it down. Say, “Hi, where is your shop located?” If the answer is that it’s not a shop, it’s a mobile business, note that down. Finally, If anything seems off, check the Guidelines for representing your business on Google to see what’s allowed in the industry you’re investigating. For example, it’s perfectly okay for a window blinds dealer to operate out of their home, but if they’re operating out of 5 homes in the same city, it’s likely a violation. In my case, just a couple of minutes on the phone identified multiple listings with phone numbers that were no longer in service.
  4. Visit the iffy websites. 
    Now that you’re narrowing your spreadsheet down to a set of businesses that are either obviously legitimate or “iffy,” visit the websites of the iffy ones. Does the name on the listing match the name on the website? Does anything else look odd? Note it down.
  5. Highlight businesses that are clearly spammy.
    Your dive hasn’t been deep, but by now, it may have identified one or more listings that you strongly believe don’t belong because they have spammy names, fake addresses, or out-of-service phone numbers. My lightning-quick pass through my data set showed that six of the twenty listings were clearly junk. That’s 30% of Google’s info being worthless! I suggest marking these in red text in your spreadsheet to make the next step fast and easy.

Step 4: Report it!

If you want to become a spam-fighting ace later, you’ll need to become familiar with Google’s Business Redressal Complaint Form which gives you lots of room for sharing your documentation of why a listing should be removed. In fact, if an aggravating spammer remains in the Local Finder despite what we’re doing in this session, this form is where you’d head next for a more concerted effort.

But, today, I promised the easiness of falling off a log, so our first effort at impacting the results will simply focus on the “suggest an edit” function you’ll see on each listing you’re trying to get rid of. This is how you do it:

After you click the “suggest an edit” button on the listing, a popup will appear. If you’re reporting something like a spammy name, click the “change name or other details” option and fill out the form. If you’ve determined a listing represents a non-existent, closed, unreachable, or duplicate entity, choose the “remove this place” option and then select the dropdown entry that most closely matches the problem. You can add a screenshot or other image if you like, but in my quick pass through the data, I didn’t bother.

Record the exact action you took for each spam listing in the “Actions” column of the spreadsheet. In my case, I was reporting a mixture or non-existent buildings, out-of-service phone numbers, and one duplicate listing with a spammy name.

Finally, hit the “send” button and you’re done.

Step 5: Record the results

Within an hour of filing my reports with Google, I received an email like this for 5 of the 6 entries I had flagged:

The only entry I received no email for was the duplicate listing with the spammy name. But I didn’t let this worry me. I went about the rest of my day and checked back in the morning.

I’m not fond of calling out businesses in public. Sometimes, there are good folks who are honestly confused about what’s allowed and what isn’t. Also, I sometimes find screenshots of the local finder overwhelmingly cluttered and endlessly long to look at. Instead, I created a bare-bones representational schematic of the total outcome of my hour of spam-fighting work.

The red markers are legit businesses. The grey ones are spam. The green one is the Home Depot I was trying to positively impact. I attributed a letter of the alphabet to each listing, to better help me see how the order changed from day one to day two. The lines show the movement over the course of the 24 hours.

The results were that:

  • A stayed the same, and B and C swapping positions was unlikely due to my work; local rankings can fluctuate like this from hour to hour.
  • Five out of six spam listings I reported disappeared. The keyword-stuffed duplicate listing which was initially at position K was replaced by the brand’s legitimate listing one spot lower than it had been.
  • The majority of the legitimate businesses enjoyed upward movement, with the exception of position I which went down, and M and R which disappeared. Perhaps new businesses moving into the Local Finder triggered a filter, or perhaps it was just the endless tide of position changes and they’ll be back tomorrow.
  • Seven new listings made it into the top 20. Unfortunately, at a glance, it looked to me like 3 of these new listings were new spam. Dang, Google!
  • Most rewardingly, my hypothetical client, Home Depot, moved up 3 spots. What a super easy win!

Fill out the final column in your spreadsheet with your results.

What we’ve learned

You battle upstream every day for your business or clients. You twist yourself like a paperclip complying with Google’s guidelines, seeking new link and unstructured citation opportunities, straining your brain to shake out new content, monitoring reviews like a chef trying to keep a cream sauce from separating. You do all this in the struggle for better, broader visibility, hoping that each effort will incrementally improve reputation, rankings, traffic, and conversions.

Catch your breath. Not everything in life has to be so hard. The river of work ahead is always wide, but don’t overlook the simplest stepping stones. Saunter past the spam listings without breaking a sweat and enjoy the easy upward progress!

I’d like to close today with three meditations:

1. Google is in over their heads with spam

Google is in over their heads with spam. My single local search for a single keyword phrase yielded 30% worthless data in their top local results. Google says they process 63,000 searches process per second and that as much as 50% of mobile queries have a local intent. I don’t know any other way to look at Google than as having become an under-regulated public utility at this point.

Expert local SEOs can spot spam listings in query after query, industry after industry, but Google has yet to staff a workforce or design an algorithm sufficient to address bad data that has direct, real-world impacts on businesses and customers. I don’t know if they lack the skills or the will to take responsibility for this enormous problem they’ve created, but the problem is plain. Until Google steps up, my best advice is to do the smart and civic work of watchdogging the results that most affect the local community you serve. It’s a positive not just for your brand, but for every legitimate business and every neighbor near you.

2. You may get in over your head with spam

You may get in over your head with spam. Today’s session was as simple as possible, but GMB spam can stem from complex, global networks. The Home Depot location I randomly rewarded with a 3-place jump in Local Finder rankings clearly isn’t dedicating sufficient resources to spam fighting or they would’ve done this work themselves.

But the extent of spam is severe. If your market is one that’s heavily spammed, you can quickly become overwhelmed by the problem. In such cases, I recommend that you:

  • Read this excellent recent article by Jessie Low on the many forms spam can take, plus some great tips for more strenuous fighting than we’ve covered today.
  • Follow Joy Hawkins, Mike Blumenthal, and Jason Brown, all of whom publish ongoing information on this subject. If you wade into a spam network, I recommend reporting it to one or more of these experts on Twitter, and, if you wish to become a skilled spam fighter yourself, you will learn a lot from what these three have published.
  • If you don’t want to fight spam yourself, hire an agency that has the smarts to be offering this as a service.
  • You can also report listing spam to the Google My Business Community Forum, but it’s a crowded place and it can sometimes be hard to get your issue seen.
  • Finally, if the effect of spam in your market is egregious enough, your ability to publicize it may be your greatest hope. Major media have now repeatedly featured broadcasts and stories on this topic, and shame will sometimes move Google to action when no other motivation appears to.

3. Try to build a local anti-spam movement

What if you built a local movement? What if you and your friendlier competitors joined forces to knock spam out of Google together? Imagine all of the florists, hair salons, or medical practitioners in a town coming together to watch the local SERPs in shifts so that everyone in their market could benefit from bad actors being reported.

Maybe you’re already in a local business association with many hands that could lighten the work of protecting a whole community from unethical business practices. Maybe your town could then join up with the nearest major city, and that city could begin putting pressure on legislators. Maybe legislators would begin to realize the extent of the impacts when legitimate businesses face competition from fake entities and illegal practices. Maybe new anti-trust and communications regulations would ensue.

Now, I promised you “simple,” and this isn’t it, is it? But every time I see a fake listing, I know I’m looking at a single pebble and I’m beginning to think it may take an avalanche to bring about change great enough to protect both local brands and consumers. Google is now 15 years into this dynamic with no serious commitment in sight to resolve it.

At least in your own backyard, in your own community, you can be one small part of the solution with the easy tactics I’ve shared today, but maybe it’s time for local commerce to begin both doing more and expecting more in the way of protections. 

I’m ready for that. And you?





Source link

All About Fraggles (Fragment + Handle) – Whiteboard Friday



Suzzicks

What are “fraggles” in SEO and how do they relate to mobile-first indexing, entities, the Knowledge Graph, and your day-to-day work? In this glimpse into her 2019 MozCon talk, Cindy Krum explains everything you need to understand about fraggles in this edition of Whiteboard Friday.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Cindy Krum, and I’m the CEO of MobileMoxie, based in Denver, Colorado. We do mobile SEO and ASO consulting. I’m here in Seattle, speaking at MozCon, but also recording this Whiteboard Friday for you today, and we are talking about fraggles.

So fraggles are obviously a name that I’m borrowing from Jim Henson, who created “Fraggle Rock.” But it’s a combination of words. It’s a combination of fragment and handle. I talk about fraggles as a new way or a new element or thing that Google is indexing.

Fraggles and mobile-first indexing

Let’s start with the idea of mobile-first indexing, because you have to kind of understand that before you can go on to understand fraggles. So I believe mobile-first indexing is about a little bit more than what Google says. Google says that mobile-first indexing was just a change of the crawler.

They had a desktop crawler that was primarily crawling and indexing, and now they have a mobile crawler that’s doing the heavy lifting for crawling and indexing. While I think that’s true, I think there’s more going on behind the scenes that they’re not talking about, and we’ve seen a lot of evidence of this. So what I believe is that mobile-first indexing was also about indexing, hence the name.

Knowledge Graph and entities

So I think that Google has reorganized their index around entities or around specifically entities in the Knowledge Graph. So this is kind of my rough diagram of a very simplified Knowledge Graph. But Knowledge Graph is all about person, place, thing, or idea.

Nouns are entities. Knowledge Graph has nodes for all of the major person, place, thing, or idea entities out there. But it also indexes or it also organizes the relationships of this idea to this idea or this thing to this thing. What’s useful for that to Google is that these things, these concepts, these relationships stay true in all languages, and that’s how entities work, because entities happen before keywords.

This can be a hard concept for SEOs to wrap their brain around because we’re so used to dealing with keywords. But if you think about an entity as something that’s described by a keyword and can be language agnostic, that’s how Google thinks about entities, because entities in the Knowledge Graph are not written up per se or their the unique identifier isn’t a word, it’s a number and numbers are language agnostic.

But if we think about an entity like mother, mother is a concept that exists in all languages, but we have different words to describe it. But regardless of what language you’re speaking, mother is related to father, is related to daughter, is related to grandfather, all in the same ways, even if we’re speaking different languages. So if Google can use what they call the “topic layer”and entities as a way to filter in information and understand the world, then they can do it in languages where they’re strong and say, “We know that this is true absolutely 100% all of the time.”

Then they can apply that understanding to languages that they have a harder time indexing or understanding, they’re just not as strong or the algorithm isn’t built to understand things like complexities of language, like German where they make really long words or other languages where they have lots of short words to mean different things or to modify different words.

Languages all work differently. But if they can use their translation API and their natural language APIs to build out the Knowledge Graph in places where they’re strong, then they can use it with machine learning to also build it and do a better job of answering questions in places or languages where they’re weak. So when you understand that, then it’s easy to think about mobile-first indexing as a massive Knowledge Graph build-out.

We’ve seen this happening statistically. There are more Knowledge Graph results and more other things that seem to be related to Knowledge Graph results, like people also ask, people also search for, related searches. Those are all describing different elements or different nodes on the Knowledge Graph. So when you see those things in the search, I want you to think, hey, this is the Knowledge Graph showing me how this topic is related to other topics.

So when Google launched mobile-first indexing, I think this is the reason it took two and a half years is because they were reindexing the entire web and organizing it around the Knowledge Graph. If you think back to the AMA that John Mueller did right about the time that Knowledge Graph was launching, he answered a lot of questions that were about JavaScript and href lang.

When you put this in that context, it makes more sense. He wants the entity understanding, or he knows that the entity understanding is really important, so the href lang is also really important. So that’s enough of that. Now let’s talk about fraggles.

Fraggles = fragment + handle

So fraggles, as I said, are a fragment plus a handle. It’s important to know that fraggles — let me go over here —fraggles and fragments, there are lots of things out there that have fragments. So you can think of native apps, databases, websites, podcasts, and videos. Those can all be fragmented.

Even though they don’t have a URL, they might be useful content, because Google says its goal is to organize the world’s information, not to organize the world’s websites. I think that, historically, Google has kind of been locked into this crawling and indexing of websites and that that’s bothered it, that it wants to be able to show other stuff, but it couldn’t do that because they all needed URLs.

But with fragments, potentially they don’t have to have a URL. So keep these things in mind — apps, databases and stuff like that — and then look at this. 

So this is a traditional page. If you think about a page, Google has kind of been forced, historically by their infrastructure, to surface pages and to rank pages. But pages sometimes struggle to rank if they have too many topics on them.

So for instance, what I’ve shown you here is a page about vegetables. This page may be the best page about vegetables, and it may have the best information about lettuce, celery, and radishes. But because it’s got those topics and maybe more topics on it, they all kind of dilute each other, and this great page may struggle to rank because it’s not focused on the one topic, on one thing at a time.

Google wants to rank the best things. But historically they’ve kind of pushed us to put the best things on one page at a time and to break them out. So what that’s created is this “content is king, I need more content, build more pages” mentality in SEO. The problem is everyone can be building more and more pages for every keyword that they want to rank for or every keyword group that they want to rank for, but only one is going to rank number one.

Google still has to crawl all of those pages that it told us to build, and that creates this character over here, I think, Marjory the Trash Heap, which if you remember the Fraggles, Marjory the Trash Heap was the all-knowing oracle. But when we’re all creating kind of low- to mid-quality content just to have a separate page for every topic, then that makes Google’s life harder, and that of course makes our life harder.

So why are we doing all of this work? The answer is because Google can only index pages, and if the page is too long or too many topics, Google gets confused. So we’ve been enabling Google to do this. But let’s pretend, go with me on this, because this is a theory, I can’t prove it. But if Google didn’t have to index a full page or wasn’t locked into that and could just index a piece of a page, then that makes it easier for Google to understand the relationships of different topics to one page, but also to organize the bits of the page to different pieces of the Knowledge Graph.

So this page about vegetables could be indexed and organized under the vegetable node of the Knowledge Graph. But that doesn’t mean that the lettuce part of the page couldn’t be indexed separately under the lettuce portion of the Knowledge Graph and so on, celery to celery and radish to radish. Now I know this is novel, and it’s hard to think about if you’ve been doing SEO for a long time.

But let’s think about why Google would want to do this. Google has been moving towards all of these new kinds of search experiences where we have voice search, we have the Google Home Hub kind of situation with a screen, or we have mobile searches. If you think about what Google has been doing, we’ve seen the increase in people also ask, and we’ve seen the increase in featured snippets.

They’ve actually been kind of, sort of making fragments for a long time or indexing fragments and showing them in featured snippets. The difference between that and fraggles is that when you click through on a fraggle, when it ranks in a search result, Google scrolls to that portion of the page automatically. That’s the handle portion.

So handles you may have heard of before. They’re kind of old-school web building. We call them bookmarks, anchor links, anchor jump links, stuff like that. It’s when it automatically scrolls to the right portion of the page. But what we’ve seen with fraggles is Google is lifting bits of text, and when you click on it, they’re scrolling directly to that piece of text on a page.

So we see this already happening in some results. What’s interesting is Google is overlaying the link. You don’t have to program the jump link in there. Google actually finds it and puts it there for you. So Google is already doing this, especially with AMP featured snippets. If you have a AMP featured snippet, so a featured snippet that’s lifted from an AMP page, when you click through, Google is actually scrolling and highlighting the featured snippet so that you could read it in context on the page.

But it’s also happening in other kind of more nuanced situations, especially with forums and conversations where they can pick a best answer. The difference between a fraggle and something like a jump link is that Google is overlaying the scrolling portion. The difference between a fraggle and a site link is site links link to other pages, and fraggles, they’re linking to multiple pieces of the same long page.

So we want to avoid continuing to build up low-quality or mid-quality pages that might go to Marjory the Trash Heap. We want to start thinking in terms of can Google find and identify the right portion of the page about a specific topic, and are these topics related enough that they’ll be understood when indexing them towards the Knowledge Graph.

Knowledge Graph build-out into different areas

So I personally think that we’re seeing the build-out of the Knowledge Graph in a lot of different things. I think featured snippets are kind of facts or ideas that are looking for a home or validation in the Knowledge Graph. People also ask seem to be the related nodes. People also search for, same thing. Related searches, same thing. Featured snippets, oh, they’re on there twice, two featured snippets. Found on the web, which is another way where Google is putting expanders by topic and then giving you a carousel of featured snippets to click through on.



 So we’re seeing all of those things, and some SEOs are getting kind of upset that Google is lifting so much content and putting it in the search results and that you’re not getting the click. We know that 61% of mobile searches don’t get a click anymore, and it’s because people are finding the information that they want directly in a SERP.

That’s tough for SEOs, but great for Google because it means Google is providing exactly what the user wants. So they’re probably going to continue to do this. I think that SEOs are going to change their minds and they’re going to want to be in those windowed content, in the lifted content, because when Google starts doing this kind of thing for the native apps, databases, and other content, websites, podcasts, stuff like that, then those are new competitors that you didn’t have to deal with when it was only websites ranking, but those are going to be more engaging kinds of content that Google will be showing or lifting and showing in a SERP even if they don’t have to have URLs, because Google can just window them and show them.

So you’d rather be lifted than not shown at all. So that’s it for me and featured snippets. I’d love to answer your questions in the comments, and thanks very much. I hope you like the theory about fraggles.

Video transcription by Speechpad.com



Source link

App Store SEO: How to Diagnose a Drop in Traffic & Win It Back



Joel.Mesherghi

For some organizations, mobile apps can be an important means to capturing new leads and customers, so it can be alarming when you notice your app visits are declining.

However, while there is content on how to optimize your app, otherwise known as ASO (App Store Optimization), there is little information out there on the steps required to diagnose a drop in app visits.

Although there are overlaps with traditional search, there are unique factors that play a role in app store visibility.

The aim of this blog is to give you a solid foundation when trying to investigate a drop in app store visits and then we’ll go through some quick fire opportunities to win that traffic back.

We’ll go through the process of investigating why your app traffic declined, including:

  1. Identifying potential external factors
  2. Identifying the type of keywords that dropped in visits
  3. Analyzing app user engagement metrics

And we’ll go through some ways to help you win traffic back including:

  1. Spying on your competitors
  2. Optimizing your store listing
  3. Investing in localisation

Investigating why your app traffic declined

Step 1. Identify potential external factors

Some industries/businesses will have certain periods of the year where traffic may drop due to external factors, such as seasonality.

Before you begin investigating a traffic drop further:

  • Talk to your point of contact and ask whether seasonality impacts their business, or whether there are general industry trends at play. For example, aggregator sites like SkyScanner may see a drop in app visits after the busy period at the start of the year.
  • Identify whether app installs actually dropped. If they didn’t, then you probably don’t need to worry about a drop in traffic too much and it could be Google’s and Apple’s algorithms better aligning the intent of search terms.

Step 2. Identify the type of keywords that dropped in visits

Like traditional search, identifying the type of keywords (branded and non-branded), as well as the individual keywords that saw the biggest drop in app store visits, will provide much needed context and help shape the direction of your investigation. For instance:

If branded terms saw the biggest drop-off in visits this could suggest:

  1. There has been a decrease in the amount of advertising spend that builds brand/product awareness
  2. Competitors are bidding on your branded terms
  3. The app name/brand has changed and hasn’t been able to mop up all previous branded traffic

If non-branded terms saw the biggest drop off in visits this could suggest:

  1. You’ve made recent optimisation changes that have had a negative impact
  2. User engagement signals, such as app crashes, or app reviews have changed for the worse
  3. Your competition have better optimised their app and/or provide a better user experience (particularly relevant if an app receives a majority of its traffic from a small set of keywords)
  4. Your app has been hit by an algorithm update

If both branded and non-branded terms saw the biggest drop off in visits this could suggest:

  1. You’ve violated Google’s policies on promoting your app.
  2. There are external factors at play

To get data for your Android app

To get data for your Android app, sign into your Google Play Console account.

Google Play Console provides a wealth of data on the performance of your android app, with particularly useful insights on user engagement metrics that influence app store ranking (more on these later).

However, keyword specific data will be limited. Google Play Console will show you the individual keywords that delivered the most downloads for your app, but the majority of keyword visits will likely be unclassified: mid to long-tail keywords that generate downloads, but don’t generate enough downloads to appear as isolated keywords. These keywords will be classified as “other”.

Your chart might look like the below. Repeat the same process for branded terms.

Above: Graph of a client’s non-branded Google Play Store app visits. The number of visits are factual, but the keywords driving visits have been changed to keep anonymity.

To get data for your IOS app

To get data on the performance of your IOS app, Apple have App Store Connect. Like Google Play Console, you’ll be able to get your hands on user engagement metrics that can influence the ranking of your app.

However, keyword data is even scarcer than Google Play Console. You’ll only be able to see the total number of impressions your app’s icon has received on the App Store. If you’ve seen a drop in visits for both your Android and IOS app, then you could use Google Play Console data as a proxy for keyword performance.

If you use an app rank tracking tool, such as TheTool, you can somewhat plug gaps in knowledge for the keywords that are potentially driving visits to your app.

Step 3. Analyze app user engagement metrics

User engagement metrics that underpin a good user experience have a strong influence on how your app ranks and both Apple and Google are open about this.

Google states that user engagement metrics like app crashes, ANR rates (application not responding) and poor reviews can limit exposure opportunities on Google Play.

While Apple isn’t quite as forthcoming as Google when it comes to providing information on engagement metrics, they do state that app ratings and reviews can influence app store visibility.

Ultimately, Apple wants to ensure IOS apps provide a good user experience, so it’s likely they use a range of additional user engagement metrics to rank an app in the App Store.

As part of your investigation, you should look into how the below user engagement metrics may have changed around the time period you saw a drop in visits to your app.

  • App rating
  • Number of ratings (newer/fresh ratings will be weighted more for Google)
  • Number of downloads
  • Installs vs uninstalls
  • App crashes and application not responding

You’ll be able to get data for the above metrics in Google Play Console and App Store Connect, or you may have access to this data internally.

Even if your analysis doesn’t reveal insights, metrics like app rating influences conversion and where your app ranks in the app pack SERP feature, so it’s well worth investing time in developing a strategy to improve these metrics.

One simple tactic could be to ensure you respond to negative reviews and reviews with questions. In fact, users increase their rating by +0.7 stars on average after receiving a reply.

Apple offers a few tips on asking for ratings and reviews for IOS app.

Help win your app traffic back

Step 1. Spy on your competitors

Find out who’s ranking

When trying to identify opportunities to improve app store visibility, I always like to compare the top 5 ranking competitor apps for some priority non-branded keywords.

All you need to do is search for these keywords in Google Play and the App Store and grab the publicly available ranking factors from each app listing. You should have something like the below.

Brand

Title

Title Character length

Rating

Number of reviews

Number of installs

Description character length

COMPETITOR 1

[Competitor title]

50

4.8

2,848

50,000+

3,953

COMPETITOR 2

[Competitor title]

28

4.0

3,080

500,000+

2,441

COMPETITOR 3

[Competitor title]

16

4.0

2566

100,000+

2,059

YOUR BRAND

​[Your brands title]

37

4.3

2,367

100,000+

3,951

COMPETITOR 4

[Competitor title]

7

4.1

1,140

100,000+

1,142

COMPETITOR 5

[Competitor title]

24

4.5

567

50,000+

2,647

     Above: anonymized table of a client’s Google Play competitors

From this, you may get some indications as to why an app ranks above you. For instance, we see “Competitor 1” not only has the best app rating, but has the longest title and description. Perhaps they better optimized their title and description?

We can also see that competitors that rank above us generally have a larger number of total reviews and installs, which aligns with both Google’s and Apple’s statements about the importance of user engagement metrics.

With the above comparison information, you can dig a little deeper, which leads us on nicely to the next section.

Optimize your app text fields

Keywords you add to text fields can have a significant impact on app store discoverability.

As part of your analysis, you should look into how your keyword optimization differs from competitors and identify any opportunities.

For Google Play, adding keywords to the below text fields can influence rankings:

  • Keywords in the app title (50 characters)
  • Keywords in the app description (4,000 characters)
  • Keywords in short description (80 characters)
  • Keywords in URL
  • Keywords in your app name

When it comes to the App Store, adding keywords to the below text fields can influence rankings:

  • Keywords in the app title (30 characters)
  • Using the 100 character keywords field (a dedicated 100-character field to place keywords you want to rank for)
  • Keywords in your app name

To better understand how your optimisation tactics hold up, I recommended comparing your app text fields to competitors.

For example, if I want to know the frequency of mentioned keywords in their app descriptions on Google Play (keywords in the description field are a ranking factor) than I’d create a table like the one below.

Keyword

COMPETITOR 1

COMPETITOR 2

COMPETITOR 3

YOUR BRAND

COMPETITOR 4

COMPETITOR 5

job

32

9

5

40

3

2

job search

12

4

10

9

10

8

employment

2

0

0

5

0

3

job tracking

2

0

0

4

0

0

employment app

7

2

0

4

2

1

employment search

4

1

1

5

0

0

job tracker

3

0

0

1

0

0

recruiter

2

0

0

1

0

0

     Above: anonymized table of a client’s Google Play competitors

From the above table, I can see that the number 1 ranking competitor (competitor 1) has more mentions of “job search” and “employment app” than I do.

Whilst there are many factors that decide the position at which an app ranks, I could deduce that I need to increase the frequency of said keywords in my Google Play app description to help improve ranking.

Be careful though: writing unnatural, keyword stuffed descriptions and titles will likely have an adverse effect.

Remember, as well as being optimized for machines, text fields like your app title and description are meant to be a compelling “advertisement” of your app for users..

I’d repeat this process for other text fields to uncover other keyword insights.

Step 2. Optimize your store listing

Your store listing in the home of your app on Google Play. It’s where users can learn about your app, read reviews and more. And surprisingly, not all apps take full advantage of developing an immersive store listing experience.

Whilst Google doesn’t seem to directly state that fully utilizing the majority of store listing features directly impacts your apps discoverability, it’s fair to speculate that there may be some ranking consideration behind this.

At the very least, investing in your store listing could improve conversion and you can even run A/B tests to measure the impact of your changes.

You can improve the overall user experience and content found in the store listing by adding video trailers of your app, quality creative assets, your apps icon (you’ll want to make your icon stand out amongst a sea of other app icons) and more.

You can read Google’s best practice guide on creating a compelling Google Play store listing to learn more.

Step 3. Invest in localization

The saying goes “think global, act local” and this is certainly true of apps.

Previous studies have revealed that 72.4% of global consumers preferred to use their native language when shopping online and that 56.2% of consumers said that the ability to obtain information in their own language is more important than price.

It makes logical sense. The better you can personalize your product for your audience, the better your results will be, so go the extra mile and localize your Google Play and App Store listings.

Google has a handy checklist for localization on Google Play and Apple has a comprehensive resource on internationalizing your app on the App Store.

Wrap up

A drop in visits of any kind causes alarm and panic. Hopefully this blog gives you a good starting point if you ever need to investigate why an apps traffic has dropped as well as providing some quick fire opportunities to win it back.

If you’re interested in further reading on ASO, I recommend reading App Radar’s and TheTool’s guides to ASO, as well as app search discoverability tips from Google and Apple themselves.



Source link

Better Content Through NLP (Natural Language Processing) – Whiteboard Friday



RuthBurrReedy

Gone are the days of optimizing content solely for search engines. For modern SEO, your content needs to please both robots and humans. But how do you know that what you’re writing can check the boxes for both man and machine?

In today’s Whiteboard Friday, Ruth Burr Reedy focuses on part of her recent MozCon 2019 talk and teaches us all about how Google uses NLP (natural language processing) to truly understand content, plus how you can harness that knowledge to better optimize what you write for people and bots alike.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. I’m Ruth Burr Reedy, and I am the Vice President of Strategy at UpBuild, a boutique technical marketing agency specializing in technical SEO and advanced web analytics. I recently spoke at MozCon on a basic framework for SEO and approaching changes to our industry that thinks about SEO in the light of we are humans who are marketing to humans, but we are using a machine as the intermediary.

Those videos will be available online at some point. [Editor’s note: that point is now!] But today I wanted to talk about one point from my talk that I found really interesting and that has kind of changed the way that I approach content creation, and that is the idea that writing content that is easier for Google, a robot, to understand can actually make you a better writer and help you write better content for humans. It is a win-win. 

The relationships between entities, words, and how people search

To understand how Google is currently approaching parsing content and understanding what content is about, Google is spending a lot of time and a lot of energy and a lot of money on things like neural matching and natural language processing, which seek to understand basically when people talk, what are they talking about?

This goes along with the evolution of search to be more conversational. But there are a lot of times when someone is searching, but they don’t totally know what they want, and Google still wants them to get what they want because that’s how Google makes money. They are spending a lot of time trying to understand the relationships between entities and between words and how people use words to search.

The example that Danny Sullivan gave online, that I think is a really great example, is if someone is experiencing the soap opera effect on their TV. If you’ve ever seen a soap opera, you’ve noticed that they look kind of weird. Someone might be experiencing that, and not knowing what that’s called they can’t Google soap opera effect because they don’t know about it.

They might search something like, “Why does my TV look funny?” Neural matching helps Google understand that when somebody is searching “Why does my TV look funny?” one possible answer might be the soap opera effect. So they can serve up that result, and people are happy. 

Understanding salience

As we’re thinking about natural language processing, a core component of natural language processing is understanding salience.

Salience, content, and entities

Salience is a one-word way to sum up to what extent is this piece of content about this specific entity? At this point Google is really good at extracting entities from a piece of content. Entities are basically nouns, people, places, things, proper nouns, regular nouns.

Entities are things, people, etc., numbers, things like that. Google is really good at taking those out and saying, “Okay, here are all of the entities that are contained within this piece of content.” Salience attempts to understand how they’re related to each other, because what Google is really trying to understand when they’re crawling a page is: What is this page about, and is this a good example of a page about this topic?

Salience really goes into the second piece. To what extent is any given entity be the topic of a piece of content? It’s often amazing the degree to which a piece of content that a person has created is not actually about anything. I think we’ve all experienced that.

You’re searching and you come to a page and you’re like, “This was too vague. This was too broad. This said that it was about one thing, but it was actually about something else. I didn’t find what I needed. This wasn’t good information for me.” As marketers, we’re often on the other side of that, trying to get our clients to say what their product actually does on their website or say, “I know you think that you created a guide to Instagram for the holidays. But you actually wrote one paragraph about the holidays and then seven paragraphs about your new Instagram tool. This is not actually a blog post about Instagram for the holidays. It’s a piece of content about your tool.” These are the kinds of battles that we fight as marketers. 

Natural Language Processing (NLP) APIs

Fortunately, there are now a number of different APIs that you can use to understand natural language processing: 

Is it as sophisticated as what they’re using on their own stuff? Probably not. But you can test it out. Put in a piece of content and see (a) what entities Google is able to extract from it, and (b) how salient Google feels each of these entities is to the piece of content as a whole. Again, to what degree is this piece of content about this thing?

So this natural language processing API, which you can try for free and it’s actually not that expensive for an API if you want to build a tool with it, will assign each entity that it can extract a salient score between 0 and 1, saying, “Okay, how sure are we that this piece of content is about this thing versus just containing it?”

So the higher or the closer you get to 1, the more confident the tool is that this piece of content is about this thing. 0.9 would be really, really good. 0.01 means it’s there, but they’re not sure how well it’s related. 

A delicious example of how salience and entities work

The example I have here, and this is not taken from a real piece of content — these numbers are made up, it’s just an example — is if you had a chocolate chip cookie recipe, you would want chocolate cookies or chocolate chip cookies recipe, chocolate chip cookies, something like that to be the number one entity, the most salient entity, and you would want it to have a pretty high salient score.

You would want the tool to feel pretty confident, yes, this piece of content is about this topic. But what you can also see is the other entities it’s extracting and to what degree they are also salient to the topic. So you can see things like if you have a chocolate chip cookie recipe, you would expect to see things like cookie, butter, sugar, 350, which is the temperature you heat your oven, all of the different things that come together to make a chocolate chip cookie recipe.

But I think that it’s really, really important for us as SEOs to understand that salience is the future of related keywords. We’re beyond the time when to optimize for chocolate chip cookie recipe, we would also be looking for things like chocolate recipe, chocolate chips, chocolate cookie recipe, things like that. Stems, variants, TF-IDF, these are all older methodologies for understanding what a piece of content is about.

Instead what we need to understand is what are the entities that Google, using its vast body of knowledge, using things like Freebase, using large portions of the internet, where is Google seeing these entities co-occur at such a rate that they feel reasonably confident that a piece of content on one entity in order to be salient to that entity would include these other entities?

Using an expert is the best way to create content that’s salient to a topic

So chocolate chip cookie recipe, we’re now also making sure we’re adding things like butter, flour, sugar. This is actually really easy to do if you actually have a chocolate chip cookie recipe to put up there. This is I think what we’re going to start seeing as a content trend in SEO is that the best way to create content that is salient to a topic is to have an actual expert in that topic create that content.

Somebody with deep knowledge of a topic is naturally going to include co-occurring terms, because they know how to create something that’s about what it’s supposed to be about. I think what we’re going to start seeing is that people are going to have to start paying more for content marketing, frankly. Unfortunately, a lot of companies seem to think that content marketing is and should be cheap.

Content marketers, I feel you on that. It sucks, and it’s no longer the case. We need to start investing in content and investing in experts to create that content so that they can create that deep, rich, salient content that everybody really needs. 

How can you use this API to improve your own SEO? 

One of the things that I like to do with this kind of information is look at — and this is something that I’ve done for years, just not in this context — but a prime optimization target in general is pages that rank for a topic, but they rank on page 2.

What this often means is that Google understands that that keyword is a topic of the page, but it doesn’t necessarily understand that it is a good piece of content on that topic, that the page is actually solely about that content, that it’s a good resource. In other words, the signal is there, but it’s weak.

What you can do is take content that ranks but not well, run it through this natural language API or another natural language processing tool, and look at how the entities are extracted and how Google is determining that they’re related to each other. Sometimes it might be that you need to do some disambiguation. So in this example, you’ll notice that while chocolate cookies is called a work of art, and I agree, cookie here is actually called other.

This is because cookie means more than one thing. There’s cookies, the baked good, but then there’s also cookies, the packet of data. Both of those are legitimate uses of the word “cookie.” Words have multiple meanings. If you notice that Google, that this natural language processing API is having trouble correctly classifying your entities, that’s a good time to go in and do some disambiguation.

Make sure that the terms surrounding that term are clearly saying, “No, I mean the baked good, not the software piece of data.” That’s a really great way to kind of bump up your salience. Look at whether or not you have a strong salient score for your primary entity. You’d be amazed at how many pieces of content you can plug into this tool and the top, most salient entity is still only like a 0.01, a 0.14.

A lot of times the API is like “I think this is what it’s about,” but it’s not sure. This is a great time to go in and bump up that content, make it more robust, and look at ways that you can make those entities easier to both extract and to relate to each other. This brings me to my second point, which is my new favorite thing in the world.

Writing for humans and writing for machines, you can now do both at the same time. You no longer have to, and you really haven’t had to do this in a long time, but the idea that you might keyword stuff or otherwise create content for Google that your users might not see or care about is way, way, way over.

Now you can create content for Google that also is better for users, because the tenets of machine readability and human readability are moving closer and closer together. 

Tips for writing for human and machine readability:

Reduce semantic distances!

What I’ve done here is I did some research not on natural language processing, but on writing for human readability, that is advice from writers, from writing experts on how to write better, clearer, easier to read, easier to understand content.Then I pulled out the pieces of advice that also work as pieces of advice for writing for natural language processing. So natural language processing, again, is the process by which Google or really anything that might be processing language tries to understand how entities are related to each other within a given body of content.

Short, simple sentences

Short, simple sentences. Write simply. Don’t use a lot of flowery language. Short sentences and try to keep it to one idea per sentence. 

One idea per sentence

If you’re running on, if you’ve got a lot of different clauses, if you’re using a lot of pronouns and it’s becoming confusing what you’re talking about, that’s not great for readers.

It also makes it harder for machines to parse your content. 

Connect questions to answers

Then closely connecting questions to answers. So don’t say, “What is the best temperature to bake cookies? Well, let me tell you a story about my grandmother and my childhood,” and 500 words later here’s the answer. Connect questions to answers. 

What all three of those readability tips have in common is they boil down to reducing the semantic distance between entities.

If you want natural language processing to understand that two entities in your content are closely related, move them closer together in the sentence. Move the words closer together. Reduce the clutter, reduce the fluff, reduce the number of semantic hops that a robot might have to take between one entity and another to understand the relationship, and you’ve now created content that is more readable because it’s shorter and easier to skim, but also easier for a robot to parse and understand.

Be specific first, then explain nuance

Going back to the example of “What is the best temperature to bake chocolate chip cookies at?” Now the real answer to what is the best temperature to bake chocolate cookies is it depends. Hello. Hi, I’m an SEO, and I just answered a question with it depends. It does depend.

That is true, and that is real, but it is not a good answer. It is also not the kind of thing that a robot could extract and reproduce in, for example, voice search or a featured snippet. If somebody says, “Okay, Google, what is a good temperature to bake cookies at?” and Google says, “It depends,” that helps nobody even though it’s true. So in order to write for both machine and human readability, be specific first and then you can explain nuance.

Then you can go into the details. So a better, just as correct answer to “What is the temperature to bake chocolate chip cookies?” is the best temperature to bake chocolate chip cookies is usually between 325 and 425 degrees, depending on your altitude and how crisp you like your cookie. That is just as true as it depends and, in fact, means the same thing as it depends, but it’s a lot more specific.

It’s a lot more precise. It uses real numbers. It provides a real answer. I’ve shortened the distance between the question and the answer. I didn’t say it depends first. I said it depends at the end. That’s the kind of thing that you can do to improve readability and understanding for both humans and machines.

Get to the point (don’t bury the lede)

Get to the point. Don’t bury the lead. All of you journalists who try to become content marketers, and then everybody in content marketing said, “Oh, you need to wait till the end to get to your point or they won’t read the whole thing,”and you were like, “Don’t bury the lead,” you are correct. For those of you who aren’t familiar with journalism speak, not burying the lead basically means get to the point upfront, at the top.

Include all the information that somebody would really need to get from that piece of content. If they don’t read anything else, they read that one paragraph and they’ve gotten the gist. Then people who want to go deep can go deep. That’s how people actually like to consume content, and surprisingly it doesn’t mean they won’t read the content. It just means they don’t have to read it if they don’t have time, if they need a quick answer.

The same is true with machines. Get to the point upfront. Make it clear right away what the primary entity, the primary topic, the primary focus of your content is and then get into the details. You’ll have a much better structured piece of content that’s easier to parse on all sides. 

Avoid jargon and “marketing speak”

Avoid jargon. Avoid marketing speak. Not only is it terrible and very hard to understand. You see this a lot. I’m going back again to the example of getting your clients to say what their products do. You work with a lot of B2B companies, you will you will often run into this. Yes, but what does it do? It provides solutions to streamline the workflow and blah, blah. Okay, what does it do? This is the kind of thing that can be really, really hard for companies to get out of their own heads about, but it’s so important for users, for machines.

Avoid jargon. Avoid marketing speak. Not to get too tautological, but the more esoteric a word is, the less commonly it’s used. That’s actually what esoteric means. What that means is the less commonly a word is used, the less likely it is that Google is going to understand its semantic relationships to other entities.

Keep it simple. Be specific. Say what you mean. Wipe out all of the jargon. By wiping out jargon and kind of marketing speak and kind of the fluff that can happen in your content, you’re also, once again, reducing the semantic distances between entities, making them easier to parse. 

Organize your information to match the user journey

Organize it and map it out to the user journey. Think about the information somebody might need and the order in which they might need it. 

Break out subtopics with headings

Then break it out with subheadings. This is like very, very basic writing advice, and yet you all aren’t doing it. So if you’re not going to do it for your users, do it for machines. 

Format lists with bullets or numbers

You can also really impact skimmability for users by breaking out lists with bullets or numbers.

The great thing about that is that breaking out a list with bullets or numbers also makes information easier for a robot to parse and extract. If a lot of these tips seem like they’re the same tips that you would use to get featured snippets, they are, because featured snippets are actually a pretty good indicator that you’re creating content that a robot can find, parse, understand, and extract, and that’s what you want.

So if you’re targeting featured snippets, you’re probably already doing a lot of these things, good job. 

Grammar and spelling count!

The last thing, which I shouldn’t have to say, but I’m going to say is that grammar and spelling and punctuation and things like that absolutely do count. They count to users. They don’t count to all users, but they count to users. They also count to search engines.

Things like grammar, spelling, and punctuation are very, very easy signals for a machine to find and parse. Google has been specific in things, like the “Quality Rater Guidelines,”that a well-written, well-structured, well-spelled, grammatically correct document, that these are signs of authoritativeness. I’m not saying that having a greatly spelled document is going to mean that you immediately rocket to the top of the results.

I am saying that if you’re not on that stuff, it’s probably going to hurt you. So take the time to make sure everything is nice and tidy. You can use vernacular English. You don’t have to be perfect “AP Style Guide” all the time. But make sure that you are formatting things properly from a grammatical standpoint as well as a technical standpoint. What I love about all of this, this is just good writing.

This is good writing. It’s easy to understand. It’s easy to parse. It’s still so hard, especially in the marketing world, to get out of that world of jargon, to get to the point, to stop writing 2,000 words because we think we need 2,000 words, to really think about are we creating content that’s about what we think it’s about.

Use these tools to understand how readable, parsable, and understandable your content is

So my hope for the SEO world and for you is that you can use these tools not just to think about how to dial in the perfect keyword density or whatever to get an almost perfect score on the salience in the natural language processing API. What I’m hoping is that you will use these tools to help yourself understand how readable, how parsable, and how understandable your content is, how much your content is about what you say it’s about and what you think it’s about so you can create better stuff for users.

It makes the internet a better place, and it will probably make you some money as well. So these are my thoughts. I’d love to hear in the comments if you’re using the natural language processing API now, if you’ve built a tool with it, if you want to build a tool with it, what do you think about this, how do you use this, how has it gone. Tell me all about it. Holla atcha girl.

Have a great Friday.

Video transcription by Speechpad.com



Source link