12 Biggest Revelations From Google’s Updated SEO Starter Guide
Google dropped a bombshell of a document just a few weeks ago in the form of their Search Engine Optimization Starter Guide. The guide is meant to give SEO wonks and concerned site owners a 10,000 mile high view of what they need to have all the needed SEO quality factors present.
In many ways, the guide reads like a more in-depth version of our SEO Checklist for 2018 we recently published.
What’s most interesting about the guide is how it puts to rest many rumors regarding SEO ranking factors. For instance, the guide clearly outlines the importance of URL length and readability when it comes to user experience and search engine crawlability.
Google summarized the overview’s usefulness by asserting that,
“Search engine optimization (SEO) is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site’s user experience and performance in organic search results.”
You can catch the full, quite lengthy Google SEO Starter guide on their site, but here are 12 of the biggest takeaways we thought we’d highlight. Each one we point out either runs contrary to commonly held beliefs or represents a factor many underestimate when attempting to work with an SEO agency in Atlanta.
1. Five Questions Webmasters Should Ask
At the very beginning of the guide, Google suggests that the majority of SEO issues can be cleared up by asking the following:
- Is my website showing up on Google?
- Do I serve high-quality content to users?
- Is my local business showing up on Google?
- Is my content fast and easy to access on all devices?
- Is my website secure?
These questions barely scratch the surface of SEO best practices, but they do reveal Google’s priorities in how they determine rankability. Ask them of your own site(s) any time you want to improve.
2. Importance of No-Crawl Pages in Robots.txt
The Robots.txt file is intended to benefit site owners by giving them control over which pages the search engine indexing bot crawls.
Many webmasters overlook the importance of blocking certain page elements from being indexed. For instance, any page created as a result of an internal search engine query or related to the search engine’s use should not be crawled, since users don’t one to go from one search engine to another.
On the other hand, you should allow the index bot to access things like all on-page JavaScript, CSS, and image files. Otherwise, the search engine bot could be examining completely different things than the average user, leading to contrary behavioral signals that could hurt your ranking. You don’t want to make Google’s spiders angry!
Check how your page is read by Google’s spiders using “Fetch as Google.”
3. Ensure That Each Page Title Is Unique, Informative, Relevant
Title tags are for search users and people viewing your page. Google feels that they should therefore have informative value, be relevant, and be easy to read. Make each title tag unique so that pages can be differentiated by humans and bots alike.
Also, don’t use generic titles like “Page1” or “This is a Page About ___”. And for the love of all that is sacred, don’t stuff random strings of repetitive keywords in title tags!
4. Use Search Console to Verify Meta Description Quality
Meta descriptions can be tough to get right, but the Google Search Console’s HTML Improvements report function can help. It’ll tell you if your metas are too short, too long, or have been recycled too many times using a template.
Also, Google finally clears the air on the fact that your meta snippet may not necessarily be what’s displayed in search. “Google may choose to use a relevant section of your page’s visible text if it does a good job of matching up with a user’s query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet.”
The search engine offers an oldie-but-a-goodie article on writing quality metas.
5. Use an Outline Structure With Nesting Heading Tags
It’s been long suspected, but rarely outright suggested, that Google cares deeply about the high-level structure of your content pages.
Ideally, the company says that headers should serve as a sort of Cliff Notes to readers skimming quickly. The headers should also work as buckets or summaries accurately describing body text underneath.
Try to think of each page in a high-level outline form, the guide suggests. But don’t get crazy!
Using heading tags as a way to emphasize a particular quote, for instance, is discouraged compared to simply adding a separate <div> tag for CSS formatting. Alternatively, just use <strong> or <em> rather than a <h_> tag.
Also, exercise restraint. Don’t leap “erratically” from one heading size to another, and don’t pack too many headers in quick succession.
6. Review Our Suggestions for Structured Data
Google offers many features for structured data that improve search user experience and help your listing stand out. Use these features!
We recently covered how rich snippets and structured data work, so review that guide for a refresher.
7. Pay Close Attention to Site Hierarchy, And Make It Easy to Navigate
The general layout of your site matters to Google, since it can affect user experience. They want your website to be laid out logically, with general information displayed at first on your homepage and progressively more specific information revealed as you delve deeper into the site.
This is perhaps an underemphasized element of SEO, if only because many digital marketers don’t want to tell clients to completely rebuild their site structure.
The detailed recommendations for site structure get fairly specific, so you should probably take a look at that section if you’re interested in meeting their recommendations.
A few aspects we will highlight relate to the use of links to navigate from one page to the next:
- It’s important to have text-based menu links
- Menu item links should be displayed on page load, not after user interaction
- Avoid having a navigation based entirely on images, or animations
- Don’t use navigation that requires script or plugin-based event-handling for navigation
8. Ensure 404 Pages Are Useful, Designed In-Skin
Surprisingly, Google cares enough about 404 pages to mention them in their guide. But even health inspectors care about garbage cans, though, eh?
The gist is that Google wants your 404 pages to say something helpful like “Oops! You may be lost!” and offer a convenient link back to your homepage.
They want these pages to be generated through a 404 HTTP status code, not a persistent page link. That way, the pages won’t be indexed without having to block them manually in your robots.txt, which they say for some reason is a “no no.”
9. Ensure URLs Are “Clean”
Only slightly less surprising than caring about your 404s is the fact that Google cares about your site URLs. They justify this by saying that “visitors may be intimidated by extremely long and cryptic URLs that contain few recognizable words.”
It also doesn’t help that gibberish URLs are harder for spiders to index and associate with other pages compared to URLs that contain descriptive keywords. Give each URL an easy-to-read identity in a short format, and avoid generating things like Session IDs in your URLs if you can help it.
10. Write Good-Quality, Unique Content Suited to Search Intent
One thing we have mentioned a ton of times is how great content is the most important factor for ranking. We even took the time to cover how quality content affects ranking in two recent ranking factor study breakdowns.
What we haven’t touched upon quite as much is how you can match keyword use and article purpose to search user intent. As an example, Google suggests that “a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs].”
These aren’t hard and fast rules, but you should always think about how relevant your titles, descriptions and keywords are to the content you’re creating. If you can think of ways to deliver on expectations from search users stumbling onto your page, then do it!
11. Use Best Practices When Creating Anchor Text for Links
Ouch! Even the best SEO experts have been guilty of this one a time or two!
Apparently, Google has fairly strict recommendations when it comes to how and why you use anchor text for links. If you didn’t know, anchor text is the text people click on that takes them to an internal or external link.
Google strongly suggests using specific, descriptive phrases when creating anchor text rather than useless words like “page” “article” or “click here.”
They also advise you to use the rel=NoFollow command when listing links to sites with a worse reputation than yours. Whether you’re linking to a site as a bad example of something or just want to direct people there regardless of its poor quality, selecting NoFollow lessens the chance it’s cruddiness will rub off on your site domain authority.
Google also recommends having user comments set to NoFollow as a default since it’s a magnet for chat spam.
12. Use an Image Sitemap
Here’s something else that’s not often talked about: You should create an image sitemap.
Everyone knows about using alt text to help bots index image information, but an image sitemap can help Google index images better. This index-ability helps for both search snippets and image searches.
Take a look at Google’s image sitemap recommendations to get started. And remember to stick to image formats most browsers can read, namely: JPEG, GIF, PNG, BMP and WebP.
Some Final Notes to Consider
- In content, use keyword synonyms as well as common words and phrases used in relation to keywords.
- Avoid large chunks of text; people like reading descriptive text broken up by subheadings, lists, bullets and videos.
- Consider deleting or redirecting outdated content from your site or moving it to an archive with a note on why you’re preserving it. e.g. Consider deleting an announcement post for an event that’s passed if it has no additional value today.
- Be open with disclosing things like affiliate marketing relationships or native ads.
- Keep an eye on user-generated content, especially spam-prone comments.
- Go back and manually remove outdated SEO practices, like keyword spam dumps.
SEO Best Practices Are Changing All the Time, So Partner With an Up-to-Date Atlanta SEO Agency
Some of the revelations in Google’s starter guide came as a surprise to us as SEO continues to evolve, but others have been in our repertoire for years. The truth is that some SEO agencies in Atlanta will steer you wrong with outdated practices that hurt rather than help your chances of ranking.
For instance, the decreasing importance of verbatim keyword uses sails right over many SEO “experts’” heads. They might suggest using an exact match keyword at least 3-6 times in a document. Yet, the top search results may not have an exact match anywhere in the body.
These people are operating off of outdated information from 5 or more years ago, and you shouldn’t be!
Get the latest tips and informed best practices by partnering with EverSpark Interactive. Our SEO practices are always fresh, current, effective and based on the latest research.
Take a look at our Atlanta SEO services page to learn more about what we can do for you, then contact us to get the conversation started.