EverSpark’s What’s New In the World of Google marched on into 2014 this week, continuing ESI’s commitment to educating professionals who want to know more about how Google works along with helpful tactics to improve their own websites. This meetup occurs every week on Wednesday from 8-9:30am. All are welcome to attend.
Please join us at our next meeting on Wednesday, January 22nd, 2014 by RSVP’ing on the Meetup.com link below. Can’t make the meeting? No problem. We provide weekly catch up notes on our blog.
The meetup takes place at the EverSpark Interactive offices located at:
6 Concourse Parkway
Atlanta, GA 30328
RSVP here: http://www.meetup.com/Whats-New-In-the-World-of-Google-Everspark-Interactive/
This Week’s Recap:
The Four Commandments of SEO
EverSpark Principal, Chris Watson, kicked off the meeting by going over four tenants to get the best SEO results:
- Keep relevancy at the top of your “To Do” list.
Ultimately, Google wants to see you as an educator, or someone who is providing content that answers questions. Adding content to your site is more than just writing “stuff.” In order for your site to be in the best position to get a good rank on Google, your content must be both high quality and relevant to what your site is all about. Think of sites like Wikipedia, or reference guides where people can turn to find information or get answers to questions. Going off topic or adding content just for content’s sake will do you no favors in the eyes of a search engine, no matter how big your site is.
- Ensure your code is clean.
While this isn’t that hard to do, making sure your code is clean is an essential element of ensuring your site gets that all-important attention from search engines. Double check your title tags and meta descriptions, and regularly test your site to look for any errors. Additionally, make sure that you’re page sculpting and ensuring that every page has a relationship to your site. Regular checks will help keep any errors or missing elements to a minimum.
- Don’t forget that Google likes bigger sites.
Expanding your site through pages that answer questions people might search for is a great way to drive traffic to your site. By regularly updating both your blog and your site, Google search engines are more likely to take you seriously and view you as an educator. Think about sites like Wikipedia that people visit to discover information, and set up your site to act like an online classroom for whatever your market is.
- It actually is a popularity contest.
The centerpiece of the entire exercise is that it all boils down to popularity. You can do everything else right, but unless it’s popular, it doesn’t stand a chance of ranking on search engines.
Comparing Your Site Against The Competition
Chris went on to show the group how EverSpark starts evaluating a site against their competitors by looking at a variety of factors. In order to understand what you are up against, it’s important that you comprehend the variables of websites. By engaging in a comparative analysis of your site versus your competition, you can see where improvements can be made and where to focus your attention.
- What is the age of the domain? Google likes older sites and cares about that first registration date. If you are looking to purchase a domain, this is a good place to start.
- What is the site’s PageRank (PR)? PageRank places a value on a site’s relevance, reliability and authority on the web by assigning a value score of 0-10.
- How many indexed pages does the site have? You can easily determine this by using Google’s Webmaster Tools.
- How fast is the page speed? If you have a slow score, it’s likely you will be pushed to the back pages. Check your page speed with Google’s Page Speed Tools.
- Does the site have a lot of duplicate content? Copyscape is a handy tool for finding out if sites are scraping your content, or if you have too much matching content on your own site.
- How many inbound/do follow links does the site have? Google tends to like a 3/1 ratio maximum for each domain that is linking to you. For each domain, try not to have more than three links coming from it or it can tend to look spammy. Focus on quality, not quantity.
- Does the site have videos indexed on Google? Video is getting to be very important to SEO campaigns, and it’s relatively easy to turn a YouTube campaign into an SEO video campaign with Google.
- What is the value of the site? Using SEMRush, the value of a site represents the amount of money that would have to paid for a PPC campaign if the website was not already ranking for those particular keywords.
- Does the site have any egregious errors? For example, 404, 300 or 500 errors or duplicate or missing title or meta tags.
- What is the site’s social media footprint? While Facebook and Twitter are important, companies often miss the point of Google+ by thinking of it as a social media platform. Instead, think of it as part of the nervous system of Google itself. Putting 1,000 posts on Facebook will not affect your rankings. However, doing the same thing on Google+ will.
What’s New With Google This Week?
Enhanced Google Webmaster Tools
EverSpark Editor Dave Paul recently posted a blog with regards to how Google webmaster tools are now providing more specific search query data. Up until two years ago, it used to be relatively easy to see how many visits you generated from a specific keyword by using Google Analytics. This feature was removed with Google citing the official reason as better privacy and security for their users. Fortunately this data is once again available through Google Webmaster Tools, allowing SEO experts to see what keywords are driving traffic and which ones aren’t working. This kind of information is invaluable for understanding how to form and shape your SEO tactics.
Social Media Now Being Spotted on Google’s First Page of Search Results
Though previously thought to be reserved only for straight websites, social media is now making an appearance on Google’s first page of search results, with several coming up as number one in the organic search results. For example, if you google “Tampa DUI lawyer,” the Twitter feed for @DUITampa comes up as one of the first in organic search results. It would appear that Google is picking up on the name, rather than the content of the feed. Using the leverage of a powerful site to springboard your own SEO is an interesting theory, and this is currently being studied and tested by EverSpark for possible upcoming campaigns. Does this mean you should get registering multiple Twitter handles so you can take up more space? No. It’s most likely that Google will only show one or two Twitter or Facebook results on the valuable real estate that is their first search results page.
Reverse Engineering: A Real-Life, Real-Time Demo (Continued)
Steve Miller, SEO Manager at EverSpark continued where Jason Hennessey left off last week by introducing the group to a variety of helpful online tools you can use to continue the reverse engineering process using EverSpark’s newest client, Wellcentive, as an example.
Copyscape is an extremely useful tool for determining if you have duplicate content out on the web, or to determine if another party is using content (i.e. “scraping). Use Copyscape to check for duplicate content. Should you find duplicate content being used, look for the canonical tag in the code, as sometimes Copyscape can bring up a false positive. By using a canonical link element in your code, webmasters can flag up to search engines which site is responsible for the original content.
Companies that broadcast press releases can see a lot of duplicate content across the web as oftentimes sites will simply copy and paste directly from the release. Usually, most press releases contain a canned “About Us” blurb at the bottom. While cookie cutter text may be good from a branding perspective, it can actually be damaging from an online perspective. This is where smaller companies can crush the large-scale companies who refuse to change policies. However, the simple addition of a source link within the press release can also help search engines establish where the content originated from.
What Can You Do If Someone Is Scraping Your Content
- File a DMCA Complaint with Google. This essentially notifies Google that a site is stealing your content and you would like them to take action.
- Try sending a Cease and Desist letter to the website’s legal department.
- Attempt to find out what they’re using to crawl your site for content and block it. You can do this by determining their IP address.
- Worst case scenario? You may have to rewrite your content to escape the duplication.
- Aim to get Google to see your content first by uploading a site map to Webmaster Tools. Additionally, when you make any major revisions, request that Google crawl your site right away.
Though scraping is problematic, it’s on the way out. As Google becomes better at recognizing where content originates from, even “Black Hat” SEO practitioners are moving away from this practice. Still, it’s a good idea to keep an eye on this and ensure your site is not falling victim to it.
Using Siteliner can help find duplicate content on your site itself. As you can oftentimes find quite a lot of duplicate material on a blog, this tool can help hunt down what’s coming up as a duplicate and point the shovel at where you start digging. On category pages, try using a meta tag called “Robots”. This is where you can tell Google that these are index pages that shouldn’t be indexed, but should be followed to see where they lead. Simply add the following to your meta tags: <meta name=”robots” content=”noindex, follow”>
The only time to use duplicate content is when you’re trying to build up citations for Google Local. In this it is okay to use the same address and citation descriptions as citations are not for ranking, but rather to get your business name out there. A word of warning: do not make the mistake of using the “About Us” section from your site.
The old-fashioned, canned “About Us” blurb at the bottom of many press releases is actually a byproduct of doing things wrong. Cookie cutter text may be good from a branding perspective, is not good from an online perspective. This is where smaller companies can actually crush the large-scale companies who refuse to change policies.
Too Many Duplicates? What’s The Right Ratio?
Any given full page on your site should be at least 80%+ unique. If you absolutely must use boilerplate information, try to limit to 20% or less of your page.
Using Ahrefs.com can give you a better picture of your URL rankings and links. While this system is great for accuracy, it doesn’t crawl everything. Should you want to do that, your better bet is to use MajesticSEO.
Look for relative stability in backlinks and referring domains. You can find out how many links you’re currently getting per site by dividing the backlinks by referring domains. It’s also important to uncover if those links are “nofollow” or “do follow”. One other interesting thing you can do with this site is find out who your top referring TLD’s (.com, .net, .org) are coming from. You can also pinpoint where they are globally, giving you a better idea of your users and your market.
Netcomber is an extremely useful too if you want to ascertain how Google sees you and the relationship you have to other sites. Technically, this is not officially Google, but it gives a good estimation of what Google actually sees. Running your site through Netcomber on a regular basis is a good idea to keep on top of what’s going on. It’s not official Google, but an estimation of what google sees. It determines ownership, and can pick up registration information. Although it will pull in “private registrations,” they will be listed as private with no identifiable information shown.
Netcomber also allows you to view how Google is seeing you relate to other sites. For example, there are oftentimes instances of more than one site saying “We’re the official site.” This site can help point out when that’s happening with your website. It will sometimes find false positives, so look into any strong correlations you find. At the very least, it lets you know where to dig.
New Generic Top Level Domains (GTLD’s)
Recently Editor Dave Paul posted a blog post on EverSpark’s site that discussed how global auditing and tax advisory firm KPMG was making the move from a .com extension to .KPMG. While the rammifications of the impact on SEO are currently unclear, many experts will certainly be watching over the next couple of months to see the results. More and more GTLD’s are opening up every day, meaning you can have almost anything after the “dot,” like .health or .berlin. Given that the cost to take on a GTLD hovers somewhere around the $500,000 mark, it’s easy to wonder if this is a savvy move or just a risky gamble on the part of KPMG.