Archive for the ‘tools’ Category

Monday, December 29th, 2014

XML Sitemaps – 8 Facts, Tips, and Recommendations for the Advanced SEO

XML Sitemaps for Advanced SEOs

After publishing my last post about dangerous rel canonical problems, I started receiving a lot of questions about other areas of technical SEO. One topic in particular that seemed to generate many questions was how to best use and set up xml sitemaps for larger and more complex websites.

Sure, in its most basic form, webmasters can provide a list of urls that they want the search engines to crawl and index. Sounds easy, right? Well, for larger and more complex sites, the situation is often not so easy. And if the xml sitemap situation spirals out of control, you can end up feeding Google and Bing thousands, hundreds of thousands, or millions of bad urls. And that’s never a good thing.

While helping clients, it’s not uncommon for me to audit a site and surface serious errors with regard to xml sitemaps. And when that’s the case, websites can send Google and Bing mixed signals, urls might not get indexed properly, and both engines can end up losing trust in your sitemaps. And as Bing’s Duane Forrester once said in this interview with Eric Enge:

“Your Sitemaps need to be clean. We have a 1% allowance for dirt in a Sitemap. If we see more than a 1% level of dirt, we begin losing trust in the Sitemap.”

Clearly that’s not what you want happening…

So, based on the technical SEO work I perform for clients, including conducting many audits, I decided to list some important facts, tips, and answers for those looking to maximize their xml sitemaps. My hope is that you can learn something new from the bullets listed below, and implement changes quickly.

 

1. Use RSS/Atom and XML For Maximum Coverage
This past fall, Google published a post on the webmaster central blog about best practices for xml sitemaps. In that post, they explained that sites should use a combination of xml sitemaps and RSS/Atom feeds for maximum coverage.

Xml sitemaps should contain all canonical urls on your site, while RSS/Atom feeds should contain the latest additions or recently updated urls. XML sitemaps will contain many urls, where RSS/Atom feeds will only contain a limited set of new or recently changed urls.

RSS/Atom Feed and XML Sitemaps

So, if you have new urls (or recently updated urls) that you want Google to prioritize, then use both xml sitemaps and RSS/Atom feeds. Google says by using RSS, it can help them “keep your content fresher in its index”. I don’t know about you, but I like the idea of Google keeping my content fresher. :)

Also, it’s worth noting that Google recommends maximizing the number of urls per xml sitemap. For example, don’t cut up your xml sitemaps into many smaller files (if possible). Instead, use the space you have in each sitemap to include all of your urls. If you don’t Google explains that, “it can impact the speed and efficiency of crawling your urls.” I recommend reading Google’s post to learn how to best use xml sitemaps and RSS/Atom feeds to maximize your efforts. By the way, you can include 50K urls per sitemap and each sitemap must be less than 10MB uncompressed.

 

2. XML Sitemaps By Protocol and Subdomain
I find a lot of webmasters are confused by protocol and subdomains, and both can end up impacting how urls in sitemaps get crawled and indexed.

URLs included in xml sitemaps must use the same protocol and subdomain as the sitemap itself. This means that https urls located in an http sitemap should not be included in the sitemap. This also means that urls on sample.domain.com cannot be located in the sitemap on www.domain.com. So on and so forth.

XML Sitemaps and Protocol and Subdomains

 

This is a common problem when sites employ multiple subdomains or they have sections using https and http (like ecommerce retailers). And then of course we have many sites starting to switch to https for all urls, but haven’t changed their xml sitemaps to reflect the changes. My recommendation is to check your xml sitemaps reporting today, while also manually checking the sitemaps. You might just find issues that you can fix quickly.

 

3. Dirty Sitemaps – Hate Them, Avoid Them
When auditing sites, I often crawl the xml sitemaps myself to see what I find. And it’s not uncommon to find many urls that resolve with non-200 header response codes. For example, urls that 404, 302, 301, return 500s, etc.

Dirty XML Sitemaps

You should only provide canonical urls in your xml sitemaps. You should not provide non-200 header response code urls (or non-canonical urls that point to other urls). The engines do not like “dirty sitemaps” since they can send Google and Bing on a wild goose chase throughout your site. For example, imagine driving Google and Bing to 50K urls that end up 404ing, redirecting, or not resolving. Not good, to say the least.

Remember Duane’s comment from earlier about “dirt” in sitemaps. The engines can lose trust in your sitemaps, which is never a good thing SEO-wise. More about crawling your sitemaps later in this post.

 

4. View Trending in Google Webmaster Tools
Many SEOs are familiar with xml sitemaps reporting in Google Webmaster Tools, which can help surface various problems, while also providing important indexation statistics. Well there’s a hidden visual gem in the report that’s easy to miss. The default view will show the number of pages submitted in your xml sitemaps and the number indexed. But if you click the “sitemaps content” box for each category, you can view trending over the past 30 days. This can help you identify bumps in the road, or surges, as you make changes.

For example, check out the trending below. You can see the number of images submitted and indexed drop significantly over a period of time, only to climb back up. You would definitely want to know why that happened, so you can avoid problems down the line. Sending this to your dev team can help them identify potential problems that can build over time.

XML Sitemaps Trending in Google Webmaster Tools

 

5. Using Rel Alternate in Sitemaps for Mobile URLs
When using mobile urls (like m.), it’s incredibly important to ensure you have the proper technical SEO setup. For example, you should be using rel alternate on the desktop pages pointing to the mobile pages, and then rel canonical on the mobile pages pointing back to the desktop pages.

Although not an approach I often push for, you can provide rel alternate annotations in your xml sitemaps. The annotations look like this:

Rel Alternate in XML Sitemaps

 

It’s worth noting that you should still add rel canonical to the source code of your mobile pages pointing to your desktop pages.

 

6. Using hreflang in Sitemaps for Multi-Language Pages
If you have pages that target different languages, then you are probably already familiar with hreflang. Using hreflang, you can tell Google which pages should target which languages. Then Google can surface the correct pages in the SERPs based on the language/country of the person searching Google.

Similar to rel alternate, you can either provide the hreflang code in a page’s html code (page by page), or you can use xml sitemaps to provide the hreflang code. For example, you could provide the following hreflang attributes when you have the same content targeting different languages:

Hreflang in XML Sitemaps

Just be sure to include a separate <loc> element for each url that contains alternative language content (i.e. all of the sister urls should be listed in the sitemap via a <loc> element).

 

7. Testing XML Sitemaps in Google Webmaster Tools
Last, but not least, you can test your xml sitemaps or other feeds in Google Webmaster Tools. Although easy to miss, there is a red “Add/Test Sitemap” button in the upper right-hand corner of the Sitemaps reporting page in Google Webmaster Tools.

Test XML Sitemaps in Google Webmaster Tools

When you click that button, you can add the url of your sitemap or feed. Once you click “Test Sitemap”, Google will provide results based on analyzing the sitemap/feed. Then you can rectify those issues before submitting the sitemap. I think too many webmasters use a “set it and forget it” approach to xml sitemaps. Using the test functionality in GWT, you can nip some problems in the bud. And it’s simple to use.

Results of XML Sitemaps Test in Google Webmaster Tools

 

8. Bonus: Crawl Your XML Sitemap Via Screaming Frog
In SEO, you can either test and know, or read and believe. As you can probably guess, I’m a big fan of the former… For xml sitemaps, you should test them thoroughly to ensure all is ok. One way to do this is to crawl your own sitemaps. By doing so, you can identify problematic tags, non-200 header response codes, and other little gremlins that can cause sitemap issues.

One of my favorite tools for crawling sitemaps is Screaming Frog (which I have mentioned many times in my previous posts). By setting the crawl mode to “list mode”, you can crawl your sitemaps directly. Screaming Frog natively handles xml sitemaps, meaning you don’t need to convert your xml sitemaps into another format before crawling (which is awesome).

Crawling Sitemaps in Screaming Frog

Screaming Frog will then load your sitemap and begin crawling the urls it contains. In real-time, you can view the results of the crawl. And if you have Graph View up and running during the crawl, you can visually graph the results as the crawler collects data. I love that feature. Then it’s up to you to rectify any problems that are surfaced.

Graph View in in Screaming Frog

 

Summary – Maximize and Optimize Your XML Sitemaps
As I’ve covered throughout this post, there are many ways to use xml sitemaps to maximize your SEO efforts. Clean xml sitemaps can help you inform the engines about all of the urls on your site, including the most recent additions and updates. It’s a direct feed to the engines, so it’s important to get it right (and especially for larger and more complex websites).

I hope my post provided some helpful nuggets of sitemap information that enable you to enhance your own efforts. I recommend setting some time aside soon to review, crawl, audit, and then refine your xml sitemaps. There may be some low-hanging fruit changes that can yield nice wins. Now excuse me while I review the latest sitemap crawl. :)

GG

 

Tuesday, July 22nd, 2014

How To Get More Links, Crawl Errors, Search Queries, and More By Verifying Directories in Google Webmaster Tools

Verify by Directory in Google Webmaster Tools

In my opinion, it’s critically important to verify your website in Google Webmaster Tools (GWT). By doing so, you can receive information directly from Google as it crawls and indexes your website. There are many reports in GWT that can help identify various problems SEO-wise. For example, you can check the crawl errors report to surface problems Googlebot is encountering while crawling your site. You can check the HTML improvements section to view problems with titles, descriptions, and other metadata. You can view your inbound links as picked up by Google (more on that soon). You can check xml sitemaps reporting to view warnings, errors, and the indexed to submitted ratio. You can view indexation by directory via Index Status (forget about a site command, index status enables you to view your true indexation number).

In addition to the reporting you receive in GWT, Google will communicate with webmasters via “Site Messages”. Google will send messages when it experiences problems crawling a website, when it picks up errors or other issues, and of course, if you’ve received a manual action (penalty). That’s right, Google will tell you when your site has been penalized. It’s just another important reason to verify your website in GWT.

Limit On Inbound Links for Sites With Large Profiles
And let’s not forget about links. Using Google Webmaster Tools, you can view and download the inbound links leading to your site (as picked up by Google). And in a world filled with Penguins, manual actions, and potential negative SEO, it’s extremely important to view your inbound links, and often. Sure, there’s a limit of ~100K links that you can download from GWT, which can be limiting for larger and more popular sites, but I’ll cover an important workaround soon. And that workaround doesn’t just apply to links. It applies to a number of other reports too.

When helping larger websites with SEO, it’s not long before you run into the dreaded limit problem with Google Webmaster Tools. The most obvious limit is with inbound links. Unfortunately, there’s a limit of ~100K links that you can download from GWT. For most sites, that’s not a problem. But for larger sites, that can be extremely limiting. For example, I’m helping one site now with 9M inbound links. Trying to hunt down link problems at the site-level is nearly impossible via GWT with a link profile that large.

Inbound Links in Google Webmaster Tools

 

When you run into this problem, third party tools can come in very handy, like Majestic SEO, ahrefs, and Open Site Explorer. And you should also download your links from Bing Webmaster Tools, which is another great resource SEO-wise. But when you are dealing with a Google problem, it’s optimal to have link data directly from Google itself.

So, how do you overcome the link limit problem in GWT? Well, there’s a workaround that I’m finding many webmasters either don’t know about or haven’t implemented yet – verification by directory.

Verification by Directory to the Rescue
If you’ve been following along, then you can probably see some issues with GWT for larger, complex sites. On the one hand, you can get some incredible data directly from Google. But on the other hand, larger sites inherently have many directories, pages, and links to deal with, which can make your job analyzing that data harder to complete.

This is why I often recommend verifying by directory for clients with larger and more complex websites. It’s a great way to dig deep into specific areas of a website. As mentioned earlier, I’ve found that many business owners don’t even know you can verify by directory!  Yes, you can, and I recommend doing that today (even if you have a smaller site, but have distinct directories of content you monitor). For example, if you have a blog, you can verify the blog subdirectory in addition to your entire site. Then you can view reporting that’s focused on the blog (versus muddying up the reporting with data from outside the blog).

Add A Directory in Google Webmaster Tools

And again, if you are dealing with an inbound links problem, then isolating specific directories is a fantastic way to proceed to get granular links data. There’s a good chance the granular reporting by directory could surface new unnatural links that you didn’t find via the site-level reporting in GWT. The good news is that verifying your directories will only take a few minutes. Then you’ll just need to wait for the reporting to populate.

Which Reports Are Available For Directories?
I’m sure you are wondering which reports can be viewed by subdirectory. Well, many are available by directory, but not all. Below, you can view the reports in GWT that provide granular data by directory.

  • Search Queries
  • Top Pages (within Search Queries reporting)
  • Links to Your Site
  • Index Status
  • Crawl Errors (by device type)
  • HTML Improvements
  • Internal Links
  • International Targeting (New!)
  • Content Keywords
  • Structured Data

 

GWT Reporting by Directory – Some Examples

Indexation by Directory
Let’s say you’re having a problem with indexation. Maybe Google has only indexed 60% of your total pages for some reason. Checking the Index Status report is great, but doesn’t give you the information you need to isolate the problem.  For example, you want to try and hunt down the specific areas of the site that aren’t indexed as heavily as others.

If you verify your subdirectories in GWT, then you can quickly check the Index Status report to view indexation by directory. Based on what you find, you might dig deeper to see what’s going on in specific areas of your website. For example, running crawls of that subdirectory via several tools could help uncover potential problems. Are there roadblocks you are throwing up for Googlebot, are you mistakenly using the meta robots tag in that directory, is the directory blocked by robots.txt, is your internal linking weaker in that area, etc? Viewing indexation by directory is a logical first step to diagnosing a problem.

How To View Index Status by Directory in Google Webmaster Tools

 

Search Queries by Directory
Google Webmaster Tools provides search queries (keywords) that have returned pages on your website (over the past 90 days). Now that we live in a “not provided” world, the search queries reporting is important to analyze and export on a regular basis. You can view impressions, clicks, CTR, and average position for each query in the report.

But checking search queries at the site level can be a daunting task in Google Webmaster Tools. What if you wanted to view the search query data for a specific section instead? If you verify by directory, then all of the search query data will be limited to that directory. That includes impressions, clicks, CTR, and average position for queries leading to content in that directory only.

In addition, the “Top Pages” report will only contain the top pages from that directory. Again, this quickly enables you to hone in on content that’s receiving the most impressions and clicks.

And if you feel like there has been a drop in performance for a specific directory, then you can click the “with change” button to view the change in impressions, clicks, CTR, and average position for the directory. Again, the more granular you can get, the more chance of diagnosing problems.

How To View Search Query Reporting by Directory in Google Webmaster Tools

 

Links by Directory
I started explaining more about this earlier, and it’s an extremely important example. When you have a manual action for unnatural links, you definitely want to see what Google is seeing. For sites with large link profiles, GWT is not ideal. You can only download ~100K links, and those can be watered down by specific pieces of content or sections (leaving other important sections out in the cold).

When you verify by directory, the “links to your site” section will be focused on that specific directory. And that’s huge for sites trying to get a better feel for their link profile, unnatural links, etc. You can see domains linking to your content in a specific directory, your most linked content, and of course, the actual links. And you can download the top ~100K links directly from the report.

In addition, if you are trying to get a good feel for your latest links (like if you’re worried about negative SEO), then you can download the most recent links picked up by Google by clicking the “Download latest links” button.  That report will be focused on the directory at hand, versus a site-level download.

I’m not saying this is perfect, because some directories will have many more links than 100K. But it’s much stronger than simply downloading 100K links at the site-level.

How To View Inbound Links by Directory in Google Webmaster Tools

 

Crawl Errors By Directory
If you are trying to analyze the health of your website, then the Crawl Errors reporting is extremely helpful to review. But again, this can be daunting with larger websites (as all pages are reported at the site-level). But if you verify by directory, the crawl errors reporting will be focused on a specific directory. And that can help you identify problems quickly and efficiently.

In addition, you can view crawl errors reporting by Google crawler. For example, Googlebot versus Googlebot for Smartphones versus Googlebot-mobile for Feature Phones. By drilling into crawl errors by directory, you can start to surface problems at a granular level. This includes 404s, 500s, Soft 404s, and more.

How To View Crawl Errors by Directory in Google Webmaster Tools

Summary – Get Granular To View More Google Webmaster Tools Data
Verifying your website in Google Webmaster Tools is extremely important on several levels (as documented above).  But verifying by directory is also important, as it enables you to analyze specific parts of a website at a granular basis. I hope this post convinced you to set up your core directories in GWT today.

To me, it’s critically important to hunt down SEO problems as quickly as possible. The speed at which you can identify, and then rectify, those problems can directly impact your overall SEO health (and traffic to your site). In addition, analyzing granular reporting can help surface potential problems in a much cleaner way than viewing site-wide data. And that’s why verifying subdirectories is a powerful way to proceed (especially for large and complex sites).  So don’t hesitate. Go and verify your directories in Google Webmaster Tools now. More data awaits.

GG

 

 

Tuesday, January 7th, 2014

Rap Genius Recovery: Analyzing The Keyword Gains and Losses After The Google Penalty Was Lifted

Rap Genius Recovers From Google Penalty

On Christmas Day, Rap Genius was given a heck of a gift from Google.  A penalty that sent their rankings plummeting faster than an anvil off the Eiffel tower.  The loss in traffic has been documented heavily as many keywords dropped from page one to page five and beyond.  And many of those keywords used to rank in positions #1 through #3 (or prime real estate SEO-wise).  Once the penalty was in place, what followed was a huge decrease in visits from Google organic, since most people don’t even venture to page two and beyond.  It’s like Siberia for SEO.

Gaming Links
So what happened that Google had to tear itself away from eggnog and a warm fire to penalize a lyrics website on Christmas Day?  Rap Genius was gaming links, and badly.  No, not just badly, but with such disregard for the consequences that they were almost daring Google to take action.  And that’s until Matt Cutts learned of the matter and took swift action on Rap Genius.

That was Christmas Day. Ho, ho, ho.  You get coal in your lyrical stocking.   I won’t go nuts here explaining the ins and outs of what they were doing.  That’s been documented heavily across the web.  In a nutshell, they were exchanging tweets for links.  If bloggers added a list of rich anchor text links to their posts, then Rap Genius would tweet links to their content.  The bloggers get a boatload of traffic and Rap Genius got links (and a lot of them using rich anchor text like {artist} + {song} + lyrics).  Here’s a quick screenshot of one page breaking the rules:

Rap Genius Unnatural Links

A 10 Day Penalty – LOL
Now, I help a lot of companies with algorithmic hits and manual actions.  Many of the companies contacting me for help broke the rules and are seeking help in identifying and then rectifying their SEO problems.  Depending on the situation, recovery can take months of hard work (or longer).  From an unnatural links standpoint, you need to analyze the site’s link profile, flag unnatural links, remove as many as you can manually, and then disavow the rest.  If you only have 500 links leading to your site, this can happen relatively quickly.  If you have 5 million, it can be a much larger and nastier project.

Rap Genius has 1.5 million links showing in Majestic’s Fresh Index.  And as you start to drill into the anchor text leading to the site, there are many questionable links.  You can reference their own post about the recovery to see examples of what I’m referring to.  Needless to say, they had a lot of work to do in order to recover.

So, you would think that it would take some time to track down, remove, and then disavow the unnatural links that caused them so much grief.  And then they would need to craft a serious reconsideration request documenting how they broke the rules, how they fixed the problem, and of course, offer a sincere apology for what they did (with a guarantee they will never do it again).   Then Google would need to go through the recon request, check all of the removals and hard work, and then decide whether the manual action should be lifted, or if Rap Genius had more work to do.  This should take at least a few weeks, right?  Wrong.  How about 10 days.

Rap Genius Recovers After 10 Days

Only 10 days after receiving a manual action, Rap Genius is back in Google.  As you can guess, the SEO community was not exactly thrilled with the news.  Screams of special treatment rang through the twitterverse, as Rap Genius explained that Google helped them to some degree understand how to best tackle the situation, or what to target.  Believe me, that’s rare.  Really rare…

Process for Removing and Disavowing Links
Rap Genius wrote a post about the recovery on January 4th, which included the detailed process for identifying and then dealing with unnatural links.  They had thousands of links to deal with, beginning with a master list of 178K.  From that master list, they started to drill into specific domains to identify unnatural links.   Once they did, Rap Genius removed what they could and disavowed the rest using Google’s Disavow Tool.   Following their work, Google removed the manual action on January 4th and Rap Genius was back in Google.

But many SEOs wondered how much they came back, especially since Rap Genius had to nuke thousands of links.  And many of those links were to deeper pages with rich anchor text.  Well, I’ve been tracking the situation from the start, checking which keywords dropped during the penalty, and now tracking which ones returned to high rankings after the penalty was lifted.  I’ll quickly explain the process I used for tracking rankings and then provide my findings.

My Process for Analyzing Rankings (With Some Nuances)
When the penalty was first applied to Rap Genius, I quickly checked SEMRush to view the organic search trending and to identify keywords that were “lost” and ones that “declined”.  Rap Genius ranks for hundreds of thousands of keywords according to SEMRush and its organic search reporting identified a 70K+ keyword loss based on the penalty.

Note, you can’t compare third party tools to a website’s own analytics reporting, and SEMRush won’t cover every keyword leading to the site.  But, for larger sites with a lot of volume, SEMRush is a fantastic tool viewing the gains and losses for a specific domain.  I’ve found it to be extremely thorough and accurate.

Checking the lost and declined keywords that SEMRush was reporting lined up with manual checks.  Those keywords definitely took a plunge, with Rap Genius appearing on page five or beyond.  And as I mentioned earlier, that’s basically Siberia for organic search.

When the penalty was lifted, I used the same process for checking keywords, but this time I checked the “new” and “improved” categories.  The reporting has shown 43K+ keywords showing in the “new” category, which means those keywords did not rank the last time SEMRush checked that query.

I also used Advanced Web Ranking to check 500 of the top keywords that were ranking prior to the penalty (and that dropped after the manual action was applied).  The keywords I checked were all ranking in the top ten prior to the penalty.  Once the penalty was lifted, I ran the rankings for those keywords.  I wanted to see how much of an improvement there was for the top 500 keywords.

Then I dug into the data based on both SEMRush and Advanced Web Ranking to see what I could find.  I have provided my findings below.   And yes, this is a fluid situation, so rankings could change.  But we have at least a few days of data now.  Without further ado, here’s what I found.

 

Branded Keywords
This was easy. Branded keywords that were obliterated during the penalty returned quickly with strong rankings.  This was completely expected.  For example, if you search for rap genius, rapgenius, or any variation, the site now ranks at the top of the search results.  And the domain name ranks with sitelinks. No surprises here.

Rap Genius Branded Keywords

Category Keywords
For category keywords, like “rap lyrics”, “favorite song lyrics”, and “popular song lyrics”, I saw mixed results after the recovery.  For example, the site now ranks #1 for “rap lyrics”, which makes sense, but does not rank well for “favorite song lyrics” and “popular song lyrics”.  And it ranked well for each of those prior to the penalty.  Although specific song lyric queries are a driving force for rap genius (covered soon), category keywords can drive a lot of volume.  It’s clear that the site didn’t recover for a number of key category keywords.

Rap Genius Category Keywords

 

Artist Keywords
I noticed that the site ranked for a lot of artists prior to the penalty (just the artist name with no modifiers).  For example, “kirko bangz”, “lil b”, etc.  Similar to what I saw with category keywords, I saw mixed results with artists.  Searching for the two artists I listed above does not yield high rankings anymore, when they both ranked on page one prior to the penalty.  Some increased in rankings, but not to page one.  For example, “2 chainz” ranks #12 after the penalty was lifted.  But it was MIA when the penalty was in effect.  Another example is “Kendrick Lamar”, which Rap Genius ranked #8 for prior to the penalty.  The site is not ranking well at all for that query now.  So again, it seems that Rap Genius recovered for some artist queries, but not all.

Rap Genius Artist Keywords

Lyrics Keywords
Based on my research, I could clearly see the power of {song} + lyrics queries for Rap Genius.  It’s a driving force for the site.  And Rap Genius is now ranking again for many of those queries.  When the penalty was first lifted, I started checking a number of those queries and saw Rap Genius back on page one, and sometimes #1.  But when I started checking in scale, you could definitely see that not all keywords returned to high rankings.

Rap Genius High Rankings for Lyrics Keywords

For example, “hallelujah lyrics”, “little things lyrics”, and “roller coaster lyrics” are still off of page one.  Then there are keywords that skyrocketed back up the charts, I mean search rankings.  For example, “swimming pool lyrics”, “marvins room lyrics”, and “not afraid lyrics” all returned after the penalty after being buried.  So, it seems that many song lyrics keywords returned, but there are some that rank page two and beyond.

Rap Genius Low Rankings for Lyrics Keywords

What About Keywords That Were Gamed?
I’m sure some of you are wondering how Rap Genius fared for keywords that were gamed via unnatural links.  For example, “22 two’s lyrics” yields extremely strong rankings for Rap Genius, when it was one of the songs gamed via the link scheme.  Actually, rap genius ranks twice in the top 5.  Go figure.

Rap Genius Rankings for Gamed Links - Jay Z

Ditto for “timbaland know bout me”, which was also one of the songs that made its way into the spammy list of links at the end of articles and posts.  Rap Genius ranks #3 right now.

Rap Genius Rankings for Gamed Links - Timbaland

And then there’s Justin Bieber, which I can’t cover with just one sentence.  Rap Genius currently ranks on page 3 for “Justin Bieber song lyrics”, when it used to rank #8!  And then “Justin Bieber baby lyrics” now ranks #12 on page 2, when it used to rank #8.  But for “Justin Bieber lyrics”, Rap Genius is #10, on page one.

Rap Genius Rankings for Justin Bieber Lyrics

Overall, I saw close to 100 Justin Bieber keywords pop back into the top few pages of Google after the penalty was lifted.  But, many were not on page one anymore… I saw many of those keywords yield rankings on page two or beyond for Rap Genius.  See the screenshot below:

Rap Genius Keywords for Justin Bieber

 

Summary – Rap Genius Recovers, But The Scars Remain
So there you have it.  A rundown of where Rap Genius is after the penalty was lifted.  Again, I can’t see every keyword that was lost or gained during the Christmas Day fiasco, but I could see enough of the data.  It seems that Rap Genius came back strong, but not full-blast.  I saw many keywords return, but still a number that remain buried in Google.

But let’s face it, a 10 day penalty is a slap on the wrist for Rap Genius.  They now have a clean(er) platform back, and can build up on that platform.  That’s a lot better than struggling for months (or longer) with horrible rankings.  As I explained earlier, too many business owners aren’t as lucky as Rap Genius.  10 days and help from Google can quicken up the recovery process.  That’s for sure.

I’ll end with one more screenshot to reinforce the fact that Rap Genius is back.  And it’s a fitting query. :)

Rap Genius I'm Sorry

GG

 

 

Monday, August 12th, 2013

Facebook Ads for eCommerce – How To Combine Custom Audiences, Lookalikes, and Unpublished Posts to Target Customers and Similar Users

How to use unpublished posts as Facebook Ads

I used to be extremely critical of Facebook Ads in the past.  But that’s before Facebook released a boatload of functionality for enhancing your campaigns.  Sure, marketplace ads, or ads running the right sidebar, have seen declining engagement over the years, but that’s just a fraction of what you can do now with Facebook Ads.  And I’m finding many advertisers don’t know about the powerful options available to them.

For example, there’s FBX (or retargeting on Facebook), news feed targeting, mobile-only targeting, promoted posts, custom audiences, lookalike audiences, unpublished posts, etc.  And with this enhanced functionality comes better targeting and performance.  Now, I still think paid search can reach someone who is searching for a specific solution at the exact time they need it, and social advertising can’t do that (yet).  But, using advanced targeting within Facebook can absolutely make an impact, and on multiple levels.

In this post, I’m going to explain one method of using three pieces of functionality in Facebook Ads that might change your view of social advertising.  It has for me, and I’ve been using this technique for some time now.  It leverages unpublished posts, custom audiences, and lookalike audiences to target your current customers, and users similar to your customers, when you are running a specific promotion or sale.  It’s a great way to make the most of your current assets, and at a relatively low cost.

Meet Unpublished Posts
I find many business owners have no idea what unpublished posts are.  If you fit into this category, then today is your lucky day.  Unpublished posts enable page owners to create page updates that don’t get shared with their entire fan base.  In addition, you can run ads based on the unpublished posts and use a wealth of ad targeting to reach the right audience (which can include current customers).  Interesting, right?

Unpublished posts in Facebook

The easiest way to create an unpublished post is to use Power Editor.  And if you’re running Facebook Ads and not using Power Editor, you should start today.  It offers a lot of functionality and targeting options not available in Ads Manager (which is what advertisers use on Facebook’s website).

By clicking “Manage Pages” in Power Editor, you can actually craft a page post.  But since we want an unpublished post, you can create the update and not publish it.  That’s ultra-important, since we want to use the post as an ad, and not an update that’s broadcast to your entire fan base.

Creating an unpublished post in Facebook using Power Editor.

So, if you’re an ecommerce provider running a specific sale, you could create an update focusing on that sale, with an understanding it will reach a very specific audience (and not every fan).  I’ll cover how to target specific parts of your customer list soon, including people that are similar to those users.  Once you create your post, you can click your account ID in the left pane to return to your ads dashboard (in Power Editor).

Now we’re ready to talk about custom audiences and lookalikes.

Meet Custom Audiences and Lookalikes
I wrote a post earlier in the year about custom audiences in Facebook.  You should read that post to learn how to set them up.  You’ll need a custom audience in order to use the method I’m covering in this post (since that’s the audience you will target, and it’s also the list you will use to create a lookalike audience).

Custom audiences enable you to upload a list of current customers, based on your in-house email list.  Then, Facebook will match up the list with users on the social network.  Yes, you read that correctly.  That means you can target your in-house email list (or parts of that list) via Facebook Ads.  Awesome, right?

Using Custom Audiences in Facebook

Once your custom audience is created, you can use that list to target current customers with specific promotions and sales.  And you can use unpublished posts to reach them.  Did you catch that?  I said unpublished posts.  That means getting your targeted promotion in front of your current customers (whether they are fans of your page or not).

Great, but what’s a lookalike?
Lookalike audiences enable you to base a new audience (set of Facebook users) on a custom audience (your current customers).  Facebook reviews a number of characteristics about your custom audience (your current customer base), and then finds people similar to your customers.  Yes, once again, eye-opening targeting opportunity ahead.

Imagine you had five custom audiences set up, all containing specific customers for specific categories of products.  Then you could use lookalikes to find similar people (which you can then target via Facebook Ads).  The old days of Facebook ads seem so caveman-like, right?  :)

How To Set Up Lookalikes
Once you have set up a custom audience (following my tutorial), then you can easily select that audience in Power Editor, and choose “Create Similar Audience”.  Choose “Similarity” in the dialog box and Facebook will find users that are similar to your in-house list (based on a number of criteria).  It could take up to 24 hours to create the list, but I’ve seen it take much less time than that (especially for smaller lists).

Using Lookalike Audiences in Facebook

Combining Unpublished Posts, Custom Audiences, and Lookalikes
OK, we have covered unpublished posts that contain targeted messages about new promotions or sales.  We have also covered custom audiences based on our in-house email list.  And, we have covered lookalike audiences, which enable us to target similar people to our own customers.  Now we are ready to tie them together.

1. Create a New Campaign
In Power Editor, you can create a new campaign and set the campaign parameters like name, budget, etc.

Creating a new Facebook campaign in Power Editor.

2. Create a New Ad
Click the “Ads” tab to create your ad.  Under “Type”, choose “Ad”, and then select the radio button labeled “For a Facebook Page Using a Page Post”.  That will enable you to choose an unpublished post for your ad.

Creating an unpublished post ad in Facebook.

3. Choose a Destination
For “Destination”, choose your Facebook Page.  Note, your page’s image and title will still link users to your page, but the post itself can drive users to the sale landing page on your website.  Your post itself is where you should place the link to your landing page (on your own site).  In addition, you should add tracking parameters to your destination urls for your unpublished post (so you can track each campaign via your analytics package).

Choosing an ad destination for unpublished post ad in Facebook.

4. Select An Unpublished Post
Now, choose your unpublished post to use that post as the actual ad.  Note, you can also create your unpublished post at this stage (using Power Editor).  That’s a nice feature that was recently added.

Selecting a page post for an unpublished post ad in Power Editor.

5. Choose your placement:
OK, how awesome is this?  You get to choose where your unpublished post shows up.  For example, in the News Feed (Desktop and Mobile).  This is the most powerful placement in my opinion.  Your ads will show up directly in someone’s news feed versus along the right side.

Choosing ad placement for unpublished post in Power Editor.

6. Choose Your Targeting
Under “Audience”, you can choose targeting, based on the goals of your campaign.  Note, this is not where you will choose your custom or lookalike audience, although the tab is titled “Audience”.  You can choose location, age, gender, etc. if you want more granular targeting than just the custom audiences we created earlier.

Choosing ad targeting for unpublished post in Power Editor.

7. Choose Your Audience (Yes, this is what we’ve been waiting for.)
Under “Advanced Options”, you’ll notice the first field is titled “Custom Audiences”.  If you start typing in that field, your custom audience should show up (based on what you named the audience when you created it).  Once selected, it should show up in the field.  You can leave the rest of the targeting options located below as-is.

Selecting a custom audience for an unpublished post ad in Power Editor.

Clarification Side Note:
To clarify what we’ve been doing, this ad will target your current customer list.  When you create a second campaign, you can choose your lookalike audience.  Then you can run both campaigns and target both your current customer list and people similar to your current customers.   And since they are in separate campaigns, with separate tracking parameters, you can track performance by audience.  Awesome.

8. Select Your Pricing and Status Options
For this example, let’s choose CPC and enter the desired cost per click.  Facebook will provide a suggested CPC to the right.  Once completed, you’re ready to rock.

How to set pricing for an unpublished post ad in Power Editor.

9. Upload Your Campaign
Click “Upload” in Power Editor and your ad will be uploaded to Facebook, where it will need to be approved.  Once approved, you’ll receive a notification that your unpublished post is live.

Uploading an unpublished post ad using Power Editor.

Why this approach works:

1. Exposure and Sharing
By using this approach, you can get your latest sale or promotion in front of your current customers as they browse Facebook, while also providing a great opportunity for that sale or promotion to get shared.  For example a current customer might like your update, and it could hit their friends’ news feeds, which can provide even more exposure and opportunities to land new customers.

2. Engagement
Even though the unpublished post is technically an ad, it still looks and works like a typical page post update.  That means users can like, share, and comment on the post.  And yes, users often do like and comment on unpublished post ads.  Remember, the unpublished post ad is hitting users’ news feeds (both desktop and mobile), so there is a strong chance they will be exposed to your ad.   And if it’s crafted well, then there’s a chance that a certain percentage of that audience will engage with the post. It’s a great way to engage your current customers, while also engaging similar people (via a lookalike audience).

3. Page Likes
Gaining more page likes is an added benefit to using this approach.  Sure, you want people to click through to your sale landing page and buy, but you probably also want more page likes (so you can reach more people with your organic status updates down the line).  I’ve seen unpublished post ads work extremely well for gaining more page likes (across industries).  For example, a recent campaign I launched increased page likes by 7% during a one week period.  Not bad, when you take into account the other benefits from running the campaign (like exposure, sharing, engagement, and sales – which I’ll cover next).

4. Sales (and other important conversions)
Using this approach can yield a low CPA, high ROAS method for increasing sales for specific promotions.  I’ve run campaigns where the CPC was under $0.40 per click, and depending on the specific campaign, return on ad spend (ROAS) can be extremely strong.  For example, 2000 clicks at $0.40 per click is $800.  A conversion rate of 2.0% and an average order value of $75 would yield $3000 in revenue and 275% ROAS.  That’s just a small and quick example, but unpublished page post ads could yield a shot in the arm pretty quickly.

And from a B2B standpoint, with average order values typically much higher than B2C, the ROAS could be even greater.  Even a handful of sales could generated thousands (or tens of thousands) of dollars in revenue.  For example, a recent campaign I launched for a client focused on items starting at $1000 (and some were up to $5000 per item).  Even one sale at $5K based on the campaign I mentioned before would yield a strong ROAS.

And let’s not forget other important micro-conversions on your website.  For example, newsletter signups, which can be a great driver of revenue for any ecommerce provider, app downloads, requests for more information, etc. all fall under this category and can start forging a relationship between prospective customers and your business.

What’s the Downside?
OK, I love using this approach, but social advertising brings some unique challenges with it.  Since what we’ve covered is an actual page post, and not a straight ad, users can interact with it.  That means both positive and negative interaction can occur.  For example, you might have some unhappy customers post their negative feedback in the unpublished page post ad.  How you deal with that situation is for another post, but I always recommend addressing the problem directly (in the post).  But again, there are several situations that can arise, and I’ll try and address them in a future post.  Just keep in mind that users can comment, and those comments might not always be positive.

The Power of Unpublished Posts, Custom Audiences, and Lookalikes
After reading this post, I hope you better understand the power of using unpublished posts along with custom audiences and lookalike audiences.  Unfortunately, the features and functionality I covered in the post are not readily apparent to many Facebook advertisers.  And that’s a shame, since they can be extremely effective for businesses looking to engage current customers and new audiences, while also increasing sales.  I recommend testing this approach soon to see if it can be effective for your business.

You can start today. Create a custom audience, create a lookalike audience, and use Power Editor to create unpublished post ads.  You may never look back.  :)

GG

 

Wednesday, July 10th, 2013

Avoiding Dirty Sitemaps – How to Download and Crawl XML Sitemaps Using Screaming Frog

Dirty XML Sitemaps

SEO Audits are a core service I provide, including both comprehensive audits and laser-focused audits tied to algorithm updates.  There are times during those audits that I come across strange pages that are indexed, or I see crawl errors for pages not readily apparent on the site itself.  As part of the investigation, it’s smart to analyze and crawl a website’s xml sitemap(s) to determine if that could be part of the problem.  It’s not uncommon for a sitemap to contain old pages, pages leading to 404s, application errors, redirects, etc.  And you definitely don’t want to submit “dirty sitemaps” to the engines.

What’s a Dirty Sitemap?
A dirty sitemap is an xml sitemap that contains 404s, 302s, 500s, etc.  Note, those are header response codes.  A 200 code is ok, while the others signal various errors or redirects.  Since the engines will retrieve your sitemap and crawl your urls, you definitely don’t want to feed them errors.  Instead, you want your xml sitemaps to contain canonical urls on your site, and urls that resolve with a 200 code.  Duane Forrester from Bing was on record saying that they have very little tolerance for “dirt in a sitemap”.  And Google feels the same way.  Therefore, you should avoid dirty sitemaps so the engines can build trust in that sitemap (versus having the engines encounter 404s, 302s, 500s, etc.)

Indexed to Submitted Ratio
One metric that can help you understand if your xml sitemaps are problematic (or dirty) is the indexed to submitted ratio in Google Webmaster Tools.  When you access the “Sitemaps” section of webmaster tools (under the “Crawl” tab), you will see the number of pages submitted in the sitemap, but also the number indexed.  That ratio should be close(r) to 1:1.  If you see a low indexed to submitted ratio, then that could flag an issue with the urls you are submitting in your sitemap.  For example, if you see 12K pages submitted, but only 6500 indexed.  That’s only 54% of the pages submitted.

Here’s a screenshot of a very low indexed to submitted ratio:
A Low Submitted to Indexed Sitemap Ratio in Google Webmaster Tools

Pages “Linked From” in Google Webmaster Tools
In addition to what I explained above about the indexed to submitted ratio, you might find crawl errors in Google Webmaster Tools for urls that don’t look familiar.  In order to help track down the problematic urls, webmaster tools will show you how it found the urls in question.

If you click the url in the crawl errors report, you will see the error details as the default view.  But, you will also see two additional tabs for “In Sitemaps” and “Linked From”.  These tabs will reveal if the urls are contained in a specific sitemap, and if the urls are being linked to from other files on your site.  This is a great way to hunt down problems, and as you can guess, you might find that the your xml sitemap is causing problems.

Linked From in Google Webmaster Tools
Crawling XML Sitemaps
If you do see a problematic indexed to submitted ratio, what can you do?  Well, the beautiful part about xml sitemaps is that they are public.  As long as you know where they reside, you can download and crawl them using a tool like Screaming Frog.  I’ve written about Screaming Frog in the past, and it’s a phenomenal tool for crawling websites, flagging errors, analyzing optimization, etc.  I highly recommend using it.

Screaming Frog provides functionality for crawling text files (containing a list of urls), but not an xml file (which is the format of xml sitemaps submitted to the engines).  That’s a problem if you simply download the xml file to your computer.  In order to get that sitemap file into a format that can be crawled by Screaming Frog, you’ll need to first import that file into Excel, and then copy the urls to a text file.  Then you can crawl the file.

And that’s exactly what I’m going to show you in this tutorial.  Once you crawl the xml sitemap, you might find a boatload of issues that can be quickly resolved.  And when you are hunting down problems SEO-wise, any problem you can identify and fix quickly is a win.  Let’s begin.

Quick Note: If you control the creation of your xml sitemaps, then you obviously don’t need to download them from the site.  That said, the sitemaps residing on your website are what the engines crawl.  If your CMS is generating your sitemaps on the fly, then it’s valuable to use the exact sitemaps sitting on your servers.  So even though you might have them locally, I would still go through the process of downloading them from your website via the tutorial below.

How To Download and Crawl Your XML Sitemaps

  1. Download the XML Sitemap(s)
    Enter the URL of your xml sitemap, or the sitemap index file.  A sitemap index file contains the urls of all of your xml sitemaps (if you need to use more than one due to sitemap size limitations).  If you are using a sitemap index file, then you will need to download each xml sitemap separately.  Then you can either crawl each one separately or combine the urls into one master text file.  After the sitemap loads in your browser, click “File”, and then “Save As”.  Then save the file to your hard drive.
    Download XML Sitemaps
  2. Import the Sitemap into Excel
    Next, you’ll need to get a straight list of urls to crawl from the sitemap.  In order to do this, I recommend using the “Import XML” functionality in the “Developer” tab in Excel.  Click “Import” and then select the sitemap file you just downloaded.  After clicking the “Import” button after selecting your file, Excel will provide a dialog box about the xml schema.  Just click “OK”.  Then Excel will ask you where to place the data.  Leave the default option and click “OK”.  You should now see a table containing the urls from your xml sitemap.  And yes, you might already see some problems in the list.  :)
    Import XML Sitemaps into Excel
  3. Copy the URLs to a Text File
    I mentioned earlier that Screaming Frog will only crawl text files with a list of urls in them.  In order to achieve this, you should copy all of the urls from column A in your spreadsheet.  Then fire up your text editor of choice (mine is Textpad), and paste the urls.  Make sure you delete the first row, which contains the heading for the column.  Save that file to your computer.
    Copy URLs to a Text File
  4. Unleash the Frog
    Next, we’re ready to crawl the urls in the text file you just created.  Fire up Screaming Frog and click the “Mode” tab.  Select “List”, which enables you to load a text file containing a series of urls.
    List Mode in Screaming Frog
  5. Start the Crawl
    Once you load the urls in Screaming Frog, click “Start” to begin crawling the urls.
    Start a Crawl in Screaming Frog
  6. Analyze the Crawl
    When the crawl is done, you now have a boatload of data about each url listed in the xml sitemap.  The first place I would start is the “Response Codes” tab, which will display the header response codes for each url that was crawled.  You can also use the filter dropdown to isolate 404s, 500s, 302s, etc.  You might be surprised with what you find.
    Analyze a Sitemap Crawl in Screaming Frog
  7. Fix The Problems!
    Once you analyze the crawl, work with your developer or development team to rectify the problems you identified.  The fix sometimes can be handled quickly (in less than a day or two).

 

Summary – Cleaning Up Dirty Sitemaps
Although XML sitemaps provide an easy way to submit all of your canonical urls to the engines, that ease of setup sometimes leads to serious errors.  If you are seeing strange urls getting indexed, or if you are seeing crawl errors for weird or unfamiliar urls, then you might want to check your own sitemaps to see if they are causing a problem.  Using this tutorial, you can download and crawl your sitemaps quickly, and then flag any errors you find along the way.

Let’s face it, quick and easy wins are sometimes hard to come by in SEO.  But finding xml sitemap errors can be a quick an easy win.  And now you know how to find them.  Happy crawling.

GG

 

 

Monday, May 13th, 2013

Robots.txt and Invisible Characters – How One Hidden Character Could Cause SEO Problems

How syntax errors in robots.txt can cause SEO problems.

If you’ve read some of my blog posts in the past, then you know I perform a lot of SEO technical audits.  As one of the checks during SEO audits, I always analyze a client’s robots.txt file to ensure it’s not blocking important directories or files.  If you’re not familiar with robots.txt, it’s a text file that sits in the root directory of your website and should be used to inform the search engine bots which directories or files they should not crawl.  You can also add autodiscovery for your xml sitemaps (which is a smart directive to add to a robots.txt file).

Anyway, I came across an interesting situation recently that I wanted to share.  My hope is that this post can help some companies avoid a potentially serious SEO issue that was not readily apparent.  Actually, the problem could not be detected by the naked eye.  And when a problem impacts your robots.txt file, the bots won’t follow your instructions.  And when the bots don’t follow instructions, they can potentially be unleashed into content that should never get crawled.  Let’s explore this situation in greater detail.

A sample robots.txt file:

Sample Robots.txt File

Technical SEO – Cloaked Danger in a Robots.txt File
During my first check of the robots.txt file, everything looked fine.  There were a number of directories being blocked for all search engines.  Autodiscovery was added, which was great.  All looked good.  Then I checked Google Webmaster Tools to perform some manual checks on various files and directories (based on Google’s “Blocked URLs” functionality).  Unfortunately, there were a number of errors showing within the analysis section.

The first error message started with the User-agent line (the first line in the file).  Googlebot was choking on that line for some reason, but it looked completely fine.  And as you can guess, none of the directives listed in the file were being adhered to.  This meant that potentially thousands of files would be crawled that shouldn’t be crawled, and all because of a problem that was hiding below the surface…  literally.

Blocked URLs reporting in Google Webmaster Tools:

Blocked URLs in Google Webmaster Tools

 

Word Processors and Hidden Characters
So I started checking several robots.txt tools to see what they would return.  Again, the file looked completely fine to me.  The first few checks returned errors, but wouldn’t explain exactly what was wrong.  And then I came across one that revealed more information.  The tool revealed an extra character (hidden character) at the beginning of the robots.txt file.  This hidden character was throwing off the format of the file, and the bots were choking on it.  And based on the robots syntax being thrown off, the bots wouldn’t follow the instructions.  Not good.

Invisible Character in Robots.txt

I immediately sent this off to my client and their dev team tracked down the hidden character, and created a new robots.txt file.  The new file was uploaded pretty quickly (within a few hours).  And all checks are fine now.  The bots are also adhering to the directives included in robots.txt.

 

The SEO Problems This Scenario Raises
I think this simple example underscores the fact that there’s not a lot of room for error with technical SEO… it must be precise.  In this case, one hidden character in a robots.txt file unleashed the bots on a lot of content that should never be crawled.  Sure, there are other mechanisms to make sure content doesn’t get indexed, like the proper use of the meta robots tag, but that’s for another post.  For my client, a robots.txt file was created, it looked completely fine, but one character was off (and it was hidden).  And that one character forced the bots to choke on the file.

 

How To Avoid Robots.txt Formatting Issues
I think one person at my client’s company summed up this situation perfectly when she said, “it seems you have little room for error, SEO seems so delicate”.  Yes, she’s right (with technical SEO).  Below, I’m going to list some simple things you can do to avoid this scenario.   If you follow these steps, you could avoid faulty robots.txt files that seem accurate to the naked eye.

1. Text Editors
Always use a text editor when creating your robots.txt file.  Don’t use a word processing application like Microsoft Word.  A text editor is meant to create raw text files, and it won’t throw extra characters into your file by accident.

2. Double and Triple Check Your robots.txt Directives
Make sure each directive does exactly what you think it will do.  If you aren’t 100% sure you know, then ask for help.  Don’t upload a robots.txt file that could potentially block a bunch of important content (or vice versa).

3. Test Your robots.txt File in Google Webmaster Tools and Via Third Party Tools
Make sure the syntax of your robots.txt file is correct and that it’s blocking the directories and files you want it to.  Note, Google Webmaster Tools enables you to copy and paste a new robots file into a form and test it out.  I highly recommend you do this BEFORE uploading a new file to your site.

4. Monitor Google Webmaster Tools “Blocked URLs” Reporting
The blocked urls functionality will reveal problems associated with your robots.txt file under the “analysis” section.  Remember, this is where I picked up the problem covered in this post.

 

Extra Characters in Robots.txt – Cloaked in Danger
There you have it.  One hidden character bombed a robots.txt file.  The problem was hidden to the naked eye, but the bots were choking on it.  And depending on your specific site, that one character could have led to thousands of pages getting crawled that shouldn’t be.  I hope this post helped you understand that your robots.txt format and syntax are extremely important, that you should double and triple check your file, and that you can test and monitor that file over time.  If the wrong file is uploaded to your website, bad things can happen.  Avoid this scenario.

GG

 

Sunday, April 14th, 2013

You Might Be Losing Out – How To Make Sure Sitelink Extensions in Bing Ads Are Tracked Properly [Tutorial]

Bing Ads released sitelink extensions in October of 2012, which enables advertisers to provide additional links in their text ads.  Google AdWords has had ad sitelinks for some time, so this was a great addition by our friends at Bing Ads.  For example, if you were an ecommerce website selling sporting goods, you could provide ad sitelinks for your top categories, like football, baseball, basketball, etc. right beneath your standard text ad.  Sitelink extensions are great usability-wise, while they also provide a nice advantage in the SERPs (since they take up more real-estate).

Here are two examples of sitelink extensions in action (2 Formats):
Example of Sitelink Extensions in Bing Ads for Lucky Jeans

 

Example of Sitelink Extensions in Bing Ads for Adidas

So, let’s say you set up sitelink extensions for some of your campaigns, and you’re basking in the glory of those beautiful ads (and the click through they are getting).  But, maybe your reporting isn’t lining up clicks and visits-wise.  Sure, there are several reasons that could be happening, but maybe it got worse since you launched sitelink extensions.  Well, the reason could very well be the lack of tagging on your ad sitelinks.  If those additional URLs aren’t tagged properly, then your analytics package could very well be reporting that traffic as organic search.  And that would be shame.

In this post, I’m going to walk you through why this could be happening, and how to rectify the situation.  After reading this post, you might just run to Bing Ads today and make changes.  Let’s jump in.

Sitelink Extensions and Tracking Parameters
In Bing Ads, you can include sitelink extensions several ways.  First, you can add them manually via the Bing Ads web UI.  Second, you can use Bing Ads Editor to add them locally, and then upload them to your account.  And third, and possibly the top reason ad sitelinks don’t get tagged, is that you can import them from AdWords via the “Import from Google” functionality.  Note, the import from AdWords functionality is awesome, so don’t get me wrong.  It’s just that it’s easy to import ad sitelinks and not know they are there.  Then you run the risk of uploading untagged sitelink extensions.

How To Create Sitelink Extensions in Bing Ads

So, you need to make sure that your ad sitelinks are tagged properly, based on the analytics package you are using to track campaigns.  For example, if you are using Google Analytics, then you need to make sure that you identify each click coming from your sitelink extensions.  That means you should be appending tracking parameters to your sitelink URLs.  For Google Anlaytics, you can use URL Builder to tag your landing page URLs.

Tagging Sitelink URLs Using URL Builder

 

How To Tag Your Ad Sitelinks in Bing Ads
Again there are various ways to include sitelink extensions in your campaigns, from using the web UI to using Bing Ads Editor to using the “Import from Google” functionality.  I’ll quickly cover each method below to make sure you know where to apply your tracking parameters.

1.  The Bing Ads Web UI
You can currently apply ad sitelinks at the campaign level in Bing Ads.  When you access a campaign, you can click the “Ad Extensions” tab to include ad sitelinks.  Once there, you can click “Create” to add a new sitelink extension.  If you have other sitelink extensions set up across campaigns, you will see them listed (and you can apply those to your campaign if it makes sense).

Creating Sitelink Extensions Using the Bing Web UI

If you want to add a completely new sitelink extension, then click “Create New”.  When adding the sitelink extension, Bing Ads provide a field for link text and then a field for the destination URL.  When you add the URL, make sure your tracking parameters are added!  If not, your visits will show up as “Bing Organic” versus “Bing CPC”.  Good for the SEO team, but not so good for the paid search team.  :)

 

Adding Sitelinks Using the Bing Web UI

 

2. Bing Ads Editor
I love Bing Ads Editor.  It’s an awesome way to manage your campaigns locally and then sync with the Bing Ads web UI.  And as you can guess, there is functionality for adding and editing sitelink extensions in Bing Ads Editor.  You can access your sitelink extensions by clicking the “Ad Extensions” tab for any selected campaign.

Once you click the “Ad Extensions” tab, you can add sitelink extensions by clicking the “Create a Sitelink Extension” button from the top menu.  Then similar to the web UI, you can add the link text and the destination URL.  When adding your destination URLs, make sure your tracking parameters are added.

Adding Sitelinks Using the Bing Ads Editor

 

3. Import from Google (in Bing Ads Editor)
As I explained earlier, I love having the ability to import campaigns, changes, etc. from AdWords directly into Bing Ads Editor.  It makes managing campaigns across both platforms much more efficient.  But, I’ve seen advertisers import campaigns from AdWords that have sitelink extensions, but they don’t realize it.  Then they upload their campaigns to Bing Ads and don’t understand that prospective customers are clicking their sitelinks, visiting their sites, etc., but those visits aren’t being tracked correctly.  Again, those visits will show up as “Bing Organic” in your analytics reporting.

When you go through the process of importing your campaigns, make sure you double check the “Ad Extensions” tab for the newly-imported campaign.  You just might find sitelink extensions sitting there.  And yes, they very well could be left untagged.  Make sure you add your tracking parameters before uploading them to Bing Ads (from Bing Ads Editor).

You can also uncheck the “Ad Extensions” radio button when importing your campaigns from AdWords.  Then you can add your sitelink extensions directly in Bing Ads Editor (via the second method I covered earlier in this post.

Importing Sitelink Extensions in Bing Ads Editor

 

Sitelinks Are Powerful, But Only If They Are Tracked
Sitelinks extensions are a great addition to Bing Ads, and they absolutely can yield higher click through rates.  But, you need to make sure those clicks are being tracked and attributed to the right source – your Bing Ads campaigns!  I recommend checking your campaigns today to make sure your sitelink extensions have the proper tracking parameters appended.  If not, you can quickly refine those links to make sure all is ok.   And when everything is being tracked properly, you just might see a boost in visits, orders, and revenue being attributed to Bing Ads.  And that’s always a good thing.

GG

 

 

Thursday, March 21st, 2013

How To Properly Demote Sitelinks in Google Webmaster Tools

How to remove sitelinks in Google Webmaster Tools

I’ve received several questions recently about how to remove sitelinks in Google.  If you’re not familiar with sitelinks, they are additional links that Google provides under certain search listings.  Sitelinks enable users to drill deeper into a site directly from the search results.  You typically see sitelinks for branded searches.

For example, here are sitelinks for Amazon:
Sitelinks for Amazon.com

 

And here are sitelinks for the Apple iPad:
Sitelinks for Apple iPad


How Google Determines Sitelinks
Google algorithmically determines sitelinks for a given query/url combination.  This is based on a number of factors that Google takes into account.  For example, Google explains that it analyzes a site’s link structure to determine if there are additional links it can provide in the search results that will save users time (by quickly enabling them to link to core pages on your site).  Remember, Google always wants to connect users with the information they are seeking as fast as possible.

No, Google Doesn’t Always Get It Right
If you are checking your rankings and notice strange sitelinks showing up, you can always demote those links via Google Webmaster Tools.  For example, you might see sitelinks that are irrelevant, too granular, or links that could end up sending users to supporting pages that wouldn’t provide a strong user experience.  Whatever the case, you can take action.

For cases like this, you can use the “Sitelinks” section of Google Webmaster Tools to demote specific sitelinks.  Note, if you don’t have Google Webmaster Tools set up for your site, stop reading this post, and set it up NOW.  You can set up your account and verify your site in just a few minutes, and then you’ll receive a boatload of important data right from Google.

Demoting Sitelinks in Google Webmaster Tools
Once you set up a webmaster tools account, you can access the sitemaps section to begin demoting specific sitelinks.  Below is a step by step tutorial for demoting sitelinks that shouldn’t be showing up below your search listings.

1. Access the Sitelinks Section of Webmaster Tools
Access Google Webmaster Tools and click the “Configuration” tab, and then “Sitelinks” to access the demotion form.

How to access sitelinks in Google Webmaster Tools

2. Choose Wisely When Demoting Sitelinks
There are two text fields you need to concern yourself with in the “Sitelinks” section.  The first is labeled, “For this search result:” and it refers to the webpage that shows up in the search results that contains sitelinks.  I know this is where confusion starts to set in, so let me say that again (and show you what I mean).

The first text field is not for the sitelink URL you want to demote.  It’s for the webpage that the sitelinks show up for.  It’s the URL that’s displayed at the top of the search listing.  Note, if you are demoting a sitelink for your homepage, you can leave this field blank.  It’s also worth noting that Google provides the root URL of your site already in the text field, so you just need to worry about the remaining part of the URL, which is called the URI (everything after http://www.yourdomain.com/).

Enter search result when demoting sitelinks.

For example, if you were the VP of Marketing for Apple, and wanted to remove the “Refurbished iPad” sitelink for the iPad page, then you would enter http://www.apple.com/ipad/ in the first field.

How to remove sitelinks for the ipad search result.


3. Demote the Sitelink URL
The second field is where you will enter the URL of the sitelink you want to demote.  Using our Apple example above, you would enter http://store.apple.com/us/browse/home/specialdeals/ipad in the field to demote the “Refurbished” sitelink for the ipad URL.  That’s the refurbished iPad page on Apple’s site (and it’s where the sitelink in the search results points to).

Once you enter the URL, you can click the red “Demote” button.  Once you do, the demoted sitelink will be listed below the form with the search result it applies to, the specific sitelink URL, and a “Remove Demotion” button.  If you ever want to remove the demotion, just access this page again, and click “Remove Demotion”.  Then give Google a few days to apply the changes.

Enter the sitelink url to demote.

 

Misc. Sitelink Information
Based on the questions I have received when helping clients demote sitelinks, I figured I would provide some additional information in this post.

1. How Long Does it Take for Google to Demote Sitelinks?
I’ve seen sitelinks get demoted in just a few days.  That said, it definitely varies per site…  I’ve seen it take a little longer in certain cases.  I recommend monitoring the sitelinks for the page in question for at least a week or two after demoting a sitelink.  If you notice that it’s still showing up, then revisit the form to make sure you demoted the right sitelink for the right search result.

2. How Many Sitelinks Can I Demote?
You can demote up to 100 URL’s via Google Webmaster Tools.  That should be plenty for most webmasters.  Actually, I’d argue that something is very wrong if you are demoting too many sitelinks…  You might want to analyze your internal navigation, including the anchor text, to see if Google is picking up something that it shouldn’t be.

Summary – Demotion Can Be A Good Thing
I hope this tutorial helped you better understand what sitelinks are and how to address the wrong sitelinks showing up in the search results.  If you notice any weird sitelinks showing up in the search results for your site, then visit Google Webmaster Tools and demote those specific sitelinks.  It’s one of the few times that a demotion could be a good thing.

GG

 

Friday, February 15th, 2013

How to Combine Custom Audiences in Facebook Ads to Enhance Your Targeting [Tutorial]

Custom Audiences in Facebook

Facebook recently released a powerful new option for advertisers called Custom Audiences.  Using custom audiences, advertisers can leverage their current in-house list of customers for targeting ads.  By uploading a list of emails, phone numbers, or UID’s, you can create a custom audience that can be used for targeting Facebook campaigns.

In my opinion, this was a brilliant move by Facebook.  It brings a unique targeting capability to the social network, and can be extremely useful on several levels.  For example, are you launching a new product?  Then use your custom audience to make sure your current customers know about the new product by reaching them on Facebook.  Know that a certain group of customers are interested in a given category of products?  Then use a custom audience to target just those customers with specific ads, copy, and calls to action.  The sky is the limit with regard to ideas for targeting your current set of customers, and I’ve been using custom audiences more and more recently.

Using Segmentation to Move Beyond Your One In-house Email List
A business can easily export its in-house email list and upload it to Facebook to create a custom audience.  It’s relatively straight-forward to do so, and you can accomplish this via Power Editor.  Once Facebook processes your list, it’s available to use when targeting an audience.  But, you shouldn’t stop there…  You can slice and dice your in-house email list and upload several files (if you have criteria for segmenting your list).

For example, do you know which customers are interested in which categories you sell?  Break those out.  Do you know which customers are tied to which purchases?  Sure you do, break those out too.  Once you do, you’ll have several targeted lists of emails that you can combine to hone your targeting.  And who doesn’t like that idea?

Combining Custom Audiences
When using Remarketing in AdWords, there is something called custom combinations.  When advertisers create a custom combination, they can create a remarketing audience that includes one audience, but excludes another.  That’s extremely powerful and provides a lot of flexibility for businesses trying to reach their customers via retargeting efforts.  Well, combining custom audiences in Facebook Ads enables you to do the same thing.

Here’s a simple hypothetical situation.  Let’s say you sold amazing new earphones that are invisible to the naked eye.  You already blasted an email out to your current customers and received some orders.  If your full email list was uploaded to Facebook as a custom audience (which should be done anyway), then you could create a second audience that includes customers that already purchased the new earphones.

Then, when you create a new campaign targeting your in-house email list (promoting your new earphones), you can exclude the list of customers that already purchased them.  This saves you from looking foolish, cuts down on wasted impressions, wasted clicks, and wasted budget.  Yes, that’s a simple example, but shows the power of creating custom combinations in Facebook.

How To Use Custom Combinations with Facebook Ads
Let’s quickly walk through how to set this up in Facebook.  Below, I’m going to explain how to first create a custom audience, and then how to upload and use a second audience (that can be used to hone your targeting).  Let’s create a custom combination using custom audiences in Facebook:

1. Export a straight list of customer emails as a .csv file.

Exporting a CSV of emails to create a custom audience.

 

2. Launch Power Editor and click the “Custom Audiences” Tab.
Note, if you’ve never used Power Editor, set that up now, download all of your campaigns, and then revisit this tutorial.

Custom Audience Tab in Facebook Ads

 

3. Click the “Create Audience” button and enter the name, description, and choose the type of list. 
For this list, click the “Emails” radio button.  You should also click “Choose File” button to locate the csv file we just created in the previous step.

The Custom Audience Dialog Box in Facebook Ads

 

4. Click “Create” and Facebook will upload your list and create your custom audience. 
Note, it could take a few hours for Facebook to process the file.  That depends on your list.  Remember, Facebook is going to scan the emails and try and match them up to current Facebook users.

 

5. Wait for Facebook to process your custom audience.
The status for the custom audience will say, “Waiting” while Facebook is processing the file.  That will change to “Ready” when the audience is ready to go.
You should also see the audience size (based on the users that Facebook could match up).

Custom Audience Status Message

 

6. Repeat the process in steps 1-5 to create a second custom audience (the hypothetical list of customers that already purchased our killer new earphones).
Make sure you give the new custom audience a descriptive name like “customers-invisible-earphones”.

 

7. Create a new campaign that will be used to target your current customers that have not purchased your new earphones yet.
Simply use the standard process for setting up a new Facebook campaign.

Creating a New Facebook Campaign

 

8. Select your custom audience.

When you create a new ad within your new campaign, you can hop down to the “Audience” tab.  You can click the button labeled “Use Existing Audience”.  Then select your full in-house email list.  That’s the first custom audience we created.

Use Existing Audience in Facebook Ads

 

9. Now select the custom audience to exclude.

Next, click the “Advanced Options” tab under “Audience”.  You will see an option for “Excluded Audiences”.   You can start typing the name of the custom audience containing customers that already purchased your earphones (the second custom audience we created).  The audience name should auto-populate when you start typing.  After selecting the audience, you should see the “Estimated Reach” number drop, based on excluding the new list.

Combining Custom Audiences to Enhance Targeting

 

10. That’s it, you have now used a custom combination to hone your targeting using Custom Audiences!
Your ads will now only be displayed to customers on your email list that have not purchased your new earphones yet.

Summary – Combine Audiences for Power
As I explained earlier, using custom audiences is a new and powerful way to reach a targeted audience on Facebook.   It combines the power of a current, in-house email list with the flexibility and intelligence of segmenting your audience.  Don’t look foolish, don’t waste clicks, and don’t waste budget.  Use custom combinations to slice and dice your current customer list.  Now go ahead.  Set up your campaign now.  :)

GG

 

Thursday, December 27th, 2012

Introducing SEO Bootcamp Princeton, A Hands-On SEO Training Course in Princeton NJ

SEO Training Topics

I absolutely love getting in front of a group of people to speak about SEO (and always have).  Over the past several years, I’ve led SEO training classes for clients covering a wide range of topics, from technical SEO to keyword research to content optimization to linkbuilding strategy.  Although I’ve really enjoyed leading classes like this, I’ve always wanted to launch a training program that anyone could sign up for, and not just clients.  Well, I finally put the program together, and it’s called SEO Bootcamp Princeton.

SEO Bootcamp Princeton is a three hour, in-person training course being held at the Johnson Education Center (at D&R Greenway) on January 17th, from 9AM to 12PM.  You can register online via EventBrite, and there’s a 20% off, early registration discount running through 12/31/12.  If you register by then, tickets are $145 versus the standard price of $179.

The Target Audience for SEO Bootcamp Princeton
So, what will you learn at SEO Bootcamp Princeton?  Put simply, you’ll learn a lot.  My goal is to make sure attendees can leave the training ready to make changes to their websites.  I’ve crafted the training so it can be valuable for any person marketing a business, from small business owners to corporate marketers.  SMB’s will learn the tactical knowledge necessary to build a solid SEO foundation, while corporate marketers can learn SEO best practices and techniques.

In addition, the training will be extremely valuable for creative professionals, including designers, programmers, copywriters, etc.  I used to work for a large agency in New York City, and I led a similar type of training there.  I can tell you that every creative professional left the training with a stronger understanding of Search Engine Optimization (SEO).  Actually, I know the training changed how some people performed their jobs on a regular basis…

For example, designers and programmers learned about search engine friendly ways to design and code sites, while copywriters learned how to perform keyword research and properly optimize content.  Professionals involved with information architecture (IA) learned how to best structure a navigation, while also learning the best ways to build an internal linking structure.  And everyone in the training learned about the risks of redesigning a website without taking SEO into account.

Those are just a few of the SEO topics you’ll learn more about at SEO Bootcamp Princeton.  Again, my goal is that you leave with a much deeper knowledge of SEO, that you can make changes immediately, and that you take SEO into account whenever working on a website, campaign, or redesign.  You can learn more about the topics I’m going to cover on the SEO Bootcamp Princeton webpage.

SEO Bootcamp Princeton is Job-Agnostic – All Levels and Positions Will Benefit

Technical and Creative Job Titles

Tools and Plugins
SEO is definitely a mix of art and science.  And in order to assist SEO professionals with several core tasks, there are many tools and plugins one can use.  During the training, I will highlight several of the tools and plugins that can make your job easier SEO-wise.  I’ve always said that when you combine the right tools with the right SEO knowledge, great things can happen.  And I’ll make sure to explain some of my favorites along the way.  From Firefox plugins to Chrome extensions to standalone software applications, you’ll leave the training with a list of tools that can help you on a regular basis.

SEO Tools Training

Major Algorithm Updates
I can’t leave this post without touching on a very important topic in SEO that’s affecting many business owners.  Google has launched several important algorithm updates since early 2011, including both Panda and Penguin.   As you can imagine, I receive calls every month from business owners that have gotten hammered by these updates.  During SEO Bootcamp Princeton, I will introduce each major algorithm update, and cover important insights based on helping a range of businesses deal with the aftermath of getting hit.  And more importantly, I can explain the best ways to avoid getting hit in the first place.  You can read several of my case studies about Panda recovery and Penguin recovery if you are interested in learning more.

Panda and Penguin Algorithm Updates

Next Steps, Register Today
In closing, I’m ultra-excited about SEO Bootcamp Princeton.  If you are interested in registering, you can sign up via the EventBrite page.  Again, there’s a 20% off, early registration discount running through 12/31.  After 12/31, the standard pricing will be $179 per seat.  If you have any questions about the training, don’t hesitate to contact me.  It would be great to see you there!

GG