Archive for the ‘tools’ Category

Tuesday, January 7th, 2014

Rap Genius Recovery: Analyzing The Keyword Gains and Losses After The Google Penalty Was Lifted

Rap Genius Recovers From Google Penalty

On Christmas Day, Rap Genius was given a heck of a gift from Google.  A penalty that sent their rankings plummeting faster than an anvil off the Eiffel tower.  The loss in traffic has been documented heavily as many keywords dropped from page one to page five and beyond.  And many of those keywords used to rank in positions #1 through #3 (or prime real estate SEO-wise).  Once the penalty was in place, what followed was a huge decrease in visits from Google organic, since most people don’t even venture to page two and beyond.  It’s like Siberia for SEO.

Gaming Links
So what happened that Google had to tear itself away from eggnog and a warm fire to penalize a lyrics website on Christmas Day?  Rap Genius was gaming links, and badly.  No, not just badly, but with such disregard for the consequences that they were almost daring Google to take action.  And that’s until Matt Cutts learned of the matter and took swift action on Rap Genius.

That was Christmas Day. Ho, ho, ho.  You get coal in your lyrical stocking.   I won’t go nuts here explaining the ins and outs of what they were doing.  That’s been documented heavily across the web.  In a nutshell, they were exchanging tweets for links.  If bloggers added a list of rich anchor text links to their posts, then Rap Genius would tweet links to their content.  The bloggers get a boatload of traffic and Rap Genius got links (and a lot of them using rich anchor text like {artist} + {song} + lyrics).  Here’s a quick screenshot of one page breaking the rules:

Rap Genius Unnatural Links

A 10 Day Penalty – LOL
Now, I help a lot of companies with algorithmic hits and manual actions.  Many of the companies contacting me for help broke the rules and are seeking help in identifying and then rectifying their SEO problems.  Depending on the situation, recovery can take months of hard work (or longer).  From an unnatural links standpoint, you need to analyze the site’s link profile, flag unnatural links, remove as many as you can manually, and then disavow the rest.  If you only have 500 links leading to your site, this can happen relatively quickly.  If you have 5 million, it can be a much larger and nastier project.

Rap Genius has 1.5 million links showing in Majestic’s Fresh Index.  And as you start to drill into the anchor text leading to the site, there are many questionable links.  You can reference their own post about the recovery to see examples of what I’m referring to.  Needless to say, they had a lot of work to do in order to recover.

So, you would think that it would take some time to track down, remove, and then disavow the unnatural links that caused them so much grief.  And then they would need to craft a serious reconsideration request documenting how they broke the rules, how they fixed the problem, and of course, offer a sincere apology for what they did (with a guarantee they will never do it again).   Then Google would need to go through the recon request, check all of the removals and hard work, and then decide whether the manual action should be lifted, or if Rap Genius had more work to do.  This should take at least a few weeks, right?  Wrong.  How about 10 days.

Rap Genius Recovers After 10 Days

Only 10 days after receiving a manual action, Rap Genius is back in Google.  As you can guess, the SEO community was not exactly thrilled with the news.  Screams of special treatment rang through the twitterverse, as Rap Genius explained that Google helped them to some degree understand how to best tackle the situation, or what to target.  Believe me, that’s rare.  Really rare…

Process for Removing and Disavowing Links
Rap Genius wrote a post about the recovery on January 4th, which included the detailed process for identifying and then dealing with unnatural links.  They had thousands of links to deal with, beginning with a master list of 178K.  From that master list, they started to drill into specific domains to identify unnatural links.   Once they did, Rap Genius removed what they could and disavowed the rest using Google’s Disavow Tool.   Following their work, Google removed the manual action on January 4th and Rap Genius was back in Google.

But many SEOs wondered how much they came back, especially since Rap Genius had to nuke thousands of links.  And many of those links were to deeper pages with rich anchor text.  Well, I’ve been tracking the situation from the start, checking which keywords dropped during the penalty, and now tracking which ones returned to high rankings after the penalty was lifted.  I’ll quickly explain the process I used for tracking rankings and then provide my findings.

My Process for Analyzing Rankings (With Some Nuances)
When the penalty was first applied to Rap Genius, I quickly checked SEMRush to view the organic search trending and to identify keywords that were “lost” and ones that “declined”.  Rap Genius ranks for hundreds of thousands of keywords according to SEMRush and its organic search reporting identified a 70K+ keyword loss based on the penalty.

Note, you can’t compare third party tools to a website’s own analytics reporting, and SEMRush won’t cover every keyword leading to the site.  But, for larger sites with a lot of volume, SEMRush is a fantastic tool viewing the gains and losses for a specific domain.  I’ve found it to be extremely thorough and accurate.

Checking the lost and declined keywords that SEMRush was reporting lined up with manual checks.  Those keywords definitely took a plunge, with Rap Genius appearing on page five or beyond.  And as I mentioned earlier, that’s basically Siberia for organic search.

When the penalty was lifted, I used the same process for checking keywords, but this time I checked the “new” and “improved” categories.  The reporting has shown 43K+ keywords showing in the “new” category, which means those keywords did not rank the last time SEMRush checked that query.

I also used Advanced Web Ranking to check 500 of the top keywords that were ranking prior to the penalty (and that dropped after the manual action was applied).  The keywords I checked were all ranking in the top ten prior to the penalty.  Once the penalty was lifted, I ran the rankings for those keywords.  I wanted to see how much of an improvement there was for the top 500 keywords.

Then I dug into the data based on both SEMRush and Advanced Web Ranking to see what I could find.  I have provided my findings below.   And yes, this is a fluid situation, so rankings could change.  But we have at least a few days of data now.  Without further ado, here’s what I found.

 

Branded Keywords
This was easy. Branded keywords that were obliterated during the penalty returned quickly with strong rankings.  This was completely expected.  For example, if you search for rap genius, rapgenius, or any variation, the site now ranks at the top of the search results.  And the domain name ranks with sitelinks. No surprises here.

Rap Genius Branded Keywords

Category Keywords
For category keywords, like “rap lyrics”, “favorite song lyrics”, and “popular song lyrics”, I saw mixed results after the recovery.  For example, the site now ranks #1 for “rap lyrics”, which makes sense, but does not rank well for “favorite song lyrics” and “popular song lyrics”.  And it ranked well for each of those prior to the penalty.  Although specific song lyric queries are a driving force for rap genius (covered soon), category keywords can drive a lot of volume.  It’s clear that the site didn’t recover for a number of key category keywords.

Rap Genius Category Keywords

 

Artist Keywords
I noticed that the site ranked for a lot of artists prior to the penalty (just the artist name with no modifiers).  For example, “kirko bangz”, “lil b”, etc.  Similar to what I saw with category keywords, I saw mixed results with artists.  Searching for the two artists I listed above does not yield high rankings anymore, when they both ranked on page one prior to the penalty.  Some increased in rankings, but not to page one.  For example, “2 chainz” ranks #12 after the penalty was lifted.  But it was MIA when the penalty was in effect.  Another example is “Kendrick Lamar”, which Rap Genius ranked #8 for prior to the penalty.  The site is not ranking well at all for that query now.  So again, it seems that Rap Genius recovered for some artist queries, but not all.

Rap Genius Artist Keywords

Lyrics Keywords
Based on my research, I could clearly see the power of {song} + lyrics queries for Rap Genius.  It’s a driving force for the site.  And Rap Genius is now ranking again for many of those queries.  When the penalty was first lifted, I started checking a number of those queries and saw Rap Genius back on page one, and sometimes #1.  But when I started checking in scale, you could definitely see that not all keywords returned to high rankings.

Rap Genius High Rankings for Lyrics Keywords

For example, “hallelujah lyrics”, “little things lyrics”, and “roller coaster lyrics” are still off of page one.  Then there are keywords that skyrocketed back up the charts, I mean search rankings.  For example, “swimming pool lyrics”, “marvins room lyrics”, and “not afraid lyrics” all returned after the penalty after being buried.  So, it seems that many song lyrics keywords returned, but there are some that rank page two and beyond.

Rap Genius Low Rankings for Lyrics Keywords

What About Keywords That Were Gamed?
I’m sure some of you are wondering how Rap Genius fared for keywords that were gamed via unnatural links.  For example, “22 two’s lyrics” yields extremely strong rankings for Rap Genius, when it was one of the songs gamed via the link scheme.  Actually, rap genius ranks twice in the top 5.  Go figure.

Rap Genius Rankings for Gamed Links - Jay Z

Ditto for “timbaland know bout me”, which was also one of the songs that made its way into the spammy list of links at the end of articles and posts.  Rap Genius ranks #3 right now.

Rap Genius Rankings for Gamed Links - Timbaland

And then there’s Justin Bieber, which I can’t cover with just one sentence.  Rap Genius currently ranks on page 3 for “Justin Bieber song lyrics”, when it used to rank #8!  And then “Justin Bieber baby lyrics” now ranks #12 on page 2, when it used to rank #8.  But for “Justin Bieber lyrics”, Rap Genius is #10, on page one.

Rap Genius Rankings for Justin Bieber Lyrics

Overall, I saw close to 100 Justin Bieber keywords pop back into the top few pages of Google after the penalty was lifted.  But, many were not on page one anymore… I saw many of those keywords yield rankings on page two or beyond for Rap Genius.  See the screenshot below:

Rap Genius Keywords for Justin Bieber

 

Summary – Rap Genius Recovers, But The Scars Remain
So there you have it.  A rundown of where Rap Genius is after the penalty was lifted.  Again, I can’t see every keyword that was lost or gained during the Christmas Day fiasco, but I could see enough of the data.  It seems that Rap Genius came back strong, but not full-blast.  I saw many keywords return, but still a number that remain buried in Google.

But let’s face it, a 10 day penalty is a slap on the wrist for Rap Genius.  They now have a clean(er) platform back, and can build up on that platform.  That’s a lot better than struggling for months (or longer) with horrible rankings.  As I explained earlier, too many business owners aren’t as lucky as Rap Genius.  10 days and help from Google can quicken up the recovery process.  That’s for sure.

I’ll end with one more screenshot to reinforce the fact that Rap Genius is back.  And it’s a fitting query. :)

Rap Genius I'm Sorry

GG

 

 

Monday, August 12th, 2013

Facebook Ads for eCommerce – How To Combine Custom Audiences, Lookalikes, and Unpublished Posts to Target Customers and Similar Users

How to use unpublished posts as Facebook Ads

I used to be extremely critical of Facebook Ads in the past.  But that’s before Facebook released a boatload of functionality for enhancing your campaigns.  Sure, marketplace ads, or ads running the right sidebar, have seen declining engagement over the years, but that’s just a fraction of what you can do now with Facebook Ads.  And I’m finding many advertisers don’t know about the powerful options available to them.

For example, there’s FBX (or retargeting on Facebook), news feed targeting, mobile-only targeting, promoted posts, custom audiences, lookalike audiences, unpublished posts, etc.  And with this enhanced functionality comes better targeting and performance.  Now, I still think paid search can reach someone who is searching for a specific solution at the exact time they need it, and social advertising can’t do that (yet).  But, using advanced targeting within Facebook can absolutely make an impact, and on multiple levels.

In this post, I’m going to explain one method of using three pieces of functionality in Facebook Ads that might change your view of social advertising.  It has for me, and I’ve been using this technique for some time now.  It leverages unpublished posts, custom audiences, and lookalike audiences to target your current customers, and users similar to your customers, when you are running a specific promotion or sale.  It’s a great way to make the most of your current assets, and at a relatively low cost.

Meet Unpublished Posts
I find many business owners have no idea what unpublished posts are.  If you fit into this category, then today is your lucky day.  Unpublished posts enable page owners to create page updates that don’t get shared with their entire fan base.  In addition, you can run ads based on the unpublished posts and use a wealth of ad targeting to reach the right audience (which can include current customers).  Interesting, right?

Unpublished posts in Facebook

The easiest way to create an unpublished post is to use Power Editor.  And if you’re running Facebook Ads and not using Power Editor, you should start today.  It offers a lot of functionality and targeting options not available in Ads Manager (which is what advertisers use on Facebook’s website).

By clicking “Manage Pages” in Power Editor, you can actually craft a page post.  But since we want an unpublished post, you can create the update and not publish it.  That’s ultra-important, since we want to use the post as an ad, and not an update that’s broadcast to your entire fan base.

Creating an unpublished post in Facebook using Power Editor.

So, if you’re an ecommerce provider running a specific sale, you could create an update focusing on that sale, with an understanding it will reach a very specific audience (and not every fan).  I’ll cover how to target specific parts of your customer list soon, including people that are similar to those users.  Once you create your post, you can click your account ID in the left pane to return to your ads dashboard (in Power Editor).

Now we’re ready to talk about custom audiences and lookalikes.

Meet Custom Audiences and Lookalikes
I wrote a post earlier in the year about custom audiences in Facebook.  You should read that post to learn how to set them up.  You’ll need a custom audience in order to use the method I’m covering in this post (since that’s the audience you will target, and it’s also the list you will use to create a lookalike audience).

Custom audiences enable you to upload a list of current customers, based on your in-house email list.  Then, Facebook will match up the list with users on the social network.  Yes, you read that correctly.  That means you can target your in-house email list (or parts of that list) via Facebook Ads.  Awesome, right?

Using Custom Audiences in Facebook

Once your custom audience is created, you can use that list to target current customers with specific promotions and sales.  And you can use unpublished posts to reach them.  Did you catch that?  I said unpublished posts.  That means getting your targeted promotion in front of your current customers (whether they are fans of your page or not).

Great, but what’s a lookalike?
Lookalike audiences enable you to base a new audience (set of Facebook users) on a custom audience (your current customers).  Facebook reviews a number of characteristics about your custom audience (your current customer base), and then finds people similar to your customers.  Yes, once again, eye-opening targeting opportunity ahead.

Imagine you had five custom audiences set up, all containing specific customers for specific categories of products.  Then you could use lookalikes to find similar people (which you can then target via Facebook Ads).  The old days of Facebook ads seem so caveman-like, right?  :)

How To Set Up Lookalikes
Once you have set up a custom audience (following my tutorial), then you can easily select that audience in Power Editor, and choose “Create Similar Audience”.  Choose “Similarity” in the dialog box and Facebook will find users that are similar to your in-house list (based on a number of criteria).  It could take up to 24 hours to create the list, but I’ve seen it take much less time than that (especially for smaller lists).

Using Lookalike Audiences in Facebook

Combining Unpublished Posts, Custom Audiences, and Lookalikes
OK, we have covered unpublished posts that contain targeted messages about new promotions or sales.  We have also covered custom audiences based on our in-house email list.  And, we have covered lookalike audiences, which enable us to target similar people to our own customers.  Now we are ready to tie them together.

1. Create a New Campaign
In Power Editor, you can create a new campaign and set the campaign parameters like name, budget, etc.

Creating a new Facebook campaign in Power Editor.

2. Create a New Ad
Click the “Ads” tab to create your ad.  Under “Type”, choose “Ad”, and then select the radio button labeled “For a Facebook Page Using a Page Post”.  That will enable you to choose an unpublished post for your ad.

Creating an unpublished post ad in Facebook.

3. Choose a Destination
For “Destination”, choose your Facebook Page.  Note, your page’s image and title will still link users to your page, but the post itself can drive users to the sale landing page on your website.  Your post itself is where you should place the link to your landing page (on your own site).  In addition, you should add tracking parameters to your destination urls for your unpublished post (so you can track each campaign via your analytics package).

Choosing an ad destination for unpublished post ad in Facebook.

4. Select An Unpublished Post
Now, choose your unpublished post to use that post as the actual ad.  Note, you can also create your unpublished post at this stage (using Power Editor).  That’s a nice feature that was recently added.

Selecting a page post for an unpublished post ad in Power Editor.

5. Choose your placement:
OK, how awesome is this?  You get to choose where your unpublished post shows up.  For example, in the News Feed (Desktop and Mobile).  This is the most powerful placement in my opinion.  Your ads will show up directly in someone’s news feed versus along the right side.

Choosing ad placement for unpublished post in Power Editor.

6. Choose Your Targeting
Under “Audience”, you can choose targeting, based on the goals of your campaign.  Note, this is not where you will choose your custom or lookalike audience, although the tab is titled “Audience”.  You can choose location, age, gender, etc. if you want more granular targeting than just the custom audiences we created earlier.

Choosing ad targeting for unpublished post in Power Editor.

7. Choose Your Audience (Yes, this is what we’ve been waiting for.)
Under “Advanced Options”, you’ll notice the first field is titled “Custom Audiences”.  If you start typing in that field, your custom audience should show up (based on what you named the audience when you created it).  Once selected, it should show up in the field.  You can leave the rest of the targeting options located below as-is.

Selecting a custom audience for an unpublished post ad in Power Editor.

Clarification Side Note:
To clarify what we’ve been doing, this ad will target your current customer list.  When you create a second campaign, you can choose your lookalike audience.  Then you can run both campaigns and target both your current customer list and people similar to your current customers.   And since they are in separate campaigns, with separate tracking parameters, you can track performance by audience.  Awesome.

8. Select Your Pricing and Status Options
For this example, let’s choose CPC and enter the desired cost per click.  Facebook will provide a suggested CPC to the right.  Once completed, you’re ready to rock.

How to set pricing for an unpublished post ad in Power Editor.

9. Upload Your Campaign
Click “Upload” in Power Editor and your ad will be uploaded to Facebook, where it will need to be approved.  Once approved, you’ll receive a notification that your unpublished post is live.

Uploading an unpublished post ad using Power Editor.

Why this approach works:

1. Exposure and Sharing
By using this approach, you can get your latest sale or promotion in front of your current customers as they browse Facebook, while also providing a great opportunity for that sale or promotion to get shared.  For example a current customer might like your update, and it could hit their friends’ news feeds, which can provide even more exposure and opportunities to land new customers.

2. Engagement
Even though the unpublished post is technically an ad, it still looks and works like a typical page post update.  That means users can like, share, and comment on the post.  And yes, users often do like and comment on unpublished post ads.  Remember, the unpublished post ad is hitting users’ news feeds (both desktop and mobile), so there is a strong chance they will be exposed to your ad.   And if it’s crafted well, then there’s a chance that a certain percentage of that audience will engage with the post. It’s a great way to engage your current customers, while also engaging similar people (via a lookalike audience).

3. Page Likes
Gaining more page likes is an added benefit to using this approach.  Sure, you want people to click through to your sale landing page and buy, but you probably also want more page likes (so you can reach more people with your organic status updates down the line).  I’ve seen unpublished post ads work extremely well for gaining more page likes (across industries).  For example, a recent campaign I launched increased page likes by 7% during a one week period.  Not bad, when you take into account the other benefits from running the campaign (like exposure, sharing, engagement, and sales – which I’ll cover next).

4. Sales (and other important conversions)
Using this approach can yield a low CPA, high ROAS method for increasing sales for specific promotions.  I’ve run campaigns where the CPC was under $0.40 per click, and depending on the specific campaign, return on ad spend (ROAS) can be extremely strong.  For example, 2000 clicks at $0.40 per click is $800.  A conversion rate of 2.0% and an average order value of $75 would yield $3000 in revenue and 275% ROAS.  That’s just a small and quick example, but unpublished page post ads could yield a shot in the arm pretty quickly.

And from a B2B standpoint, with average order values typically much higher than B2C, the ROAS could be even greater.  Even a handful of sales could generated thousands (or tens of thousands) of dollars in revenue.  For example, a recent campaign I launched for a client focused on items starting at $1000 (and some were up to $5000 per item).  Even one sale at $5K based on the campaign I mentioned before would yield a strong ROAS.

And let’s not forget other important micro-conversions on your website.  For example, newsletter signups, which can be a great driver of revenue for any ecommerce provider, app downloads, requests for more information, etc. all fall under this category and can start forging a relationship between prospective customers and your business.

What’s the Downside?
OK, I love using this approach, but social advertising brings some unique challenges with it.  Since what we’ve covered is an actual page post, and not a straight ad, users can interact with it.  That means both positive and negative interaction can occur.  For example, you might have some unhappy customers post their negative feedback in the unpublished page post ad.  How you deal with that situation is for another post, but I always recommend addressing the problem directly (in the post).  But again, there are several situations that can arise, and I’ll try and address them in a future post.  Just keep in mind that users can comment, and those comments might not always be positive.

The Power of Unpublished Posts, Custom Audiences, and Lookalikes
After reading this post, I hope you better understand the power of using unpublished posts along with custom audiences and lookalike audiences.  Unfortunately, the features and functionality I covered in the post are not readily apparent to many Facebook advertisers.  And that’s a shame, since they can be extremely effective for businesses looking to engage current customers and new audiences, while also increasing sales.  I recommend testing this approach soon to see if it can be effective for your business.

You can start today. Create a custom audience, create a lookalike audience, and use Power Editor to create unpublished post ads.  You may never look back.  :)

GG

 

Wednesday, July 10th, 2013

Avoiding Dirty Sitemaps – How to Download and Crawl XML Sitemaps Using Screaming Frog

Dirty XML Sitemaps

SEO Audits are a core service I provide, including both comprehensive audits and laser-focused audits tied to algorithm updates.  There are times during those audits that I come across strange pages that are indexed, or I see crawl errors for pages not readily apparent on the site itself.  As part of the investigation, it’s smart to analyze and crawl a website’s xml sitemap(s) to determine if that could be part of the problem.  It’s not uncommon for a sitemap to contain old pages, pages leading to 404s, application errors, redirects, etc.  And you definitely don’t want to submit “dirty sitemaps” to the engines.

What’s a Dirty Sitemap?
A dirty sitemap is an xml sitemap that contains 404s, 302s, 500s, etc.  Note, those are header response codes.  A 200 code is ok, while the others signal various errors or redirects.  Since the engines will retrieve your sitemap and crawl your urls, you definitely don’t want to feed them errors.  Instead, you want your xml sitemaps to contain canonical urls on your site, and urls that resolve with a 200 code.  Duane Forrester from Bing was on record saying that they have very little tolerance for “dirt in a sitemap”.  And Google feels the same way.  Therefore, you should avoid dirty sitemaps so the engines can build trust in that sitemap (versus having the engines encounter 404s, 302s, 500s, etc.)

Indexed to Submitted Ratio
One metric that can help you understand if your xml sitemaps are problematic (or dirty) is the indexed to submitted ratio in Google Webmaster Tools.  When you access the “Sitemaps” section of webmaster tools (under the “Crawl” tab), you will see the number of pages submitted in the sitemap, but also the number indexed.  That ratio should be close(r) to 1:1.  If you see a low indexed to submitted ratio, then that could flag an issue with the urls you are submitting in your sitemap.  For example, if you see 12K pages submitted, but only 6500 indexed.  That’s only 54% of the pages submitted.

Here’s a screenshot of a very low indexed to submitted ratio:
A Low Submitted to Indexed Sitemap Ratio in Google Webmaster Tools

Pages “Linked From” in Google Webmaster Tools
In addition to what I explained above about the indexed to submitted ratio, you might find crawl errors in Google Webmaster Tools for urls that don’t look familiar.  In order to help track down the problematic urls, webmaster tools will show you how it found the urls in question.

If you click the url in the crawl errors report, you will see the error details as the default view.  But, you will also see two additional tabs for “In Sitemaps” and “Linked From”.  These tabs will reveal if the urls are contained in a specific sitemap, and if the urls are being linked to from other files on your site.  This is a great way to hunt down problems, and as you can guess, you might find that the your xml sitemap is causing problems.

Linked From in Google Webmaster Tools
Crawling XML Sitemaps
If you do see a problematic indexed to submitted ratio, what can you do?  Well, the beautiful part about xml sitemaps is that they are public.  As long as you know where they reside, you can download and crawl them using a tool like Screaming Frog.  I’ve written about Screaming Frog in the past, and it’s a phenomenal tool for crawling websites, flagging errors, analyzing optimization, etc.  I highly recommend using it.

Screaming Frog provides functionality for crawling text files (containing a list of urls), but not an xml file (which is the format of xml sitemaps submitted to the engines).  That’s a problem if you simply download the xml file to your computer.  In order to get that sitemap file into a format that can be crawled by Screaming Frog, you’ll need to first import that file into Excel, and then copy the urls to a text file.  Then you can crawl the file.

And that’s exactly what I’m going to show you in this tutorial.  Once you crawl the xml sitemap, you might find a boatload of issues that can be quickly resolved.  And when you are hunting down problems SEO-wise, any problem you can identify and fix quickly is a win.  Let’s begin.

Quick Note: If you control the creation of your xml sitemaps, then you obviously don’t need to download them from the site.  That said, the sitemaps residing on your website are what the engines crawl.  If your CMS is generating your sitemaps on the fly, then it’s valuable to use the exact sitemaps sitting on your servers.  So even though you might have them locally, I would still go through the process of downloading them from your website via the tutorial below.

How To Download and Crawl Your XML Sitemaps

  1. Download the XML Sitemap(s)
    Enter the URL of your xml sitemap, or the sitemap index file.  A sitemap index file contains the urls of all of your xml sitemaps (if you need to use more than one due to sitemap size limitations).  If you are using a sitemap index file, then you will need to download each xml sitemap separately.  Then you can either crawl each one separately or combine the urls into one master text file.  After the sitemap loads in your browser, click “File”, and then “Save As”.  Then save the file to your hard drive.
    Download XML Sitemaps
  2. Import the Sitemap into Excel
    Next, you’ll need to get a straight list of urls to crawl from the sitemap.  In order to do this, I recommend using the “Import XML” functionality in the “Developer” tab in Excel.  Click “Import” and then select the sitemap file you just downloaded.  After clicking the “Import” button after selecting your file, Excel will provide a dialog box about the xml schema.  Just click “OK”.  Then Excel will ask you where to place the data.  Leave the default option and click “OK”.  You should now see a table containing the urls from your xml sitemap.  And yes, you might already see some problems in the list.  :)
    Import XML Sitemaps into Excel
  3. Copy the URLs to a Text File
    I mentioned earlier that Screaming Frog will only crawl text files with a list of urls in them.  In order to achieve this, you should copy all of the urls from column A in your spreadsheet.  Then fire up your text editor of choice (mine is Textpad), and paste the urls.  Make sure you delete the first row, which contains the heading for the column.  Save that file to your computer.
    Copy URLs to a Text File
  4. Unleash the Frog
    Next, we’re ready to crawl the urls in the text file you just created.  Fire up Screaming Frog and click the “Mode” tab.  Select “List”, which enables you to load a text file containing a series of urls.
    List Mode in Screaming Frog
  5. Start the Crawl
    Once you load the urls in Screaming Frog, click “Start” to begin crawling the urls.
    Start a Crawl in Screaming Frog
  6. Analyze the Crawl
    When the crawl is done, you now have a boatload of data about each url listed in the xml sitemap.  The first place I would start is the “Response Codes” tab, which will display the header response codes for each url that was crawled.  You can also use the filter dropdown to isolate 404s, 500s, 302s, etc.  You might be surprised with what you find.
    Analyze a Sitemap Crawl in Screaming Frog
  7. Fix The Problems!
    Once you analyze the crawl, work with your developer or development team to rectify the problems you identified.  The fix sometimes can be handled quickly (in less than a day or two).

 

Summary – Cleaning Up Dirty Sitemaps
Although XML sitemaps provide an easy way to submit all of your canonical urls to the engines, that ease of setup sometimes leads to serious errors.  If you are seeing strange urls getting indexed, or if you are seeing crawl errors for weird or unfamiliar urls, then you might want to check your own sitemaps to see if they are causing a problem.  Using this tutorial, you can download and crawl your sitemaps quickly, and then flag any errors you find along the way.

Let’s face it, quick and easy wins are sometimes hard to come by in SEO.  But finding xml sitemap errors can be a quick an easy win.  And now you know how to find them.  Happy crawling.

GG

 

 

Monday, May 13th, 2013

Robots.txt and Invisible Characters – How One Hidden Character Could Cause SEO Problems

How syntax errors in robots.txt can cause SEO problems.

If you’ve read some of my blog posts in the past, then you know I perform a lot of SEO technical audits.  As one of the checks during SEO audits, I always analyze a client’s robots.txt file to ensure it’s not blocking important directories or files.  If you’re not familiar with robots.txt, it’s a text file that sits in the root directory of your website and should be used to inform the search engine bots which directories or files they should not crawl.  You can also add autodiscovery for your xml sitemaps (which is a smart directive to add to a robots.txt file).

Anyway, I came across an interesting situation recently that I wanted to share.  My hope is that this post can help some companies avoid a potentially serious SEO issue that was not readily apparent.  Actually, the problem could not be detected by the naked eye.  And when a problem impacts your robots.txt file, the bots won’t follow your instructions.  And when the bots don’t follow instructions, they can potentially be unleashed into content that should never get crawled.  Let’s explore this situation in greater detail.

A sample robots.txt file:

Sample Robots.txt File

Technical SEO – Cloaked Danger in a Robots.txt File
During my first check of the robots.txt file, everything looked fine.  There were a number of directories being blocked for all search engines.  Autodiscovery was added, which was great.  All looked good.  Then I checked Google Webmaster Tools to perform some manual checks on various files and directories (based on Google’s “Blocked URLs” functionality).  Unfortunately, there were a number of errors showing within the analysis section.

The first error message started with the User-agent line (the first line in the file).  Googlebot was choking on that line for some reason, but it looked completely fine.  And as you can guess, none of the directives listed in the file were being adhered to.  This meant that potentially thousands of files would be crawled that shouldn’t be crawled, and all because of a problem that was hiding below the surface…  literally.

Blocked URLs reporting in Google Webmaster Tools:

Blocked URLs in Google Webmaster Tools

 

Word Processors and Hidden Characters
So I started checking several robots.txt tools to see what they would return.  Again, the file looked completely fine to me.  The first few checks returned errors, but wouldn’t explain exactly what was wrong.  And then I came across one that revealed more information.  The tool revealed an extra character (hidden character) at the beginning of the robots.txt file.  This hidden character was throwing off the format of the file, and the bots were choking on it.  And based on the robots syntax being thrown off, the bots wouldn’t follow the instructions.  Not good.

Invisible Character in Robots.txt

I immediately sent this off to my client and their dev team tracked down the hidden character, and created a new robots.txt file.  The new file was uploaded pretty quickly (within a few hours).  And all checks are fine now.  The bots are also adhering to the directives included in robots.txt.

 

The SEO Problems This Scenario Raises
I think this simple example underscores the fact that there’s not a lot of room for error with technical SEO… it must be precise.  In this case, one hidden character in a robots.txt file unleashed the bots on a lot of content that should never be crawled.  Sure, there are other mechanisms to make sure content doesn’t get indexed, like the proper use of the meta robots tag, but that’s for another post.  For my client, a robots.txt file was created, it looked completely fine, but one character was off (and it was hidden).  And that one character forced the bots to choke on the file.

 

How To Avoid Robots.txt Formatting Issues
I think one person at my client’s company summed up this situation perfectly when she said, “it seems you have little room for error, SEO seems so delicate”.  Yes, she’s right (with technical SEO).  Below, I’m going to list some simple things you can do to avoid this scenario.   If you follow these steps, you could avoid faulty robots.txt files that seem accurate to the naked eye.

1. Text Editors
Always use a text editor when creating your robots.txt file.  Don’t use a word processing application like Microsoft Word.  A text editor is meant to create raw text files, and it won’t throw extra characters into your file by accident.

2. Double and Triple Check Your robots.txt Directives
Make sure each directive does exactly what you think it will do.  If you aren’t 100% sure you know, then ask for help.  Don’t upload a robots.txt file that could potentially block a bunch of important content (or vice versa).

3. Test Your robots.txt File in Google Webmaster Tools and Via Third Party Tools
Make sure the syntax of your robots.txt file is correct and that it’s blocking the directories and files you want it to.  Note, Google Webmaster Tools enables you to copy and paste a new robots file into a form and test it out.  I highly recommend you do this BEFORE uploading a new file to your site.

4. Monitor Google Webmaster Tools “Blocked URLs” Reporting
The blocked urls functionality will reveal problems associated with your robots.txt file under the “analysis” section.  Remember, this is where I picked up the problem covered in this post.

 

Extra Characters in Robots.txt – Cloaked in Danger
There you have it.  One hidden character bombed a robots.txt file.  The problem was hidden to the naked eye, but the bots were choking on it.  And depending on your specific site, that one character could have led to thousands of pages getting crawled that shouldn’t be.  I hope this post helped you understand that your robots.txt format and syntax are extremely important, that you should double and triple check your file, and that you can test and monitor that file over time.  If the wrong file is uploaded to your website, bad things can happen.  Avoid this scenario.

GG

 

Sunday, April 14th, 2013

You Might Be Losing Out – How To Make Sure Sitelink Extensions in Bing Ads Are Tracked Properly [Tutorial]

Bing Ads released sitelink extensions in October of 2012, which enables advertisers to provide additional links in their text ads.  Google AdWords has had ad sitelinks for some time, so this was a great addition by our friends at Bing Ads.  For example, if you were an ecommerce website selling sporting goods, you could provide ad sitelinks for your top categories, like football, baseball, basketball, etc. right beneath your standard text ad.  Sitelink extensions are great usability-wise, while they also provide a nice advantage in the SERPs (since they take up more real-estate).

Here are two examples of sitelink extensions in action (2 Formats):
Example of Sitelink Extensions in Bing Ads for Lucky Jeans

 

Example of Sitelink Extensions in Bing Ads for Adidas

So, let’s say you set up sitelink extensions for some of your campaigns, and you’re basking in the glory of those beautiful ads (and the click through they are getting).  But, maybe your reporting isn’t lining up clicks and visits-wise.  Sure, there are several reasons that could be happening, but maybe it got worse since you launched sitelink extensions.  Well, the reason could very well be the lack of tagging on your ad sitelinks.  If those additional URLs aren’t tagged properly, then your analytics package could very well be reporting that traffic as organic search.  And that would be shame.

In this post, I’m going to walk you through why this could be happening, and how to rectify the situation.  After reading this post, you might just run to Bing Ads today and make changes.  Let’s jump in.

Sitelink Extensions and Tracking Parameters
In Bing Ads, you can include sitelink extensions several ways.  First, you can add them manually via the Bing Ads web UI.  Second, you can use Bing Ads Editor to add them locally, and then upload them to your account.  And third, and possibly the top reason ad sitelinks don’t get tagged, is that you can import them from AdWords via the “Import from Google” functionality.  Note, the import from AdWords functionality is awesome, so don’t get me wrong.  It’s just that it’s easy to import ad sitelinks and not know they are there.  Then you run the risk of uploading untagged sitelink extensions.

How To Create Sitelink Extensions in Bing Ads

So, you need to make sure that your ad sitelinks are tagged properly, based on the analytics package you are using to track campaigns.  For example, if you are using Google Analytics, then you need to make sure that you identify each click coming from your sitelink extensions.  That means you should be appending tracking parameters to your sitelink URLs.  For Google Anlaytics, you can use URL Builder to tag your landing page URLs.

Tagging Sitelink URLs Using URL Builder

 

How To Tag Your Ad Sitelinks in Bing Ads
Again there are various ways to include sitelink extensions in your campaigns, from using the web UI to using Bing Ads Editor to using the “Import from Google” functionality.  I’ll quickly cover each method below to make sure you know where to apply your tracking parameters.

1.  The Bing Ads Web UI
You can currently apply ad sitelinks at the campaign level in Bing Ads.  When you access a campaign, you can click the “Ad Extensions” tab to include ad sitelinks.  Once there, you can click “Create” to add a new sitelink extension.  If you have other sitelink extensions set up across campaigns, you will see them listed (and you can apply those to your campaign if it makes sense).

Creating Sitelink Extensions Using the Bing Web UI

If you want to add a completely new sitelink extension, then click “Create New”.  When adding the sitelink extension, Bing Ads provide a field for link text and then a field for the destination URL.  When you add the URL, make sure your tracking parameters are added!  If not, your visits will show up as “Bing Organic” versus “Bing CPC”.  Good for the SEO team, but not so good for the paid search team.  :)

 

Adding Sitelinks Using the Bing Web UI

 

2. Bing Ads Editor
I love Bing Ads Editor.  It’s an awesome way to manage your campaigns locally and then sync with the Bing Ads web UI.  And as you can guess, there is functionality for adding and editing sitelink extensions in Bing Ads Editor.  You can access your sitelink extensions by clicking the “Ad Extensions” tab for any selected campaign.

Once you click the “Ad Extensions” tab, you can add sitelink extensions by clicking the “Create a Sitelink Extension” button from the top menu.  Then similar to the web UI, you can add the link text and the destination URL.  When adding your destination URLs, make sure your tracking parameters are added.

Adding Sitelinks Using the Bing Ads Editor

 

3. Import from Google (in Bing Ads Editor)
As I explained earlier, I love having the ability to import campaigns, changes, etc. from AdWords directly into Bing Ads Editor.  It makes managing campaigns across both platforms much more efficient.  But, I’ve seen advertisers import campaigns from AdWords that have sitelink extensions, but they don’t realize it.  Then they upload their campaigns to Bing Ads and don’t understand that prospective customers are clicking their sitelinks, visiting their sites, etc., but those visits aren’t being tracked correctly.  Again, those visits will show up as “Bing Organic” in your analytics reporting.

When you go through the process of importing your campaigns, make sure you double check the “Ad Extensions” tab for the newly-imported campaign.  You just might find sitelink extensions sitting there.  And yes, they very well could be left untagged.  Make sure you add your tracking parameters before uploading them to Bing Ads (from Bing Ads Editor).

You can also uncheck the “Ad Extensions” radio button when importing your campaigns from AdWords.  Then you can add your sitelink extensions directly in Bing Ads Editor (via the second method I covered earlier in this post.

Importing Sitelink Extensions in Bing Ads Editor

 

Sitelinks Are Powerful, But Only If They Are Tracked
Sitelinks extensions are a great addition to Bing Ads, and they absolutely can yield higher click through rates.  But, you need to make sure those clicks are being tracked and attributed to the right source – your Bing Ads campaigns!  I recommend checking your campaigns today to make sure your sitelink extensions have the proper tracking parameters appended.  If not, you can quickly refine those links to make sure all is ok.   And when everything is being tracked properly, you just might see a boost in visits, orders, and revenue being attributed to Bing Ads.  And that’s always a good thing.

GG

 

 

Thursday, March 21st, 2013

How To Properly Demote Sitelinks in Google Webmaster Tools

How to remove sitelinks in Google Webmaster Tools

I’ve received several questions recently about how to remove sitelinks in Google.  If you’re not familiar with sitelinks, they are additional links that Google provides under certain search listings.  Sitelinks enable users to drill deeper into a site directly from the search results.  You typically see sitelinks for branded searches.

For example, here are sitelinks for Amazon:
Sitelinks for Amazon.com

 

And here are sitelinks for the Apple iPad:
Sitelinks for Apple iPad


How Google Determines Sitelinks
Google algorithmically determines sitelinks for a given query/url combination.  This is based on a number of factors that Google takes into account.  For example, Google explains that it analyzes a site’s link structure to determine if there are additional links it can provide in the search results that will save users time (by quickly enabling them to link to core pages on your site).  Remember, Google always wants to connect users with the information they are seeking as fast as possible.

No, Google Doesn’t Always Get It Right
If you are checking your rankings and notice strange sitelinks showing up, you can always demote those links via Google Webmaster Tools.  For example, you might see sitelinks that are irrelevant, too granular, or links that could end up sending users to supporting pages that wouldn’t provide a strong user experience.  Whatever the case, you can take action.

For cases like this, you can use the “Sitelinks” section of Google Webmaster Tools to demote specific sitelinks.  Note, if you don’t have Google Webmaster Tools set up for your site, stop reading this post, and set it up NOW.  You can set up your account and verify your site in just a few minutes, and then you’ll receive a boatload of important data right from Google.

Demoting Sitelinks in Google Webmaster Tools
Once you set up a webmaster tools account, you can access the sitemaps section to begin demoting specific sitelinks.  Below is a step by step tutorial for demoting sitelinks that shouldn’t be showing up below your search listings.

1. Access the Sitelinks Section of Webmaster Tools
Access Google Webmaster Tools and click the “Configuration” tab, and then “Sitelinks” to access the demotion form.

How to access sitelinks in Google Webmaster Tools

2. Choose Wisely When Demoting Sitelinks
There are two text fields you need to concern yourself with in the “Sitelinks” section.  The first is labeled, “For this search result:” and it refers to the webpage that shows up in the search results that contains sitelinks.  I know this is where confusion starts to set in, so let me say that again (and show you what I mean).

The first text field is not for the sitelink URL you want to demote.  It’s for the webpage that the sitelinks show up for.  It’s the URL that’s displayed at the top of the search listing.  Note, if you are demoting a sitelink for your homepage, you can leave this field blank.  It’s also worth noting that Google provides the root URL of your site already in the text field, so you just need to worry about the remaining part of the URL, which is called the URI (everything after http://www.yourdomain.com/).

Enter search result when demoting sitelinks.

For example, if you were the VP of Marketing for Apple, and wanted to remove the “Refurbished iPad” sitelink for the iPad page, then you would enter http://www.apple.com/ipad/ in the first field.

How to remove sitelinks for the ipad search result.


3. Demote the Sitelink URL
The second field is where you will enter the URL of the sitelink you want to demote.  Using our Apple example above, you would enter http://store.apple.com/us/browse/home/specialdeals/ipad in the field to demote the “Refurbished” sitelink for the ipad URL.  That’s the refurbished iPad page on Apple’s site (and it’s where the sitelink in the search results points to).

Once you enter the URL, you can click the red “Demote” button.  Once you do, the demoted sitelink will be listed below the form with the search result it applies to, the specific sitelink URL, and a “Remove Demotion” button.  If you ever want to remove the demotion, just access this page again, and click “Remove Demotion”.  Then give Google a few days to apply the changes.

Enter the sitelink url to demote.

 

Misc. Sitelink Information
Based on the questions I have received when helping clients demote sitelinks, I figured I would provide some additional information in this post.

1. How Long Does it Take for Google to Demote Sitelinks?
I’ve seen sitelinks get demoted in just a few days.  That said, it definitely varies per site…  I’ve seen it take a little longer in certain cases.  I recommend monitoring the sitelinks for the page in question for at least a week or two after demoting a sitelink.  If you notice that it’s still showing up, then revisit the form to make sure you demoted the right sitelink for the right search result.

2. How Many Sitelinks Can I Demote?
You can demote up to 100 URL’s via Google Webmaster Tools.  That should be plenty for most webmasters.  Actually, I’d argue that something is very wrong if you are demoting too many sitelinks…  You might want to analyze your internal navigation, including the anchor text, to see if Google is picking up something that it shouldn’t be.

Summary – Demotion Can Be A Good Thing
I hope this tutorial helped you better understand what sitelinks are and how to address the wrong sitelinks showing up in the search results.  If you notice any weird sitelinks showing up in the search results for your site, then visit Google Webmaster Tools and demote those specific sitelinks.  It’s one of the few times that a demotion could be a good thing.

GG

 

Friday, February 15th, 2013

How to Combine Custom Audiences in Facebook Ads to Enhance Your Targeting [Tutorial]

Custom Audiences in Facebook

Facebook recently released a powerful new option for advertisers called Custom Audiences.  Using custom audiences, advertisers can leverage their current in-house list of customers for targeting ads.  By uploading a list of emails, phone numbers, or UID’s, you can create a custom audience that can be used for targeting Facebook campaigns.

In my opinion, this was a brilliant move by Facebook.  It brings a unique targeting capability to the social network, and can be extremely useful on several levels.  For example, are you launching a new product?  Then use your custom audience to make sure your current customers know about the new product by reaching them on Facebook.  Know that a certain group of customers are interested in a given category of products?  Then use a custom audience to target just those customers with specific ads, copy, and calls to action.  The sky is the limit with regard to ideas for targeting your current set of customers, and I’ve been using custom audiences more and more recently.

Using Segmentation to Move Beyond Your One In-house Email List
A business can easily export its in-house email list and upload it to Facebook to create a custom audience.  It’s relatively straight-forward to do so, and you can accomplish this via Power Editor.  Once Facebook processes your list, it’s available to use when targeting an audience.  But, you shouldn’t stop there…  You can slice and dice your in-house email list and upload several files (if you have criteria for segmenting your list).

For example, do you know which customers are interested in which categories you sell?  Break those out.  Do you know which customers are tied to which purchases?  Sure you do, break those out too.  Once you do, you’ll have several targeted lists of emails that you can combine to hone your targeting.  And who doesn’t like that idea?

Combining Custom Audiences
When using Remarketing in AdWords, there is something called custom combinations.  When advertisers create a custom combination, they can create a remarketing audience that includes one audience, but excludes another.  That’s extremely powerful and provides a lot of flexibility for businesses trying to reach their customers via retargeting efforts.  Well, combining custom audiences in Facebook Ads enables you to do the same thing.

Here’s a simple hypothetical situation.  Let’s say you sold amazing new earphones that are invisible to the naked eye.  You already blasted an email out to your current customers and received some orders.  If your full email list was uploaded to Facebook as a custom audience (which should be done anyway), then you could create a second audience that includes customers that already purchased the new earphones.

Then, when you create a new campaign targeting your in-house email list (promoting your new earphones), you can exclude the list of customers that already purchased them.  This saves you from looking foolish, cuts down on wasted impressions, wasted clicks, and wasted budget.  Yes, that’s a simple example, but shows the power of creating custom combinations in Facebook.

How To Use Custom Combinations with Facebook Ads
Let’s quickly walk through how to set this up in Facebook.  Below, I’m going to explain how to first create a custom audience, and then how to upload and use a second audience (that can be used to hone your targeting).  Let’s create a custom combination using custom audiences in Facebook:

1. Export a straight list of customer emails as a .csv file.

Exporting a CSV of emails to create a custom audience.

 

2. Launch Power Editor and click the “Custom Audiences” Tab.
Note, if you’ve never used Power Editor, set that up now, download all of your campaigns, and then revisit this tutorial.

Custom Audience Tab in Facebook Ads

 

3. Click the “Create Audience” button and enter the name, description, and choose the type of list. 
For this list, click the “Emails” radio button.  You should also click “Choose File” button to locate the csv file we just created in the previous step.

The Custom Audience Dialog Box in Facebook Ads

 

4. Click “Create” and Facebook will upload your list and create your custom audience. 
Note, it could take a few hours for Facebook to process the file.  That depends on your list.  Remember, Facebook is going to scan the emails and try and match them up to current Facebook users.

 

5. Wait for Facebook to process your custom audience.
The status for the custom audience will say, “Waiting” while Facebook is processing the file.  That will change to “Ready” when the audience is ready to go.
You should also see the audience size (based on the users that Facebook could match up).

Custom Audience Status Message

 

6. Repeat the process in steps 1-5 to create a second custom audience (the hypothetical list of customers that already purchased our killer new earphones).
Make sure you give the new custom audience a descriptive name like “customers-invisible-earphones”.

 

7. Create a new campaign that will be used to target your current customers that have not purchased your new earphones yet.
Simply use the standard process for setting up a new Facebook campaign.

Creating a New Facebook Campaign

 

8. Select your custom audience.

When you create a new ad within your new campaign, you can hop down to the “Audience” tab.  You can click the button labeled “Use Existing Audience”.  Then select your full in-house email list.  That’s the first custom audience we created.

Use Existing Audience in Facebook Ads

 

9. Now select the custom audience to exclude.

Next, click the “Advanced Options” tab under “Audience”.  You will see an option for “Excluded Audiences”.   You can start typing the name of the custom audience containing customers that already purchased your earphones (the second custom audience we created).  The audience name should auto-populate when you start typing.  After selecting the audience, you should see the “Estimated Reach” number drop, based on excluding the new list.

Combining Custom Audiences to Enhance Targeting

 

10. That’s it, you have now used a custom combination to hone your targeting using Custom Audiences!
Your ads will now only be displayed to customers on your email list that have not purchased your new earphones yet.

Summary – Combine Audiences for Power
As I explained earlier, using custom audiences is a new and powerful way to reach a targeted audience on Facebook.   It combines the power of a current, in-house email list with the flexibility and intelligence of segmenting your audience.  Don’t look foolish, don’t waste clicks, and don’t waste budget.  Use custom combinations to slice and dice your current customer list.  Now go ahead.  Set up your campaign now.  :)

GG

 

Thursday, December 27th, 2012

Introducing SEO Bootcamp Princeton, A Hands-On SEO Training Course in Princeton NJ

SEO Training Topics

I absolutely love getting in front of a group of people to speak about SEO (and always have).  Over the past several years, I’ve led SEO training classes for clients covering a wide range of topics, from technical SEO to keyword research to content optimization to linkbuilding strategy.  Although I’ve really enjoyed leading classes like this, I’ve always wanted to launch a training program that anyone could sign up for, and not just clients.  Well, I finally put the program together, and it’s called SEO Bootcamp Princeton.

SEO Bootcamp Princeton is a three hour, in-person training course being held at the Johnson Education Center (at D&R Greenway) on January 17th, from 9AM to 12PM.  You can register online via EventBrite, and there’s a 20% off, early registration discount running through 12/31/12.  If you register by then, tickets are $145 versus the standard price of $179.

The Target Audience for SEO Bootcamp Princeton
So, what will you learn at SEO Bootcamp Princeton?  Put simply, you’ll learn a lot.  My goal is to make sure attendees can leave the training ready to make changes to their websites.  I’ve crafted the training so it can be valuable for any person marketing a business, from small business owners to corporate marketers.  SMB’s will learn the tactical knowledge necessary to build a solid SEO foundation, while corporate marketers can learn SEO best practices and techniques.

In addition, the training will be extremely valuable for creative professionals, including designers, programmers, copywriters, etc.  I used to work for a large agency in New York City, and I led a similar type of training there.  I can tell you that every creative professional left the training with a stronger understanding of Search Engine Optimization (SEO).  Actually, I know the training changed how some people performed their jobs on a regular basis…

For example, designers and programmers learned about search engine friendly ways to design and code sites, while copywriters learned how to perform keyword research and properly optimize content.  Professionals involved with information architecture (IA) learned how to best structure a navigation, while also learning the best ways to build an internal linking structure.  And everyone in the training learned about the risks of redesigning a website without taking SEO into account.

Those are just a few of the SEO topics you’ll learn more about at SEO Bootcamp Princeton.  Again, my goal is that you leave with a much deeper knowledge of SEO, that you can make changes immediately, and that you take SEO into account whenever working on a website, campaign, or redesign.  You can learn more about the topics I’m going to cover on the SEO Bootcamp Princeton webpage.

SEO Bootcamp Princeton is Job-Agnostic – All Levels and Positions Will Benefit

Technical and Creative Job Titles

Tools and Plugins
SEO is definitely a mix of art and science.  And in order to assist SEO professionals with several core tasks, there are many tools and plugins one can use.  During the training, I will highlight several of the tools and plugins that can make your job easier SEO-wise.  I’ve always said that when you combine the right tools with the right SEO knowledge, great things can happen.  And I’ll make sure to explain some of my favorites along the way.  From Firefox plugins to Chrome extensions to standalone software applications, you’ll leave the training with a list of tools that can help you on a regular basis.

SEO Tools Training

Major Algorithm Updates
I can’t leave this post without touching on a very important topic in SEO that’s affecting many business owners.  Google has launched several important algorithm updates since early 2011, including both Panda and Penguin.   As you can imagine, I receive calls every month from business owners that have gotten hammered by these updates.  During SEO Bootcamp Princeton, I will introduce each major algorithm update, and cover important insights based on helping a range of businesses deal with the aftermath of getting hit.  And more importantly, I can explain the best ways to avoid getting hit in the first place.  You can read several of my case studies about Panda recovery and Penguin recovery if you are interested in learning more.

Panda and Penguin Algorithm Updates

Next Steps, Register Today
In closing, I’m ultra-excited about SEO Bootcamp Princeton.  If you are interested in registering, you can sign up via the EventBrite page.  Again, there’s a 20% off, early registration discount running through 12/31.  After 12/31, the standard pricing will be $179 per seat.  If you have any questions about the training, don’t hesitate to contact me.  It would be great to see you there!

GG

 

Monday, December 17th, 2012

Trackbacks in Google Analytics – How To Analyze Inbound Links in GA’s Social Reports

Trackbacks in Google Analytics

In May of 2012, Google Analytics introduced trackbacks in its social reporting.  If you’re not familiar with trackbacks, they enable you to understand when another website links to your content.  So, using Google Analytics, and the new trackbacks reporting, you could start to track inbound links you are building from across the web.

Note, if you want to perform advanced-level analysis of your links, you should still use more robust tools like Open Site Explorer or Majestic SEO.  But, trackbacks reporting is a quick and easy way to identify backlinks, and right within Google Analytics.  It can definitely supplement your link analysis efforts.

If you’re in charge of content strategy for your company, or if you are publishing content on a regular basis, then checking trackbacks reporting in GA can quickly help you understand the fruits of your labor.  But since trackbacks reporting isn’t immediately visible, I’ve written this post to explain how you can find trackbacks, and then what you can do with the data once you access the reporting.

Social Reports and Trackbacks
First, if you’re not familiar with social reporting in Google Analytics, you should check out my post from March where I cover how to use the new social reports to analyze content.  Social reports are a great addition to GA, but I still find many marketers either don’t know about them, or don’t know how to use them.  And that’s a shame, since they provide some great insights about the traffic coming from social networks, and the conversations going on there (at least for data hub partners).

Below, I’m going to walk you step by step through the process of finding links to your content via trackbacks reporting.  Once we find them, I’ll explain what you can do with your newly-found link data.

How To Find Trackbacks (Step by Step)
1. Access your Google Analytics reporting, and click “Traffic Sources”, “Social”, and then “Network Referrals”.

Trackback Reporting in Google Analytics

2. Next, click a network referral in the list like Google Plus, Twitter, Facebook, etc. Note, “Network Referral” is new language used by Google Analytics for “Social Network” or “Source”.

Network Referrals in Google Analytics

3. Once you click through a source, you should click the “Activity Stream” tab located near the top of the screen (right above the trending graph).

Activity Stream in Google Analytics Social Reports

4. Once you click the activity stream tab, you’ll need to click the dropdown arrow next to the “Social Network” label at the very top of the screen.  Once you do, you’ll see a link in that list for “Trackbacks”.  Click that link.

Finding Trackbacks in Google Analytics

5. Once you click the “Trackbacks” link, you will see the links to your content that Google Analytics picked up.

Viewing Trackbacks in Google Analytics Social Reports

Congratulations, you found the hidden treasure of trackbacks in Google Analytics!  Not the easiest report to find, is it?  Now let’s find out what you can do with the data.

What You Can Do Once You Find Trackbacks
First, I’ll quickly cover the data provided in the trackbacks reporting.  Google Analytics provides the following information for each trackback it picks up:

  • The date the trackback was picked up.
  • The title and URL of the page linking to your content.
  • The ability to launch and view your content that’s receiving the link.
  • And a quick way to isolate that content in your social reports (to view all social activity for that specific page).

Next, I’ll cover four ways you can benefit from analyzing trackbacks data in Google Analytics, including a bonus at the end.  Let’s jump in.

1. Understand the source of the trackback (Who is linking to you.)
Linkbuilding is hard.  So when your content builds links naturally, you definitely want to understand the source of those links.  Trackbacks in Google Analytics provides an easy and quick way to identify links to your content.  But once you build some links, you shouldn’t stop and have a tropical drink with a fancy umbrella as you admire your results.  You should analyze your newly-found inbound links.

For example, you should determine if the links are strong, relevant, and how much will those links help with your SEO efforts.  You should also determine which authors decided to link to you, what’s their background, and where else do they write?m

One of the first things you’ll see in trackbacks reporting is the title and URL of the page linking to your content.  At this point, you can click the small arrow icon next to the URL to open the referring page in a new window.  You can also click the “More” button on the right side of the page, and then click “View Activity” to be taken to the page linking to your content.

Viewing Trackbacks in Google Analytics

At this point, you can check out the article or post linking to you, understand who wrote the content, what they focus on, link to their social accounts, find their contact information, etc.  Building relationships with quality authors in your niche is a great way to earn links down the line.  Therefore, analyzing the people who already link to your content is low-hanging fruit.  Trackbacks in GA make it easy to find them.

2. Understand Your Content That’s Building Links
When I’m working with content teams, I always get the question, “what should we write about?”  I’m a big believer that a content generation plan should be based on data, and not intuition.  And trackbacks provide another piece of data to analyze.  Let’s face it, the proof is in the pudding from a linkbuilding standpoint.  Either your content builds links or it doesn’t.  If it does, you need to find out why that content built the links it did.  And if it didn’t build links, you need to document that and make sure you don’t make the same mistake again.

As I mentioned earlier, there are some outstanding link analysis tools on the market, like Open Site Explorer and Majestic SEO, and I’m not saying that trackbacks in Google Analytics are the end-all.  But, you can definitely use the reporting to quickly understand which content is building links.

Once you find trackbacks and identify the content that built those links, you can start to analyze and understand what drove interest.  Was it breaking news, evergreen content, how-to’s, industry analysis, etc?  Which topics were hot from a linkbuilding standpoint, and were those the topics you expected to build links?  If you find a subject that worked well in the past, you can build a plan for expanding on that topic.  Also, are the pages linking to you providing ideas for new posts?  Do the comments on the page provide ideas, what did the author mention, etc?  Trackbacks provide a mechanism for supplementing your analysis.

3. Join the conversation, Engage Influencers
I explained above how you can find the people (and websites) linking to your content.  That’s great, but you shouldn’t stop there.  If there’s a conversation happening on that referring page, then you should join the conversation.  If someone went to the extent to mention and link to your content, the least you can do is thank them, and provide value to the conversation.

Adding value to the conversation and engaging a targeted audience can help you build more credibility and connect with targeted people in your niche.  And as I mentioned above, you can connect with the author of the post via email or via their social accounts.

4. Understand Linkbuilding Over Time
Using the trending graph in Google Analytics, you can visually understand linkbuilding over time.  The graph at the top of the screen will show you the number of trackbacks earned over the time period you have selected in GA.  I’m not saying that it’s better than using other, dedicated link analysis tools, but this is a quick way to find link data right within Google Analytics.

Trackbacks Trending in Google Analytics

In addition, if you click the “More” button for any specific trackback, and then click “Page Analytics”, you can isolate specific pieces of content receiving links.  Note, I’ve been seeing a test in Google Analytics where “Page Analytics” is replaced by “Filter on this Page”.  Personally, I like “Filter on this Page” since it’s more intuitive.  Regardless, after clicking the link you can trend linkbuilding over time for a specific piece of content.

Viewing Trackbacks for a Specific Page

In addition, you can always compare timeframe to see how links were built during one timeframe versus another.  You might find some interesting things, like a piece of content that built more inbound links months later versus when the content was first published.  Then you can dig into the links to find out why…

Bonus: Export The Data!
As with any report in Google Analytics, you can easily export trackbacks data.  If you are viewing any trackbacks report, you can click “Export” at the top of the screen, and then choose a format to quickly export the data for further analysis in Excel.  Then you can slice and dice the data, combine data from other reports, etc.  What you do with the data depends on your own Excel skills.  :)

Exporting Trackback Data in Google Analytics

Summary – Quick Link Analysis in Google Analytics
I hope after reading this post you’re ready to jump into Google Analytics to hunt down trackbacks.  Again, Google didn’t necessarily make it super-easy to find trackbacks, but they are there.  Once you do find them, you can analyze those links to glean important insights that can help your future content and linkbuilding efforts.  Although there are more robust link analysis solutions on the market, trackbacks reporting is a quick and easy way to identify and then analyze inbound links.  I recommend checking out the reporting today.  You never know what you’ll find.  :)

GG

 

Wednesday, November 14th, 2012

Hunting False Negatives – How To Avoid False Negatives When Checking Redirects After a Website Redesign or Migration [Screaming Frog Tutorial]

How To Check Redirects Using Screaming Frog

Every webmaster has to deal with a website redesign or migration at some point.  And redesigns and migrations often mean that your URL structure will be impacted.  From an SEO perspective, when URL’s need to change, it’s critically important that you have a solid 301 redirection plan in place.  If you don’t, you can pay dearly SEO-wise.

I wrote a post for my Search Engine Journal column last spring titled “How to Avoid SEO Disaster During a Website Redesign” and implementing a 301 redirection plan was one of the most important topics I covered.  I find many webmasters and marketers don’t understand how SEO power is built URL by URL.  As your URL’s build up inbound links and search equity, it’s important that those URL’s maintain those links and equity.  If you change those URL’s, you must notify the search engines where the old content moved to, and that’s where 301 redirects come into play.

So, when you change URL’s, you run the risk of losing all of the links pointing to the older URL’s, and the search power that the URL’s contained.  That’s unless you 301 redirect the old URL’s to the new ones.  A 301 redirect safely passes PageRank from an old URL to a new one (essentially maintaining its search equity).

Unfortunately, I’ve seen many companies either not set up a redirection plan at all, or botch the plan.  That’s when they end up with a catastrophic SEO problem.  Rankings drop quickly, traffic drops off a cliff, sales drop, and nobody is happy at the company (especially the CMO, CFO, and CEO).

Traffic Drop After Website Redesign

Meet the False Negative Redirect Problem, A Silent Killer During Redesigns or Migrations:
Needless to say, properly setting up your redirects is one of the most important things you can do when redesigning or migrating your website.  That said, even if you address redirects and launch the new site, how do you know that the redirects are in fact working?  Sure, you could manually check some of those URL’s, but that’s not scalable.  In addition, just because an older URL 301 redirects to a new URL doesn’t mean it redirects to the correct URL.  If you don’t follow through and check the destination URL (where the redirect is pointing), then you really don’t know if everything is set up properly.

This is what I like to call the False Negative Redirect Problem.  For SEO’s, a false negative occurs when your test incorrectly shows that the redirects are working properly (they don’t test positive for errors), when in fact, the destination URL’s might not be resolving properly.  Basically, your test shows that the redirects are ok, when they really aren’t.  Incorrectly thinking that 301 redirects are working properly by only checking the header response code for the old URL can trick webmasters into believing the redesign or migration has gone well SEO-wise, when in reality, the destination URL’s could be 404’ing or throwing application errors.  It’s a silent killer of SEO.

False Negatives can be a Silent SEO Killer

How To Avoid the Silent SEO Killer When Changing Implementing Redirects
The false negative problem I mentioned above is especially dangerous when changing domain names (where you will often implement one directive in .htaccess or ISAPI_Rewrite that takes any request for a URL at one domain and redirects it to the same URL at another domain).  Just because it 301’s doesn’t mean the correct URL resolves.  Think about it, that one directive will 301 every request… but you need to check the destination URL to truly know if the redirects are working the way you need them to.  Unfortunately, many SEO’s only check that the old URL’s 301, but they don’t check the destination URL.  Again, that could be a silent killer of SEO.

Screaming Frog Hops to the Rescue
I mentioned “scalable” solutions earlier.  Well, Screaming Frog provides a scalable solution for checking redirects during a migration or website redesign.  Note, Screaming Frog is a paid solution, but well worth the $157 annual fee.  Using Screaming Frog, you can import a list of old URL’s from your analytics package or CMS and have it crawl those URL’s and provide reporting.  Running a two-step process for checking redirects and destination URL’s can help you understand if your redirects are truly working.  For example, you might find redirects that lead to 404’s, application errors, etc.  Once you find those errors, you can quickly change them to retain search equity.

Below, I’m going to walk you through the process of exporting your top landing pages from Google Analytics and checking them via Screaming Frog to ensure both the redirects are working and that the destination URL’s are resolving correctly.  Let’s get started.

What You’ll Need and What We’ll Be Doing

  • First, we are going to export our top landing pages from Google Analytics.
  • Second, we’ll use the CONCATENATE function in Excel to build complete URL’s.
  • Next, we’ll add the URL’s to a text file that we can import into Screaming Frog.
  • Then we’ll fire up Screaming Frog and import the text file for crawling.
  • Screaming Frog will crawl and test those URL’s and provide reporting on what it finds.
  • Then we can export the destination URL’s we find so we can make sure they resolve correctly.  Remember, just because the old URL’s 301 redirect doesn’t mean the destination URL’s resolve properly.  We are hunting for false negatives.
  • Last, and most importantly, you can fix any problematic redirects to ensure you maintain search equity.


How To Use Screaming Frog to Hunt Down False Negatives:

  1. Export Top Landing Pages from Google Analytics
    Access your Google Analytics reporting and click the “Content” tab, “Site Content”, and then “Landing Pages”.  Click the dropdown for “Show rows” at the bottom of the report and select the number of rows you want to view.Export top landing pages from Google Analytics

    Tip: If you have greater than 500 pages, then you can edit the URL in Google Analytics to display greater than 500 URL’s.   After first selecting a row count from the dropdown, find the parameter named table.rowCount= in the URL.  Simply change the number after the equals sign to 1000, 5000, 10000, or whatever number you need to capture all of the rows.   When you export your report, all of the rows will be included.

  2. Export the Report from Google Analytics
    Click the Export button at the top of the report and choose “CSV”.  The file should be exported and then open in Excel once it downloads.
    Exporting a report from Google Analytics
  3. Use Excel’s CONCATENATE Function to Build a Complete URL
    When the URL’s are exported from Google Analytics, they will not include the protocol or domain name.  That’s the beginning of a URL with http://www.yourdomain.com.  Therefore, you need to add this to your URL’s before you use them in Screaming Frog.  Excel has a powerful function called CONCATENATE, which lets you combine text and cell contents to form a new text string.  We’ll use this function to combine the protocol and domain name with the URL that Google Analytics exported.

    Create a new column next to the “Landing Page” column in Excel.  Click the cell next to the first landing page URL and start entering the following: =CONCATENATE(“http://www.yourdomain.com”, A8).  Note, change “yourdomain.com” to your actual domain name.  Also, A8 is the cell that contains the first URL that was exported from Google Analytics (in my spreadsheet).  If your spreadsheet is different, make sure to change A8 to whichever cell contains the first URL in your sheet.  The resulting text should be the complete URL (combining protocol, domain name, and URL exported from Google Analytics).  Then you can simply copy and paste the contents of that cell (which contains the formula) to the rest of the cells in that column.  The formula will automatically adjust to use the right landing page URL for that row. Now you have a list of all complete URL’s that you can import into Screaming Frog.

    Using the CONCATENATE function in Excel to buld URL's

  4. Copy all URL’s to a Text File
    Since all we want are the URL’s for Screaming Frog, you can select the entire new column you just created (with the complete URL’s) and copy those URL’s.  Then open a text file and paste the URL’s in the file.  You can use Notepad, Textpad, or whatever text editor you work with.  Save the file.

    Copy the URL list to a text file

  5. Fire Up Screaming Frog
    After launching Screaming Frog, let’s change the mode to “list” so we can upload a list of URL’s.  Under the “Mode” menu at the top of the application, click “List”, which enables you to use a text file of URL’s to crawl.   Then click “Select File” and choose the text file we just created.  Then you can click “Start” and Screaming Frog will begin to crawl those URL’s.

    Using List Mode to Crawl URL's

  6. Review Header Response Codes From the Crawl
    At this point, you will see a list of the URL’s crawled, the status codes, and the status messages.  Remember, all of the URL’s should be 301 redirecting to new URL’s.  So, you should see a lot of 301’s and “moved permanently” messages.  If you see 404’s at this point, those URL’s didn’t redirect properly.  Yes, you just found some bad URL’s, and you should address those 404’s quickly.  But that’s not a false negative.  It’s good to catch low-hanging fruit, but we’re after more sinister problems.

    Viewing 301 redirects after a Screaming Frog crawl

  7. Find the Destination URL’s for Your Redirects
    Now, just because you see 301 redirects showing up in the main reporting doesn’t mean the destination URL’s resolve correctly.  If you click the “Response Codes” tab, you’ll see the redirect URI (where the 301 actually sends the crawler).  THOSE ARE THE URL’S YOU NEED TO CHECK.    Click the “Export” button at the top of the screen to export the “Response Code” report.  This will include all of the destination URL’s.
    Finding Destination URL's via the Response Code Tab
  8. Copy All Destination URL’s to a Text File
    In Excel, copy the destination URL’s and add them to a text file (similar to what we did earlier). Make sure you save the new file.  We are now going to crawl the destination URL’s just like we crawled the original ones.  But, this process will close the loop for us, and ensure the destination URL’s resolve correctly.  This is where we could find false negatives.

    Exporting all destination URL's to excel from Screaming Frog

  9. Import Your New Text File and Crawl the Destination URL’s
    Go back through the process of selecting “List Mode” in Screaming Frog and then import the new text file we just created (the file that contains the destination URL’s).  Click “Start” to crawl the URL’s, and then check the reporting.

    Using List Mode to Crawl URL's

  10. Analyze the Report and Find False Negatives
    You should see a lot of 200 codes (which is good), but you might find some 404’s, application errors, etc.  Those are your false negatives.  At this point, you can address the errors and ensure your old URL’s in fact redirect to the proper destination URL’s.  Disaster avoided.  :)

    Finding and Fixing False Negatives Using Screaming Frog


Screaming Frog and Actionable Data: Beat False Negatives
Going through the process I listed above will ensure you accurately check redirects and destination URL’s during a website redesign or migration.  The resulting reports can identify bad redirects, 404’s, application errors, etc.  And those errors could destroy your search power if the problems are widespread.  I highly recommend performing this analysis several times during the redesign or migration to make sure every problem is caught.

Make sure you don’t lose any URL’s, which can result in lost search equity.  And lost search equity translates to lower rankings, less targeted traffic, and lower sales.  Don’t let that happen.  Perform the analysis, quickly fix problems you encounter, and retain your search power.  Redesigns or migrations don’t have to result in disaster.  You just need to look out for the silent SEO killer. :)

GG