Archive for August, 2012

Thursday, August 23rd, 2012

Adjusted Bounce Rate in Google Analytics – One Step Closer to Actual Bounce Rate

Adjusted Bounce Rate in Google Analytics

I’ve written extensively in the past about Bounce Rate both here on my blog and on Search Engine Journal.  Bounce Rate is an incredibly powerful metric, and can help marketers better understand the quality of their traffic, and the quality of their content.  If you’re not familiar with Bounce Rate, it’s the percentage of visits that view only one page on your site.  They find your site, view one page, and leave.  As you can guess, that’s usually not a good thing.

Traffic-wise, high bounce rates can raise red flags about the quality of traffic from a given source, campaign, or keyword.  For example, imagine spending $1500 in AdWords driving visitors to your site for a certain category of keywords and seeing a 92% bounce rate.  That should raise a red flag that either the visitor quality is poor or that your landing page is not providing what people are looking for. For example, maybe visitors are looking for A, B, and C, but are getting X, Y, and Z from your landing page.  That could very well cause a bounce.

But, that’s not the full story for bounce rate.  And I find many people don’t understand how analytics packages calculate the metric.  Here’s a scenario that shows a serious flaw.  What if someone visits your site, spends 15 minutes reading a blog post, and then leaves?  If there’s that much engagement with your content, it probably shouldn’t be a bounce, right?  But it is.  Since the user viewed just one page, Google Analytics has no way to determine user engagement.  Therefore, it’s counted as a bounce.  Yes, it’s a huge problem, and could be tricking webmasters into making changes when they really don’t need to.

The Problem with Standard Bounce Rate

Actual Bounce Rate
Over the years, there’s been a lot of talk about how Google and Bing use bounce rate as a ranking factor.  For example, if the engines saw a page with a very high bounce rate, maybe they would use that against the page (which could result in lower rankings).  Add the Panda update, which targets low quality content, and you can see why SEO’s became extremely concerned with bounce rate.

In August of last year, I wrote a post on Search Engine Journal about Actual Bounce Rate.  The post explains some of the mechanisms that Google can use to determine actual bounce rate, and not just the standard bounce rate that Google Analytics provides.  For example, dwell time, toolbar data, Chrome data, etc.  The core point of the post is that Google has access to much more data than you think.  Therefore, don’t focus solely on the standard bounce rate presented in Google Analytics, since the actual bounce rate is what Google could be using to impact rankings.

Actual Bounce Rate Factors


Google Introduces “Adjusted Bounce Rate”
So, given what I just explained, is there a way to gain a better view of actual bounce rate in Google Analytics?  Until recently, the answer was no.  But, I’m happy to announce that Google Analytics released an update in July that enables you to view “Adjusted Bounce Rate”.  It’s not perfect, but it’s definitely a step closer to understanding actual bounce rate.

The Definition of Adjusted Bounce Rate
By adding a new line of code to your Google Analytics snippet, you can trigger an event when users stay for a minimum amount of time.  That amount of time is determined by you, based on your specific site and content.  So, adjusted bounce rate will provide the percentage of visits that view only one page on your site and that stay on that page for less than your target timeframe.  For example, you can set the required time to 20 seconds, and that time would be used to calculate the bounce rate numbers used in your reporting.  If users stayed less than 20 seconds, then it’s a bounce.  If they stayed longer than 20 seconds, it’s not a bounce (even if they visited just one page).

Changes Needed
As I mentioned above, you need to add one line of code to your Google Analytics snippet.  Here is the revised snippet (from the Google Analytics blog post about adjusted bounce rate):

Google Analytics Snippet for Adjusted Bounce Rate

Note, that new line needs to be added to your Google Analytics snippet on every page of your site.  Also, the piece of code that includes 15000 represents the time in milliseconds.  15000 equals 15 seconds.  You can adjust that based on your own site and content.  For some sites, you might set that to 30 seconds, 1 minute, or more.  The minimum is 10 seconds.

Impact, and a Real Example
You might be wondering how this actually impacts your reporting.  Does changing that line of code really impact your bounce rate numbers?  Well, it does, and it can radically change your bounce rate numbers.  And that’s a good thing, since it will give you a closer look at actual bounce rate since time on page is now factored in.  Remember my example above about a user spending 15 minutes on a post and it’s counted as a bounce?  Well that wouldn’t be a bounce if you added this code.  Let’s take a look at an example that demonstrates adjusted bounce rate.

Below, I’ve included the metrics for a page that was showing an 76.7% bounce rate.  Clearly, that’s not a great bounce rate, and it could be driving the webmaster to make changes to the content.  But, check out the bounce rate after we started calculating adjusted bounce rate.  Yes, you are seeing that correctly.  It’s now only 28.5%.  That means 71.5% of users are either visiting other pages on the site or staying on the page for longer than 20 seconds (which is the time the company is using to calculate adjusted bounce rate).  And by the way, the new bounce rate is 62.8% lower than the original bounce rate percentage.  That’s a huge swing.

An example of adjusted bounce rate in Google Analytics

What This Means to You & How This Could Be Better
As a marketer, bounce rate is an extremely important metric.  But, you need an accurate number to rely on if you plan to make changes.  That’s why I wrote about actual bounce rate last summer.  Adjusted bounce rate enables you to add another ingredient to the standard bounce rates calculated by Google Analytics.  By understanding the minimum time spent on the page, you can gain intelligence about how users are engaging with your content.  Are they hitting your page and immediately leaving, or are they at least spending 20 or 30 seconds reading the content?  There’s a big difference between the two (especially for SEO).

Understanding adjusted bounce rate can help you:

  • Refine the right pages on your site, and not just ones that show a high standard bounce rates.
  • Better understand the quality of your top landing pages from organic search. Do you have a quality problem, or are users spending a good amount of time with that content?  Adjusted bounce rate can help you understand that.
  • Better understand the quality of campaign traffic.  Seeing a 95% bounce rate is a lot different than 25%.  Sure, you want conversions from campaign traffic, but engagement is important to understand as well.
  • Troubleshoot SEO problems related to Panda.  When a Panda update stomps on your site, you should analyze your content to determine what’s deemed “low quality”.  Adjusted bounce rate is a much stronger metric than standard bounce rate for doing this.

How Adjusted Bounce Rate Could Be Improved – Revealing Dwell Time
I would love to see dwell time added to Google Analytics somehow.  Dwell time is the amount of time someone spends on your page before hitting the back button to the search results.  Duane Forrester from Bing explained that they use dwell time to understand low and high quality content.  As an SEO, imagine you could understand which pages have high dwell time.  That would be incredible intelligence to use when trying to enhance the content on your site.

Summary – Adjusted is Closer to Actual
Again, I believe this is a great addition by the Google Analytics team.  Adjusted bounce rate can absolutely help you better understand the quality of content on your site, and the quality of traffic you are driving to your site.  I recommend adding the line of code to your Google Analytics snippet today, and then analyze how your bounce rates change.  I have a feeling you’ll be surprised.

GG

 

Wednesday, August 8th, 2012

How To Use Index Status in Google Webmaster Tools to Diagnose SEO Problems

Index Status in Google Webmaster Tools

In late July, Google added Index Status to Webmaster Tools to help site owners better understand how many pages are indexed on their websites.  In addition, Index Status can also help webmasters diagnose indexation problems, which can be caused by redirects, canonicalization issues, duplicate content, or security problems.  Until now, many webmasters relied on using less-than-optimal methods for determining true indexation.  For example, running site: commands against a domain, subdomain, subdirectory, etc.  This was a maddening exercise for many SEO’s, since the number shown could radically change (and quickly).

So, Google adding Index Status was a welcome addition to Webmaster Tools.  That said, I’m getting a lot of questions about what the reports mean, how to analyze the data, and how to diagnose potential indexation problems.  So that’s exactly what I’m going to address in this post.  I’ll introduce the reports and then explain how to use that data to better understand your site’s indexation. Note, it’s important to understand that Index Status doesn’t necessarily answer questions.  Instead, it might raise red flags and prompt more questions.  Unfortunately, it won’t tell you where the indexation problems reside on your site.  That’s up to you and your team to figure out.

Index Status
The Index Status reports are under the “Health” tab in Google Webmaster Tools.  The default report (or “Basic” report) will show you a trending graph of total pages indexed for the past year.  This report alone can signal potential problems.  For most sites, you should see a steady increase in indexation over time.  For example, this is a normal indexation graph:

Basic Index Status Report in Webmaster Tools

But what about a trending graph that shows spikes and valleys?  If you see something like the graph below, it very well could mean you are experiencing indexation issues.  Notice how the indexation graph spikes, then drops, only to spike again.  There may be legitimate reasons why this is happening, based on changes you made to your site.  But, you might have no idea why your indexation is spiking, and would require further site analysis to understand what’s going on.  Once again, this is why SEO Audits are so powerful.

Trending Spikes in Index Status Basic Reporting

Advanced Report
Now it’s time to dig into the advanced report, which definitely provides more data.  When you click the “Advanced” tab, you’ll see four trending lines in the graph.  The data includes:

  • Total Indexed
  • Ever Crawled
  • Not Selected
  • Blocked by  Robots

“Total indexed” is the same data we saw in the basic report. “Ever crawled” shows the total number of pages ever crawled by Google (the cumulative total).  “Not selected” includes the total number of pages that have not been selected to be indexed, since they look extremely similar to other pages, or that redirect to other pages.  I’ll cover “Not selected” in more detail below.  And “Blocked by robots” is just that, pages that you are choosing to block.  Note, those are pages you are hopefully choosing to block…  More about that below.

Advanced Index Status Report in Google Webmaster Tools

What You Can Learn From Index Status
When you analyze the advanced report, you might notice some strange trending right off the bat.  For example, if you see the number of pages blocked by robots.txt spike, then you know someone added new directives.  For example, one of my clients had that number jump from 0 to 20,000+ URL’s in a short period of time.  Again, if you want this to happen, then that’s totally fine.  But if this surprises you, then you should dig deeper.

Depending on how you structure a robots.txt file, you can easily block important URL’s from being crawled and indexed. It would be smart to analyze your robots.txt directives to make sure they are accurate.  Speak with your developers to better understand the changes that were made, and why.  You never know what you are going to find.

The Red Flag of “Not Selected”
If you notice a large number of pages that fall under “Not selected”, then that could also signal potential problems.  Note, depending on the type of website you have, it might be completely normal to see a larger number of “Not selected” than indexed.  It’s natural for Google to run into some redirects and non-canonical URL’s while crawling your site. And that’s especially the case with ecommerce sites or large publishers.

But, that number should not be extreme…  For example, if you see the number of pages flagged as “Not selected” suddenly spike to 100K pages, when you only have 1,500 pages indexed, then you might have a new technical issue on your hands.  Maybe each page on your site is resolving at multiple URL’s based on a coding change.  That would yield many “Not selected” pages.  Or maybe you implemented thousands of redirects without realizing it.  Those would fall under “Not selected” as well.

Security Breach
Index Status can also flag potential hacking scenarios.  If you notice the number of pages indexed spike or drop significantly, then it could mean that someone (or some bot) is adding or deleting pages from your site.  For example, someone might be adding pages to your site that link out a number of other websites delivering malware.  Or maybe they are inserting rich anchor text links to other risky sites from newly-created pages on your site.  You get the picture.

Again, these reports don’t answer your questions, they prompt you to ask more.  Take the data and speak with your developers.  Find out what has changed on the site, and why.  If you are still baffled, then have an SEO audit completed.  As you can guess, these reports would be much more useful if the problematic URL’s were listed.  That would provide actionable data right within the Index Status reports in Google Webmaster Tools.  My hope is that Google adds that data some day.

Bonus Tip: Use Annotations to Document Site Changes
For many websites, change is a constant occurrence.  If you are rolling out new changes to your site on a regular basis, then you need a good way to document those changes.  One way of doing this is by using annotations in Google Analytics.  Using annotations, you can add notes for a specific date that are shared across users of the GA profile.  I use them often when changes are made SEO-wise.  Then it’s easier to identify why certain changes in your reporting are happening.  So, if you see strange trending in Index Status, then double check your annotations.  The answer may be sitting right in Google Analytics.  :)

Adding Annotations in Google Analytics

Summary – Analyzing Your Index Status
I think the moral of the story here is that normal trending can indicate strong SEO health.  You want to see gradual increases in indexation over time.  That said, not every site will show that natural increase.  There may be spikes and valleys as technical changes are made to a website.  So, it’s important to analyze the data to better understand the number of pages that are indexed, how many are being blocked by robots.txt, and how many are not selected based on redirects or canonical issues. What you find might be completely expected, which would be good.  But, you might be uncovering a serious issue that’s inhibiting important pages from being crawled and indexed.  And that can be a killer SEO-wise.

GG