Archive for the ‘web-analytics’ Category

Wednesday, April 16th, 2014

I’m Speaking at the Weber Shandwick Data Salon on April 24th – Learn About Google Algorithm Updates, Manual Penalties, and More

Weber Shandwick Data Salon on April 24, 2014

I’m excited to announce that I’ll be speaking at the Weber Shandwick Data Salon on Thursday, April 24th in New York City (from 6:00PM to 7:30PM).  Each month, Weber Shandwick invites leaders from various areas of digital marketing to speak, to spark conversation, and to share ideas.  I’m thrilled to be presenting next week to speak about the latest in SEO.

My presentation will cover some extremely important topics that I’m neck deep in on a regular basis, including Google algorithm updates, manual penalties, and the war for organic search traffic that’s going on each day.  I’ll be introducing various algorithm updates like Panda and Penguin, explain what manual actions are, and provide case studies along the way.  I’ll also introduce the approach that Google is using to fight webspam algorithmically, while also covering how manual penalties work, how to recover from them, and how to ensure websites stay out of the danger zone.

My goal is to get the audience thinking about content quality, webspam, unnatural links, and webmaster guidelines now before any risky tactics being employed can get them in trouble.  Unfortunately, I’ve spoken with hundreds of companies over the past few years that were blindsided by algo updates or manual actions simply because they never thought about the repercussions of their tactics, didn’t understand Google’s stance on webspam, or the various algorithm updates it was crafting.  Many of them learned too late the dangers of pushing the envelope SEO-wise.

So join me next Thursday, April 24th at 6PM for a deep dive on algorithm updates, manual penalties, and more from the dynamic world of SEO.  You can register today via the following link:

Register for Weber Shandwick’s Data Salon on April 24th:
https://www.surveymonkey.com/s/N6G5K5B

Below I have provided the session overview.  I hope to see you there!

Weber Shandwick Data Salon #3
April 24, 2014 from 6:00PM to 7:30PM
Speaker: Glenn Gabe of G-Squared Interactive
Moderator: Kareem Harper of Weber Shandwick
909 Third Avenue, 5th Floor
*Refreshments will be available starting at 6:00pm
  


Front Lines of SEO

The Frontlines of SEO – Google Algorithm Updates, Penalties, and the Implications for Marketers
Explore Google’s war on webspam, learn about key changes and updates occurring in Search right now, and fully understand the implications for digital marketers.

There’s a battle going on every day in Search that many people aren’t aware of.  With millions of dollars in revenue on the line, some businesses are pushing the limits of what’s acceptable from an SEO perspective.  In other words, gaming Google’s algorithm to gain an advantage in the search results.

Google, with its dominant place in Search, is waging war against tactics that attempt to manipulate its algorithm.  From crafting specific algorithm updates that target webspam to applying manual actions to websites, Google has the ability to impact the bottom line of many businesses across the world.  And that includes companies ranging from large brands to small local businesses.  This session will introduce the various methods Google is using to address webspam in order to keep its search results as pure as possible. Specific examples will be presented, including case studies of companies that have dealt with algorithm updates like Panda and Penguin.  Manual penalties will be discussed as well.

Beyond battling webspam, the major search engines have been innovating at an extremely rapid pace.  The smartphone and tablet boom has impacted how consumers search for data (and how companies can be found).  And now the wearable revolution has begun, which will add yet another challenge for marketers looking to reach targeted audiences.   Glenn will introduce several of the key changes taking place and explain how marketers can adapt.  Glenn is also a Glass Explorer and will provide key insights into how Google Glass and other wearables could impact marketing and advertising.

Register today to learn more about Google’s war on webspam, to better understand the future of Search, and to prepare your business for what’s coming next.

You can register online by clicking the following link:
https://www.surveymonkey.com/s/N6G5K5B

 

 

Wednesday, December 18th, 2013

Panda Report – How To Find Low Quality Content By Comparing Top Landing Pages From Google Organic

Top Landing Pages Report in Google Analytics

Note, this tutorial works in conjunction with my Search Engine Watch column, which explains how to analyze the top landing pages from Google Organic prior to, and then after, Panda arrives.  With the amount of confusion circling Panda, I wanted to cover a report webmasters can run today that can help guide them down the right path while on their hunt for low-quality content.

My Search Engine Watch column covers an overview of the situation, why you would want to run the top landing pages report (with comparison), and how to analyze the data.  And my tutorial below covers how to actually create the report.  The posts together comprise a two-headed monster that can help those hit by Panda get on the right track.   In addition, my Search Engine Watch column covers a bonus report from Google Webmaster Tools that can help business owners gather more information about content that was impacted by the mighty Panda.

Why This Report is Important for Panda Victims
The report I’m going to help you create today is important, since it contains the pages that Google was ranking well and driving traffic to prior to a Panda attack.  And that’s where Google was receiving a lot of intelligence about content quality and user engagement.  By analyzing these pages, you can often find glaring Panda-related problems.  For example, thin content, duplicate content, technical problems causing content issues, low-quality affiliate content, hacked content, etc.  It’s a great way to get on the right path, and quickly.

There are several ways to run the report in Google Analytics, and I’ll explain one of those methods below.  And remember, this should not be the only report you run… A rounded analysis can help you identify a range of problems from a content quality standpoint.  In other words, pages not receiving a lot of traffic could also be causing Panda-related problems.  But for now, let’s analyze the top landing pages from Google Organic prior to a Panda hit (which were sending Google the most data before Panda arrived).

And remember to visit my Search Engine Watch column after running this report to learn more about why this data is important, how to use it, red flags you can identify, and next steps for websites that were impacted.  Let’s get started.

How To Run a Top Landing Pages Report in Google Analytics (with date comparison): 

  • First, log into Google Analytics and click the “All Traffic” tab under “Acquisition”.  Then click “Google / Organic” to isolate that traffic source.
    Accessing Google Organic Traffic in Google Analytics
  • Next, set your timeframe to the date after Panda arrived and extend that for a decent amount of time (at least a few weeks if you have the data).  If time allows, I like to set the report to 4-6 weeks after Panda hit.  If this is right after an algorithm update, then use whatever data you have (but make sure it’s at least one week).  I’m using a date range after the Phantom update hit (which was May 8th).
    Setting a Timeframe in Google Analytics
  • Your next move is to change the primary dimension to “Landing Page” to view all landing pages from Google organic search traffic.  Click the “Other” link next to “Primary Dimension” and select “Acquisition”, and then “Landing Page”.  Now you will see all landing pages from Google organic during that time period.
    Primary Dimension to Landing Page in Google Analytics
  • Now let’s use some built-in magic from Google Analytics.  In the timeframe calendar, you can click a checkbox for “Compare to” and leave “Previous period” selected.  Once you click “Apply”, you are going to see all of the metrics for each landing page, but with a comparison of the two timeframes.  And you’ll even have a nice trending graph up top to visualize the Panda horror.
    Comparing Timeframes in Google Analytics
  • As you start going down the list of urls, pay particular attention to the “% Change” column.  Warning, profanity may ensue.  When you start seeing pages that lost 30%, 40%, 50% or more traffic when comparing timeframes, then it would be wise to check out those urls in greater detail.  Again, if Google was sending a lot of traffic to those urls, then it had plenty of user engagement data from those visits.  You might just find that those urls are seriously problematic from a content quality standpoint.
    Viewing The Percent Change in Traffic in Google Analytics

 

Bonus 1: Export to Excel for Deeper Analysis

  • It’s ok to stay within Google Analytics to analyze the data, but you would be better off exporting this data to Excel for deeper analysis.  If you scroll to the top of the Google Analytics interface, you will see the “Export” button.  Click that button and then choose “Excel (XLSX)”.  Once the export is complete, it should open in Excel.  Navigate to the “Dataset” worksheet to see your landing page data (which is typically the second worksheet).
    Exporting A Report In Google Analytics
  • At this point, you should clean up your spreadsheet by deleting columns that aren’t critical for this report.  Also, you definitely want to space out each column so you can see the data clearly (and the data headers).
    Clean Up Google Analytics Export in Excel
  • You’ll notice that each url has two rows, one for the current timeframe, and one for the previous timeframe.  This enables you to see all of the data for each url during both timeframes (the comparison).
    Two Rows For Each URL Based on Timeframe
  • That’s nice, but wouldn’t it be great to create a new column that showed the percentage decrease or increase for visits (like we saw in Google Analytics?)  Maybe even with highlighting to show steep decreases in traffic  Let’s do it.  Create a new column to the right of “Visits” and before “% New Visits”.  I would title this column “% Change” or something similar.
    Creating a New Column for Percent Change in Excel
  • Next, let’s create a formula that provides the percentage change based on the two rows of data for each url.  Find the “Visits” column and the first landing page url (which will have two rows).  Remember, there’s one row for each timeframe.  If your visits data is in column C, then the post-Panda data is in row 2, and the pre-Panda data is in row 3 (see screenshot below).  You can enter the following formula in the first cell for the new column “% Change”.=(C3-C2)/C3.Again, C3 is the traffic levels from the previous timeframe, C2 is the traffic levels from the current timeframe (after the Panda hit), and you are dividing by the previous traffic levels to come up with the percentage change.  For example, if a url dropped from 5,450 visits to 640 visits, then your percentage drop would be 88%.  And yes, you would definitely want to investigate that url further!
    Creating a Formula to Calculate Percent Change in Excel
  • Don’t worry about the floating decimal point.  We’ll tackle that soon.  Now we need to copy that formula to the rest of the column (but by twos).  Remember, we have two records for each url, so you’ll need to highlight both cells before double clicking the bottom right corner of the second cell to copy the formula to all rows.  Once you do, Excel automatically copies the two rows to the rest of the cells in that column.  Now you should have percentage drops (or increases) for all the urls you exported.  Note, you can also highlight the two cells, copy them, and then highlight the rest of that column, and click paste.  That will copy the formula to the right cells in the column as well.
    Copying a Formula to All Rows in Excel
  • Now, you will see a long, floating decimal point in our new column labeled “% Change”.  That’s an easy fix, since we want to see the actual percentage instead.  Highlight the column, right click the column, and choose “Format Cells”.  Then choose “Percentage” and click “OK”.  That’s it.  You now have a column containing all top landing pages from Google organic, with the percentage drop after the Panda hit.
    Formatting Cells in Excel

 

Bonus 2: Highlight Cells With A Steep Drop in Red

  • If you want the data to “pop” a little more, then you can use conditional formatting to highlight cells that exceed a certain percentage drop in traffic.  That can easily help you and your team quickly identify problematic landing pages.
  • To do that, highlight the new column we created (titled “% Change”), and click the “Conditional Formatting” button in your Home tab in Excel (located in the “Styles” group).  Then select, “Highlight Cells Rules”, and then select, “Greater Than”.  When the dialog box comes up, enter a minimum percentage that you want highlighted.  And don’t forget to add the % symbol!  Choose the color you want to highlight your data with and click “OK”.  Voila, your problematic urls are highlighted for you.  Nice.
    Applying Conditional Formatting in ExcelApplying Conditional Formatting by Percentage in Excel

 

Summary – Analyzing Panda Data
If you made it through this tutorial, then you should have a killer spreadsheet containing a boatload of important data.  Again, this report will contain the percentage increase or decrease for top landing pages from Google Organic (prior to, and then after, a Panda hit).  This is where Google gathered the most intelligence based on user engagement.  It’s a great place to start your analysis.

Now it’s time to head over to my Search Engine Watch column to take a deeper look at the report, what you should look for, and how to get on the right track with Panda recovery.  Between the tutorial and my Search Engine Watch column, I hope to clear up at least some of the confusion about “content quality” surrounding Panda updates.  Good luck.

GG

 

 

Monday, December 17th, 2012

Trackbacks in Google Analytics – How To Analyze Inbound Links in GA’s Social Reports

Trackbacks in Google Analytics

In May of 2012, Google Analytics introduced trackbacks in its social reporting.  If you’re not familiar with trackbacks, they enable you to understand when another website links to your content.  So, using Google Analytics, and the new trackbacks reporting, you could start to track inbound links you are building from across the web.

Note, if you want to perform advanced-level analysis of your links, you should still use more robust tools like Open Site Explorer or Majestic SEO.  But, trackbacks reporting is a quick and easy way to identify backlinks, and right within Google Analytics.  It can definitely supplement your link analysis efforts.

If you’re in charge of content strategy for your company, or if you are publishing content on a regular basis, then checking trackbacks reporting in GA can quickly help you understand the fruits of your labor.  But since trackbacks reporting isn’t immediately visible, I’ve written this post to explain how you can find trackbacks, and then what you can do with the data once you access the reporting.

Social Reports and Trackbacks
First, if you’re not familiar with social reporting in Google Analytics, you should check out my post from March where I cover how to use the new social reports to analyze content.  Social reports are a great addition to GA, but I still find many marketers either don’t know about them, or don’t know how to use them.  And that’s a shame, since they provide some great insights about the traffic coming from social networks, and the conversations going on there (at least for data hub partners).

Below, I’m going to walk you step by step through the process of finding links to your content via trackbacks reporting.  Once we find them, I’ll explain what you can do with your newly-found link data.

How To Find Trackbacks (Step by Step)
1. Access your Google Analytics reporting, and click “Traffic Sources”, “Social”, and then “Network Referrals”.

Trackback Reporting in Google Analytics

2. Next, click a network referral in the list like Google Plus, Twitter, Facebook, etc. Note, “Network Referral” is new language used by Google Analytics for “Social Network” or “Source”.

Network Referrals in Google Analytics

3. Once you click through a source, you should click the “Activity Stream” tab located near the top of the screen (right above the trending graph).

Activity Stream in Google Analytics Social Reports

4. Once you click the activity stream tab, you’ll need to click the dropdown arrow next to the “Social Network” label at the very top of the screen.  Once you do, you’ll see a link in that list for “Trackbacks”.  Click that link.

Finding Trackbacks in Google Analytics

5. Once you click the “Trackbacks” link, you will see the links to your content that Google Analytics picked up.

Viewing Trackbacks in Google Analytics Social Reports

Congratulations, you found the hidden treasure of trackbacks in Google Analytics!  Not the easiest report to find, is it?  Now let’s find out what you can do with the data.

What You Can Do Once You Find Trackbacks
First, I’ll quickly cover the data provided in the trackbacks reporting.  Google Analytics provides the following information for each trackback it picks up:

  • The date the trackback was picked up.
  • The title and URL of the page linking to your content.
  • The ability to launch and view your content that’s receiving the link.
  • And a quick way to isolate that content in your social reports (to view all social activity for that specific page).

Next, I’ll cover four ways you can benefit from analyzing trackbacks data in Google Analytics, including a bonus at the end.  Let’s jump in.

1. Understand the source of the trackback (Who is linking to you.)
Linkbuilding is hard.  So when your content builds links naturally, you definitely want to understand the source of those links.  Trackbacks in Google Analytics provides an easy and quick way to identify links to your content.  But once you build some links, you shouldn’t stop and have a tropical drink with a fancy umbrella as you admire your results.  You should analyze your newly-found inbound links.

For example, you should determine if the links are strong, relevant, and how much will those links help with your SEO efforts.  You should also determine which authors decided to link to you, what’s their background, and where else do they write?m

One of the first things you’ll see in trackbacks reporting is the title and URL of the page linking to your content.  At this point, you can click the small arrow icon next to the URL to open the referring page in a new window.  You can also click the “More” button on the right side of the page, and then click “View Activity” to be taken to the page linking to your content.

Viewing Trackbacks in Google Analytics

At this point, you can check out the article or post linking to you, understand who wrote the content, what they focus on, link to their social accounts, find their contact information, etc.  Building relationships with quality authors in your niche is a great way to earn links down the line.  Therefore, analyzing the people who already link to your content is low-hanging fruit.  Trackbacks in GA make it easy to find them.

2. Understand Your Content That’s Building Links
When I’m working with content teams, I always get the question, “what should we write about?”  I’m a big believer that a content generation plan should be based on data, and not intuition.  And trackbacks provide another piece of data to analyze.  Let’s face it, the proof is in the pudding from a linkbuilding standpoint.  Either your content builds links or it doesn’t.  If it does, you need to find out why that content built the links it did.  And if it didn’t build links, you need to document that and make sure you don’t make the same mistake again.

As I mentioned earlier, there are some outstanding link analysis tools on the market, like Open Site Explorer and Majestic SEO, and I’m not saying that trackbacks in Google Analytics are the end-all.  But, you can definitely use the reporting to quickly understand which content is building links.

Once you find trackbacks and identify the content that built those links, you can start to analyze and understand what drove interest.  Was it breaking news, evergreen content, how-to’s, industry analysis, etc?  Which topics were hot from a linkbuilding standpoint, and were those the topics you expected to build links?  If you find a subject that worked well in the past, you can build a plan for expanding on that topic.  Also, are the pages linking to you providing ideas for new posts?  Do the comments on the page provide ideas, what did the author mention, etc?  Trackbacks provide a mechanism for supplementing your analysis.

3. Join the conversation, Engage Influencers
I explained above how you can find the people (and websites) linking to your content.  That’s great, but you shouldn’t stop there.  If there’s a conversation happening on that referring page, then you should join the conversation.  If someone went to the extent to mention and link to your content, the least you can do is thank them, and provide value to the conversation.

Adding value to the conversation and engaging a targeted audience can help you build more credibility and connect with targeted people in your niche.  And as I mentioned above, you can connect with the author of the post via email or via their social accounts.

4. Understand Linkbuilding Over Time
Using the trending graph in Google Analytics, you can visually understand linkbuilding over time.  The graph at the top of the screen will show you the number of trackbacks earned over the time period you have selected in GA.  I’m not saying that it’s better than using other, dedicated link analysis tools, but this is a quick way to find link data right within Google Analytics.

Trackbacks Trending in Google Analytics

In addition, if you click the “More” button for any specific trackback, and then click “Page Analytics”, you can isolate specific pieces of content receiving links.  Note, I’ve been seeing a test in Google Analytics where “Page Analytics” is replaced by “Filter on this Page”.  Personally, I like “Filter on this Page” since it’s more intuitive.  Regardless, after clicking the link you can trend linkbuilding over time for a specific piece of content.

Viewing Trackbacks for a Specific Page

In addition, you can always compare timeframe to see how links were built during one timeframe versus another.  You might find some interesting things, like a piece of content that built more inbound links months later versus when the content was first published.  Then you can dig into the links to find out why…

Bonus: Export The Data!
As with any report in Google Analytics, you can easily export trackbacks data.  If you are viewing any trackbacks report, you can click “Export” at the top of the screen, and then choose a format to quickly export the data for further analysis in Excel.  Then you can slice and dice the data, combine data from other reports, etc.  What you do with the data depends on your own Excel skills.  :)

Exporting Trackback Data in Google Analytics

Summary – Quick Link Analysis in Google Analytics
I hope after reading this post you’re ready to jump into Google Analytics to hunt down trackbacks.  Again, Google didn’t necessarily make it super-easy to find trackbacks, but they are there.  Once you do find them, you can analyze those links to glean important insights that can help your future content and linkbuilding efforts.  Although there are more robust link analysis solutions on the market, trackbacks reporting is a quick and easy way to identify and then analyze inbound links.  I recommend checking out the reporting today.  You never know what you’ll find.  :)

GG

 

Tuesday, November 27th, 2012

How Google Analytics *Really* Handles Referring Traffic Sources [Experiment] – Why Clicks and Visits Might Not Match Up

Google Analytics Referrals

Let me walk you through a common scenario in web marketing.  You have a website, and some people visit your site by clicking through links on other websites.  In your web analytics reporting, those visits are categorized as referring visits.  In Google Analytics specifically, those visits show up in your “Referrals” report under “Traffic Sources”.  And when visitors click on an outbound link on your site (a link to another website), your site shows up as a referring source in that website’s referrals report.

That’s pretty straight forward, but what I’m about to cover isn’t.  I find many marketers and webmasters don’t understand how Google Analytics handles that referring traffic during future visits to their websites.  For example, if someone clicks through to your site from sampledomain.com, leaves your site, and then returns the next day.

Do you know how that visit will be categorized in Google Analytics?  There’s a good chance you don’t, and I’m going to cover the topic in detail in this post.

Understanding Referring Visits is Important When Revenue and Cost Are Involved
I believe one of the reasons this topic isn’t understood very well is because it often doesn’t directly impact revenue or cost for many webmasters.  Sure, you definitely want to know how many people are coming from each referring site, but for many webmasters, the exact number doesn’t impact revenue, or payments to other webmasters.

But, for websites that need to track the monetary value of inbound visits and outbound clicks, accurately determining referring visits is extremely important.  For example, imagine you were charging certain partners for traffic you were sending from your site to theirs, or vice versa.  The fact of the matter is that checking referring sources could show different numbers than you think, and could be much different than the outbound clicks you see.  And depending on your own situation, the numbers could be way off…

The Core Disconnect – How Google Analytics Calculates Visits from Referring Sources (or any campaign, search visit, etc.)
Here’s the core disconnect.  When someone clicks through to your site via a referring source, the utm_z cookie is updated with traffic source information.  That cookie will not be overwritten unless another referring source or campaign takes it place.  Direct Traffic will not overwrite this value.  Let me say that again.  Direct Traffic will not overwrite the utm_z cookie value.  That means the utm_z value will remain the referring source of traffic when those visitors return to the site.

Google Analytics utm_z Cookie


What This Means To You

I know what you’re thinking. This guy is telling me about utm_z cookies?? What the heck does that mean to me?  OK, stick with me for a second.  Let’s say you had a partnership set up where another website pays you for traffic.  Maybe you’re both in the same niche and want to leverage each other’s traffic for more exposure.  You check your stats for the previous month and notice that you sent 500 visits to partner A.  Cool, so you contact them to check how the partnership is going and to make sure they are seeing the same number of visits.  They come back and say they’ve seen 700 visits from your site and thank you for the traffic.  The check will be cut soon.

Google Analytics Clicks and Visits Could Be Off

But that 200 visit discrepancy is bothering you.  Why is there a big difference between your partner’s reporting and the numbers you are seeing?  And let’s assume you have a solid setup for tracking clicks out of your website.  For example, maybe you are running outbound clicks to partners through a redirect that captures a number of important metrics.  The redirect then sends the visitor off to the correct URL on the partner site.  Basically, you know you are capturing all outbound clicks to the partner website.

This is where the native handling of referring sources in Google Analytics comes into play.  Sure, you are tracking clicks off your site, but your partner’s analytics package is capturing those clicks plus any return visits that are direct visits.  So, if someone clicks through to your partner’s site, then that’s one visit.  If they leave that site, and return directly (by typing the url directly in their browser or via a bookmark), then the visit will show up as a visit from the original referring source (your website).  That’s now two visits.  And if they do it again, that will be three visits.  That’s until another referring source or campaign overwrites the utm_z cookie.  In this example, there were 3 visits to your 1 outbound click!

An Example of How Google Analytics Handles Referrals

Based on this simple example, you can easily see how over a month’s time, some people would click through to your partner’s site and then revisit their site directly (and possibly a few times).  That would lead to more than one visit per user, and could sway the visit count from your website.

Still confused?  Let me clear this up via an experiment below.

Experiment – Calculating Referring Visits in Google Analytics
In the following simple example, I set up a webpage on a second domain that links to a landing page I set up on my website just for this experiment.  I didn’t want to skew the reporting by using an existing page on my site that gets a lot of visits.  Then I used several computers I have here with clean browsers to first visit the referring page that links to my new landing page, and then I clicked through.  The referring source should show up as the domain name of the referring site.  That would be visit #1.

Next, I would leave the new landing page on my site and revisit my website later by typing the exact URL into my browser (what most people would think is a Direct Traffic visit).  In theory, the referring site should show up as the traffic source, even though I’m entering the site as “Direct Traffic”.  Remember, the utm_z cookie will only be overwritten by another referring source or campaign.

Last, I would search for a keyword that my site ranks for, and then click through to the site.  And since this visit was from a search engine, the utm_z cookie would be updated with this new value, and my reporting would show Google as the referring source (along with the keyword I entered). Let’s find out the results of the experiment below.

The Results
1. First Visit

First, I visited the second domain and clicked through to my website.  Here is the first referring visit showing up in my analytics package:
Referring Sites Experiment - First Visit

2. Second Visit (Directly Visiting the Site)
Next, I left the site and returned via Direct Traffic.  Google Analytics shows the referring site as the source for this traffic, even though I entered via “Direct Traffic”. Also notice it accurately categorizes me as a “return visitor”:
Google Analytics Referral Experiment - Second Visit

3. Third Visit (Again Directly Visiting the Site, but the Next Day)
Just to underscore my point, I left and revisited the site the next day (again via Direct Traffic).  Google Analytics again shows the visit is from the initial referring source:
Google Analytics Referral Experiment - Third Visit

4. Fourth Visit, This Time From Search
Finally, I searched for my name on Google and visited my website.  Now Google Analytics shows the keyword that led to the site (from the traffic source “Google”).  Remember, the utm_z cookie will only be updated when another referring source is identified (versus Direct Traffic).
Google Analytics Referral Experiment - Search Visit

 

So there you have it.  Proof that your visit count by source may not be what you think it is.  Now, if you’re reading this post and are either generating revenue from referring visits, or you have to pay partners based on visits, then you might be frantically running to Google Analytics to rerun your reports.  Yes, this could impact things quite a bit.   I’ll leave it up to you how you handle the situation. :)

What You Can Do – The Importance of Clarity
If you do have a partnership where you are either generating revenue by driving traffic, or you are paying for traffic from other websites, then each party needs to clearly understand the arrangement.  Each website involved needs to be clear on the definitions of “traffic”, “clicks”, “visits”, etc.  For example, think about AdWords for a second.  You pay Google for clicks on ads, but don’t pay Google for direct visits back to your site (even though those visits will show up as campaign visits).  And by the way, most partners will not give you access to their reporting anyway… Therefore, you will only know the clicks out from your site.

If you are tracking outbound clicks, you can use event tracking in Google Analytics to track those clicks, including the pages or links where those clicks are originating.  If you don’t want to use event tracking, then you can run outbound clicks through a 302 redirect and capture the information you need to accurately track clicks.  If you are receiving traffic, then you can make sure the referring links contain querystring parameters so you can understand which partner the traffic is coming from (and that it’s not a standard referral from the site).  There are other ways to handle this, and those are just a few ideas.

Summary – Understanding Visits in Google Analytics
I hope you found this post explaining how Google Analytics handles referring visits helpful.  I know this topic can be confusing, and experiments always help clear up some of the confusion.  So now you know why visits might be higher or lower than you think, and how the utm_z cookie controls what shows up in your reporting.  I bet you’ll never look at referring sources the same again.

And let’s hope you’re not on the short end of the stick. :)

Happy Reporting.

GG

 

Wednesday, November 14th, 2012

Hunting False Negatives – How To Avoid False Negatives When Checking Redirects After a Website Redesign or Migration [Screaming Frog Tutorial]

How To Check Redirects Using Screaming Frog

Every webmaster has to deal with a website redesign or migration at some point.  And redesigns and migrations often mean that your URL structure will be impacted.  From an SEO perspective, when URL’s need to change, it’s critically important that you have a solid 301 redirection plan in place.  If you don’t, you can pay dearly SEO-wise.

I wrote a post for my Search Engine Journal column last spring titled “How to Avoid SEO Disaster During a Website Redesign” and implementing a 301 redirection plan was one of the most important topics I covered.  I find many webmasters and marketers don’t understand how SEO power is built URL by URL.  As your URL’s build up inbound links and search equity, it’s important that those URL’s maintain those links and equity.  If you change those URL’s, you must notify the search engines where the old content moved to, and that’s where 301 redirects come into play.

So, when you change URL’s, you run the risk of losing all of the links pointing to the older URL’s, and the search power that the URL’s contained.  That’s unless you 301 redirect the old URL’s to the new ones.  A 301 redirect safely passes PageRank from an old URL to a new one (essentially maintaining its search equity).

Unfortunately, I’ve seen many companies either not set up a redirection plan at all, or botch the plan.  That’s when they end up with a catastrophic SEO problem.  Rankings drop quickly, traffic drops off a cliff, sales drop, and nobody is happy at the company (especially the CMO, CFO, and CEO).

Traffic Drop After Website Redesign

Meet the False Negative Redirect Problem, A Silent Killer During Redesigns or Migrations:
Needless to say, properly setting up your redirects is one of the most important things you can do when redesigning or migrating your website.  That said, even if you address redirects and launch the new site, how do you know that the redirects are in fact working?  Sure, you could manually check some of those URL’s, but that’s not scalable.  In addition, just because an older URL 301 redirects to a new URL doesn’t mean it redirects to the correct URL.  If you don’t follow through and check the destination URL (where the redirect is pointing), then you really don’t know if everything is set up properly.

This is what I like to call the False Negative Redirect Problem.  For SEO’s, a false negative occurs when your test incorrectly shows that the redirects are working properly (they don’t test positive for errors), when in fact, the destination URL’s might not be resolving properly.  Basically, your test shows that the redirects are ok, when they really aren’t.  Incorrectly thinking that 301 redirects are working properly by only checking the header response code for the old URL can trick webmasters into believing the redesign or migration has gone well SEO-wise, when in reality, the destination URL’s could be 404’ing or throwing application errors.  It’s a silent killer of SEO.

False Negatives can be a Silent SEO Killer

How To Avoid the Silent SEO Killer When Changing Implementing Redirects
The false negative problem I mentioned above is especially dangerous when changing domain names (where you will often implement one directive in .htaccess or ISAPI_Rewrite that takes any request for a URL at one domain and redirects it to the same URL at another domain).  Just because it 301’s doesn’t mean the correct URL resolves.  Think about it, that one directive will 301 every request… but you need to check the destination URL to truly know if the redirects are working the way you need them to.  Unfortunately, many SEO’s only check that the old URL’s 301, but they don’t check the destination URL.  Again, that could be a silent killer of SEO.

Screaming Frog Hops to the Rescue
I mentioned “scalable” solutions earlier.  Well, Screaming Frog provides a scalable solution for checking redirects during a migration or website redesign.  Note, Screaming Frog is a paid solution, but well worth the $157 annual fee.  Using Screaming Frog, you can import a list of old URL’s from your analytics package or CMS and have it crawl those URL’s and provide reporting.  Running a two-step process for checking redirects and destination URL’s can help you understand if your redirects are truly working.  For example, you might find redirects that lead to 404’s, application errors, etc.  Once you find those errors, you can quickly change them to retain search equity.

Below, I’m going to walk you through the process of exporting your top landing pages from Google Analytics and checking them via Screaming Frog to ensure both the redirects are working and that the destination URL’s are resolving correctly.  Let’s get started.

What You’ll Need and What We’ll Be Doing

  • First, we are going to export our top landing pages from Google Analytics.
  • Second, we’ll use the CONCATENATE function in Excel to build complete URL’s.
  • Next, we’ll add the URL’s to a text file that we can import into Screaming Frog.
  • Then we’ll fire up Screaming Frog and import the text file for crawling.
  • Screaming Frog will crawl and test those URL’s and provide reporting on what it finds.
  • Then we can export the destination URL’s we find so we can make sure they resolve correctly.  Remember, just because the old URL’s 301 redirect doesn’t mean the destination URL’s resolve properly.  We are hunting for false negatives.
  • Last, and most importantly, you can fix any problematic redirects to ensure you maintain search equity.


How To Use Screaming Frog to Hunt Down False Negatives:

  1. Export Top Landing Pages from Google Analytics
    Access your Google Analytics reporting and click the “Content” tab, “Site Content”, and then “Landing Pages”.  Click the dropdown for “Show rows” at the bottom of the report and select the number of rows you want to view.Export top landing pages from Google Analytics

    Tip: If you have greater than 500 pages, then you can edit the URL in Google Analytics to display greater than 500 URL’s.   After first selecting a row count from the dropdown, find the parameter named table.rowCount= in the URL.  Simply change the number after the equals sign to 1000, 5000, 10000, or whatever number you need to capture all of the rows.   When you export your report, all of the rows will be included.

  2. Export the Report from Google Analytics
    Click the Export button at the top of the report and choose “CSV”.  The file should be exported and then open in Excel once it downloads.
    Exporting a report from Google Analytics
  3. Use Excel’s CONCATENATE Function to Build a Complete URL
    When the URL’s are exported from Google Analytics, they will not include the protocol or domain name.  That’s the beginning of a URL with http://www.yourdomain.com.  Therefore, you need to add this to your URL’s before you use them in Screaming Frog.  Excel has a powerful function called CONCATENATE, which lets you combine text and cell contents to form a new text string.  We’ll use this function to combine the protocol and domain name with the URL that Google Analytics exported.

    Create a new column next to the “Landing Page” column in Excel.  Click the cell next to the first landing page URL and start entering the following: =CONCATENATE(“http://www.yourdomain.com”, A8).  Note, change “yourdomain.com” to your actual domain name.  Also, A8 is the cell that contains the first URL that was exported from Google Analytics (in my spreadsheet).  If your spreadsheet is different, make sure to change A8 to whichever cell contains the first URL in your sheet.  The resulting text should be the complete URL (combining protocol, domain name, and URL exported from Google Analytics).  Then you can simply copy and paste the contents of that cell (which contains the formula) to the rest of the cells in that column.  The formula will automatically adjust to use the right landing page URL for that row. Now you have a list of all complete URL’s that you can import into Screaming Frog.

    Using the CONCATENATE function in Excel to buld URL's

  4. Copy all URL’s to a Text File
    Since all we want are the URL’s for Screaming Frog, you can select the entire new column you just created (with the complete URL’s) and copy those URL’s.  Then open a text file and paste the URL’s in the file.  You can use Notepad, Textpad, or whatever text editor you work with.  Save the file.

    Copy the URL list to a text file

  5. Fire Up Screaming Frog
    After launching Screaming Frog, let’s change the mode to “list” so we can upload a list of URL’s.  Under the “Mode” menu at the top of the application, click “List”, which enables you to use a text file of URL’s to crawl.   Then click “Select File” and choose the text file we just created.  Then you can click “Start” and Screaming Frog will begin to crawl those URL’s.

    Using List Mode to Crawl URL's

  6. Review Header Response Codes From the Crawl
    At this point, you will see a list of the URL’s crawled, the status codes, and the status messages.  Remember, all of the URL’s should be 301 redirecting to new URL’s.  So, you should see a lot of 301’s and “moved permanently” messages.  If you see 404’s at this point, those URL’s didn’t redirect properly.  Yes, you just found some bad URL’s, and you should address those 404’s quickly.  But that’s not a false negative.  It’s good to catch low-hanging fruit, but we’re after more sinister problems.

    Viewing 301 redirects after a Screaming Frog crawl

  7. Find the Destination URL’s for Your Redirects
    Now, just because you see 301 redirects showing up in the main reporting doesn’t mean the destination URL’s resolve correctly.  If you click the “Response Codes” tab, you’ll see the redirect URI (where the 301 actually sends the crawler).  THOSE ARE THE URL’S YOU NEED TO CHECK.    Click the “Export” button at the top of the screen to export the “Response Code” report.  This will include all of the destination URL’s.
    Finding Destination URL's via the Response Code Tab
  8. Copy All Destination URL’s to a Text File
    In Excel, copy the destination URL’s and add them to a text file (similar to what we did earlier). Make sure you save the new file.  We are now going to crawl the destination URL’s just like we crawled the original ones.  But, this process will close the loop for us, and ensure the destination URL’s resolve correctly.  This is where we could find false negatives.

    Exporting all destination URL's to excel from Screaming Frog

  9. Import Your New Text File and Crawl the Destination URL’s
    Go back through the process of selecting “List Mode” in Screaming Frog and then import the new text file we just created (the file that contains the destination URL’s).  Click “Start” to crawl the URL’s, and then check the reporting.

    Using List Mode to Crawl URL's

  10. Analyze the Report and Find False Negatives
    You should see a lot of 200 codes (which is good), but you might find some 404’s, application errors, etc.  Those are your false negatives.  At this point, you can address the errors and ensure your old URL’s in fact redirect to the proper destination URL’s.  Disaster avoided.  :)

    Finding and Fixing False Negatives Using Screaming Frog


Screaming Frog and Actionable Data: Beat False Negatives
Going through the process I listed above will ensure you accurately check redirects and destination URL’s during a website redesign or migration.  The resulting reports can identify bad redirects, 404’s, application errors, etc.  And those errors could destroy your search power if the problems are widespread.  I highly recommend performing this analysis several times during the redesign or migration to make sure every problem is caught.

Make sure you don’t lose any URL’s, which can result in lost search equity.  And lost search equity translates to lower rankings, less targeted traffic, and lower sales.  Don’t let that happen.  Perform the analysis, quickly fix problems you encounter, and retain your search power.  Redesigns or migrations don’t have to result in disaster.  You just need to look out for the silent SEO killer. :)

GG

 

Monday, September 10th, 2012

SEM Competitive Analysis – The Power of Understanding Your Competition in Paid Search

SEM Competitive Analysis

There are a lot of moving parts to developing and managing SEM campaigns.  First, you need to develop a strong paid search strategy, perform keyword research, map out a robust structure for your campaigns and ad groups, determine budgets, create effective ads, etc.  After the setup phase, you will be neck deep in ongoing campaign management, which involves refining your campaigns and ad groups based on performance. That includes refining keywords, ads, creating new ad groups when necessary, pausing ad groups or campaigns that don’t perform well, split testing ads, etc.  This includes managing both Search and Display Network campaigns.  As you can guess, SEM is definitely not for the faint of heart.

Based on all that’s involved with paid search, I think it’s easy for SEM’s to keep driving campaigns forward without taking a step back to analyze the competitive landscape.  For example, which companies are you competing against in SEM, which ads are they running, what types of landing pages are they using, how does their pricing stack up, etc.  That’s where a thorough competitive analysis can pay huge dividends.  There are so many important things you can learn from analyzing the competition that I’m surprised more companies aren’t doing it.

In this post, I’m going to explore five important insights you can learn from performing an SEM competitive analysis.  My hope is that once you read through this post, you’ll be eager to get started on your own analysis.  Let’s get started.

What’s an SEM Competitive Analysis?
Simply put, an SEM competitive analysis enables you to understand the companies also bidding on the same keywords and categories you are targeting in paid search.  Let’s face it, if you are bidding on a set of keywords, it’s important to understand which competitors are targeting the same keywords, where they are driving visitors, how aggressively they are bidding, the pricing they are providing for similar products, etc.  While performing the analysis, there are times you find incredible nuggets of information that can help enhance your own campaigns.  You can also understand why certain competitors might be outperforming your own efforts.

Competitive Analysis Tools
This post isn’t meant to provide a tutorial on how to use the various competitive tools in the industry.  There are many to choose from and you should test them out to determine which ones fit your needs.  Pricing-wise, some are paid solutions while others are offered for free.  For example, SEMRush and SpyFu are two paid solutions that enable you to view a wealth of competitive SEM data such as keywords, ads, cpc’s, volume of traffic, etc.

Competitive Analysis Tools

Google’s Ad Preview Tool is free and enables you to view an unpersonalized SERP, while also enabling you to specify geographic location, mobile vs. desktop, language, etc.  In addition, AdWords recently released Auction Insights, which gives you a view of the companies you are competing with on a keyword level (if there is enough data).  You can view a competitor’s impression share, their avg position, the overlap rate, the percentage of times they rank above your own ads, etc.  Again, there are many tools on the market, and my recommendation is to figure out the right combination for your needs.  Many of the paid solutions have free trials, so you can start using them immediately to gauge their effectiveness.

A screenshot from the Google Ad Preview Tool:

AdWords Ad Preview Tool

Analysis Scope
When determining the scope of your analysis, you can either start small and analyze a specific ad group, or you can analyze a larger campaign (or set of campaigns).  If you are just starting out, you might want to start smaller and just focus on an important ad group.  Once you determine the best process to use, along with the right tools, you can expand to other ad groups and larger campaigns.  I recommend choosing an ad group that’s important to your business, but one that might not be performing very well.  You never know, the competitive analysis could reveal why that is…

Let’s take a look at five things you can learn from an SEM competitive analysis that can greatly help your own SEM efforts:

1. Who Are Your *Real* Competitors (in Search)
Whenever I begin  helping a new client, I always ask them who their top competitors are.  It’s a trick question, since the standard set of competitors in the industry might not be the same competitors in SEM (or SEO).  Understanding which companies are present in the SERPs for target keywords is extremely important.  For consumers that don’t know which company to do business with, and start searching Google, the offline competition might not make a big difference.  That’s why you need to understand your true competitors in SEM.  That’s who prospective customers will be reviewing while researching online.

When I present my findings with regard to true competition, it’s not uncommon for my clients to fall out of their chairs.  Sure, they might find some familiar faces, but they might find some additional companies or websites that surprise them.  For example, say hello to Amazon.com, the biggest and baddest ecommerce retailer on the web.  If you are selling online, Amazon very well could be a core competitor in SEM.  If that’s the case, you better check out pricing on Amazon.com, how often they show up for your target keywords, which third party sellers are providing similar products, etc.  Let’s face it, low pricing and Amazon Prime membership is a killer combination that you’ll have to face and deal with at some point.  And you’re not alone.

You also might find comparison shopping sites, forums, answer-driven sites like Yahoo Answers, personal blogs, etc.  If you do, you might need to form a strategy for monitoring those sites to ensure you are represented (the right ways).  You might find manufacturer websites that provide links to online retailers that offer their products.  Are you listed there?  Should you be?  I think you get the picture.  Understand the real competition, dig deeper, and form a strategy for dealing with those “competitors”.

A list of competitors in AdWords for a target keyword:

Your True Competition in Search

2. Find the Keywords Your Competitors are Running
OK, so now you know which companies you are competing with in SEM.  Your next question might focus on which keywords they are running.  This is important for several reasons.  First, you want to make sure you aren’t missing important keywords or categories that customers are searching for.  Even if you performed keyword research, you might have missed something.  Analyzing keywords your competitors are running could help close the gaps.

Analyzing the keywords a competitor is bidding on:

Competitive Keyword Analysis in SEM

Second, you can start to gauge how much traffic each keyword or keyword category is driving to your competitors’ websites.  For example, if you see a larger percentage of traffic for certain categories, there might a good reason for that.  Maybe they are seeing outstanding performance from those keywords or categories, and they are allocating more budget to those keywords.

Note: there are many companies not managing SEM correctly, so be careful here…  If you see something stand out while analyzing the keywords that competitors are running, you can and should, test those yourself.  As long as you have a strong analytics strategy in place, you can easily identify high quality traffic, strong performance, etc.  I guess what I’m saying is that keyword intelligence is great to attain, but nothing compares to actual testing.

3. Competitor Landing Pages
Next on our list are the landing pages that competitors are using.  Let’s say you were running an ad group for an important category.  You are getting  a lot of traffic, but not many conversions.  You’re baffled why that is…  Well, analyzing the landing pages that competitors are using can tell you a lot.  Are they driving visitors to product detail pages,  campaign landing pages, lead generation pages focused on gaining contact information, mobile landing pages (for mobile traffic), etc?  All of this can help you better understand why your competitors might be outperforming you in SEM.

Understanding the landing page experience for prospective customers can help you form ideas for your own landing pages.  If you are driving visitors to a product detail page and competitors have set up dedicated campaign landing pages with a wealth of information, images, video, reviews, live chat, etc., you might want to refine your efforts.  Don’t pale in comparison to your competition.  It could be the very reason you are seeing less conversion (or no conversion).

A sample SEM landing page:

Landing Page Analysis

4. Ads, Ad Extensions, and PLA’s
Using competitive tools, you can review the text ads that competitors are running.  When prospective customers are facing a SERP filled with paid ads, it’s important to stand out (for the right reasons).  Are your competitors punching sales, deals, special offers, etc?  Are they providing actual pricing in their ads?  Are competitor text ads aligned with the landing pages they are driving visitors to?  All of this can help you understand why your own performance isn’t as strong as it should be.

Viewing competitor text ads:

Analyzing SEM Ads

And let’s not forget about ad extensions and product listing ads.  Are your competitors using sitelinks extensions, product extensions, call extensions, local extensions, social extensions, etc?  The extra information provided by ad extensions can be extremely valuable to prospective customers.  For example, you can drive visitors deeper to certain sections of your site, to specific products pages, show social connections, click to call phone numbers, etc.  And if you’re an ecommerce retailer, don’t overlook the power of seller rating extensions.   Those little stars can bring a level of credibility that can mean the difference between revenue or just a click.

An example of sitelinks extensions in AdWords:

Analyzing Ad Extensions in Paid Search

In addition to what I mentioned above, I have to cover the power of product listing ads.  Recently, Google transitioned Google Shopping to a pure paid model.  Product listing ads are an important part of that model, and are extremely powerful.  They are image-based ads for specific products, based on your merchant center feed.  They are CPC-based and can help drive strong performance for ecommerce retailers.  If your competitors are running PLA’s, and you aren’t, you better get in the game.  There are times text ads just don’t compare to the image-based PLA’s competing for attention in the SERPs.

An example of product listings ads in action:

Analyzing Product Listing Ads

5. Pricing
The final insight I’m going to cover is probably the most important – pricing.  Performing a competitive analysis will reveal the pricing your competition is providing for the same products you are selling.  The power of the internet is a double edged sword for many sellers.  You can now compete with the big boys, but you will also be compared with every other seller on the web.  And this can happen in mere seconds as people research products via Google, Bing, and Yahoo.

I find this step in a competitive analysis provides incredibly important insights for my clients.  They are sometimes floored by what they are seeing.  Actually, it’s not unusual for some clients to start yelling as I’m presenting my findings.  “How are they providing that pricing?”  “That can’t be right.”  “They are lowballing prospective customers!”  I’ve heard every possible comment under the sun.

Regardless, unless a consumer knows and trusts your company, you are going to have a hard time comparing to a competitor selling the same product at 20% lower than your own pricing.  Not every person will go with the lowest price (based on a number of credibility factors), but some will.  And when you are paying for every click, it’s important to keep those visitors on your site with the hope of converting them.

Analyzing competitor pricing:

Analyzing Competitor Pricing

 

My recommendation is to analyze each of the competitors for a category, and break down the pricing for each.  Try and determine if that’s the real pricing, how they are providing that pricing, understand their shipping costs, etc., and then form a strategy for dealing with the situation.  By the way, that could mean pausing your ad groups for that category.  If your ROI is pitiful, and your competitors are selling at pricing that makes no sense, then pause your ad groups.  You can find other more profitable categories to drive…

Summary – Competitive Data is There. Go Analyze It
Are you ready to get rolling with your own competitive analysis?  As I covered above, there’s a lot you can learn.  It’s important that you don’t get so caught up in your own campaigns that you forget to learn what your competition is running, how much they are spending, where they are driving visitors, and what type of landing pages they are using.  You never know, you might end up finding serious gaps in your own campaigns.  And that can lead to more revenue, profit, and a stronger ROI.  Good luck.

GG

Thursday, August 23rd, 2012

Adjusted Bounce Rate in Google Analytics – One Step Closer to Actual Bounce Rate

Adjusted Bounce Rate in Google Analytics

I’ve written extensively in the past about Bounce Rate both here on my blog and on Search Engine Journal.  Bounce Rate is an incredibly powerful metric, and can help marketers better understand the quality of their traffic, and the quality of their content.  If you’re not familiar with Bounce Rate, it’s the percentage of visits that view only one page on your site.  They find your site, view one page, and leave.  As you can guess, that’s usually not a good thing.

Traffic-wise, high bounce rates can raise red flags about the quality of traffic from a given source, campaign, or keyword.  For example, imagine spending $1500 in AdWords driving visitors to your site for a certain category of keywords and seeing a 92% bounce rate.  That should raise a red flag that either the visitor quality is poor or that your landing page is not providing what people are looking for. For example, maybe visitors are looking for A, B, and C, but are getting X, Y, and Z from your landing page.  That could very well cause a bounce.

But, that’s not the full story for bounce rate.  And I find many people don’t understand how analytics packages calculate the metric.  Here’s a scenario that shows a serious flaw.  What if someone visits your site, spends 15 minutes reading a blog post, and then leaves?  If there’s that much engagement with your content, it probably shouldn’t be a bounce, right?  But it is.  Since the user viewed just one page, Google Analytics has no way to determine user engagement.  Therefore, it’s counted as a bounce.  Yes, it’s a huge problem, and could be tricking webmasters into making changes when they really don’t need to.

The Problem with Standard Bounce Rate

Actual Bounce Rate
Over the years, there’s been a lot of talk about how Google and Bing use bounce rate as a ranking factor.  For example, if the engines saw a page with a very high bounce rate, maybe they would use that against the page (which could result in lower rankings).  Add the Panda update, which targets low quality content, and you can see why SEO’s became extremely concerned with bounce rate.

In August of last year, I wrote a post on Search Engine Journal about Actual Bounce Rate.  The post explains some of the mechanisms that Google can use to determine actual bounce rate, and not just the standard bounce rate that Google Analytics provides.  For example, dwell time, toolbar data, Chrome data, etc.  The core point of the post is that Google has access to much more data than you think.  Therefore, don’t focus solely on the standard bounce rate presented in Google Analytics, since the actual bounce rate is what Google could be using to impact rankings.

Actual Bounce Rate Factors


Google Introduces “Adjusted Bounce Rate”
So, given what I just explained, is there a way to gain a better view of actual bounce rate in Google Analytics?  Until recently, the answer was no.  But, I’m happy to announce that Google Analytics released an update in July that enables you to view “Adjusted Bounce Rate”.  It’s not perfect, but it’s definitely a step closer to understanding actual bounce rate.

The Definition of Adjusted Bounce Rate
By adding a new line of code to your Google Analytics snippet, you can trigger an event when users stay for a minimum amount of time.  That amount of time is determined by you, based on your specific site and content.  So, adjusted bounce rate will provide the percentage of visits that view only one page on your site and that stay on that page for less than your target timeframe.  For example, you can set the required time to 20 seconds, and that time would be used to calculate the bounce rate numbers used in your reporting.  If users stayed less than 20 seconds, then it’s a bounce.  If they stayed longer than 20 seconds, it’s not a bounce (even if they visited just one page).

Changes Needed
As I mentioned above, you need to add one line of code to your Google Analytics snippet.  Here is the revised snippet (from the Google Analytics blog post about adjusted bounce rate):

Google Analytics Snippet for Adjusted Bounce Rate

Note, that new line needs to be added to your Google Analytics snippet on every page of your site.  Also, the piece of code that includes 15000 represents the time in milliseconds.  15000 equals 15 seconds.  You can adjust that based on your own site and content.  For some sites, you might set that to 30 seconds, 1 minute, or more.  The minimum is 10 seconds.

Impact, and a Real Example
You might be wondering how this actually impacts your reporting.  Does changing that line of code really impact your bounce rate numbers?  Well, it does, and it can radically change your bounce rate numbers.  And that’s a good thing, since it will give you a closer look at actual bounce rate since time on page is now factored in.  Remember my example above about a user spending 15 minutes on a post and it’s counted as a bounce?  Well that wouldn’t be a bounce if you added this code.  Let’s take a look at an example that demonstrates adjusted bounce rate.

Below, I’ve included the metrics for a page that was showing an 76.7% bounce rate.  Clearly, that’s not a great bounce rate, and it could be driving the webmaster to make changes to the content.  But, check out the bounce rate after we started calculating adjusted bounce rate.  Yes, you are seeing that correctly.  It’s now only 28.5%.  That means 71.5% of users are either visiting other pages on the site or staying on the page for longer than 20 seconds (which is the time the company is using to calculate adjusted bounce rate).  And by the way, the new bounce rate is 62.8% lower than the original bounce rate percentage.  That’s a huge swing.

An example of adjusted bounce rate in Google Analytics

What This Means to You & How This Could Be Better
As a marketer, bounce rate is an extremely important metric.  But, you need an accurate number to rely on if you plan to make changes.  That’s why I wrote about actual bounce rate last summer.  Adjusted bounce rate enables you to add another ingredient to the standard bounce rates calculated by Google Analytics.  By understanding the minimum time spent on the page, you can gain intelligence about how users are engaging with your content.  Are they hitting your page and immediately leaving, or are they at least spending 20 or 30 seconds reading the content?  There’s a big difference between the two (especially for SEO).

Understanding adjusted bounce rate can help you:

  • Refine the right pages on your site, and not just ones that show a high standard bounce rates.
  • Better understand the quality of your top landing pages from organic search. Do you have a quality problem, or are users spending a good amount of time with that content?  Adjusted bounce rate can help you understand that.
  • Better understand the quality of campaign traffic.  Seeing a 95% bounce rate is a lot different than 25%.  Sure, you want conversions from campaign traffic, but engagement is important to understand as well.
  • Troubleshoot SEO problems related to Panda.  When a Panda update stomps on your site, you should analyze your content to determine what’s deemed “low quality”.  Adjusted bounce rate is a much stronger metric than standard bounce rate for doing this.

How Adjusted Bounce Rate Could Be Improved – Revealing Dwell Time
I would love to see dwell time added to Google Analytics somehow.  Dwell time is the amount of time someone spends on your page before hitting the back button to the search results.  Duane Forrester from Bing explained that they use dwell time to understand low and high quality content.  As an SEO, imagine you could understand which pages have high dwell time.  That would be incredible intelligence to use when trying to enhance the content on your site.

Summary – Adjusted is Closer to Actual
Again, I believe this is a great addition by the Google Analytics team.  Adjusted bounce rate can absolutely help you better understand the quality of content on your site, and the quality of traffic you are driving to your site.  I recommend adding the line of code to your Google Analytics snippet today, and then analyze how your bounce rates change.  I have a feeling you’ll be surprised.

GG

 

Wednesday, August 8th, 2012

How To Use Index Status in Google Webmaster Tools to Diagnose SEO Problems

Index Status in Google Webmaster Tools

In late July, Google added Index Status to Webmaster Tools to help site owners better understand how many pages are indexed on their websites.  In addition, Index Status can also help webmasters diagnose indexation problems, which can be caused by redirects, canonicalization issues, duplicate content, or security problems.  Until now, many webmasters relied on using less-than-optimal methods for determining true indexation.  For example, running site: commands against a domain, subdomain, subdirectory, etc.  This was a maddening exercise for many SEO’s, since the number shown could radically change (and quickly).

So, Google adding Index Status was a welcome addition to Webmaster Tools.  That said, I’m getting a lot of questions about what the reports mean, how to analyze the data, and how to diagnose potential indexation problems.  So that’s exactly what I’m going to address in this post.  I’ll introduce the reports and then explain how to use that data to better understand your site’s indexation. Note, it’s important to understand that Index Status doesn’t necessarily answer questions.  Instead, it might raise red flags and prompt more questions.  Unfortunately, it won’t tell you where the indexation problems reside on your site.  That’s up to you and your team to figure out.

Index Status
The Index Status reports are under the “Health” tab in Google Webmaster Tools.  The default report (or “Basic” report) will show you a trending graph of total pages indexed for the past year.  This report alone can signal potential problems.  For most sites, you should see a steady increase in indexation over time.  For example, this is a normal indexation graph:

Basic Index Status Report in Webmaster Tools

But what about a trending graph that shows spikes and valleys?  If you see something like the graph below, it very well could mean you are experiencing indexation issues.  Notice how the indexation graph spikes, then drops, only to spike again.  There may be legitimate reasons why this is happening, based on changes you made to your site.  But, you might have no idea why your indexation is spiking, and would require further site analysis to understand what’s going on.  Once again, this is why SEO Audits are so powerful.

Trending Spikes in Index Status Basic Reporting

Advanced Report
Now it’s time to dig into the advanced report, which definitely provides more data.  When you click the “Advanced” tab, you’ll see four trending lines in the graph.  The data includes:

  • Total Indexed
  • Ever Crawled
  • Not Selected
  • Blocked by  Robots

“Total indexed” is the same data we saw in the basic report. “Ever crawled” shows the total number of pages ever crawled by Google (the cumulative total).  “Not selected” includes the total number of pages that have not been selected to be indexed, since they look extremely similar to other pages, or that redirect to other pages.  I’ll cover “Not selected” in more detail below.  And “Blocked by robots” is just that, pages that you are choosing to block.  Note, those are pages you are hopefully choosing to block…  More about that below.

Advanced Index Status Report in Google Webmaster Tools

What You Can Learn From Index Status
When you analyze the advanced report, you might notice some strange trending right off the bat.  For example, if you see the number of pages blocked by robots.txt spike, then you know someone added new directives.  For example, one of my clients had that number jump from 0 to 20,000+ URL’s in a short period of time.  Again, if you want this to happen, then that’s totally fine.  But if this surprises you, then you should dig deeper.

Depending on how you structure a robots.txt file, you can easily block important URL’s from being crawled and indexed. It would be smart to analyze your robots.txt directives to make sure they are accurate.  Speak with your developers to better understand the changes that were made, and why.  You never know what you are going to find.

The Red Flag of “Not Selected”
If you notice a large number of pages that fall under “Not selected”, then that could also signal potential problems.  Note, depending on the type of website you have, it might be completely normal to see a larger number of “Not selected” than indexed.  It’s natural for Google to run into some redirects and non-canonical URL’s while crawling your site. And that’s especially the case with ecommerce sites or large publishers.

But, that number should not be extreme…  For example, if you see the number of pages flagged as “Not selected” suddenly spike to 100K pages, when you only have 1,500 pages indexed, then you might have a new technical issue on your hands.  Maybe each page on your site is resolving at multiple URL’s based on a coding change.  That would yield many “Not selected” pages.  Or maybe you implemented thousands of redirects without realizing it.  Those would fall under “Not selected” as well.

Security Breach
Index Status can also flag potential hacking scenarios.  If you notice the number of pages indexed spike or drop significantly, then it could mean that someone (or some bot) is adding or deleting pages from your site.  For example, someone might be adding pages to your site that link out a number of other websites delivering malware.  Or maybe they are inserting rich anchor text links to other risky sites from newly-created pages on your site.  You get the picture.

Again, these reports don’t answer your questions, they prompt you to ask more.  Take the data and speak with your developers.  Find out what has changed on the site, and why.  If you are still baffled, then have an SEO audit completed.  As you can guess, these reports would be much more useful if the problematic URL’s were listed.  That would provide actionable data right within the Index Status reports in Google Webmaster Tools.  My hope is that Google adds that data some day.

Bonus Tip: Use Annotations to Document Site Changes
For many websites, change is a constant occurrence.  If you are rolling out new changes to your site on a regular basis, then you need a good way to document those changes.  One way of doing this is by using annotations in Google Analytics.  Using annotations, you can add notes for a specific date that are shared across users of the GA profile.  I use them often when changes are made SEO-wise.  Then it’s easier to identify why certain changes in your reporting are happening.  So, if you see strange trending in Index Status, then double check your annotations.  The answer may be sitting right in Google Analytics.  :)

Adding Annotations in Google Analytics

Summary – Analyzing Your Index Status
I think the moral of the story here is that normal trending can indicate strong SEO health.  You want to see gradual increases in indexation over time.  That said, not every site will show that natural increase.  There may be spikes and valleys as technical changes are made to a website.  So, it’s important to analyze the data to better understand the number of pages that are indexed, how many are being blocked by robots.txt, and how many are not selected based on redirects or canonical issues. What you find might be completely expected, which would be good.  But, you might be uncovering a serious issue that’s inhibiting important pages from being crawled and indexed.  And that can be a killer SEO-wise.

GG

 

Wednesday, July 18th, 2012

How To Use Social Reports in Google Analytics To Analyze Specific Blog Posts or Content [Tutorial]

Social Reports in Google Analytics

In March of this year, Google Analytics released a set of new reports for measuring the effectiveness of traffic from social networks.  It was a great addition and provides some valuable information about how social is affecting your business.  For example, you can view social referrers, content that received traffic from social networks, view conversations across certain social networks, view conversion data (including last click and assisted attribution), how social visitors flow through your site, and more.

One question I keep getting from business owners is how to easily analyze a piece of content they are tracking?  For example, let’s say a certain blog post went live recently, was heavily shared across social networks, and ended up driving a lot of traffic.  What if you want to isolate that page and view data via GA’s social reports?  Well, you can absolutely do that, and I’m going to walk you through some of the core insights you can glean from the reporting.  Let’s get started.

Isolating a Blog Post or Piece of Content
For this tutorial, I’m going to use a recent post of mine, which ended up being popular within the search marketing industry.  Last month, I attended the Google Agency Summit and found out that the old Google Wonder Wheel’s engine actually drives the Contextual Targeting Tool.  The Wonder Wheel was a great tool for finding related searches, based on actual Google data, and many in my industry loved using it.  Needless to say, search marketers were thrilled to find out the functionality can still be found in the Contextual Targeting Tool.  The post ended up getting shared quite a bit on Twitter, Facebook, and Google+.  Let’s take a look at the social reporting for this post.

You can isolate a page in two ways via social reporting in Google Analytics.  The first way is from the overview page, and the second way is from the Pages report.  Let’s jump to the Pages report, which will list your top content receiving traffic from social networks.  You can access this report by clicking “Traffic Sources”, “Social”, and then “Pages”.

The Pages Report in Google Analytics Social Reports

At this point, you will see a list of pages from your site, along with key metrics like visits, pageviews, time on site, data hub activities, etc.  I’ll cover what data hub partners are in a second.  For now, find the page you want to analyze and click the URL.  For me, I’m going to click the URL for my Google Wonder Wheel post, which had 1040 visits from social networks from June 20th through June 30th.

After clicking the URL, the Social Referral tab is the default view.  Here, you can view the social networks driving the most traffic to the post, along with viewing trending for all traffic versus trending for social traffic.  In addition, the primary dimension in the report is “Social Network”, which as I mentioned above, will display a list of social networks driving the most traffic to this specific post.  For me, Twitter, Facebook, and Google+ drove the most traffic to this post over the 10 day period.

Social Networks in Social Reports

Social Actions and Data Hub Partners
If you click the “Social Network and Action” dimension, you will see Data Hub Activities for the post. Data Hub partners are social networks that have chosen to share additional information with Google so users of Google Analytics can view that data within Google Analytics reporting.  The activity stream from data hub partners can provide rich information that can be organized and viewed via Social Reports.

Unfortunately, some of the big players in Social are not participating, like Facebook and Twitter.  This means you will only get basic data in your reporting from these networks.  Current Data Hub partners include Google+, Delicious, Blogger, Disqus, Diigo, Pocket, etc.  You can tell which social network is a data hub partner since there will be a data hub icon next to participating networks.  See the icon below.

Data Hub Partners in Social Reports

Back to our example.  If you click the “Social Network and Action” dimension, you can analyze Data Hub activities for specific pieces of content.  For example, you can view Google+ posts, +1’s, reshares, bookmarks from Delicious, Pocket saves, etc.  You can also view a graphical breakdown of the data hub activities to the right.  Again, I wish more social networks were data hub partners, so you could get a full view of activities like tweets, likes, etc. from major networks like Twitter and Facebook.  That said, this is still valuable, and we’ll get more granular next.

Data Hub Activities in Social Reports

Activity Stream and Special Treatment for Data Hub Partners
You can click the Activity Stream tab to view specific data hub activities across social networks.  Sure, it’s cool to see top-level activity like we’ve seen so far, but the activity stream gets much more specific.  When clicking the tab, you will see actual conversations and events from across data hub partners.  The default tab is the Conversations tab, which will display shares and comments from data hub actions. You will see specific users, their shares, what they wrote when sharing the content, resharing, or commenting on a post.   For example, you can view Google+ and Diigo information below for my Wonder Wheel post.

Activity Stream in Social Reports

It’s important to note that while analyzing the activity stream (starting with conversations), you’ll notice some great functionality for Google+ content.  For example, you can click a person’s photo to view their G+ profile and there are icons that let you know if the person shared an update, reshared someone else’s update, or commented on a G+ update.  Then you can click the dropdown arrow on the far right to view additional information, including the Google+ ripple for the piece of content, you can view specific shares on G+, etc.  This is awesome data, as you can find influencers, view their posts about your content, view +1’s from other G+ users, etc.

Viewing additional data for data hub partners.

The Power of Ripples
In particular, viewing the Google+ Ripple for a specific URL reveals incredible data.  I’ve written previously about how to analyze G+ Ripples, and you should definitely check out that post.  Ripples enable you to see how your content was shared across Google+, from user to user.  You can also view influencers, sharing sequences, links to each public Google+ post, view shares over time, etc.  Spend some time with Ripples… you can find some incredible information.

Viewing Google Plus Ripples for Specific URL's

Events in Activity Stream
The second dimension in the Activity Stream report is Events.  By clicking this dimension, you can view additional information beyond just the conversations people are having about your content.  For example, you can view data hub partner events like +1’s, delicious bookmarks, pocket saves, trackbacks, etc.  I’ll cover more about trackbacks shortly, but this was a cool addition by Google recently.

Similar to what we did earlier, using the dropdown arrow on the right side enables you to see the actual activity on each social network.  For example, selecting “View Activity” for a delicious bookmark takes you to the actual bookmark page.  Here, you can view the profile of the person bookmarking your content, view comments, etc.  This is a great way to understand what people are saying about your content, find influencers, connect with similar people, etc.

Events in Social Reports in Google Analytics

Quick Tip:
By clicking the social network logo in the events list for any action, you can link to a page that shows all activity from that specific social network.  For example, clicking the delicious icon in the screenshot below, you will be taken to all delicious events for this specific piece of content.

A Note About Twitter
I mentioned earlier that you can only get advanced level data from Data Hub Partners.  That’s true (and unfortunate), but there is some additional data you can get from Twitter.  If you click the the link for Twitter when viewing social networks in your reporting, you will see a list of t.co links (shortened links from Twitter).  If you move fast enough, you can enter those shortened URL’s in Twitter Search to view the actual tweets.  Then you can check out each Twitter user to find influencers, follow them, engage them, etc.  Twitter Search does not go back very far, so you’ll need to move fast.  You can also use a number of third party tools to mine Twitter data, but that’s for another post. :)

Analyzing Tweets via Social Reports in Google Analytics

Trackbacks
If you click back to the Social Referral tab, and click the “Social Referrers” dimension, you might see “Trackbacks” listed in the report.  Note, you might have to use the rows dropdown at the bottom of the report to reveal additional rows to view trackbacks.  If you click the “Trackbacks” link, and then click the Activity Stream tab, you will see inbound links that Google Analytics picked up.  Trackbacks will display links to your content from outside your site (inbound links).

Viewing trackbacks in social reports

From this report, you can view the pages linking to your content by clicking the link icon next to the URL, or by clicking the arrow dropdown and clicking “View Activity”.

Trackbacks are a Great Addition, But Not Perfect
It’s important to understand the links that your content is building on several levels.  First, you can start to understand what people are saying about your content, what types of sites are linking to you, understand the authors of that content, what the comments are saying, etc.  That’s all really useful information.  Second, you want to understand the SEO power of those links. Are they relevant websites, is the content high quality, is it a spammy website, etc?  Third, you can absolutely use this intelligence to connect with influencers, whether that’s the blog author, or people commenting.  And no, this isn’t as robust as using Open Site Explorer, Majestic SEO Tools, Google and Bing Webmaster Tools, etc., but it’s nice having this data in Google Analytics.

Social Conversion Data for Specific Content
GA’s Social Reports include a valuable conversion report that displays the last click and assisted conversions from social networks. This is important data to analyze, since you can understand how social networks impact conversion (by directly impacting conversion and/or assisting conversion).

But, the social conversion report is not broken down by content.  In order to get that data, you would need to create an advanced segment for social traffic, then view top landing pages with that segment active.  Then you can analyze the conversion impact of visits to that piece of content from social networks.  At a top-level view, it’s great to see conversion data from each social network, but if you are laser focused on a specific piece of content, then the standard social reports won’t really help you.

Summary – Get Social with Google Analytics
As you can see, Social Reporting was a great addition for Google Analytics.  It’s ultra-important to understand the impact of social traffic, what’s being shared across social networks, which influencers are sharing your content, who is engaging that content, etc.  It’s also important to analyze specific pieces of content that are being actively shared across social networks.  I hope this post explained more about how to find and analyze data for a specific post.  But like anything else in digital marketing, you need to test it out for yourself!  So target a piece of content, fire up Google Analytics, and hit the social reports.  Good luck.

GG