6 Questions You Should Ask During a Website Redesign That Can Save Your Search Engine Rankings
So, I decided to write this post to help you stand out as the person that saves the day. The person that flies in with SEO on your chest, swoops down and identifies SEO issues with your redesign and then corrects a potential disaster in the making.
--BTW, these are actual SEO scenarios I have come across. Also, there are many more issues that can pop up, but I decided to focus on these 6 for the post. And don’t laugh when you read each item, this might be happening as part of your next redesign. :-)
Without further ado, here are 6 questions you can ask during your website redesign that can save your search engine rankings:
1. Are we using Flash in the right ways and only when we need its unique power?
If you know me at all, then you know I’m a big advocate of Flash (having developed with it for over 10 years). But, replacing HTML content with full Flash pages or a significant amount of Flash can really cause problems SEO-wise. Run a cache command on a full flash webpage and you’ll see the problem quickly. That is unless you want to rank for “big blank white space”! ;-) If you do add more flash content to your site, then definitely utilize SWFObject 2.0 to provide search engine friendly alternative HTML content. I’ve written an in depth post about how to use SWFObject 2.0 here. And for those of you that are saying, “We’ll be ok since the engines are now crawling flash...”, please read my other post about Google crawling flash. There are several variables that can impact how Google and Yahoo crawl your swfs (the two engines working with Adobe now). My tests and recommendations were backed up this week at SMX during the Flash and SEO session with Adobe, Google, Yahoo, and Live Search. What’s my rule of thumb with Flash? Use it where you need the unique power of Flash. Do not, I repeat, do not use Flash for your entire site or for entire pages of content. Use it for webpage elements only.
2. Did we analyze the Search Equity of webpages marked for removal?
If you will be removing content from your site, make sure you determine the Search Equity of your pages. Your current rankings are heavily based on the quality and relevance of your inbound links. You’ve worked hard to build those links, so why would you throw them away?? This happens all too often when you don’t take into account which pages are important from a Natural Search standpoint.
Campaign landing pages are a great example of this. Let’s say you launch a new product and use a wide range of marketing channels to promote the new product and landing page. When the campaign ends, you decide the page isn’t needed anymore, so you just delete it. But hold on… if you had taken a look at the Search Equity of the page, you would have realized it built more than 5000 links for you, mostly from industry-relevant blogs and websites! It earned a Pagerank 5 and you just threw away all of those links by deleting the page! I hate when I see this happen. Do your homework before deleting pages.
So what should you do? You should either keep the page as-is or 301 redirect the page to a corresponding page on your site. That might be the product category page or a similar product page. 301 redirects are the proper way to pass link power from one URL to another. It’s a permanent redirect and tells the engines that Page A has moved permanently to a new location (Page B). Tip: Do not use 302 redirects when you remove a page. 302's are temporary redirects and are not search engine friendly. I can write an entire post about redirects, but just remember that 301’s are good and 302’s are bad.
3. Are we changing our URL structure during the redesign? If we are, did we make sure the engines know where the old pages will reside on the new website?
Similar to the bullet above, be careful if you decide to change your URL structure. If you change a URL from abcd.asp to efgh.asp, the engines will look at the page as NEW, even though the same content has been around for a long time (and has built up links and search power). Basically, the new page won’t automatically inherit the search power of the original page. Now imagine the impact if you change thousands of URL’s, tens of thousands of URL’s or even more?
For example, let’s say you decide to include target keywords in your URL’s, such as a product name and category. The old URL’s that have built up a nice amount of Search Equity will all be changed to your new taxonomy during the redesign. That’s great, but again, all of that search power will unfortunately be lost unless you tell the engines where the new URL’s are. Based on what I mentioned above, you can probably guess that it’s Mr. 301 redirect to the rescue again. You can redirect your old URL’s to your new ones and safely pass their link power. I’ve seen this overlooked plenty of times, and again, the results can be devastating.
4. Are we using Vanity URL’s or custom domains for our campaign microsites?
Note, this doesn't fall under something that will crush your current rankings, but it sure can impact how your site builds more power based on your hard work.
Let's say you have a new marketing campaign going live soon and someone on your team wants to register a bunch of new domain names for the microsite. You know, something like www.TheBestDarnBagelOnThePlanet.com or something catchy like that… Here’s the problem. It will be a brand new domain that needs to build its own search power versus inheriting the trust from your core domain, which is why I’m a bigger fan of using subdirectories, such as yourdomain.com/campaigntitle. Then your campaign will leverage your trusted domain, rank faster, and help build links for your trusted domain. It’s a win-win.
5. Are we replacing keyword-rich text content with images or Flash in order to achieve an aesthetic advantage? AKA, we want things to look pretty…
Your design team went nuts with the redesign, the new site looks incredible, and it uses all sorts of images and flash content in place of text content. You know, because the standard browser fonts aren’t sexy enough. I get that, I really do... but the SEO impact can be serious. For example, taking keyword rich text content on each page and throwing it into images to get a desired look. Taking your text navigation and placing it in Flash or in images. Again, this happens all too often. Text links are still the best way to get the bots to all of your content. And, using descriptive anchor text, you can tell the engines what they will find at the other end of the link. For example, using a text link with the anchor text Adidas Running Sneakers is much more powerful than using an image that holds the text Adidas Running Sneakers. Even if you use alt text with that image, it’s a much better idea to use descriptive text links. And, if you use Flash, then you’ll run into even more problems, which is why you should use SWFObject to provide an HTML version of your navigation. And for those of you who are saying, “I’ll just provide an xml sitemap to the engines and I’ll be fine”, keep in mind that the optimal way to get the engines to your pages is via a traditional crawl (as noted by a Google engineer at SMX this week). :) XML Sitemaps are a great supplement and help with more than just content discovery, but they don’t replace text links and navigation as the best way to get the bots to your website pages.
6. Did we do such as a good job at coding that we essentially removed key pages from our website? i.e. Where one page now handles the equivalent of 10 pages. The URL doesn’t change, but the content does big time!
Your developers did a great job of streamlining your code. They did such as good job, that 10 pages of content can now be handled dynamically by just one page. That one page posts back to itself and dynamically provides the content of 10 pages from your old site. Code-wise this might be outstanding, SEO-wise, it’s a nightmare. Beyond removing 10 pages from your site that might have built up Search Equity, you cannot optimize a page for each of the 10 items that will be presented on the fly. You are going to have a heck of a time getting those products to rank if they cannot be crawled! In addition, you cannot optimize the typical HTML elements like you normally would. For example, the title tag, h1, h2, body copy, inline links, etc. since the information will be loaded dynamically. Coming from a development background, I totally understand why you would want to code this way. However, from an SEO-standpoint, it can cause all sorts of issues. I would make sure you can present each of the 10 pieces of content in an optimized webpage with a distinct URL. You can still use code to streamline the process and delivery, but try not to handle everything at one URL.
A quick example would be a category page that dynamically presents each product within that category. This might happen when you click each product image (and this all happens at at one URL). The engines would only see one URL and crawl the initial content. Not good.
So there you have it, 6 ways you can save the day during your next website redesign or website update. Keep in mind that you will probably have a challenging time when you first introduce these questions. There will be pushback and requests to back up your recommendations. But once you do, and everyone involved starts to understand SEO best practices, the problems I mentioned will be less likely to occur. If they are less likely to occur, then you have a better chance of keeping your organic search power. If you keep your organic search power then you can keep driving natural search traffic to your site. If you keep driving natural search traffic to your site, then you can reap the benefits of that traffic, which can be increased exposure, customers, and revenue.
So don't be afraid to speak up!
with your online marketing projects,
then contact Glenn Gabe today>