If you have an established website with good search engine rankings and traffic, the greatest challenge of a website redesign is meeting or exceeding your current algorithms so you don’t lose rankings and traffic.

That’s usually not as easy as it sounds for several reasons. Some of the reasons are technical, and some of the reasons are the web developer and site owner‘s understanding of the crucial SEO elements of site redesigns.

The technical issues tend to fall into two major groups:

1) Content and the onsite optimization of the top entrance pages, and

2) Site architecture such as changes to the navigation, main menu items or the renaming of URLs and domains.

Here’s the checklist of the 10 most common SEO mistakes to avoid with website redesigns – including both content and site structure – that should be given full consideration if you don’t want to lose your current rankings and traffic levels when the new site goes live.

Clouds

SEO Content of the Main Entrance Pages

1) People tend to take their Google rankings for granted until they lose them

Too often, people mistakenly believe their search engine rankings are much more automatic than they actually are.  In fact, SEO can be a “One strike you’re out” environment! You can do a hundred things right – but get one crucial element wrong and it can undo everything else you do.

Google publishes its Webmaster Guidelines to help people build websites that work for both human visitors and search engine spiders like the GoogleBot robot that crawls sites for rankings.  It comes as a complete shock to most people however, that the overwhelming majority of web developers have never read them – and fewer still follow them.

Ignore Google’s Webmaster Guidelines and Google will ignore you. It’s just that simple!

Unlike print publishing which is “distributed” Internet publishing is “found” as a result of a web search. That means you have to jump over someone to get their spot in the rankings.

You can’t just “do your own thing” with web publishing and still get noticed like you can with print publishing.  Internet publishing means out-algorithming the competition to get their spot in the rankings to meet the most basic criteria of even being seen!

Effective marketing requires starting with your ideal customer and working backwards to develop an effective strategy to reach them.  For Internet marketing, that means first identifying and actually clicking on the top ranked sites Google is rewarding with top results for the searches you want to compete for – the sites your customers are actually clicking on in the real world – and working backwards from there.

This means first studying the onsite factors that are easy to see on these websites.  Like how their meta descriptions and URLs display in search results – as well as the text length of the pages you arrive at when clicking on these results which is an important ranking factor along with keywords and keyword position, quantity and density.

To get a more complete picture of how these top rankings were actually achieved, you also need to consider the offsite factors contributing to their rankings. The Link Authority of links pointing from other websites to the site which can be studied at MajesticSEO.com is important to understand why certain sites may be able achieve rankings that other sites with fewer links may not be able to do and achieve those same rankings with on-site factors such as shorter text length.

The definition of professional is “Getting predictable results.” If this essential market research is not done studying the sites that are achieving top results for the searches you want to compete for, you are simply shooting in the dark whether your site will be able to outrank them and get their spot in the rankings.

The biggest enemy of maintaining existing Google rankings are web developers who take a “Code it and forget it” attitude and don’t feel they have to actually check their work to see how it is performing on search engines.  Humans are extremely limited in how much of the SEO environment they can see that spiders see and compute for your rankings, and it simply is not possible using human eyes alone on a website to know if it will work on search engines or not.

One of Google’s most important guidelines is “Know how your content management system (CMS) is actually performing on search engines” – because CMS do all kinds of things humans don’t even see which greatly influence spiders crawling your site and can have a major impact on search rankings.

Critical search rankings can be lost very quickly not taking fifteen seconds to spider test redirects with a redirect checker like the InternetOfficer.com to see if they actually work as expected, checking Google Webmaster Console to see if issues are arising such as a sharp increase in nofounds or messages from Google saying the site isn’t working, or doing a simple diagnostic search like a  site:yoururl  search and clicking Google’s “cached” link to see if the website is still being crawled properly can easily knock a website right out of the rankings if the work is not checked religiously.

You have to actually look at Google’s index and study how your pages are displaying in search engine results to see if changes to the site are working as expected.  According to Google’s Matt Cutts the number one SEO mistake he sees are pages not being crawlable and indexed properly – and the site owners are not aware of the problem because they haven’t looked.

Unless search engine results are actually checked to see pages are crawlable and being indexed properly – not just looking at a third party ranking report showing ranking position – there is no way of knowing if search engines are able to crawl a page as hoped and traffic is not being lost, which happens far more often the people realize!

If you don’t check your work by actually looking at Google’s index, how would you know if it works? Although that seems obvious and simple, far more often than not people don’t seem to feel they have to actually look at Google to confirm their assumptions things are working as expected – and make very expensive SEO mistakes not doing so!

The costs of SEO website redesign mistakes are extremely high. The difference between being number one or number two for a search is the loss of half the click  through traffic.  The difference between being the top of page one on Google or the top of page two is thirty five times the traffic!

This steep fall off in click through rates can make your Google Analytics traffic reports look like your traffic has driven off a cliff if certain key ranking elements are lost in the site redesign – and you inadvertently move backwards in search results.

You want to track your search engine rankings closely before the new site goes live so you have a clear baseline for comparison with your rankings after the site launch.   Checking your rankings in the first 18 hours after a major change on the site is crucial as Google will give you a temporary boost in the rankings that you are doing something right that is a transient algorithm, may or may not last past the 18 hours period, but is important to know that Google has given you a “Heads up” you have done something right.

More frequently than people realize,  new site launches need to be aborted because although the site works for human visitors, it does not work for search engine for rankings and people being able to find it on the web.  This is why you always have to be prepared to revert to a backup of the current site as a reset if something doesn’t go as expected with a new site launch on search engines.

 

Clouds Sunset2) The focus of website redesigns is primarily on what is going to be added to the site – what’s overlooked is auditing the SEO elements that are achieving current rankings, so as not to lose them

Search engine spiders that crawl your site for ranking often view it very differently than humans. The GoogleBot is just a robot bean counter that counts your keywords.  It can’t see images or how great your site looks which is why, for the most part, Google is very text based.

Rewriting copy so it is more effective at converting customers is a good thing that everyone is in favor of. But you don’t want to make the mistake of that being at the expense of fewer visitors to the site. If you drastically reduce the number of URLs on a site, or reduce the amount of copy on a specific landing page you can also very easily reduce the site’s ranking ability compared to the former site that those rankings were dependant upon.

Too often, the elements that are achieving current rankings are lost in the creative process.  This can result in a disastrous drop in search engine traffic. Being proactive and not allowing this SEO mistake to occur should be the highest priority with a website redesign.

Rankings are always a combination of onsite factors such as keyword densities, and offsite factors such as the appearance of keywords in the link text of links pointing to a site.  The higher your Link Authority and rankings resulting from offsite links, the less dependent you are upon onsite factors to hold your rankings.

However never forget, only the GoogleBot knows that algorithm for sure.  Onsite factors are the changes you can make that you have one hundred percent control over like body text so you always want to do as much as you can there because it all adds up in your quality score for ranking.

At the end of the day, Google always has the last say.  Most people will consider the website redesign a success or a failure if they see the site optimized and traffic increased – or the website devalued with traffic down. That’s why it’s so important to involve an SEO from the very beginning of the website redesign process to be certain it is built search engine friendly from the foundation up.

 

Clouds Sunset3) Thin Content

Like it or not, Google’s ranking algorithms are very text dominated. GoogleBot is blind and can’t see pictures or text embedded in images.

Website pages with more text by word count (up to a point) are a mathematical certainty to be able to compete for a wider range of keywords without exceeding the keyword densities Google rewards with top ranking. Sites with higher text to web page ratios – i.e. ratio of text to images – tend to do better in the rankings.

Whether a page is thin content is defined by what the competition is doing you have to jump over to get their spot in the rankings. All other things being equal, Google rewards information rich sites and web pages with more relevant and on-topic text content will outrank thin content with fewer words on a page.

Many web designers come to Internet publishing from a primarily graphics arts background without understanding the fundamentals of SEO, which is essentially bean counting. When developers say they want a “clean design” – big pictures and thin text content – in practical terms this may mean people won’t be able to find you in search results.

What frequently happens during site redesigns in which the images are made larger, and the amount of text on a page is reduced is that on a purely statistical basis, it will compete for a fewer number of searches. Reducing the amount of text and keywords on a page usually results in reducing the number of keywords the page can compete for and, ultimately, web traffic.

Often, information that is keyword targeted and crucial for rankings gets buried on a secondary “About” subpage. This subpage doesn’t have links pointing to it from other sites to compete like the homepage does.  A fundamentally different way humans experience websites for ranking than spiders for ranking are that humans click to other pages – spiders crawl and rank you for the most part one page at a time.

Another issue with word length is the longer people stay on the page reading it, the better it is viewed by Google as an information rich site by measuring the length of stay on the page. Look at the rankings and note Google tends to favor text rich sites with complete grammatically correct sentences with periods at the end over short bullet point phrases.

Trying to make up for thin content with duplicate content is a major SEO mistake.

In most instances, duplicate content is a filter not a penalty. When GoogleBot has to decide in a millisecond which version of duplicate content to display Google says it usually does a good job of selecting which version is the most important, but “It may not be the version you expect.”

That’s an understatement! It feels like a penalty when a page you ranked for is lost because a duplicate content page with no links pointing to it is mistakenly chosen instead.

Google usually does do a good job displaying the most important version, but not always. As long as duplicate content is on a site, it is simply impossible to get professional, predictable rankings because it becomes a roll of the dice which version Google selects.

If a page is filled with substantially duplicate content seen on other URLs or domains, the odds are it will be filtered from search results, as Google doesn’t want its search results filled with duplicate content. This may not happen immediately upon publication, but over time as Google’s algorithms more fully compute the page’s quality score.

Intentionally created duplicate content to make up for thin content however, is another matter as it can be viewed as webspam intended to manipulate search rankings.  This is what Google’s Panda updates are about.  Since 2011 Google has said “It only takes a few pages of poor quality or duplicated content to hold down traffic on an otherwise solid site,” and recommends such pages be, “removed, blocked from being indexed by the search engine, or rewritten.”

Google’s Matt Cutts says if your content is important enough to rank it should be important enough to write original content to earn those rankings. Once intentionally created duplicate content reaches a certain threshold only the Googlebot knows, Panda penalties can be very time consuming and expensive to get out of.

Once a Panda penalty has been imposed, Matt Cutts says simply rewriting the duplicate content so it is original content may not be enough at that point to get out of a Panda Penalty.  Don’t even think about intentionally created duplicate content as an attempt to make up for thin content!

Splash pages – a page with very little text visitors land on that asks the visitor to click something like a language choice to get to the main site content are an absolute disaster for SEO and search engine rankings because they are thin content.  There simply isn’t enough text on these pages for meaningful rankings – and spiders don’t click.  They crawl and rank you for the most part one page at a time.

Because Google sees the thin content Splash page as the root domain homepage, they are the opposite of search engine friendly.  You want your keyword content and maximum PageRank – which usually resides in the homepage as a result of links pointing to the website – to be focused on one page for maximum ranking strength.

Splash pages simply lack enough text content for rankings and are a waste of the website’s Link Authority when applied to the homepage – the page that has the most external links pointing to it on most sites.

Always have a “Plan B” backup plan in place if homepage text is reduced in case rankings take a dip after publication.

 

Clouds Sunset4) Failure to do an SEO keyword audit of the most important entrance page content

Use Google Analytics and under the main menu select “Content” to see what pages are receiving the most rankings and traffic, and are the most important entrance pages to the site. Make these pages your top priority in terms of SEO which you must meet or exceed – or lose traffic.

This means taking into consideration page titles, body text keyword densities, number of words on the page – as well as more behind the scenes elements like headline designations and image alternative text.

Whenever text is being dropped from an important entrance page for an existing site you should always first do a keyword audit, counting your keywords to see exactly what is being lost.  It is a major SEO website redesign mistake to remove keywords that current rankings are dependent upon without first counting them – or you are working blind.

Once you know that figure, ask what text will be put on the site in its place to compensate for the loss which meets or exceeds the current ranking algorithm?  Changing or deleting these entrance page texts without knowing what you are losing in terms of keyword mentions can easily change your rankings.

Far too frequently, marketing changes copy without any consideration of whether or not the former copy was optimized and bringing in keyword targeted traffic.

 

Clouds Sunset5) Converting text to images spiders can’t see for ranking

Because Googlebot is blind and can’t see images or text embedded in images, converting text to images spiders can’t see is an SEO website redesign mistake that frequently reduces rankings.  Although image alternative text is an important ranking factor on all the major search engines, it doesn’t carry as much ranking weight as plain text.

Follow Google’s Webmaster Guidelines, “Avoid converting text to images,” whenever possible in a website redesign.

Google’s guidelines are also very clear that image alternative text – the text in a webpages coding that describes for search engine spiders and the blind the content of images – must be short, accurate and descriptive.  This is not the place for keyword stuffing!  Keyword stuffing image alts is a beginner’s mistake that only fools humans who don’t see the code – not spiders.  It pulls penalties affecting the entire site when detected.

 

Clouds Sunset6) Loading Speed Time

Loading speed is now part of Google’s algorithm. A site redesign is the perfect opportunity to re-code, condense externally referenced files, and achieve faster load times. Be careful not to inadvertently increase loading speed time by adding images and objects without checking how quickly each one loads.

Check loading speeds across a wide range of platforms including satellite as many loading speed problems multiply hugely at slower speeds when many people are accessing the server at a given moment that we have no control over. Check your website’s average loading speed under “Performance” in Google Webmaster Console. Google strongly recommends webmasters monitor site performance using Page Speed, YSlow, and WebPagetest.

 

Clouds Sunset7) Be sure to keep spiders off the work-in-progress site

Keep spiders off the work-in-progress site with either; 1) Password protected staging site, or with a 2) Noindex, nofollow and disallow in robots.txt file to keep spiders from crawling it.  Google Console will provide you this coding under “Crawler access.”

It is a major SEO website redesign mistake to allow the staging site to be crawled and enter Google’s index. The result can be GoogleBot seeing the content first on the work-in-progress site and viewing the beta site as the authoring site. It then views the final publishing of the website on the main domain as duplicate content of secondary importance.

What happens is just when the release of the new website is announced, no one can find it on Google because it’s been knocked out of the rankings by the duplicate content URLs of the work-in-progress site. This SEO mistake can take weeks or months to straighten out depending upon the crawl rate of the site unless the URLs where Google first viewed the content are removed from the index using the Google URL Removal Tool.

It’s also crucially important to keep spiders off any temporary page you put up that reads “Under Construction” or “Site Undergoing Maintenance”.  Always safest to assume spiders are just waiting to ambush you. Putting these types of pages up for only a few minutes is extremely high risk if they get crawled and enter Google’s index as they are not keyword targeted and it can sometime take weeks to straighten out the problems this creates.

SEO of Site Architecture, Main Menu Navigation and Renaming URLS

Clouds Sunset8) Changing the main menu website navigation and URL structure can have major short term and long term SEO consequences

The website navigation and URL structure can change for several reasons – including changing content management systems that has the effect of rewriting URLs that will impact your rankings if not managed properly.  If URLs change in the redesign search engines will usually need time to index the new URLs and give them the same ranking weight as the former URL due to age strength.

Changing the site’s main navigation system – for example, converting your menu to Javascript image based pull downs, fancy hover or flash navigation – can cause your website to look completely different to search engines if they are unable to see and follow the dynamic links. This happens much more frequently than people realize if the work is not spider checked.

Never assume you can tell if spiders can see elements of your menu by looking at the site with human eyes alone.  This needs to be spider checked to see if search engines can actually recognize and follow the menu by clicking the “Text-only version” link in the upper right corner of Google’s cache of the page on published pages.  Google recommends using a Lynx text based browser to view the work-in-progress that has not yet been published to see if spiders can see crucial aspects of the website.

Another common SEO website redesign problem is homepage link inflation which results from expanding the number or pages the homepage links to. How many pages of the site the homepage links to can impact the rankings of the subpages as a result of how Google PageRank – which primarily resides in the homepage as a result of the links pointing to it – is dispersed from the homepage to subpages of the site via these links.

Google recommends having a site with a clear site hierarchy.  Link pages of primary importance from the homepage – pages of secondary importance from secondary pages. Google’s SEO Starter Guide says not to link every page of a site to every other page – this is the opposite of clear site hierarchy.

Google’s Matt Cutts says regarding how many links per page,  “So how might Google treat pages with well over a hundred links? If you end up with hundreds of links on a page, Google might choose not to follow or to index all those links. At any rate, you’re dividing the PageRank of that page between hundreds of links, so each link is only going to pass along a minuscule amount of PageRank anyway.”

How many links per page?

The crucial SEO element to grasp is that PageRank is finite and keep links from the homepage to a reasonable number. Increasing the number of links in the main menu decreases the amount of PageRank each link receives  –  which can inadvertently decrease rankings of your main entrance pages.

For “SEO Siloing” – which is focusing your PageRank vertically instead of spreading it too thin horizontally – link pages of primary importance from the homepage. Link pages of secondary importance from secondary pages.  This is to prevent spreading your PageRank too thin via the links from the homepage across secondary pages that don’t need it because they are not competing for rankings.

Here’s Google’s Matt Cutts video explaining how PageRank flows from the homepage and that this as a secondary level of optimization after first focusing on content:

Increasing the size of the main menu can not only reduce the rankings of subpages as a result of PageRank being spread more thinly across more pages, but it can also change the keyword densities of all pages the menu appears on and can push keywords you want ranking for lower in the layout.

Humans never see menus as a whole like spiders do, but rather as single tabs or pull downs at a time.  Although over time Google has gotten better and better at recognizing menus depending upon how they are coded – like with the ‘Menu’ tag with HTML 5 – to spiders these menu items look like same site link text and effect the keyword densities of the other elements of body text on the page.

All search engines give the most ranking weight to what comes at the top of the page. Expanding the main overhead menu has had the effect of pushing further down in the layout the keyword targeted text you want ranking for that humans see – reducing the ranking strength of those keywords.

Because menus are usually one to three word items they are inherently difficult to optimize, there’s usually not a lot in the main menu at the top of the page for optimization so it can reduce your rankings if the main menu is expanded without thinking through the SEO impact carefully. There is an SEO cost to each link, and Google says to keep them to a reasonable number on a page.

Every link from the homepage has the capability of focusing PageRank on your most important pages for top ranking. Or it can create a devaluation of PageRank site wide. Homepage link inflation can devalue website rankings without understanding PageRank inflation can impact websites in a similar way that currencies are devalued if too much money is printed without respect for a fixed underlying value that is finite.  PageRank is finite – and there is an SEO cost to each link from the homepage where most PageRank usually resides.

 

Clouds Sunset9) Renaming URLs without individually 301 redirecting the former URLs to each new one

Ideally, 301-pemanently redirect the old URLs to the new ones.  If this isn’t possible on a large site, make highest priority the top entry pages you’ve identified in Google Analytics.

These 301’s need to be put in place immediately after going live before the spiders show up. 301 redirects are the only redirect that conveys the PageRank of the previous page.

Because humans can’t tell by looking at a redirect resolve in a browser window if it is also working for spiders as expected, always run a spider test on redirects to make sure they are working properly. Not spider checking these redirects can inadvertently result in big ranking losses.

Enter into the Google search box “Redirect Test” and test the redirect at one of the results like www.internetofficer.com/seo-tool/redirect-check .

Be sure to run a link validator test on your website to be certain all internal links are working properly after the website redesign.

Don’t delete files from the former site. 301 redirect them to manage 404s most effectively. After publishing, check Google Webmaster Console under “Coverage” for 404’s and be certain 301’s are on any page returning sizable no founds. Always keep copies of the former files for backup.

While 301 redirects sometimes show right away in Google Rankings, they can sometimes take two to five weeks for Google’s algorithms to assess and finally display with full ranking weight. Be prepared for a dip in traffic depending upon how many 301’s are in place.

Best whenever possible to keep former top content pages the same URL. Because website redesigns don’t always go as expected on Google and any type of problem solving always involves isolating the variables, it is always best, if possible. to make changes in URL structure separate from changes in content, and/or changes in IP addresses a separate step.  When too many things change all at once, it can make it extremely difficult to assess the source of problems when they occur.

Another major issue with 301 redirects on site redesigns whenever URLs are changed is the inadvertent creation of redirect chains. Redirect chains which are redirects to redirects create problems for search engine spiders due to latency – a backlog of extra server handshakes that each requires extra time to execute. Once certain thresholds are reached – which can behave vary differently under different server loads – problems increase exponentially squeezing the redirects though the data stream humans barely detect but spiders don’t stick around for.

Listen closely to what Matt Cutt’s says in this video regarding latency and redirect chains which work at one, two – maybe three.  Past that point you are in the danger zone and at five or six the odds are close to zero spiders can follow it. Redirect chains begins at the 2:45 mark:

The problem that frequently happens, particularly on large corporate sites are legacy scripts that are placed in the htaccess file and forgotten. If there are a lot of links pointing to these former URLs, when redirect chains are created at three it only becomes a maybe spiders can follow it during heavy server loads. When redirects chains don’t work, the link juice they are conveying from the former URLs are lost.

The results can be disastrous with rankings wiped out overnight.

Another problem is the extra server loads the latency of redirects places on the site slows everything and site speed is a ranking factor. When pages don’t load quickly enough due to latency as a result of redirect chains they don’t get indexed or indexed consistently losing age strength.

One way to search for redirect chains is to go into Google Analytics, find the top ranking pages from former URLs and put them through a redirect checker like the Internet Officer to see if a redirect chain has been created.

 

Clouds Sunset10) Changing Domain Names

Age strength of a domain is a very important ranking factor. What happens when you change your domain name without carefully manageing the SEO is you can very easily lose your Age Strength and Link Authority.

New domains on Google – which do not have a powerful former domain’s links being 301 redirected to them giving them Age Strength – go through what is known as the Google Sandbox for immature sites where you won’t get rankings for competitive search for 3-14 months, depending upon how competitive the search is.  During that time you will continue to get rankings for less competitive searches like your company name.

Not understanding changing your domain name can kill your Google traffic due to the Google Sandbox for immature sites and / or resulting from the loss of Age Strength and Link Authority is a major SEO website redesign mistake that can do very major damage to a company if not handled properly.  Links are the most important part of Internet marketing!

Google Console provides a “Domain Forwarding” feature that you can inform Google a previous domain is being forwarded to a new one to help maintain the Link Authority of links from the former domain to the new one.

The “Domain Forwarding” feature in Google Console can help you buy time by maintaining the integrity of your link strength in the short term.  However, once site owners realize that links on their site are redirecting to a different domain than what appears in the link text on their site – or if the links become dead because the former domain is no longer redirecting – the best websites put spiders on their site to check for redirecting or dead links and when they find them, they pull them from their site.

This happens much more quickly than most people realize, and within a month a site can see a major loss in its Link Authority that will not be compensated for by the “Domain Forwarding” feature in Google’s Console because the old links now simply no longer exist.  Links that took years to acquire can be lost very quickly. This can take companies years to recover from in the rankings.

When changing domains, it’s crucial to do the critical outreach to the top sites linking to you to make your best effort to retain Link Authority asking them to update their links to you, or you will see a decline in rankings when site owners pull their old links to you which you have no control over.

Whenever old former domains are redirected to a new domain with a site redesign – which happens for example with company rebrandings – it is imperative to be certain that the old domain is redirected after Google has crawled and is showing the new domain in the index that you hope to get rankings for going forward.  If Google sees the new site redesign on the old domain name first – as a result of crawling the redirect instead of first indexing the content on the new domain – it will view where it sees the content first as the primary authoring site. This can take weeks to straighten out and for search engine spiders to get right.

The problem here usually results from the webmaster putting all the redirects from the old domain to the new domain at once, rather than checking first to be certain the new domain is showing in Google’s index. The main issue that is crucial to understand is that pointing two links that read different URLs to the same content creates duplicate content on search engines, and Google will only show one – usually where it sees the new content first.

Never forget SEO is a “one strike you’re out” environment. Miss any one of these top ten SEO mistakes with a website redesign and it can undo everything else you hoped to accomplish!

 

Clouds SunsetWhen The New Website Goes Live

At the time the website redesign goes live, begin checking your Google cache to make sure the site is crawled properly.  This is critical because you can never assume it is crawlable by spiders with human eyeballs alone until the work is actually crawled and appears in Google’s index.

If you don’t spider check your work, how do you know it works?  You do that by moving your cursor to the right of your Google listing so the “Cached” link appears and click it and check the cached date.  Also, use the “URL Inspection Tool” in Google Console to make sure a page is live, accessible and crawlable by search engine spiders.

The reason this is so important is that when problems arise at this point – and this can go wrong for a wide range of reasons far more frequently than most people realize – Google only gives you so many hours to correct them before you fall out of the rankings.

Google’s algorithms are time based and Google doesn’t send people to websites that don’t work on Google.  It is very easy to time out and fall out of the rankings trying to fix problems that emerge at the point of a new site launch. When that happens, it’s right at the moment when notices have gone out to “Check out our new website” and no one can find it on Google!

If you don’t spider check your work to make sure it is entering the search engines properly, if problems occur and you time out and fall off Google, you lose the ability to spider check your work in real time and figure out what the problem is.

You always must be prepared to revert to the last publishing to restore rankings and traffic first, then try again once the domain is back on the search engines to attempt the new site launch more carefully, making sure any problems that arise are dealt with immediately before you time out again.

Once the new site is appearing in Google’s cache, click back and forth between the “Text-only version” and “Full version” link in the top left hand corner of Google’s cached page.  Is everything humans see in the full version also visible in the text-only version search engine spiders see for ranking? You will only receive ranking for what spiders can see in the text only version.  If any portion of your site’s text is not visible when clicking the “Text-only version” link you need to correct the problem immediately!

Don’t forget to transfer your existing Google Analytics code to your new site to maintain the historical continuity of your traffic data. Google Analytics is one of your most important SEO tools because it’s the only tracking code that tells you how Google sees your site. Together with Google Console that provides you your site’s diagnostics and how it is appearing on Google, these are invaluable for figuring out what happened if you overlooked anything on this checklist and find your site redesign has resulted in a serious loss of rankings and traffic.

You always want to get an SEO involved at the earliest stages of a site redesign to make sure it is search engine friendly from the bottom up. Never forget Google always has the last say whether people view the website redesign as being optimized and building traffic – or devalued and losing traffic.

This is why following Google’s Webmaster Guidelines and knowing what Google is rewarding in the rankings is so crucial to avoid SEO website redesign mistakes that lose search engine rankings and traffic.