SEO

What is Search Engine Optimization or SEO?

SEO is the process of making a website that works for both human visitors as well as search engine spiders that crawl and give your site ranking so people find you on the internet.  SEO is the crucial success factor for internet publishing getting you free, well targeted traffic to your website by getting the top rankings it deserves.

While print publishing is “distributed”, internet publishing is “found” as a result of a search.  There are only ten spots on Page One – and over 80% of web searchers don’t go past the first page of search results making this a very competitive environment.

Most websites have issues on the site holding it back from the top rankings it can achieve.  This is because much of the SEO environment is invisible to human eyes – such as link strength or the use of the correct redirects – and requires special software, knowledge and insight to perceive and fully optimize the site’s ranking potential.

Don’t ever expect your web developer who is primarily concerned with the look and feel of your site and its daily operation to also be a search expert able to keep up with Google’s latest algorithm updates.  Like law or medicine, SEO is very much a “Practice” where insight is gained over time – with an extremely expensive learning curve that can feel like a “One strike you are out” ranking environment!

SEO is a more specialized knowledge and different software base than your web developer should be expected to provide on a high, professional level and keep up with everything else on your site!  Google is constantly changing – it’s a full time job to keep up with the changes which is why Google hired an SEO for their own site.  Keeping up with the latest algorithm updates is more than most web developers can find time to do.

How does SEO work?

It involves doing extensive on-site technical content analysis.  Including keyword research, content development and optimization, technical SEO tweaking of issues like loading speed or internal linking within the site – as well as off-site SEO such as links from other website and listings in important trade directories.

Google publishes webmaster guidelines to help you build a website that works on search engines – which can be very different from what works for human visitors.  Many SEO problems result from Google’s Guidelines not being followed.  Search engine optimization begins with an SEO audit to determine the degree to which your site has been built in alignment with Google’s Guidelines that must be followed to achieve top rankings.

How long does it take to see results from SEO?

That depends upon the specific strategy being attempted.  It can also depend upon how competitive a search term is being targeted and how old your website is.

On-site changes like making page titles, headlines and copy keyword targeted can often show dramatic results on websites right away that have not previously been optimized.  However, the very thing which makes Google the world’s most trusted search engine is that it is also the hardest to spam, and Google doesn’t let you have all your final rankings at once.

With many types of SEO initiatives such as content marketing or link building, the final results often take about five months to fully mature and show in the rankings.

For very competitive search terms, remember you are competing with websites that have been around for a long time building reputation and external links from other websites which is not easy to overcome in a short period of time.  So results can also depend upon how competitive the search term is and what the competition is doing which must be overcome to get their spot on Page One.

How do you track SEO results?

The two most important tools for tracking SEO traffic results and diagnose SEO performance are Google Analytics and Google Console. Both of these programs are free from Google and can be placed on a site relatively easily by your web developer.

These two programs present different types of information and numbers, with Google Analytics primarily focused upon traffic -including traffic from all sources and other search engines – and Google Console being more of a diagnostic program telling how you are performing on Google only.

No matter what other tracking program you have on your site, these two Google traffic and diagnostic programs are the most important for SEO because they tell us how Google sees the site – which no other program can do.

How much does SEO cost?

Every website has a unique history and size with varying degrees of optimization already in place. The single greatest determinate of how much time needs to be spent on your site depends entirely upon how closely Google’s Webmaster Guidelines were followed when it was created. Google publishes these guidelines to make sites that work on search engines and must be followed for a site to reach its full potential.

For an initial SEO Audit and recommendations, think in terms of a $1,000 commitment to get started for ten hours of work on your site at the rate of $100 an hour. How we proceed from there can be determined as we go based upon your specific needs and budget after we cover the most important technical SEO issues in the initial review of your site.

Google never stops changing and the top competition never stops their efforts to get the top ten most valued spots. SEO should be viewed as an ongoing process to meet or beat the competition.

How are SEO and Pay Per Click or PPC different?

One way to think of the difference between Organic Search Engine results and Pay Per Click sponsored advertising is the same difference between the editorial content or the advertisements in the newspaper. Which do you trust the most? How many Pay Per Click ads have you clicked on in the last month?

Pay Per Click is the only way to guarantee a position in the search results. There can be several reasons for doing Pay Per Click, including trying to stay one spot above your competition who is appearing higher in the search results than you for a search of your own company name – because they are buying those search results to get visitors looking for you.

It can also be valuable to fill in the gaps of important keywords you would like to be ranking for in your field – but at least for now – are unable to effectively compete for with your organic search ranking strength compared to the competition.

However, Pay Per Click is expensive, and as soon as you stop paying for the clicks the traffic stops. Pay Per Click requires a lot of attention to stay up to date on because the most money is usually not made with the top bid in the top place or the bottom bid in a lower spot, but a sweet spot in between which can take a fair amount of time to monitor and have situational awareness of.

What’s more, Google admits that over 15% of PPC clicks are fraudulent. Usually clicks made by the competition to run up your costs and drive you out of business.

If you are a teenage boy that just got mustered out of a Ugandan Rebel Army, if you can read English and click links at the rate of one a minute there are “Click-a-minute boy” sweatshops in Uganda fraudulently clicking targeted PPC campaigns because it’s more profitable than working in the fields under an equatorial sun.

Dollar for dollar, organic SEO when properly applied is a far better investment because it lasts year after year and builds upon itself earning trust for your website with top organic rankings.

What is Local SEO?

Local SEO is optimization to come up for a specific location search, such as a restaurant or a type of business in a particular city. Local SEO is a very effective way of reaching local customers and Google’s research says, “50% of consumers who conducted a local search on their smartphone visited a store within a day, and 34% who searched on computer/tablet did the same.” Local search has a direct impact on in-store traffic.

Local SEO involves getting your business listed in Google My Business and making sure your profile is complete and accurate with your name, address and phone number consistent across all local search channels such as Yelp and many others. It involves using local business schema markup to enhance and make your listing standout compared to the competition.

What are backlinks?
How do I choose the right keywords?

The right keywords for you may not be the keywords with the highest search numbers.  And no matter how many years you have worked in a field, what search terms people are actually searching the Internet for versus what people in their field believe their customers are looking for almost always has surprises!

You rarely want to optimize for a single keyword as that is almost always too broad an approach for effective SEO. Keyword phrases are what you want to optimize for.

It can take a site that has been around a long time and gained a lot of external link authority to compete for the most competitive search terms.  However, sites that have not yet achieved that ranking strength can often compete very successfully making a realistic assessment of their ranking ability and target less competitive “Long Tail” searches which are longer with three or more words to the phrase, are more specific and less competitive.

What is duplicate content?

Duplicate content is the enemy of consistent professional rankings! Duplicate content is where more than one version of your content is occurring in Google’s index. When this occurs, search engines have to make a choice between which versions to display because they do not want their search results filled with duplicate versions of the same search result.

While Google says they usually do a good job of determining which version is most important, they warn “It may not be the one you expect.” That’s one of Google’s biggest understatements!

While it’s important to understand that in most instances duplicate content is a filter – not a penalty unless it was intentionally created to spam search engines – it can feel like a penalty when the page you have gotten top rankings with suddenly falls out of the rankings back to position one hundred because Google mistakenly selected the wrong page!

Happens all the time – and usually suddenly occurs unexpectedly when you can afford to lose your rankings the least!

Duplicate content makes being able to achieve consistent professional rankings impossible because it becomes a roll of the dice which version Google selects for rankings – which may be a page with no external links pointing to it versus your usual page which has dozens of links pointing to it which is why it is ranking.

The way to prevent duplicate content of your material entering Google’s index is to clearly understand the danger it poses to Internet publishing and not allow it to exist on your website.

One way to check for duplicate content is to take a unique snippet of text from a page, put it in quotation marks “ “ which means “same words in same order”.  Enter it into the Google search box and see if more than one result shows in search results which indicates duplicate content in the index that can potentially hurt your rankings.

Does website loading speed impact SEO?

Yes. But as with many things regarding SEO, the causes of slow loading speed and how it can effect SEO and rankings is more complicated than just having a fast or slow site.

Google has released several algorithm updates over the years targeting website loading speed. However, Google usually makes pronouncements after these launches clarifying that unless your site is the slowest of the slow, you probably won’t see a drop in rankings. Fast sites won’t see a ranking increase by making their site faster.

However, here are some ways slow loading speed problems often creeps up on people without site owners knowing it.

If you are on a shared server, hosting company’s interests in making more money can sometimes lead to piling on too many users that can peak at certain unexpected times. If you look in your Google Analytics program under Site Speed and see if there are spikes in your loading speed at times there was nothing unusual going on your own site, then perhaps someone you are on a shared server with suddenly got a favorable press coverage that made their usage surge and the server’s load spike.

If this happens occasionally it usually will not impact rankings. However, if you start having sudden unexpected loading speed spikes showing on a regular basis, and on dates you see your search engine traffic tanking the two may be related – and your hosting company probably doesn’t even know it! Tell them and take appropriate action.

Another way slow loading speed problems sneaks up on site owners are not checking their speed and checking it repeatedly more than once as there can often be quite a variance at different times depending upon server loads and how different elements load in the data stream that’s pretty much a roll of the dice and can show a lot of variance.

Google recommends PageSpeed.  Another good loading speed site is Tools. Pingdom.com which provides free cascading speed tests that identify precisely individual element’s loading times such as images and CSS.

Loading speed issues can also occur converting a site to HTTPS with an SSL Security Certificate that requires more server loads to execute and can greatly slow already slow sites. Many sites experience ranking declines after converting a site to HTTPS because the added server loads was not adequately planned for.

What is HTTPS SSL Security Certificate conversion?

The “S” in HTTPS stands for security and its purpose is to make website communication more secure.

On first glance it sounds relatively simple installing a Secure Sockets Layer Certificate (SSL) on your server. And in a perfect world HTTPS has many important advantages for your site.

However, from an SEO standpoint the problem is that converting your site from HTTP to HTTPS means you are discarding every single URL your current rankings are dependent upon and converting that to a new address. In search engine’s eyes this is a completely different new web address which may or may not convert depending upon how carefully the task is undertaken.

Web developers frequently underestimate the complexity of what must done for a site to consistently convert on search engines which a lot of time and attention to details like recoding all former mentions in the coding from HTTP to HTTPS. This can be a very long and tedious task, and if not done correctly the website simply never fully converts on search engines and regain former rankings.

Even if everything goes perfectly expect it to take about two and half months for Google to fully award the rankings and traffic you formerly had, which is why this should be done during low traffic times of year – not during the peak season. Google Console for both HTTP and HTTPS versions must be watched closely to see if the site is actually converting as hoped or not.

This cannot be discerned with human eyes alone. Failure to watch Google Console closely watching the HTTPS sitemap file steadily increase the number of pages indexed, and the main menu item “Site Index” continue to grow and convert to HTTPS can lead to doom!

The things that go wrong regularly with HTTPS conversions is not all coding is rewritten to reflect the change to the HTTPS address, and redirect chains are inadvertently created redirecting redirects to redirects to redirects.

Redirects stop working reliably on search engines after two redirects, and it’s very easy for coders to take shortcuts converting HTTP to HTTPS by creating redirect chains which effectively kills the link juice to that page because spiders can’t follow the redirect chain. HTTPS conversion requires rigorous testing to make sure no redirect chains are inadvertently created.

Because HTTPS requires more “Handshakes” with the server, it increases server loads and can slow down servers not ready for the increase, which if not monitored closely can cause rankings to be lost. Problems with redirect chains and HTTPS increasing server loads show in spikes in your Google Analytics under site speed. Once certain tipping points are met with server loads or the time it takes for redirects to occur, latency sets in slowing everything down exponentially – not arithmetically.

Bottom line, converting HTTP to HTTPS often requires much more time and attention to get everything right and working on search engine than is usually anticipated, budgeted and planned for which can lead to disastrous results.

It’s not as simple as simply putting an SSL Certificate on your server and keep your search engine rankings and traffic!

Should SEO be done in stages?

In most instances the answer is yes, for several reasons. One is that any time you are changing the content of a website you are changing its ranking algorithms – which can be for good or for bad depending upon the results.

Most SEO principles like keyword densities or the appearance of certain phrases in link text can all work in your favor up to a certain point. Past that point – which can often be very difficult to discern in advance – when something gets laid on “Too Thick” such as a keyword density it can work against you in the rankings big time.

When this suddenly happens, it can make your traffic report look like someone suddenly drove off a cliff!

Getting these SEO elements just right is far more nuanced than most people realize and is best done in steps. One of the most important reasons SEO should be done in steps is to check and confirm all assumptions made as to what we think is good for SEO – and then always confirm that by actually checking search engine results.

End of the day, Google always has the last say. Results must be checked because this environment is simply too complex to make many SEO calls with 100% accuracy 100% the time, and if anything doesn’t go as expected it requires a mid-course correction to stay on the good side of these ranking algorithms.

Another reason it should be understood SEO should be done in stages is because SEO is an ongoing practice – something that is never entirely finished on a site. We give it our best shot, we watch the results, fine tune our approach and amplify what’s working best. That’s rarely done in a single setting but should be expected to proceed in stages.

What are 301 and 302 Redirects?

Redirects are one of the most important parts of SEO and understanding the specific commands these two different redirects give to search engine spiders is essential to have a site that works on Google as expected.

301 Redirects are permanent redirects and are the only type of redirect that conveys link juice through your site. Being a permanent redirect, it tells search engines spiders to no longer come back to this address.

If a 301 redirect is applied and then it is decided that was a mistake, it can take considerable time for Google to come back and see the change, disrupting rankings and traffic. Using the URL Inspection Tool in Google Console and requesting crawling can speed this up for Google to see the correct redirect once it is in place.

A 302 Redirect is a temporary redirect and tells spiders this is a temporary change, to keep coming back to the original address. 302 Redirects do not convey link juice. So if a 302 redirect is applied to a page which formerly had search engine rankings as a result of links pointing to that page, and the flow of link juice those rankings were dependent upon is cut off with a 302 Redirect it will destroy the former rankings and traffic those rankings brought.

Misuse of 302 Redirects this way are a common SEO mistake.

Correctly coding redirects are the most important part of SEO because they allow or destroy the flow of link juice through the site rankings are dependent upon. Not understanding this crucial difference between 301 and 302 redirects and getting it wrong can destroy rankings in the blink of an eye. Happens all the time!

It is not possible for human eyes alone to know if a redirect is properly coded and working for search engine spiders as well as people. Always use a redirect checker by Googling: Redirect Checker and be sure to spider test redirects as soon as they go live to make sure you do not do irreplaceable damage to a site’s link profile that took years to acquire.

Never change URLs that have external links pointing to the page without 301 Permanently redirect the former URL to the new URL or you will destroy the link juice that pages rankings are dependent upon. Best SEO practices are to always think in terms of 301 Redirecting old pages, not deleting them to preserve link juice from other websites.

The best sites that link to you put spiders on their sites to spot and eliminate bad links that no longer work if these redirects are not properly checked and codded, which is why redirects need to be checked as soon as they go live or possibly lose valuable link juice very quickly!

When this happens, links that took decades to build can be lost very quickly from the best sites that link to you because they don’t want to lose their own rankings having a site with dead links on it. These links can usually never be recovered and they are often the best most important links to the site!

What are Sitemaps and why are they important?

There are two types of sitemaps:  1) HTML Sitemaps which are often published as a page on websites humans can see and search engine spiders can follow, and  2) XML Sitemaps which are a different format and are filed directly with search engines such as in Google Console.

Sitemaps help communicate to search engines your URL structure and what pages to crawl.  By submitting them you will see in Google Console how many pages of your sitemap are indexed, which is important to study to make sure your entire website is being seen by search engines.

Sitemap protocol is case sensitive. Your URLs and the sitemap must read exactly the same upper and lower case characters or it will return a 404 not found response on search engines.  For this reason, always best to follow sitemap protocol and keep all URLs lower case to avoid errors.

Sitemaps are recommendations, not commands.  There can be a number of reasons why URLs submitted in a sitemap in Google Console are not indexed.  A frequent mistake is filing a non www. sitemap on a website that is www. because these are viewed as completely different websites.

It’s very important to monitor in Google Console if URLs in the sitemap are actually being indexed to make sure the site is working properly for SEO.  This is the only tool that will tell you that and requires filing a sitemap to know if what you hope Google is indexing is actually working on search engines for ranking.

What is White Hat and Black Hat SEO?

White Hat SEO is ethical SEO that follows Google’s and other search engine guidelines. Integrity on the Internet is often defined in terms of if what you see is what you get. Complete transparency and fairness. It’s about not trying to game the system for personal advantage at the expense of others who are playing fairly.

Black Hat SEO utilizes unethical “Tricks” to game the system – like creating unnatural link schemes or hidden text search engine spiders can see but hidden to human eyes – which may produce short term gains and long term damage!

Every time you get one of those “We’ll make you #1 on Google” unsolicited emails or telephone calls the first thing you should do before giving the offer any serious consideration is to try to find them on Google. If they are nowhere to be found in the rankings it is very possible they have been banned from Google for their methods!

Don’t take their word for anything – ask Google if they are able to rank themselves before believing they will get you rankings!

If people promise you pie in the sky offers like “We’ll put your link building on autopilot” they will do far more harm than good to your website. Don’t even think about Black Hat SEO!

Ethical SEOs will report you to Google and feel very good doing it so cheaters are called out who hurt legitimate site’s rankings.

What is Google Panda?

Google Panda was the first of several algorithm updates that Google gave cute animal names for, but could actually have devastating consequences for sites that found themselves on the wrong side of what the update targeted. Panda first rolled out in February 2011 which targeted “low-quality sites” or “thin sites” otherwise known as content farms and scraper sites, with the intention of raising the rankings of higher quality sites.

This update effected the rankings of almost 12 percent of sites at the time, followed by several more Panda updates over the years since. Google provides a list of 23 points it uses to evaluate “What counts as a high-quality site?”  Google Panda effected the ranking of the entire site – not just the individual pages on a site

What is Google Penguin?

Penguin was the next cute animal name algorithm update that rolled out in April 2012 that targeted sites that were using what are now deemed unnatural link building. Before Penguin, sites could get top ranking by getting the most links that read the keyword in the link text.

With Penguin, if you got too many links that read the same keywords in the link text – that were links from websites not related to yours in your own field – the site would pull what was first called by Google an “Over-optimization Penalty”. It was quickly pointed out, isn’t the meaning of optimum “Ideal” and how would you know if you were overly ‘Ideal”? And what would you do to correct the situation?

Once SEOs all over the world put their heads together and compared their findings it was perceived that you should not have more than 5% of links from other websites that are not related to your field and have the same keyword targeted link text. You could have more than this percentage if they were from quality sites related to your field.

This required monitoring your website’s external links with a link checking tool to get an accurate sense of your link profile. If you found links pointing to your site that you wished to disavow and wipe clean from your web profile due to this change in how Google perceived these links, you could do so with the disavow tool in Google Console.

Problem was, for many years, the penalty would only be lifted when the next Google Penguin update occurred which was many months apart. By late 2016 Google announced that Penguin was now part of its core algorithm and so was being continuously updated.

One of the positive effects of the Penguin updates have been to help assure high quality content marketing that people linked to naturally is what is rising to the top of search results instead of sites with spammy links.

What is Google Hummingbird?

The Hummingbird update came in late 2013 which placed a greater emphasis on natural speech queries as mobile devices came to the forefront of importance. This made content marketing which answered the Frequently Asked Questions or FAQs in any given field in the spoken form – such as: “What is …, How do you…, Why are…, Where is…” – much more important to optimize for.

It also changed Google’s ranking algorithms to look more deeply into the structure of a website beyond the homepage which usually has the most external links pointing to it to find the most relevant content of the site overall.

While keyword mentions continued to be important, it paid more attention to natural writing rather than forced keyword mentions by increasing the ability to interpret synonyms and long tail keyword phrases.

Is SEO ever finished?

Simple answer is no. Google isn’t going to stop evolving anytime soon and if you ignore Google’s algorithm changes Google will ignore you!

Its not that Google is making this intentionally hard for you. Google is constantly trying to make its rankings better, the competition is constantly working on their sites to climb in the rankings, so what works today simply may not work as well tomorrow.

Every website is a work in progress never fully finished and SEO is the same. In fact, search engine algorithms reward you for improving your content and SEO, making ongoing continuous efforts a key ingredient of successful SEO.

SEO has become increasingly competitive over the years, and because there are only ten spots on Page One for any given search result, if your competition is continuing to improve their SEO or over time, you need to be too or you will lose your rankings to the competition who are steadily improving their SEO and keeping their pulse on what Google is rewarding with top rankings.