SEO ethics involve your willingness to abide by the guidelines for webmasters set by the major search engines. If you don’t follow them there is a good chance that your website will not rank high in a search. In an extreme situation your site might be removed from the index.
White Hat or Black Hat?
âWhite Hat SEOâ means ethical SEO; âBlack Hat SEOâ means unethical SEO; however there is a grey area aptly labelled âGrey Hat SEOâ. Unethical in this context means fooling the search engines or performing tricks that show something to a spider and something else to the human visitor. Some operators do not use such labels but they believe that anything that the search engine does not catch is OK. We do not recommend Black Hat SEO. More on White Hat or Black Hat.
Seeing the big picture
The first task in performing SEO is to understand the end game for the website. What does the customer (or your boss or you) want to achieve?
There is always some reason why people build a website. Whether the purpose is commercial or not a major objective is to attract eyeballs. This is targeted traffic not random people. Typical conversion goals include:
- Sale â Purchase something online
- Complete an enquiry form (lead generation)
- Sign up â Subscribe to a magazine
Website owners always want the best for themselves and they are paying you because you (their SEO) can satisfy their need. You need to know their âmoney termsâ â the keyphrases that will get them the most qualified traffic that will convert into sales. You cannot optimise to reach a specific position e.g. Number 1 rank for everything; you do your best based on your experience and learning but then you have to hope for the best â the customer needs to understand this in a legal context.
The latest SEO slogan is âRanking is deadâ. Some American states are already seeing a version of Google Search with a âCustomised for [location] [state]â result. Behavioural targeting will be more common and it is possible that no two people will see the same sequence of search results. Now this makes it impossible to deliver an SEO service based on rankings but it does not mean that we can ignore rankings. With the right preparation you can get a website to rank higher for numerous keyphrases; therefore the website will get more targeted traffic.
Increasingly websites that contain âengagement objectsâ will do well on the web. These can be one or more features that engage the reader such as video clips podcasts forums polls surveys forms comments ratings reviews and so on. Most sites can and should implement at least one of these.
No two unoptimised sites pose the same SEO challenge. Some contain coding errors that don’t break the page but are nonetheless undesirable as the page may get distorted in a future browser.
The following is an example of code bloat on a popular website that makes the page a little slower to load.
<meta http-equiv="Pragma" content="no-cache" />
<meta http-equiv="Cache-Control" content="no-cache" />
This forces the server to serve a fresh copy of each page even if the visitor has a cached copy in their browser. Why tax the server? I’d omit these lines.
<meta name="ROBOTS" content="NOARCHIVE" />
This tells well-behaved search engine robots to not keep an archive copy of each page. Why? I’d omit this line.
<meta name="COPYRIGHT" content="Â© 1980-2008 ACP Magazines Limited" />
Unnecessary to assert copyright. Mention it in the footer of the home page just once.
<meta name="KEYWORDS" content="XP Windows Linux Mac
Vista OS X iPhone APC Australian Personal Computer Australia
PC hardware dual-boot dualboot tech news" />
Never show keywords that do not appear on the page in question. Limit to about 8 phrases or 15 words. Only Yahoo takes any note of this line but it is necessary.
<meta name="AUTHOR" content="APC Magazine" />
I’d omit this line. No search engine uses it.
<meta http-equiv="CONTENT-LANGUAGE" content="en-US" />
This is not bloat but en-au is appropriate for Australian websites.
<meta name="revisit-after" content="7" />
<meta name="keywords" content="" />
Duplicated empty tags. Remove.
<meta name="description" content="" />
Keep but fill with some compelling description.
This code isn’t âbadâ but wastes time and computer resources on a large site. The ârevisit-afterâ tag is copied blindly by many many ignorant web designers â in fact its presence is an indicator that the site needs SEO. It was created solely by and for a local Vancouver search engine called SearchBC whose spider VWBot_K uses that tag to schedule its crawls of websites within British Columbia. It is worthless for the rest of the world.
You can’t measure progress or otherwise unless you have a starting point and a suitable tool that measures changes from that point.
Page Element Checklist
The following checklist (for each page) is ordered in order of importance. It is not necessary to load the lesser scored items on every page (or contrive to load the heavier items on every page):
Question & Score
- Are there inbound links using keywords as anchor text? e.g. an external page linked to the home page with the text âLondon Accommodationâ is better than the word âClick Hereâ linked to our home page (if we want to boost the ranking for âLondon Accommodationâ. Similarly text in internal pages on our site can have the words âLondon Accommodationâ linking to our home page – only if it looks OK. 10 points.
- Does the Title Element (tag) start with the desired keyphrase for the page? e.g. âWeb hosting specialistsâ suits a page about âweb hostingâ 10
- Does the domain name contain the keyphrase e.g. âlondonaccommodation.co.ukâ is a good* domain name for targeting the keyphrase âLondon Accommodationâ (*requires backlinks to use this keyphrase as the anchor text) 7
- Does each page (or product) have just one URL on the site? 5
- Is there a Heading Level 1 (H1) containing the main keyphrase? 5
- Does the first sentence contain the main keyphrase? 5
- Does the pathname (directory) contain the keyphrase? e.g. /web-hosting/filename.php 4
- Does the filename contain the keyphrase? e.g. /domain-names/renew.php 4
- Are the desired keyphrases proximate to one another? 4
- Are other keyphrases towards the beginning of sentences? 1.5
- Are important keyphrases in italics or bold? 1
- Are other keyphrases present in the body text (not at beginning of sentences)? 1
- Where an image is used as a link are Alt tags present? 0.5
- Does the meta Description tag contain the main keyphrase? 0.5
- Does the meta Keywords tag contain the main keyphrase first followed by others? 0.25
- Are there keyphrases in the last paragraph of the page? 0.2
- Are there outbound links to major authority sites? 0.1
- Are title attributes (not title tag) used e.g. in link img table etc? 0
- Is there a meta title (not Title tag) in the header? 0
- Are there keyphrases in comment tags? (not good) -5
Add your scores!
Finding Hidden Text
Press Ctrl+A to select everything on a page which will show hidden text. Usually such text is at the bottom of a page and a giveaway clue is that there seems to be a large bottom margin. This must be removed.
Ctrl+A reveals the spam.
For basic traffic measurement use the statistics tool supplied in the hosting control panel. If none is installed then you should ask the website owner to install the free Google Analytics available at http://www.google.com/analytics/.
Search Engine Analysis
If you have an existing site you should perform some basic checks on the three major search engines:
- Number of inbound links (backlinks)
- Number of pages in the index
- Ranking for desired keyphrases
Tools for Analysing a Site
The quickest way is to use the SEOpen extension for FireFox 2.0 which has a convenient option called Mass Check then check all three options. If you use FireFox 3.0 use the SearchStatus extension available from quirk.biz.
SEOpen will tell you:
- The number of inbound links (IBLs) but bear in mind that Google shows a tiny sample whereas Yahoo and Live Search show more. This includes deep links.
- The number of pages indexed.
- The number of IBLs pointing to the root domain. This is fewer than the total number of IBLs.
You can also check Google’s Webmaster Central for Google-specific issues provided that the sitemap has been submitted to it.
Now we have some idea of the scope of the problem.
You should certainly identify and study competitor websites:
- Read their robots.txt file to see if they have anything useful in the excluded directories.
- Study their Keywords meta tags as this may reveal what they are targeting and give you some new keyphrases you had not thought of.
- Use the link: operator to find IBLs and their anchor text. Note those sites for contacting later when you need similar links. If you notice a pattern emerging in the anchor text you will know which terms are important to the competitors.
- Take note of internal links and their anchor text.
- View the pages to see if they use tables or CSS-P (table-less positioning with CSS). If they use tables you want to change your client’s site to CSS-P if feasible to make it leaner and improve keyword density and text-to-tag ratio.
- Read the Google cache for their key pages to see if they differ from what you see on the page. If so cloaking could be in use.
- If their web designer or SEO has given themselves a link check out their sites and their portfolio of customers in case you find tactics to copy.
- Don’t assume that all the competitor tactics are helping them to rank. Some might get them demoted the next time Google updates its algorithm.
Fixing Accessibility Issues
While you must address human accessibility errors mainly for legal compliance in many countries you must also consider the accessibility to spiders.
Use a free tool such as Xenu to find broken links on an existing site. Broken links not only frustrate the visitor but will cause the search engines to not see parts of your site. Fix all broken links! More on broken links.
Xenu finds broken links.
Functionality of Interactive Elements
Any content that is behind a search box should be considered for being exposed provided that this is commercially desirable and technically feasible. For example a directory may have millions of records behind a search box and the business managers now want to make them visible via search engines.
This could be a simple matter of removing a restriction in the robots.txt file or an additional reworking of the code to ensure that the search engines can display something that is meaningful to the searcher i.e. not some navigation text.
Speeding Up the Site
Various speed-up tactics can be used to make the site accessible to spiders and humans will also appreciate a fast site:
- Get a faster reliable web server that will have less downtime.
- Use the Firefox 2.0 web browser with the Firebug and Yslow extensions to find speed tips.
Yslow analysis of a page.