When you are doing search engine optimization there are basically only three universal rules you need to keep in mind, while designing, developing or generating content:
- seo rule #1: use your head!
- seo rule #2: follow the standards and specifications.
- seo rule #3: search engines are reading the text and the structure of your web pages – and nothing else
If you want to learn more about the three most important rules of search engine optimization check out this article: the 3 golden rules of search engine optimization (seo).
But in the end you can boil down the much glorified discipline of search engine optimization (seo) to a list of rather simple web development tasks you need to execute bit by bit in order to get good results.
There is no such thing as “the ultimate SEO trick” you need to implement on your web site in order to improve your search engine ranking.
But every little fragment from the SEO-checklist below will improve your chances for a higher search engine ranking.
No invalid HTML elements
Try to keep your invalid HTML to a bare minimum in order to ensure that everybody is interpreting your HTML code the same way
(You can use a Firefox Add-on for this task: Html Validator for Firefox).
If you have HTML Validator installed you will see a tiny icon in your Firefox status-bar that indicates whether the HTML of the page you are currently viewing is valid or not.
You will be surprised how many big sites have invalid HTML code on their pages.
And even though it is not absolutely necessary to be 100% W3C compliant, it is the “zero errors / zero warning” message from HTML Validator
you should aim for – but warnings are not are not pretty, but not so bad either.
But I must admit that this more of an “web designer’s code of honor” thing, than it is a real factor for a good search engine ranking.
→ Valid HTML code is more of an issue for the usability and design of your web sites.
Interesting links about the impact of valid HTML code on the search engine ranking:
Assign descriptive and meaningful ALT texts
Make sure all
<img> tags (as well as
<area>-elements) are equipped with a descriptive and meaningful ALT text that supports the goals of the web page by adding some useful information.
But also keep in mind that the text assigned to the ALT attribute should not exceed 100 characters.
Don’t worry too much about invalid HTML attributes
Invalid or unsupported attributes don’t hurt your search engine ranking (e.g. “autocomplete”), but try to avoid them as well if possible.
Run an accessibility checker on your web pages
Install the WAVE (web accessibility toolbar) Add-on for Firefox and run an accessibility test. The tips the tool gives you are generally good points to start your search engine optimization with. Many of the criteria applied by accessibility checkers are also relevant for SEO.
An besides, the tool offers some great features (Structure/Order view, Text-only view, disable styles & document outline) we will reuse for the search engine optimizations tasks further down this checklist.
Create a list of keywords you would like to optimize you web pages for.
What the SEO keyword-list should capture is:
- The priority of the keyword or search query (0 is the highest).
- The keyword itself (e.g. “Product XY”).
- The current search rank (Google should suffice for starters).
- The current rank in image searches (if the respective search is relevant for image-searches).
- The URL of the desired landing page for this keyword.
- The URL of the actual landing page (if your web site is in the search index at all)
- One or more categories which helps to group similar search queries
The list of keywords for your personal search engine optimization process could look something like this one:
And will do two important things for you:
- Write down the status quo of your current search engine rankings.
- Define a set of goals you want to accomplish during the process of optimizing your web site for search engines.
Note: Whenever you are modifying content on a web page, changing the meta-description of a document, assigning a new document title or creating a new web site structure you should keep this list in the back of your head.
With this list on your desk you will now be able to identify the starting position for your search engine optimization efforts. Order the keyword list by your personal priority and then open a clean Browser instance (no cookies, no cache), Google the term and write down on which position you are listed for each of them, and the URL of the page that is indexed.
Make sure you are not logged in and you clear your cache and cookies before you perform a new search, because otherwise Google might just show you personalized search results.
If your desired page does not show on the first five or ten pages, you can see if your page is indexed at all by entering a search query with this syntax into the search field:
Write down “not indexed” next to your keyword and move on to the next phrase.
If your web site is not listed for one or more terms, Don’t Panic. We are here to fix that.
If you have the feeling that the information on your web pages will not suffice to attract new users from search engines, considers writing some new content that relates to the keywords you are trying to optimize your web site for right away. The time you invest into writing new and unique content is always well spent, because search engines love new content (almost regardless of the structural problems you might have on your web site that normally would scare away search engines).
Creating such a SEO keyword-list is a lot of work and certainly not easy – you have to think hard about every single keyword and search query on it. But if you don’t create this list, you will have no base to evaluate the success or failure of your SEO actions. And your content editors have no guidelines they can follow while creating new content.
Train your content editors
In order to optimize you content for the list of keywords you have just created for your web site you must make sure that every single content editor and author on your web site knows about your SEO keyword list and understands how they can adapt their writing for reaching your SEO goals.
Optimize your content for the target list of keywords:
Inform your authors and editors about your keyword-priorities and make sure they have time to create good content – it is worth it.
Tips & tricks for creating SEO friendly content:
- Make sure all images have descriptive ALT texts
- Go through your SEO keyword list and check if the text mentions the relevant keywords often enough and if there is some good information attached to them.
- View the pages in “plain-text-mode” and check usage of headlines makes sense and forms a coherent outline (e.g. “One H1-tag, followed by few H2-tags; explaining the pages’ topic starting with general information and then going over to more specific information.”)
Optimize your web site’s search result presentation
Once you’ve made the leap into the search results of Google, Yahoo and Bing, you must make sure that your web pages are displayed in an optimal way. Even if you are displayed within the first ten search results for a specific keyword, you can still lose a lot of clicks by screwing up your web site’s search-result signature:
The search-result signature of a web page must convince a web user within milliseconds that the current web page is delivering the information he or she is looking for.
The signature is generally defined by three basic information items every web developer, editor or author can easily influence:
- The page title
- The META-description text
- The URL of a web page
Title tag optimization
The text you put into the
<title>-tag of your HTML pages has a tremendous influence on your Click-Through-Rate (CTR), because
the document title is always displayed at the same prominent position in the search results (for all search engines).
A document with a title that contains the keyword the user has searched for (and includes the keyword in the META-description text) appears automatically much more reliable than a result that is ranked a little bit higher, but doesn’t name the search-term in the title and/or the META-description text.
As you can see in the this screenshot of a Google search result page, it is important that you include the most important keyword of a web page into its title tag, because this is the first thing web users will notice while scanning through the search results – pages which don’t include the users search query in the document title appear less relevant (even though Google placed them into the first ten search results):
Tips for SEO-friendly title-tags:
- Mention your main keywords.
- Don’t include text which does not serve the goals of your SEO keyword-list – they would just create noise and distraction.
- Don’t use pointless marketing gibberish – use descriptive words and phrases that help users to understand the focus of the page.
- Keep your title text short. On search result pages there is only space for about 60 characters. So choose your words wisely.
- Keep in mind that many users will judge the relevance of your page by by their title text. Inform them and don’t scare them away.
- Don’t cheat! If the content of your page is about something totally different than announced by the title, your page will certainly not show up on the search results.
The meta description text (often referred to as “snippet”) is the second important component of your web pages’ search result presentation.
You can use the meta description to write a short and meaningful summary of your documents’ content; giving search engines users all the information they need to decide whether the current page might hold the information he or she is looking for; or not.
Considering that your Click-Through-Rate (CTR) might drop considerably if you don’t manage to interest users for your content, you should think twice about what text to put into a web pages’ meta description.
Meaningless marketing claims such as “Get it now!”, “Best Product ever”, “Go green” might work for TV spots, but certainly not on a search result pages on which a users is deciding within milliseconds whether a page looks promising or not.
Tips for creating useful meta-descriptions/snippets:
- If it is a product page, describe the product and its core-features.
- If it is an article, tutorial or a help-page – write a short summary highlighting the text’s quintessence.
- Include the main keywords from your SEO keyword list for the page.
- Be frugal with your words: A short description shouldn’t be longer (but also not shorter) than a Twitter message (approximately 140 characters; about 25-30 words).
Use human readable, search engine friendly URLs
URLs are an important part of your documents and shouldn’t be generic. This is both, bad for your users and will hurt your search engine ranking (→ text is king!).
Make sure all the URLs you want search engines to index have a clear hierarchy, are human-readable, are descriptive and include some of your major keywords you want to optimize your site for.
I think I created a quite sensible and SEO-friendly URL structure / architecture for my blog here:
- /andykdocs/document/list/by-category/Knowledge Base
- /andykdocs/document/list/by-author/Andreas Koch
- /andykdocs/document/list/by-author/Andreas Koch/page:2
- /andykdocs/shortmessage/Andreas Koch/2010-08-16/FlGGf
Maybe you can find a similar URL format for your web site; one that your visitors can understand and which helps your SEO goals.
Tips & tricks for SEO-friendly URLs:
- Create a “URL Format Guide” for your developers and content editors, so that they are aware of the overall URL policy of your web site. → It is quite hard to revise mistakes in your URL structure once they are established.
- If you must change your URL format, make sure that you have redirects in place which lead returning visitors and search engines to the new URLs.
- Try to avoid URL parameters, except for internal search engines (e.g. “http://www.example.com/file.aspx?id=33142", “http://www.example.com/document.php?alias=lorem-ipsum"). They are not descriptive enough and are often the cause for problems with duplicate content.
- Keep your URL structure as simple as possible → keep the URLs as short as possible.
- Use hyphens “-” to separate words from each other (underscores “_” are easily mistaken for a simple space “ “).
- Try to reuse some of your SEO keyword-list terms in your documents’ URLs → URLs are the third pillar for your site’s search result presentation.
- Read what Google has to say about SEO-friendly URL formats: Google Webmaster Central: URL Structure
Optimize your heading tag usage
Making sure that your web pages are clearly structured with semantic markup (e.g. heading elements:
<H2>, .. ,
<H6>) is of great importance for both, your normal web users and for search engines.
Using heading elements as semantic markup for your documents, helps web users and search engines understand the structure of your content better, than without these semantic markup code.
But you must 100% confident that you are using the heading elements properly.
Because otherwise you will only confuse your visitors and the search engines:
- You should have only one
<H1>heading tag per page (you could have more than one, but there are not many cases in which this would make sense).
- The text of the
<H1>heading tag should be similar to the contents of the page’s
<H1>heading tag should summarize the subject of a web page (try to include one of the keywords from your SEO keyword-list for that specific page).
- Use informative text for your heading tags → marketing gibberish won’t help.
- Keep it short! Don’t stuff too much text into head heading tags.
- Use the secondary heading tags (H2-H6) to create an outline for your web page which helps web users to understand its structure and find the information they searched for.
- Don’t use too many heading tags (H2-H6), because this will weaken their relevance for the document’s structure.
You can easily validate your document’s outline if you have the WAVE toolbar for Firefox installed.
Just click on the “Outline” icon and you will see the outline of any given web page:
Tips and tricks about using heading tags on your web pages:
Another useful hint you can give search engines about your web pages content language. Since most web users will expect search results which are in their respective mother tongue, the content-language can be used by search engines to decide whether your content is relevant to a user with certain language-preferences in their browser, or not.
You can specify your web pages language using three different techniques:
- The “lang” and “xml:lang” attributes of your html tag:
<html lang="en" xml:lang="en"> ... </html>
- You can add a META tag specifying the language of the web page:
<meta http-equiv="Content-Language" content="en">
- And/or you can specify the content-language in the HTTP header that is sent with your web page:
HTTP/1.1 200 OK Content-Language: en
The language codes you must use for the all three types of Content-Language flags must comply with the ISO 639-1 Standard:
- English → en
- German → de
- French → fr
- Swedish → sv
But you are allowed to specify a list of content languages, if your content is mixing multiple languages:
<meta http-equiv="Content-Language" content="de, en">
My advice is to set the content-language in the html tag and with the Content-Language META tag if you can, because search engines might make mistakes detecting the language of your content.
But don’t worry if it is too much work or just not possible for you to specifically name the contents’ language – I think search engines don’t care as much about the Content-Language nowadays, because they are being fooled to often and are able to detect the language themselves.
Watch this video to get an impression about how advanced Google’s translation capabilities are:
→ But it certainly doesn’t hurt your SEO ranking if you specify your content-language.
Links about the Content-Language meta information:
- Wikipedia article: ISO 639-1 Standard Language Codes
- W3C: HTTP and meta for language information
- Read Google’s Webmaster central article on “Multi-regional and multilingual sites“
- Google Webmaster central on “Geo targeting“
- Google Webmaster Central Blog: Unifying content under multilingual templates
Use descriptive anchor texts
In order to inform search engines about the (internal) links you are setting to other parts of your web site, you must use anchor texts which actually describe the content of the document the link is pointing to.
This seems obvious, but is often enough done wrong by web developers and content editors – and you will certainly find more than one bad example on this web site.
Here is a list of commonly used but very bad examples of anchor texts which don’t give search engines any clues about the content found at the link’s destination (because unlike humans, search engines cannot understand the context in which the link is being used):
<a href="..."> Read more </a>
<a href="..."> Click here </a>
<a href="..."> Next </a>
<a href="..."> Video </a>
Tips for writing anchor texts:
- Summarize the content of the link’s target document (or at least give hints about the content).
- Don’t use generic text like “learn more” or “click here”.
- Use some of the SEO list keywords that the target document is about.
- Be consistent with your anchor texts: Always use the same anchor text for a link (if feasible) → this increase the relevance of your (internal) links.
→ Basically you can apply the same rules to writing anchor texts than would apply for headings.
Make use of internal linking
Taking into account how search engines work, you must set internal links to the documents you want to promote from as many pages of your web site as possible.
→ The reputation/weight of a certain web page grows (in comparison to all other web pages within your site), the more often this specific page is linked from other pages on your web site.
Tips & Tricks for improving your internal linkage:
- If you are optimizing your web site for five specific products, link these products from as many pages as possible.
- The higher you move these links up in your source code, the higher the ranking will be.
- Make sure your name these internal links consistently throughout your web site (↑ Optimizing anchor texts).
- Don’t cheat with internal links! Don’t think you can put your whole site map on every page of your website in order to increase your internal linkage. Search engines will only follow the first hundred or so links on every page – so many of your links might be “overseen” if you have to many of them. Or even worse, the search engine might even detect that you are trying to cheat and bans your site from the search index altogether. → when placing additional links on your web site in order to increase the internal linking, you need to find the right balance between what is beneficial for your search engine ranking AND for your users.
This way search engines can crawl your page, flow out through these links, index the linked pages and relate them to your web site.
On the other hand you can also use this technique to keep certain text snippets or links away from Google’s search index.
One scenario might be a link you want to have on all your web pages that leads the user to a login-page or an area of your web site which requires the user to be logged in.
These links are not relevant to search engines and hence don’t need to represented in the HTML source code of your web site:
In this concrete example (↑) I do have the problem that Google is trying to index these login-page links and reports “Soft 404 errors” in the Google Webmaster tools “Crawling errors”-section – even though I do have a meta noindex on these pages:
function generating this link:
Hidden Links & Hidden Content
If you have HTML elements (especially links) which you hide via CSS (
display: none) you can be sure that the Google Bot will follow them. → Just be honest and don’t be sneaky about your hidden text. Either you have a good reason to hide certain elements on your page or you don’t.
Must-reads about hidden content:
- Google Webmaster central article on Hidden text and links
- Google’s Matt Cutts blogging about Hidden links
Avoid broken links – keep your content up-to-date
Removing or fixing broken links is more about keeping the user experience of your site intact, than it is about improving your search engine ranking.
- Users reading an article, product description or blog post on your web site that contains broken links, you might destroy the visitors’ user-experience of your web site and he or she might never come back.
- If you have many internal links that are broken, that will certainly not help search engines crawl your web site.
- If you have many outgoing links on your site that are leading to non-existing pages, this won’t hurt the ranking of the page containing the broken links; But won’t help either. And they will most certainly create a poor user experience.
- If you have many incoming links to your site which lead to a 404-page you should consider creating a redirect for these broken links – because otherwise you are missing a lot of potentially valuable traffic.
There are a lot of crappy tools around the net to check your web sites for broken link, but none of them is really good (at least I don’t none any of them).
My advice is to use the “Crawl errors” section of the Google Webmaster Tools.
In the Google Webmaster tools Crawl errors section you can easily see, when a broken link was detected by the Google Bot and from which (and how many) pages this link is referenced:
And you don’t have to actively run a tool which takes hours to crawl all your web pages, looking for broken links – every week or so.
→ The Google Webmaster Tools are just great!
Prevent search engines from indexing pages which should not be indexed
On (almost) every web site there are certain pages which are available to everybody, but you don’t want them to appear in the search results before your actual landing page when somebody is searching for your brand name. Or you have pages on your site which you know, you don’t want them indexed:
- Unpublished web pages
- Certain types of marketing campaigns
- Legal information
- Error pages
- Maintenance pages
Don’t hesitate to disallow search engines to crawl these pages or areas of your web site if you think they will hurt your SEO goals.
My site here isn’t a very good example but my robots.txt looks something like this:
And here is another example where I use the “noindex” META tag element to prevent search engines from indexing my login pages:
Useful articles about robot exclusion:
- A Standard for Robot Exclusion
- Block or remove pages using a robots.txt file
- Using meta tags to block access to your site
Forget Valid HTML and Content optimization in order to improve your search engine ranking – Duplicate content is the worst that can happen to your web sites ranking. Not because search engines will ban you from their index if you have a lot of copied content on your web site, but because the content of your web site is generic and therefore not interesting to users and search engines.
- Dilution of back links
- Google picks the wrong “right” URL
- Crawling gets inefficient
- Lose reputation
But don’t panic, if your site is not a spam site you probably won’t be
punished by Google, Bing or Yahoo if you have for example the same product description
than many other sites. But your chances of being found will of course go down, just because
there are more pages which fit to the same search query.
If you are an e-commerce web site that has almost the same product description
as many other web sites, you shouldn’t make it worse by creating duplicate content on your own web site.
Links and articles explaining what duplicate content is:
- Understanding how Google handles duplicate content
- Bing Community: Optimizing Your Very Large Site For Search
Don’t place the same content under multiple domain names (Addon domains)
Registering your web site under my different domain names won’t help search engine ranking! Search engines will consider the web site as duplicate content and will link the “wrong” version and/or rank them lower than necessary.
Useful articles and videos on duplicate content:
Redirects / There can be only one … URL to a document!
Use redirects to normalize (or canonicalize) your URL structure!
Make sure you have all the redirects in place that you need to rigorously enforce your web site’s URL architecture – you cannot allow any exceptions to your URL structure/architecture and there can certainly be be not two URLs to the same resource or web page.
But you can’t always control how people link to your web site, and so you must anticipate the most common errors and prepare permanent (301) redirects that lead these users (and search engines) to the correct URL.
And why would you do that? Your Content Management System probably show the correct page either way.
Yes, you are right, web users don’t care if they can access the same content under two (slightly) different URLs.
But search engines do care!
When search engines see two URLs and get a HTTP 200 status code, they will “think” they have two different pages at hand.
Eventually they will discover that both URLs lead to the same content, but until then damage might already been done:
- 180 external sites linking to your top-seller product page with this URL with a trailing slash (/):
- 250 external sites linking to your top-seller product page with this URL without the trailing slash:
- 120 more external sites linking to your top-seller product page without the www sub-domain in the URL because you forgot to create a non-www to www redirect:
- And 10 external sites linking to your top-seller product page with the non-human-readable, internal URL version using product IDs:
And all the reputation and trust your page got from these external source will now be split into 4 pieces – just because you forgot to install proper redirects.
→ What’s better? Having four links in the top 40 search results or having one link on the first page?
Helpful links and resources about creating redirects:
- Apache Module: mod_rewrite
- ISAPI_Rewrite 3 - Apache .htaccess mod_rewrite compatible module for IIS
- www or non-www: Setting your preferred domain in the Google Webmaster Tools
Make use of the Canonical Link Element
The canonical link element is a rather new method to reduce the number of duplicate content issues by informing search engines (Google, Bing and Yahoo) about the correct URL of a web page.
An example usage for canonical link element could be the landing page of your web site. If you are using campaign ids to track the success of a
promotion or your advertisements you will certainly find several different URLs to your web site on the web:
… and many more. And even though you know that all these links are linking to the very same web page, Google and other search engines will have a hard time figuring out which URL is the official one.
But you can ease the duplicate content pain, by giving the search engines a hint which one of the URLs you chose as the canonical one:
<html> <head> <link rel="**canonical**" href = **http://www.example.com/start** /> </head> <body> ... </body> </html>
→ Choosing one URL as the canonical will help unify scattered reputation of all the links to your web page.
Read more about the Canonical Link Element:
- Google Webmaster Tool Video about the Canonical Link Element
- Google Webmaster Central Blog post: Specify your canonical
Avoid pages with similar content
If you have two web pages which (almost) identical content you should ask yourself what the hell went wrong:
- If you are running a blog, there is no valid reason for repeating yourself.
For example: Make sure that printable versions of your documents are marked with a NOINDEX (or a canonical link element), so that they don’t compete with your main content.
- If you are selling products online which are quite similar to each other, you should work out some helpful text which specifically points out the differences between the two products (instead of just repeating the description of the first one) – your customers will certainly value you efforts.
Again, the danger with duplicate (or similar) content is not that you will be actively penalized by search engines, but you will loose the control over which version of a page search engines will keep in their index – because they will certainly only keep one version if they think two pages are pretty much alike (→ dilution of your reputation).
Watch these Google Webmaster Central YouTube Videos to understand what duplicate content is and what it is not:
Give your content editors time and clearance for coming up with good text!
If you are running a commercial web site and you only invest in your platform and don’t invest into your content you are doomed (unless you are Apple; they can sell about anything).
And make sure they know the keywords you are optimizing your site for.
Improve your Website’s Performance
Increasing your web site’s performance is another component of search engine optimization which will not directly translate into a better ranking, but will increase the usability and help lower your bounce rates.
Nowadays, web user aren’t willing to wait for slow web pages. If you web site didn’t render within the first 2-3 seconds, chances are high that visitors will leave immediately and try the next search result.
But as Matt Cutts explained in a recent Google Webmaster Central Help Video (Should I be worried about ad servers affecting my site’s speed and ranking?),
you don’t have to worry too much about your web site’s performance, because it is just one of many other factors which influence your search engine ranking. Optimize your content first and your performance second.
But read Google’s official statement on site speed and decide for yourself if it worth the trouble for the search engine optimization of your web site:
Using site speed in web search ranking.
If you decide to invest into your web site’s performance you can do so by working off the issues reported by Yahoo! YSlow and Google Page Speed (both tools are add-ons to Firebug for Firefox).
Both, YSlow and Page Speed, are really great tools for analyzing your web site’s performance and will generate a list of actions you can take in order to improve the performance.
And for starters you can read Yahoo!’s Best Practices for Speeding Up Your Web Site and Google Page Speed’s Web Performance Best Practices. The YSlow and Page Speed performance guides will help you
understand which actions can be done to improve a web site’s performance.
My advice for web site performance optimization is:
Go for the low-hanging fruits first!
On the performance-optimization lists created by Page Speed and YSlow are always items that can be easily implemented and will help your web site’s performance tremendously.
Measure the performance of your web pages (page load time)
First of all you need to identify your web site’s “performance status quo”. You might think that your web site is fast enough, but you need to know it for sure. You must measure and write down the actual page load times in order to improve them – or even begin to understand that your web site might be too slow.
So, how is the “page load time” defined?
And the answer is straight forward: “Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser.” (source: Google Webmaster Central - Site performance)
There are a couple of great tools that can help you to measure you web site’s page load times (plt) on per-page basis:
Of course does the almighty Firebug have the ability to measure page load times. Just install it into your Firefox instance, click on the little “bug” in the lower left corner of your browser window, click on the “Net” tab, select “All” and see the page load time in seconds on the network activity monitor.
HttpWatch [Firefox, Internet Explorer]
Another great tool for measuring your web site’s performance and page load times is HttpWatch. HttpWatch is an integrated HTTP sniffer for Internet Explorer (IE) and Firefox that allows you, just like Firebug, to measure the page load times of your web pages.
Chrome/Webkit Developer Tools [Chrome]
The Google Chrome Developer Tools (DevTools) have quickly become my second favorite tool suite for web development. The Chrome DevTools have a similar network activity monitoring feature like Firebug has which allows you to easily measure the page load time of your web pages.
Online services for web site speed Analysis
And there are numerous online tools which allow you to enter your web site’s URL and let them measure the page load time of the URL for you:
These tools and services for measuring your web site speed help you get an outside view on you web sites performance, but cannot substitute a detailed performance analysis.
Choose a tool that you like (Firebug, HttpWatch, Chrome Developer Tools, …) and write down the page load times for every page type of your web site. Make sure you don’t have the page in your browser cache while testing! And test at least one heavy (content rich, many images) and one light (mostly text) version of each page type and write down their page load times into an Excel sheet like the one displayed below (↓):
If you want to know the average page load time of all your pages within a web site, and you are too lazy too measure them yourself, you can just use the Google Webmaster Tools’ “Site performance” feature which show you a graphs of your web site’s average page load times over the past few month:
And if you wonder how Google is collecting the page load times of your web pages (I would have guessed that the data is coming right out of the Google Bot, but apparently that is not the case) – they are collecting the performance data from users using the Google Toolbar who have the Page Rank feature enabled (source: Google Webmaster Central - Site performance).
Blog posts, articles and videos on the impact of page load times on the search engine ranking:
- Using site speed in web search ranking
- Google Webmaster Central Video-Answer: Do site load times have an impact on Google rankings?
Measure the size of your web pages (page weight)
Another important factor to your web site’s performance is the size of your web pages. Even though there isn’t a hard limitation a web page shouldn’t exceed, you should consider your average visitors bandwidth and keep in mind that every extra kilobyte will slow down your page load times.
Considering the the theoretical download time (in seconds) for only 500 KB, you should be alarmed if your pages are heavier than one megabyte (> 1MB). Everything below 500 KB should be fine. But in terms of page size, less is always better.
Optimize your Images
The biggest impact on your page size can be achieved by optimizing the image usage. On many web pages, image are the biggest chunk of data that needs to be downloaded by web users.
- Make sure your PNGs are optimized for the web – unoptimized PNGs can be extremely large.
- For design elements: Use PNGs instead of GIFs
- For photos or screenshots: Use JPEGs instead of PNGs if you don’t need transparency.
- Optimize your JPEGs for the web: A JPEG-Quality level of 70% is in many cases good enough.
→ find a balance between file size and quality.
- Use image optimization tools to reduce the file size of your images. For example you could use Smush.it – Smush.it is an online service from Yahoo! which applies a number of clever image optimization techniques on your images that are capable of reducing the file size of normal JPEGs and PNGs considerably.
Enable Gzip Compression
Activating the GZIP compression for static content (*.html, *.js, *.css) reduces the page size between 50 and 70 percent – and is therefore one of the best things you can do to reduce the size of your web pages. Its just a configuration you have to active on your Apache or IIS web server and it will just work.
If you want to learn more about GZIP compression for web server you can read these articles:
- How To Optimize Your Site With GZIP Compression
- Yahoo’S Best Practices for Speeding Up Your Web Site – Gzip Components
- Google Page Speed: Performance Best Practices – Enable compression
- Apache HTTP Server Version 2.0 Documentation: mod_deflate
If you have the Gzip compression activated, this won’t help as much as turning the Gzip compression on, but is a good thing to do anyways.
→ Every little bit helps.
- YUI Compressor
- Yahoo! UI Library: YUI Compressor for .Net
Use browser caching – Set proper expires headers
Leverage browser caching by setting the expires headers of your content wisely.
If you are worried that some users might see a skewed up version of your web site, after you deployed an UI update, because they have an old version of the styles or scripts in their browser cache – you can file names which are a hash value representation of that files content.
Links about browser caching and expires headers:
- Page Speed Performance Best Practices – Leverage browser caching
- Add an Expires or a Cache-Control Header
- Cache Control Directives Demystified
Inform search engines about your content
Normally you don’t have to actively inform search engines about your content – if your web site is linked anywhere
on the web and you have a good internal link architecture on your site search engines will find their way to your web site and crawl all the links they can find from the linked page, sooner or later.
If you don’t want to take your chances and (maybe) wait weeks for a search bot to index your site, you can give the search engines a helping hand
and speed up the indexing process.
Create a XML Sitemap for your web site
XML sitemaps are an easy, machine-readable format for listing all the links on a web site.
All you need to do to add XML sitemap support to your web site is:
- Create an XML file called “sitemap.xml” in your domain-root (e.g. “http://www.example.com/sitemap.xml")
- Fill it with your links following this syntax:
<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>http://www.example.com/index.html</loc> <lastmod>2005-01-01</lastmod> <changefreq>daily</changefreq> <priority>0.8</priority> </url> <url> <loc>http://www.example.com/product-xy.html</loc> <lastmod>2005-01-01</lastmod> <changefreq>monthly</changefreq> <priority>0.9</priority> </url> <url> ... </url> </urlset>
When you are done the XML sitemap could look something like this:
The Sitemaps protocol is currently supported by Google, Bing, Yahoo and Ask (Although, I am not sure if Yahoo really got it right, because sometimes they are returning the XML sitemap itself in their search results).
But note, XML sitemaps aren’t the ultimate way of announcing your content:
(At least) Google doesn’t guarantee that they are indexing all the links in your XML Sitemap, so you should also have a HTML sitemap that is both, useful for you web users and search engines.
Link the HTML sitemap in footer of all your web pages and search engines will certainly crawl all the links listed in your HTML sitemap (if there aren’t too many).
Resources and articles on XML sitemaps:
Increase the number of external links to your content
Search engines will not automatically crawl all the links you specify in your XML sitemap.
To increase the chances of your web pages of being crawled by search engines, you must have as many as possible external (trusted) sites linking to your content.
Let the search engines know that you have created new content
If you already have an XML sitemap and you want the search engines to crawl your recently published content on your web site, you can do so by actively submitting (ping) your sitemap URL to the different search engines.
Example URLs for submitting a XML Sitemap to the different search engines:
Submit a sitemap to Google:
Submit a sitemap to Bing:
Submit a sitemap to Yahoo:
Submit a sitemap to Ask.com:
Help pages on how to submit XML sitemaps to search engines:
- Submitting Sitemaps to Google
- Submit a Sitemap to Bing
- Submit a Sitemap or Webpage to Yahoo
- Submit a Sitemap to Ask.com
Monitor your SEO-Status
In the last couple of year, all major search engines (Google, Bing, Yahoo) have created several opportunities for web masters to
get some analytical data out of their databases (→ Search Engine Analytics / SEO Traffic Analytics).
And in order to really understand how your web site performs on the different search engines you must take advantage of these tools.
Make use of the webmaster tools provided by the Google, Bing and Yahoo
Google Webmaster Tools
The Google Webmaster Tools are a great tool for monitoring your SEO efforts and contain a variety of really useful features:
- The “Manage XML sitemaps” feature lets you add, remove or re-submit XML sitemaps.
- The core strength of the Google Webmaster Tools is the “search queries” feature. The page provides detailed information about the search queries that included pages of your web site:
- You see all search queries that returned at least one of your web pages in the result set (→ Queries).
- You can see all of your pages that were returned for a specific search query and their approximate position in the search results (→ Pages per Query).
- You can view the number of search queries that returned at least one of your web pages (→ Query count).
- You can see the number of clicks you got from a specific search query (→ Click count).
- You can see the average index position of your web pages for a specific query (→ Average Position).
- And you can see the number of times pages from your site were viewed in the search results (→ Impressions).
- You see all search queries that returned at least one of your web pages in the result set (→ Queries).
- The “Links to your site” feature shows you all external links to your web site that GoogleBot knows.
- The “Keywords” page lists the most significant keywords Google found when crawling your web site.
- The “Internal links” page lists all internal links of your web site.
- The “Subscriber statistics” page lists all your web site feeds (RSS, Atom) and shows you how many people have subscribed to these feeds (via Google Reader, iGoogle and Orkut).
- The “Malware” section notifies you if Google detected malware on your website.
- The “Crawl errors” section informs you if the GoogleBot had problems with some of your web pages. → this is an exceptionally useful feature, because the GoogleBot sometimes “sees” more than you or your other tools do.
- The “Crawl statistics” section show you charts on the number of pages crawled per day, the amount of kilobytes downloaded per day and the time spent downloading a page. → These crawling statistics can give a good impression on how active the GoogleBot is on your web site.
- Another extremely useful feature hides on the “HTML suggestions” page of the Google Webmaster Tools. On this page you will find hints about problems (duplicates, too long texts, too short texts, non-informative text) with your meta-descriptions and title-tags.
- The very last useful feature of the Google Webmaster tools I would like to mention here is the “Site performance” page. On this page you can see a chart which shows the average page load times of your web site for the last six month.
Settings and administration
- If Google generated sitelinks for your web site you can view them in the Sitelinks-section and block them if you don’t want certain pages to be listed as a sitelink.
- The “Change of address” feature lets you move your whole web site to another domain.
- You can set a geographic target (e.g. Germany) for your web site.
- You can choose your web site’s preferred domain name notation (e.g. “www.example.com” vs. “example.com”)
- The Google Webmaster Tools allow you to set a custom crawl rate for your domain, if you the GoogleBot overstrains your servers.
- The “(URL) parameter handling” section lets you define URL parameters which should be ignored by Google.
- “Test robots.txt” lets you check your robots.txt for configuration errors and shows you when the robots.txt was last downloaded.
- The “robots.txt generator” helps you with your robots.txt file configuration.
- The Google Webmaster Tools have a component for managing your “Removal requests”. If you want a URL removed from the Google index you can submit the URL here and monitor the progress.
And for every single feature you will find a ton of examples and explanations in the Google Webmaster Tools Help Center.
The only big disadvantage of the Google Webmaster Tools is that they only provide data for the last month. So don’t forget to download the data every month!
Bing Webmaster Tools
The Bing Webmaster Tools are not as good as the Google Webmaster Tools but have been improved just recently:
They added a nice looking and usable* Silverlight interface which allows you to submit XML sitemaps, view charts and explore your search-related statistics:
- Pages Crawled: Shows the number pages crawled by the BingBot (formerly known as MSNBot) for the last six month.
- Pages with Crawl Errors: Shows the number pages with crawling errors (404) detected by the BingBot for the last six month.
- Pages Index: Shows the number pages in the Bing search index (on a daily basis).
- Impressions and Clicks: This chart lets you analyze the traffic Bing brought to your web site. The chart show the search the number of impression, clicks and the Click-Through Rate per search query over time.
- I must admit that I hardly use the Bing Webmaster Tools, because Bing’s share of my overall search engine traffic is (as of today) only 0.72%.
Yahoo Site Explorer
The Yahoo Site Explorer allows webmasters to submit XML sitemaps, view crawl errors, explore the search queries and view the web site statistics.
Just like the Bing Webmaster Tools, is the Yahoo Site Explorer just a bad clone of the Google Webmaster Tools … and (at least for my web sites) mostly irrelevant, because Yahoo’s share of my overall search engine traffic is only 0.47% – even less than Bing.
But even if Yahoo is not bringing much traffic to your site, you should have an eye on your SEO status using the Yahoo Site Explorer.
Make use of Web Analytics
There is no way to do search engine optimization without a properly configured Web Analytics provider. There is no other way to understand how people are using your web site.
Web Analyitcs providers like Google Analytics or eTracker can deliver you valueable insights about
- referring sites and their target pages,
- the amount and distribution of search engine and direct traffic on your site
- the keywords and search queries that brought visitors to your web site,
- the regions and countries users come from
and many things more.
Web Analytics is a science on its own – but one you cannot effort to neglect.
→ Web Analytics is your only source for collecting empirical bevavioural data that builds the base for all your optimization efforts.
Monitor your up-/down-times
If you are not running an online business you might not always be aware of your web site’s downtimes.
But even though (short) downtimes won’t affect your web site’s ranking, you should try to keep them to a bare minimum.
If your content is good, the GoogleBot and the other search engines will always come back for another time if a web site is temporarily unavailable. But I am pretty sure that the availability of your web page is at least one “signal” that search engines include when calculating their search index.
Why would a search engine place a web site, that is known to be a shaky candidate, on position one of the search results, if the next best web site that is content-wise just as good as the first one – but more reliable?
Search engines are always trying to find the best result for a given query.
And since web users don’t like web pages in their search results that are temporarily unavailable (or responding slowly), search engines will care about a web site’s availability (and performance).
A good (and free) service for website monitoring I can recommend is AreMySitesUp.com. They will check your web sites for downtimes and timeouts and send you an email and/or sms if a site causes trouble.
→ This way you can address issues right away and reduce downtimes.
Staying on top of the Game – Getting the latest SEO News
You knowledge about the web and search engines might suffice today, but you certainly need to know whats going on today to survive on the long run.
Watch the “Google Webmaster Central YouTube-Channel” videos
The Google Webmaster Central Help Channel contains hundreds of knowledgeable videos giving informative answers to every conceivable SEO-related question.
This YouTube-Channel is without a doubt the best source for search engine optimization tips and tricks … and is fun to watch.
Follow the updates posted on the offical developer blogs of Google, Yahoo and Bing
The search engine providers are constantly improving and bug fixing their search algorithms to deliver better results (e.g. Google’s State of the Index 2009) – and you should inform yourself about these changes:
Constantly evaluate your SEO status
And you should constantly check your web sites for crawling errors, broken links and performance issues.
Conclusion – Final thoughts on Search Engine Optimization
Search engine optimization is simple a two-stage process – once you have done your “homework” you can spend your time creating unique and useful content:
- Preliminary stage: “Doing your Homework”
- Defining Goals
- Preparing your web site architecture and cleaning up you HTML code
- Training your web designers, content editors and architects
- Defining Goals
- Main stage: “Content Optimization Routine”
The actual search-engine optimization is an ever-repeating process of three simple steps:
- Defining (or refining) your SEO keyword lists
- Creating new content
- Analyzing the results
- Defining (or refining) your SEO keyword lists
So the essence of what I am trying to say, to those of you who want to do Search Engine Optimization, can be shrinked to these few bullet points:
Ask yourself what the goals for your web site(s) are.
→ Defining measurable goals is often the hardest part of the SEO process.
Read “Google’s SEO report card” in order to understand what the usual actions of a search engine optimization process are.
→ The SEO report card is a fantastic show case for search engine optimization – illustrating the most important SEO issues on the well-known Google products many of us use every day.
This SEO Guide is a more detailed and general tutorial for web developers and content editors, explaining search engine optimization from a to z with many examples.
→ All the topics addressed in the Google’s Search Engine Optimization Starter Guide and the Google’s SEO report card are certainly issues you should address in all of your search engine optimization projects.
- Don’t spend to much time on tweaking your HTML code for search engines either – instead invest this time wisely and create some new content.
- Don’t spend your money on search-engine-optimization consultants before you have done your “homework” (→ Worked yourself through the preliminary stage of the SEO process, set SEO goals for your website and read Google’s free SEO Guides ↑).
Once you have done your homework for on-site SEO, you will have a much better understanding of
what to expect from search engine optimization in general and consultants supporting you with off-site search engine optimization in particular.
On-site SEO is in most cases something that you have to do yourself.
Search Engine Optimization in a Nutshell
Every bit helps. Keep your code clean and your site well architectured; think of your web pages as text-documents - not magazine adds; picture your customers searching for answers; be honest; don’t cheat – having good content will help the most. 😉
– Andreas Koch