A checklist for (on-site) Search Engine Optimization

In this article I will give you a short list of actions you can take in order to make your web-site's HTML structure cleaner, more accessible, more relevant and more useful to your audience – and as a result of all these actions: SEO-friendly.
created by on 2010-10-06

When you are doing search engine optimization there are basically only three universal rules you need to keep in mind, while designing, developing or generating content:

If you want to learn more about the three most important rules of search engine optimization check out this article: the 3 golden rules of search engine optimization (seo).

But in the end you can boil down the much glorified discipline of search engine optimization (seo) to a list of rather simple web development tasks you need to execute bit by bit in order to get good results.

There is no such thing as “the ultimate SEO trick” you need to implement on your web site in order to improve your search engine ranking.

But every little fragment from the SEO-checklist below will improve your chances for a higher search engine ranking.

Valid HTML

No invalid HTML elements

Try to keep your invalid HTML to a bare minimum in order to ensure that everybody is interpreting your HTML code the same way
(You can use a Firefox Add-on for this task: Html Validator for Firefox).

If you have HTML Validator installed you will see a tiny icon in your Firefox status-bar that indicates whether the HTML of the page you are currently viewing is valid or not.
You will be surprised how many big sites have invalid HTML code on their pages.

And even though it is not absolutely necessary to be 100% W3C compliant, it is the “zero errors / zero warning” message from HTML Validator
you should aim for – but warnings are not are not pretty, but not so bad either.

But I must admit that this more of an “web designer’s code of honor” thing, than it is a real factor for a good search engine ranking.
→ Valid HTML code is more of an issue for the usability and design of your web sites.

Interesting links about the impact of valid HTML code on the search engine ranking:

Google Webmaster Central: Why doesn't Google.com validate?

Assign descriptive and meaningful ALT texts

Make sure all <img> tags (as well as <input> and <area>-elements) are equipped with a descriptive and meaningful ALT text that supports the goals of the web page by adding some useful information.

But also keep in mind that the text assigned to the ALT attribute should not exceed 100 characters.

Don’t worry too much about invalid HTML attributes

Invalid or unsupported attributes don’t hurt your search engine ranking (e.g. “autocomplete”), but try to avoid them as well if possible.

Run an accessibility checker on your web pages

Install the WAVE (web accessibility toolbar) Add-on for Firefox and run an accessibility test. The tips the tool gives you are generally good points to start your search engine optimization with. Many of the criteria applied by accessibility checkers are also relevant for SEO.

An besides, the tool offers some great features (Structure/Order view, Text-only view, disable styles & document outline) we will reuse for the search engine optimizations tasks further down this checklist.

Content Optimization

Create a list of keywords you would like to optimize you web pages for.

What the SEO keyword-list should capture is:

The list of keywords for your personal search engine optimization process could look something like this one:

And will do two important things for you:

Note: Whenever you are modifying content on a web page, changing the meta-description of a document, assigning a new document title or creating a new web site structure you should keep this list in the back of your head.

With this list on your desk you will now be able to identify the starting position for your search engine optimization efforts. Order the keyword list by your personal priority and then open a clean Browser instance (no cookies, no cache), Google the term and write down on which position you are listed for each of them, and the URL of the page that is indexed.

Make sure you are not logged in and you clear your cache and cookies before you perform a new search, because otherwise Google might just show you personalized search results.
If your desired page does not show on the first five or ten pages, you can see if your page is indexed at all by entering a search query with this syntax into the search field:

YourSearchTerm site:YourDomainName.com

Write down “not indexed” next to your keyword and move on to the next phrase.
If your web site is not listed for one or more terms, Don’t Panic. We are here to fix that.

If you have the feeling that the information on your web pages will not suffice to attract new users from search engines, considers writing some new content that relates to the keywords you are trying to optimize your web site for right away. The time you invest into writing new and unique content is always well spent, because search engines love new content (almost regardless of the structural problems you might have on your web site that normally would scare away search engines).

Creating such a SEO keyword-list is a lot of work and certainly not easy – you have to think hard about every single keyword and search query on it. But if you don’t create this list, you will have no base to evaluate the success or failure of your SEO actions. And your content editors have no guidelines they can follow while creating new content.

Train your content editors

In order to optimize you content for the list of keywords you have just created for your web site you must make sure that every single content editor and author on your web site knows about your SEO keyword list and understands how they can adapt their writing for reaching your SEO goals.

Optimize your content for the target list of keywords:
Inform your authors and editors about your keyword-priorities and make sure they have time to create good content – it is worth it.

Tips & tricks for creating SEO friendly content:

Optimize your web site’s search result presentation

Once you’ve made the leap into the search results of Google, Yahoo and Bing, you must make sure that your web pages are displayed in an optimal way. Even if you are displayed within the first ten search results for a specific keyword, you can still lose a lot of clicks by screwing up your web site’s search-result signature:

The search-result signature of a web page must convince a web user within milliseconds that the current web page is delivering the information he or she is looking for.
The signature is generally defined by three basic information items every web developer, editor or author can easily influence:

Title tag optimization

The text you put into the <title>-tag of your HTML pages has a tremendous influence on your Click-Through-Rate (CTR), because
the document title is always displayed at the same prominent position in the search results (for all search engines).

A document with a title that contains the keyword the user has searched for (and includes the keyword in the META-description text) appears automatically much more reliable than a result that is ranked a little bit higher, but doesn’t name the search-term in the title and/or the META-description text.

As you can see in the this screenshot of a Google search result page, it is important that you include the most important keyword of a web page into its title tag, because this is the first thing web users will notice while scanning through the search results – pages which don’t include the users search query in the document title appear less relevant (even though Google placed them into the first ten search results):

Tips for SEO-friendly title-tags:

Meta-Description

The meta description text (often referred to as “snippet”) is the second important component of your web pages’ search result presentation.

You can use the meta description to write a short and meaningful summary of your documents’ content; giving search engines users all the information they need to decide whether the current page might hold the information he or she is looking for; or not.

Considering that your Click-Through-Rate (CTR) might drop considerably if you don’t manage to interest users for your content, you should think twice about what text to put into a web pages’ meta description.

Meaningless marketing claims such as “Get it now!”, “Best Product ever”, “Go green” might work for TV spots, but certainly not on a search result pages on which a users is deciding within milliseconds whether a page looks promising or not.

Tips for creating useful meta-descriptions/snippets:

Use human readable, search engine friendly URLs

URLs are an important part of your documents and shouldn’t be generic. This is both, bad for your users and will hurt your search engine ranking (→ text is king!).
Make sure all the URLs you want search engines to index have a clear hierarchy, are human-readable, are descriptive and include some of your major keywords you want to optimize your site for.

I think I created a quite sensible and SEO-friendly URL structure / architecture for my blog here:

Maybe you can find a similar URL format for your web site; one that your visitors can understand and which helps your SEO goals.

Tips & tricks for SEO-friendly URLs:

Optimize your heading tag usage

Making sure that your web pages are clearly structured with semantic markup (e.g. heading elements: <H1>, <H2>, .. , <H6>) is of great importance for both, your normal web users and for search engines.
Using heading elements as semantic markup for your documents, helps web users and search engines understand the structure of your content better, than without these semantic markup code.

But you must 100% confident that you are using the heading elements properly.
Because otherwise you will only confuse your visitors and the search engines:

You can easily validate your document’s outline if you have the WAVE toolbar for Firefox installed.
Just click on the “Outline” icon and you will see the outline of any given web page:

Tips and tricks about using heading tags on your web pages:

Google Webmaster Central YouTube-Video: Multiple H1 tags on a page

Content-Language

Another useful hint you can give search engines about your web pages content language. Since most web users will expect search results which are in their respective mother tongue, the content-language can be used by search engines to decide whether your content is relevant to a user with certain language-preferences in their browser, or not.

You can specify your web pages language using three different techniques:

<html lang="en" xml:lang="en">
...
</html>
<meta http-equiv="Content-Language" content="en">
HTTP/1.1 200 OK
Content-Language: en

The language codes you must use for the all three types of Content-Language flags must comply with the ISO 639-1 Standard:

But you are allowed to specify a list of content languages, if your content is mixing multiple languages:

<meta http-equiv="Content-Language" content="de, en">

My advice is to set the content-language in the html tag and with the Content-Language META tag if you can, because search engines might make mistakes detecting the language of your content.

But don’t worry if it is too much work or just not possible for you to specifically name the contents’ language – I think search engines don’t care as much about the Content-Language nowadays, because they are being fooled to often and are able to detect the language themselves.

Watch this video to get an impression about how advanced Google’s translation capabilities are:

Inside Google Translate

→ But it certainly doesn’t hurt your SEO ranking if you specify your content-language.

Links about the Content-Language meta information:

Link Optimization

Use descriptive anchor texts

In order to inform search engines about the (internal) links you are setting to other parts of your web site, you must use anchor texts which actually describe the content of the document the link is pointing to.

This seems obvious, but is often enough done wrong by web developers and content editors – and you will certainly find more than one bad example on this web site.

Here is a list of commonly used but very bad examples of anchor texts which don’t give search engines any clues about the content found at the link’s destination (because unlike humans, search engines cannot understand the context in which the link is being used):

Tips for writing anchor texts:

→ Basically you can apply the same rules to writing anchor texts than would apply for headings.

Make use of internal linking

Taking into account how search engines work, you must set internal links to the documents you want to promote from as many pages of your web site as possible.
→ The reputation/weight of a certain web page grows (in comparison to all other web pages within your site), the more often this specific page is linked from other pages on your web site.

Tips & Tricks for improving your internal linkage:

Google's Matt Cutts explaining how you can emphasize a certain set of page on your site

Make sure that important links are not generated by JavaScript code

If you have dynamic content elements on your web site, make sure that they are build from the HTML code that already exists on you pages. Because if you base rich user interface components such as a country-selector, a product-slider or a language-switcher on data that is delivered by JavaScript code, you’ll lose many potentially valuable links that would otherwise have improved your site’s ranking.

Here is a bad example of a web site using a dynamic fly-out menu as the country-selector, pointing users to the different country-portals, that is solely based on JavaScript code and hence invisible to search engines – none of the links is mentioned in the HTML code of the web page:

→ It would have been smarter to put the links of the country-selector into the HTML source code of the web site and leave just the interactive part, the formatting and the styling to the JavaScript code.
This way search engines can crawl your page, flow out through these links, index the linked pages and relate them to your web site.

On the other hand you can also use this technique to keep certain text snippets or links away from Google’s search index.

One scenario might be a link you want to have on all your web pages that leads the user to a login-page or an area of your web site which requires the user to be logged in.
These links are not relevant to search engines and hence don’t need to represented in the HTML source code of your web site:

In this concrete example (↑) I do have the problem that Google is trying to index these login-page links and reports “Soft 404 errors” in the Google Webmaster tools “Crawling errors”-section – even though I do have a meta noindex on these pages:

What I could do to solve this problem is move this link (which is only relevant to content-editors and certainly not to search engines) from the HTML source code of my web site, to a simple JavaScript
function generating this link:

<script type="text/javascript">
var loginUrl = "http://andykdocs.de/andykdocs/login.php?redirect=/document/list/latest&login=1&minimumaccesslevel=1";
$('#loginLink').html('<a href="' + loginUrl + '">Login</a>');
</script>

Long story short: If you have links and content that shall not be indexed – generate it with JavaScript.

Hidden Links & Hidden Content

If you turn a hidden list of links or hidden parts of your page’s content, via JavaScript into an interactive UI element (drop-down menu, fly-out menu or a TabView control) on your web page, I think, you don’t have to worry about being banned from the search index.

If you have HTML elements (especially links) which you hide via CSS (display: none) you can be sure that the Google Bot will follow them. → Just be honest and don’t be sneaky about your hidden text. Either you have a good reason to hide certain elements on your page or you don’t.

Must-reads about hidden content:

Google Webmaster Central YouTube-Channel: How not to hide text

Avoid broken links – keep your content up-to-date

Removing or fixing broken links is more about keeping the user experience of your site intact, than it is about improving your search engine ranking.

There are a lot of crappy tools around the net to check your web sites for broken link, but none of them is really good (at least I don’t none any of them).

My advice is to use the “Crawl errors” section of the Google Webmaster Tools.

In the Google Webmaster tools Crawl errors section you can easily see, when a broken link was detected by the Google Bot and from which (and how many) pages this link is referenced:

And you don’t have to actively run a tool which takes hours to crawl all your web pages, looking for broken links – every week or so.

→ The Google Webmaster Tools are just great!

Prevent search engines from indexing pages which should not be indexed

On (almost) every web site there are certain pages which are available to everybody, but you don’t want them to appear in the search results before your actual landing page when somebody is searching for your brand name. Or you have pages on your site which you know, you don’t want them indexed:

Don’t hesitate to disallow search engines to crawl these pages or areas of your web site if you think they will hurt your SEO goals.
My site here isn’t a very good example but my robots.txt looks something like this:

And here is another example where I use the “noindex” META tag element to prevent search engines from indexing my login pages:

Useful articles about robot exclusion:

Duplicate Content

Forget Valid HTML and Content optimization in order to improve your search engine ranking – Duplicate content is the worst that can happen to your web sites ranking. Not because search engines will ban you from their index if you have a lot of copied content on your web site, but because the content of your web site is generic and therefore not interesting to users and search engines.

But don’t panic, if your site is not a spam site you probably won’t be
punished by Google, Bing or Yahoo if you have for example the same product description
than many other sites. But your chances of being found will of course go down, just because
there are more pages which fit to the same search query.

If you are an e-commerce web site that has almost the same product description
as many other web sites, you shouldn’t make it worse by creating duplicate content on your own web site.

Links and articles explaining what duplicate content is:

Don’t place the same content under multiple domain names (Addon domains)

Registering your web site under my different domain names won’t help search engine ranking! Search engines will consider the web site as duplicate content and will link the “wrong” version and/or rank them lower than necessary.

Useful articles and videos on duplicate content:

Redirects / There can be only one … URL to a document!

Use redirects to normalize (or canonicalize) your URL structure!

Make sure you have all the redirects in place that you need to rigorously enforce your web site’s URL architecture – you cannot allow any exceptions to your URL structure/architecture and there can certainly be be not two URLs to the same resource or web page.

But you can’t always control how people link to your web site, and so you must anticipate the most common errors and prepare permanent (301) redirects that lead these users (and search engines) to the correct URL.

And why would you do that? Your Content Management System probably show the correct page either way.
Yes, you are right, web users don’t care if they can access the same content under two (slightly) different URLs.
But search engines do care!
When search engines see two URLs and get a HTTP 200 status code, they will “think” they have two different pages at hand.
Eventually they will discover that both URLs lead to the same content, but until then damage might already been done:

And all the reputation and trust your page got from these external source will now be split into 4 pieces – just because you forgot to install proper redirects.

→ What’s better? Having four links in the top 40 search results or having one link on the first page?

Helpful links and resources about creating redirects:

Make use of the Canonical Link Element

The canonical link element is a rather new method to reduce the number of duplicate content issues by informing search engines (Google, Bing and Yahoo) about the correct URL of a web page.

An example usage for canonical link element could be the landing page of your web site. If you are using campaign ids to track the success of a
promotion or your advertisements you will certainly find several different URLs to your web site on the web:

… and many more. And even though you know that all these links are linking to the very same web page, Google and other search engines will have a hard time figuring out which URL is the official one.

But you can ease the duplicate content pain, by giving the search engines a hint which one of the URLs you chose as the canonical one:

<html>
<head>
	<link rel="**canonical**" href = **http://www.example.com/start** />
</head>
<body>
	...
</body>
</html>

→ Choosing one URL as the canonical will help unify scattered reputation of all the links to your web page.

Read more about the Canonical Link Element:

Avoid pages with similar content

If you have two web pages which (almost) identical content you should ask yourself what the hell went wrong:

Watch these Google Webmaster Central YouTube Videos to understand what duplicate content is and what it is not:

How are duplicate shopping pages with different currencies handled in Google?
Duplicate Content & Multiple Site Issues

Give your content editors time and clearance for coming up with good text!

If you are running a commercial web site and you only invest in your platform and don’t invest into your content you are doomed (unless you are Apple; they can sell about anything).

And make sure they know the keywords you are optimizing your site for.

Improve your Website’s Performance

Increasing your web site’s performance is another component of search engine optimization which will not directly translate into a better ranking, but will increase the usability and help lower your bounce rates.

Nowadays, web user aren’t willing to wait for slow web pages. If you web site didn’t render within the first 2-3 seconds, chances are high that visitors will leave immediately and try the next search result.

But as Matt Cutts explained in a recent Google Webmaster Central Help Video (Should I be worried about ad servers affecting my site’s speed and ranking?),
you don’t have to worry too much about your web site’s performance, because it is just one of many other factors which influence your search engine ranking. Optimize your content first and your performance second.

But read Google’s official statement on site speed and decide for yourself if it worth the trouble for the search engine optimization of your web site:
Using site speed in web search ranking.

If you decide to invest into your web site’s performance you can do so by working off the issues reported by Yahoo! YSlow and Google Page Speed (both tools are add-ons to Firebug for Firefox).

Both, YSlow and Page Speed, are really great tools for analyzing your web site’s performance and will generate a list of actions you can take in order to improve the performance.

And for starters you can read Yahoo!’s Best Practices for Speeding Up Your Web Site and Google Page Speed’s Web Performance Best Practices. The YSlow and Page Speed performance guides will help you
understand which actions can be done to improve a web site’s performance.

My advice for web site performance optimization is:
Go for the low-hanging fruits first!
On the performance-optimization lists created by Page Speed and YSlow are always items that can be easily implemented and will help your web site’s performance tremendously.

Measure the performance of your web pages (page load time)

First of all you need to identify your web site’s “performance status quo”. You might think that your web site is fast enough, but you need to know it for sure. You must measure and write down the actual page load times in order to improve them – or even begin to understand that your web site might be too slow.

So, how is the “page load time” defined?
And the answer is straight forward: “Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser.” (source: Google Webmaster Central - Site performance)

There are a couple of great tools that can help you to measure you web site’s page load times (plt) on per-page basis:

Firebug [Firefox]

Of course does the almighty Firebug have the ability to measure page load times. Just install it into your Firefox instance, click on the little “bug” in the lower left corner of your browser window, click on the “Net” tab, select “All” and see the page load time in seconds on the network activity monitor.

HttpWatch [Firefox, Internet Explorer]

Another great tool for measuring your web site’s performance and page load times is HttpWatch. HttpWatch is an integrated HTTP sniffer for Internet Explorer (IE) and Firefox that allows you, just like Firebug, to measure the page load times of your web pages.

Chrome/Webkit Developer Tools [Chrome]

The Google Chrome Developer Tools (DevTools) have quickly become my second favorite tool suite for web development. The Chrome DevTools have a similar network activity monitoring feature like Firebug has which allows you to easily measure the page load time of your web pages.

Online services for web site speed Analysis

And there are numerous online tools which allow you to enter your web site’s URL and let them measure the page load time of the URL for you:

These tools and services for measuring your web site speed help you get an outside view on you web sites performance, but cannot substitute a detailed performance analysis.

Choose a tool that you like (Firebug, HttpWatch, Chrome Developer Tools, …) and write down the page load times for every page type of your web site. Make sure you don’t have the page in your browser cache while testing! And test at least one heavy (content rich, many images) and one light (mostly text) version of each page type and write down their page load times into an Excel sheet like the one displayed below (↓):

If you want to know the average page load time of all your pages within a web site, and you are too lazy too measure them yourself, you can just use the Google Webmaster Tools’ “Site performance” feature which show you a graphs of your web site’s average page load times over the past few month:

And if you wonder how Google is collecting the page load times of your web pages (I would have guessed that the data is coming right out of the Google Bot, but apparently that is not the case) – they are collecting the performance data from users using the Google Toolbar who have the Page Rank feature enabled (source: Google Webmaster Central - Site performance).

Blog posts, articles and videos on the impact of page load times on the search engine ranking:

Measure the size of your web pages (page weight)

Another important factor to your web site’s performance is the size of your web pages. Even though there isn’t a hard limitation a web page shouldn’t exceed, you should consider your average visitors bandwidth and keep in mind that every extra kilobyte will slow down your page load times.

Considering the the theoretical download time (in seconds) for only 500 KB, you should be alarmed if your pages are heavier than one megabyte (> 1MB). Everything below 500 KB should be fine. But in terms of page size, less is always better.

If you want to reduce the overall size of your web pages (HTML, Images, JavaScript, CSS) you can basically do three things which will have a significant impact on your page size (in the order they are listed here):

Optimize your Images

The biggest impact on your page size can be achieved by optimizing the image usage. On many web pages, image are the biggest chunk of data that needs to be downloaded by web users.

Enable Gzip Compression

Activating the GZIP compression for static content (*.html, *.js, *.css) reduces the page size between 50 and 70 percent – and is therefore one of the best things you can do to reduce the size of your web pages. Its just a configuration you have to active on your Apache or IIS web server and it will just work.

You can easily view the advantages of using server-side GZip compression when you have YSlow installed. The “Components”-tab of YSlow shows contrasts the file uncompressed file sizes of HTML, JavaScript and CSS files with the compressed file sizes – the advantages of using Gzip compression are just incredible.

If you want to learn more about GZIP compression for web server you can read these articles:

Minify your CSS & JavaScript

If Gzip compression isn’t enough for your web site, you can reduce the file sizes of your JavaScript and CSS files even further if you minify and pack them.

JavaScript and CSS minification takes your existing *.js and *.css files and removes all unnecessary characters (line breaks, tabs, white-spaces) from the source files and replaces long variable names with short ones – where possible.

If you have the Gzip compression activated, this won’t help as much as turning the Gzip compression on, but is a good thing to do anyways.
→ Every little bit helps.

Tools for CSS and JavaScript minification:

Use browser caching – Set proper expires headers

Leverage browser caching by setting the expires headers of your content wisely.
If you have resources like JavaScript, CSS or image files where you know they won’t change any time soon, you should definitely set an expires headers which lays far in the future.

If you are worried that some users might see a skewed up version of your web site, after you deployed an UI update, because they have an old version of the styles or scripts in their browser cache – you can file names which are a hash value representation of that files content.

Links about browser caching and expires headers:

Inform search engines about your content

Normally you don’t have to actively inform search engines about your content – if your web site is linked anywhere
on the web and you have a good internal link architecture on your site search engines will find their way to your web site and crawl all the links they can find from the linked page, sooner or later.

If you don’t want to take your chances and (maybe) wait weeks for a search bot to index your site, you can give the search engines a helping hand
and speed up the indexing process.

Create a XML Sitemap for your web site

XML sitemaps are an easy, machine-readable format for listing all the links on a web site.
All you need to do to add XML sitemap support to your web site is:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

	<url>
		<loc>http://www.example.com/index.html</loc>
		<lastmod>2005-01-01</lastmod>
		<changefreq>daily</changefreq>
		<priority>0.8</priority>
	</url>

	<url>
		<loc>http://www.example.com/product-xy.html</loc>
		<lastmod>2005-01-01</lastmod>
		<changefreq>monthly</changefreq>
		<priority>0.9</priority>
	</url>

	<url>
		...
	</url>

</urlset>

When you are done the XML sitemap could look something like this:

The Sitemaps protocol is currently supported by Google, Bing, Yahoo and Ask (Although, I am not sure if Yahoo really got it right, because sometimes they are returning the XML sitemap itself in their search results).

But note, XML sitemaps aren’t the ultimate way of announcing your content:
(At least) Google doesn’t guarantee that they are indexing all the links in your XML Sitemap, so you should also have a HTML sitemap that is both, useful for you web users and search engines.

Google Webmaster Central Video-Answer – Which is better: an HTML site map or XML Sitemap?

Link the HTML sitemap in footer of all your web pages and search engines will certainly crawl all the links listed in your HTML sitemap (if there aren’t too many).

Resources and articles on XML sitemaps:

Increase the number of external links to your content

Search engines will not automatically crawl all the links you specify in your XML sitemap.
To increase the chances of your web pages of being crawled by search engines, you must have as many as possible external (trusted) sites linking to your content.

How can I get Google to index more of my Sitemap URLs?

Let the search engines know that you have created new content

If you already have an XML sitemap and you want the search engines to crawl your recently published content on your web site, you can do so by actively submitting (ping) your sitemap URL to the different search engines.

Example URLs for submitting a XML Sitemap to the different search engines:
Submit a sitemap to Google:
http://www.google.com/webmasters/tools/ping?sitemap=http://andykdocs.de/sitemap.xml

Submit a sitemap to Bing:
http://www.bing.com/webmaster/ping.aspx?sitemap=http://andykdocs.de/sitemap.xml

Submit a sitemap to Yahoo:
http://search.yahooapis.com/SiteExplorerService/V1/ping?sitemap=http://andykdocs.de/sitemap.xml

Submit a sitemap to Ask.com:
http://submissions.ask.com/ping?sitemap=http://andykdocs.de/sitemap.xml

Help pages on how to submit XML sitemaps to search engines:

Monitor your SEO-Status

In the last couple of year, all major search engines (Google, Bing, Yahoo) have created several opportunities for web masters to
get some analytical data out of their databases (→ Search Engine Analytics / SEO Traffic Analytics).

And in order to really understand how your web site performs on the different search engines you must take advantage of these tools.

Make use of the webmaster tools provided by the Google, Bing and Yahoo

Google Webmaster Tools

The Google Webmaster Tools are a great tool for monitoring your SEO efforts and contain a variety of really useful features:

SEO Monitoring
Settings and administration
Tools

And for every single feature you will find a ton of examples and explanations in the Google Webmaster Tools Help Center.

The only big disadvantage of the Google Webmaster Tools is that they only provide data for the last month. So don’t forget to download the data every month!

Bing Webmaster Tools

The Bing Webmaster Tools are not as good as the Google Webmaster Tools but have been improved just recently:

They added a nice looking and usable* Silverlight interface which allows you to submit XML sitemaps, view charts and explore your search-related statistics:

Yahoo Site Explorer

The Yahoo Site Explorer allows webmasters to submit XML sitemaps, view crawl errors, explore the search queries and view the web site statistics.

Just like the Bing Webmaster Tools, is the Yahoo Site Explorer just a bad clone of the Google Webmaster Tools … and (at least for my web sites) mostly irrelevant, because Yahoo’s share of my overall search engine traffic is only 0.47% – even less than Bing.

But even if Yahoo is not bringing much traffic to your site, you should have an eye on your SEO status using the Yahoo Site Explorer.

Make use of Web Analytics

There is no way to do search engine optimization without a properly configured Web Analytics provider. There is no other way to understand how people are using your web site.

Web Analyitcs providers like Google Analytics or eTracker can deliver you valueable insights about

and many things more.

Web Analytics is a science on its own – but one you cannot effort to neglect.

→ Web Analytics is your only source for collecting empirical bevavioural data that builds the base for all your optimization efforts.

Monitor your up-/down-times

If you are not running an online business you might not always be aware of your web site’s downtimes.

But even though (short) downtimes won’t affect your web site’s ranking, you should try to keep them to a bare minimum.

If your content is good, the GoogleBot and the other search engines will always come back for another time if a web site is temporarily unavailable. But I am pretty sure that the availability of your web page is at least one “signal” that search engines include when calculating their search index.

Why would a search engine place a web site, that is known to be a shaky candidate, on position one of the search results, if the next best web site that is content-wise just as good as the first one – but more reliable?

Search engines are always trying to find the best result for a given query.
And since web users don’t like web pages in their search results that are temporarily unavailable (or responding slowly), search engines will care about a web site’s availability (and performance).

A good (and free) service for website monitoring I can recommend is AreMySitesUp.com. They will check your web sites for downtimes and timeouts and send you an email and/or sms if a site causes trouble.
→ This way you can address issues right away and reduce downtimes.

Staying on top of the Game – Getting the latest SEO News

You knowledge about the web and search engines might suffice today, but you certainly need to know whats going on today to survive on the long run.

Watch the “Google Webmaster Central YouTube-Channel” videos

The Google Webmaster Central Help Channel contains hundreds of knowledgeable videos giving informative answers to every conceivable SEO-related question.

This YouTube-Channel is without a doubt the best source for search engine optimization tips and tricks … and is fun to watch.

Follow the updates posted on the offical developer blogs of Google, Yahoo and Bing

The search engine providers are constantly improving and bug fixing their search algorithms to deliver better results (e.g. Google’s State of the Index 2009) – and you should inform yourself about these changes:

Constantly evaluate your SEO status

And you should constantly check your web sites for crawling errors, broken links and performance issues.

Conclusion – Final thoughts on Search Engine Optimization

Search engine optimization is simple a two-stage process – once you have done your “homework” you can spend your time creating unique and useful content:

So the essence of what I am trying to say, to those of you who want to do Search Engine Optimization, can be shrinked to these few bullet points:

This SEO Guide is a more detailed and general tutorial for web developers and content editors, explaining search engine optimization from a to z with many examples.

→ All the topics addressed in the Google’s Search Engine Optimization Starter Guide and the Google’s SEO report card are certainly issues you should address in all of your search engine optimization projects.

Once you have done your homework for on-site SEO, you will have a much better understanding of
what to expect from search engine optimization in general and consultants supporting you with off-site search engine optimization in particular.

On-site SEO is in most cases something that you have to do yourself.

Search Engine Optimization in a Nutshell

Every bit helps. Keep your code clean and your site well architectured; think of your web pages as text-documents - not magazine adds; picture your customers searching for answers; be honest; don’t cheat – having good content will help the most. 😉

Mahalo
– Andreas Koch

Shortlink:
Tags: