We at Zyxware have been working on several drupal projects for more than two years now and we learned a lot in the process. Two of our internal projects include www.zyxware.com and www.ubuntumanual.org, where we have tried out different SEO tactics. There are a number of really good drupal modules that come in handy when you are dealing with SEO ing your drupal website. Here are a select few of them that really helped in attracting more traffic to our drupal projects.
No, I am not kidding here. Though i won't recommend SEO Checklist for experienced drupal users, but it really is a good tool for starters in drupal. This module helps you in keeping a check list with which you can monitor what all SEO measures have you already taken and what else can be done further. Really helpful if you are new to drupal and not yet accustomed with different technical jargons associated with drupal.
A small "Digg this" button at bottom of your article. Digg this module can enhance the reach of your article to a much larger audience, provided your blog has quality content. A good alternative is Service Link module.
A much larger array of choice for users to bookmark your articles. Add To Any consists of almost all popular services like digg, deli.cio.us, slashdot, stumble upon and many more. I will recommend you to use this along with Digg This module.
This module allows you to set some meta tags for each node, view or panels page. Using meta tags you could actually notify search engines the important keywords in you blog which in turn will help in better search engine positioning of your website.
Adds the Google Analytics web statistics tracking system to your website. Using google analytics you could track all kind of traffic and its pattern to your site and a whole lot of other statistics as well. Google Analytics is a must if you are really serious in getting more traffic.
This a really useful tool which can submit sitemap of your site to different search engines. Visitors get an overview of your website instantly and your site will look more authentic among search results which may result in even more traffic.
Instead of showing the standard "page not found" error search 404 searches with the keywords given by the user within your site, which in most of the cases will result in accurate searches.
Alinks module is able to replace terms in the content body with a link without changes to the node body which will be displayed as it was executed on display only.
The Pathauto module automatically generates path aliases for various kinds of content without requiring the users to manually specify the path alias. Also enable Clean urls, which is like a natural extension to Path Auto module. Here is how Clean Urls can be enabled in drupal.
This module allows you to specify a redirect from one path to another path or an external URL, using any HTTP redirect status.
The Link checker module extracts links from your content when saved and periodically tries to detect broken hypertext links by checking the remote sites and evaluating the HTTP response codes. It shows all broken links in the reports/logs section and on the content edit page if a link check has been failed.
Pathologic is an input filter which can correct paths in links and images in your Drupal content in situations which would otherwise cause them to “break;” for example, if the URL of the site changes, or the content was moved to a different server.
Global Redirect checks if the Clean URLs feature is enabled and then checks the current URL is being accessed using the clean method rather than the 'unclean' method. It also checks access to the URL. If the user does not have access to the path, then no redirects are done. This helps avoid exposing private aliased node's.
Creates a list of node URLs at /q=urllist.txt or (/urllist.txt for clean URLs) for submitting to search engines like Google webmasters or yahoo! site explorer
Use this module when you are running multiple Drupal sites from a single code base and you need a different robots.txt file for each one. This module generates the robots.txt file dynamically and gives you the chance to edit it, on a per-site basis, from the web UI.