The robots.txt file is not blocking the directories that I want to be indexed, and the sitemap.xml file is defined correctly.
The .htaccess file is correct and is not blocking anything.
All major pages on the site are easily accessible through the main menus and through left side navigation AND through a Site Map page that is linked to from the footer on every page on the site.
There are no meta tags blocking the bots. I am using:
meta name="robots" content="index,follow"
meta name="googlebot" content="index,follow"
In WMT, Google has the correct sitemap.xml file verified and does not report any errors in it. It was most recently downloaded two days ago.
However, out of 163 URLs in the sitemap.xml file (this is not a large site) only 57 are shown as "URLs in web index". I do not understand why the indexed number is so small.
Anything else I can check?
NOTE: pages in this site that I can't find in Google I can easily find in Bing.
When I do site:www.domain.com on Bing I see "243 results". When I do that on Google I see 99.
Edited by qwerty, 17 January 2011 - 01:04 PM.
Removed domain name per poster's request