Friday 16 September 2011

Selective indexing by google - why does it index some pages but not others?

We have a consumer web site with a large number of content pages (they have the same structure but varying content) that was launched about a month ago. We improved SEO about a week ago. Now some pages show in the 1st place or 1st page but it appears that vast majority of pages are not indexed at all because they do not show at all - even at low ranks. Anybody has an idea what could be the reasons and how to fix this issue?

We have submitted a sitemap to google and we also found out that out of some 100,000 pages only about 400 are indexed. At this speed, it would take years to index all pages. The page content also changes slightly from time to time.Selective indexing by google - why does it index some pages but not others?You have 2 problems.

1) the internal linking is not correct. you need a site map that links the entire website. I recommend have the entire website linked internally. Many web builders on have a few pages indexed but what if a person searching is looking for information not indexed.



2) The metatags have similiar keywords or title. Make each page unique and each page will get indexed.
Selective indexing by google - why does it index some pages but not others?
Companies pay google to show them higher on a search in when people look for certain things.
Selective indexing by google - why does it index some pages but not others?
Seem to be you need an expert guidance

What I suggest is from my own experience

to go to this site for guide and email

the owner of this site for help

https://paydotcom.com/r/25065/easyanswer鈥?/a>
The first problem that stood out to me after I read this question was you submitted a sitemap with 100,000 URLs. The XML sitemap standard says sitemaps should be no larger than 10MB (10,485,760 bytes) and can contain a maximum of 50,000 URLs. After 50,000 URLs crawlers (including GoogleBot) stop reading your XML Sitemap. Just create two sitemaps, you can submit multiple sitemaps to search engines.



I want to also mention that another reason you could be getting poor crawling and indexing is due to your internal link structure. Internal linking is %26quot;vital%26quot; to proper indexing. All web pages should some how be internally linked to one another atleast once.



Unique title and meta description tags can also be useful. Sometimes search engines will not index all multiple pages with the same title and meta descriptions. SEO wise it's also recommened to have unique title and description tags.



Consider *NOT* indexing all your web pages, I'm sure many sections of your website don't even need to be indexed, most likely more than 20%. Only %26quot;important content\item based%26quot; pages need to be indexed. Shopping carts, about us, feedback, etc don't really need to be. Correct usage of the robots.txt file can help crawlers WASTE time crawling your site by blocking un-needed sections of your website.



http://www.sitemaps.org/faq.php#faq_site鈥?/a>

http://www.robotstxt.org

http://www.civicseo.com/content/6/43/en/鈥?/a>