I have an SEO in place for my php based site. It automatically makes php links appear as html links and is sophisotcated enough to inhibit duplicate html links when being scanned by robots/spiders.
Unfortuantely, it appears that my XML-Sitemaps program is not scanning my site as a spider, so it is mapping many of these links -- resulting in duplicate content appearing in the sitemap files.
I appologize if I am not explaining this very well. Bottom line, when a spider scans my site, these different html links will all appear to be the same, but when XML-Sitemaps scans it, they appear to be different -- resulting in multiple html links listed in my sitemap files that are, in fact, links to the same product page.
Is there a way to make XML-Sitemaps scan my site as a spider would, so that the php code that I have that acts uniquely for spiders will be employed?
Thanks for any help.
James