0

Let's say I have a listing website with thousands of pages. I don't want to make all those pages(items) crawlable by others, unless they search for those items. But, still I like to have all those pages indexed in google. Please note that those pages do exist, and it is not a broken link, just that there is no direct link to those pages unless one uses search function of the website.

How can I achieve that? One things that comes to my mind is to create a complete sitemap with all pages included and submit it to google for indexing. would it work!? Or google would complain that it cannot follow a path of links from home page to those (unlinked) pages?

Thanks

cybergeek654
  • 163
  • 2
  • 8

1 Answers1

0

Besides penalizing, when a visitor could not find a certain route to your pages which aren't defined by category or an index somehow, you may lose some page visits.

Also performing a "site:example.com" search in Google will reveal which URLs are indexed.

In addition, in normal ways of having a sitemap, anyone cloud find the sitemap and use it by typing the address and the filename like "Example.com/sitemap.xml".

With that in mind, if you are looking to protect your data against piracy, you can register your site within the search engines, or adding meta-tags to declare the content's author, or use some tools like Google Alerts, Copyscape, &... that are described here.

And if you want to hide your sitemap there are some suggestions like: Do not use frequent used address/filenames, check the User-Agent of the bots you want to allow &... that are described here.

Omid PD
  • 21
  • 5