0

I have this website which has lot of dynamic URLs, so I want Googlebot to revisit my sitemap every few days. However, my site map have not been crawled for a long time now or new pages from it indexed.

server-sitemap.xml was crawled on Jan 19 when I published the site. I removed the sitemap and re-added it in search console hoping it will change to today's date. Still it's not changing.

enter image description here

Only 17 pages are getting indexed that and it says those URLs are not in the site map, even though there are more than 40 URLs in sitemap. What am I doing wrong here?

enter image description here

The site is not crawlable due to some issue.

enter image description here

Stephen Ostermiller
  • 99,822
  • 18
  • 143
  • 364
vivek kn
  • 209
  • 1
  • 4

1 Answers1

2

Sitemap is not a generally useful signal, despite what classical SEO claims.

If you want pages indexed, you first have to make sure they're crawlable organically. Bots often ignore sitemap entries that they can't find organically from the site.

You should think of sitemap as a limiting factor, not empowering. I often ask client to delete their sitemaps and proceed without them when they're struggling with setting up things properly. And your sitemap is exactly the one I would suggest removing.

BNazaruk
  • 1,917
  • 8
  • 10