1

I have a website and I changed the system from my previous web, but still with the same domain (eg: example.com). The web upgrade is complete and I want to have search engines index all page URLs on my website. I've submitted sitemap.xml to Google Search Console. But why only 1 URL is discovered? How do I get all my page URLs (articles, blogs, products, etc.) indexed?

enter image description here

sitemap.xml

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>http://example.com/</loc>
    <lastmod>2018-06-04</lastmod>
  </url>
</urlset>
Stephen Ostermiller
  • 99,822
  • 18
  • 143
  • 364
fikfe
  • 63
  • 4

1 Answers1

0

The XML sitemap you submitted to Google only has one URL in it. It shouldn't be a surprise that Google reports only discovering one URL in your sitemap. A sitemap should have multiple <url>s in it, one for each page on your site.

Google ignores the <lastmod> in XML sitemaps because very few sites keep it up to date appropriately. It is an optional field, and I recommend dropping it.

In the end, your sitemap should look more like:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url><loc>http://example.com/</loc></url>
<url><loc>http://example.com/page-one.html</loc></url>
<url><loc>http://example.com/page-two.html</loc></url>
<url><loc>http://example.com/page-three.html</loc></url>
</urlset>

XML sitemaps are limited to at most 50,000 URLs per file. Since your site has hundreds of thousands of URLs, you are going to need many XML sitemap files. You can either submit each of them separately or create a sitemap index file that lists all the sitemaps and submit the index. See Manage Large Or Multiple Sitemaps

It looks like you have submitted text sitemaps with all your URLs already. It isn't necessary to have both text and XML sitemaps. If your text sitemaps have all your URLs already, there is no need to submit an XML sitemap as well. If your text sitemap is out of date, it should be removed.

The other thing that you should know is that sitemaps have little bearing on which pages Google chooses to index or how the pages are ranked. See The Sitemap Paradox. If you want Google to index and rank all your pages, each page should have links pointing to it from other pages. That is a big part of why this site has the "related questions" section on this page.

If you recently changed all your URLs, you should redirect all your old URLs to their equivalent new URLs using "301 Permanent" type redirects. Using redirects is the best way to preserve rankings that your old pages had.

You can remove the old sitemap file from your server to remove it. Then Google will soon report "could not fetch" for that sitemap.

Removing old out-of-date sitemaps won't cause the URLs in them to get de-indexed. Sitemaps don't control which pages are indexed and Google typically indexes pages that are no longer in a sitemap. It would be a good idea to include any URLs from your old sitemap that are still good in your new sitemap.

The usual way to generate a sitemap for a large site is to query the database, make a list of pages from it, and write their URLs into a sitemap file. The specific mechanics of that depend on your site. If you are using an off-the-shelf content management system (CMS) there are likely existing tools or plugins that you can use to generate your sitemap. If you have a custom built site, you'll have to write your own sitemap generator.

Stephen Ostermiller
  • 99,822
  • 18
  • 143
  • 364