I never understood why people would use a crawler to build their XML sitemap file. I guess if you don’t have access to read the database of your CMS you would do that but seems so inefficient. John Mueller of Google said that also on Reddit.
John Mueller said “Automate it on your backend (generate the files based on your local database). That way you can ping sitemap files immediately when something changes, and you have an exact last-modification date. Don’t crawl your own site, Google already does that.”
So make sure you rebuild your XML sitemap file based on what your database sees and does. Do not crawl your site to make your sitemap file, because (1) you may miss things and (2) it causes unnecessary stress on your server resources.
Be efficient and smart with your resources.
The good thing is that most CMS platforms these days already do this.
Forum discussion at Reddit.
This marketing news is not the copyright of Scott.Services – please click here to see the original source of this article. Author: firstname.lastname@example.org (Barry Schwartz)
For more SEO, PPC, internet marketing news please check out https://news.scott.services
Why not check out our SEO, PPC marketing services at https://www.scott.services