XML Sitemap
XML Sitemap is a file that lists all important URLs on a website to help search engines discover and crawl content efficiently.
An XML sitemap acts as a roadmap for search engine crawlers, listing every URL that should be indexed along with metadata like last modification date, change frequency, and priority. This is especially important for large sites where internal linking alone may not surface all content.
Sitemaps should be dynamic, automatically updating as content is published, modified, or removed. Stale sitemaps that include deleted pages or exclude new content waste crawl budget and delay indexing of fresh material.
GenGrowth generates and maintains XML sitemaps automatically, including separate sitemap indexes for different content types (blog posts, glossary terms, feature pages). The platform submits updated sitemaps to search engines via their APIs, ensuring new content gets crawled as quickly as possible.
How GenGrowth Helps
See how GenGrowth helps -->Related Terms
Robots.txt is a text file at a website's root that instructs search engine crawlers which pages or sections they are allowed or disallowed from accessing.
Structured DataStructured Data is a standardized format (typically JSON-LD) for providing explicit information about a page's content to search engines.
Let GenGrowth handle this automatically for your product
Learn More