# This robots.txt file controls how search engines crawl this website # User-agent: * means these rules apply to all search engine bots/crawlers # Allow: / means allow crawling of all pages under the root path User-agent: * Allow: / Sitemap: https://web.landeed.com/sitemap.xml