Hey team, here's a puzzle for you. I currently have 2 domains that are tied back to the same Gamma website. (i.e. Beyondthebid.co; GOFirst.blog). The question I have is how can I tell Google Search Console to inspect all the pages on these websites without me having to manually ask Google to inspect each page everytime I make an update/add a blog,etc.) I know being able to create a XML sitemap may solve this but I'm not sure how to do this in Gamma just yet. Any insight/advice would be appreciated ๐ Screenshot for reference ๐
Hello! I'm not a Google indexing expert, so can't speak to that side of it. I can confirm that both the domains you listed have a <meta name="robots" content="index, follow"> set in their headers. But we do not currently support adding your own sitemap to your Gamma sites at this time.
What behavior beyond that gets Google to index a page automatically for you is not something I'm personally knowledgable of. I would've guessed it would eventually reindex your site as long as its linked to from elsewhere
Thanks for the help Dae-Ho C., I appreciate it. Thanks for looking into this. That makes sense, that explains why I'm running into issues with adding my XML sitemap as a codeblock on one of the pages but it's being read as a HTML page. I wish that were the case with regards to Google auto-crawling the pages. It seems I have to manually task the Search Console to inspect the URL in order for it do index the page.
