Suddenly I am having this problem, I only noticed it because I saw that the google search console does not scan my site since October 23rd.
I'm on 3.34.1 (this is my url) and having issues overriding the default robots.txt by adding my own file to the theme.
This is a custom result inserted after the second result.
Hello, A client of mine is going through a bit of a crisis. A developer (at their end) added Disallow: / to the robots.txt file.
Our robots.txt field is populated properly but the file rendered includes random characters and duplicates our sitemap.
I've searched the forums to try and find an answer I am having where robots are generating multiple abandoned carts a day but can't find any ...
So I have no idea where this robots.txt file is generated from, where its located, how this is happening. Its a ghost file! I even overwrote ...
We've achieved this simply by building the blog functionality on bubble and migrating from ghost. see what we got here No-code blog by Zeroqode.
Over the last few weeks, my search results have dropped off since changing hosts. On the robots.txt tester. ... Disallow: /ghost/. Disallow: /p/.
I'm trying to setup robots.txt, but I'm not ... My bad, I want these pages not to be crawled by robots. My question is if the sitemap is setup ...
I am worried about the Squarespace default settings having a negative impact on our ranking. (How) can I manually change the settings? Or is ...