I am doing an SEO audit for a client and found that the robots.txt is using JS, meaning the file look very different when I turn on or off JS. Does anybody know whether it is a good practice?
Also, the robots.txt looks different on different location. For example, the file that my colleagus saw in Hong Kong is different from the one I saw in China. Is it a good practice? I tried looking for answers online and it’s little talked about.
Thanks for helping!
(in both cases, the differences are in that in one, it disallow something; it the other, it disallows nothing)
I tried looking for answers online but found few talking about this issue.
Helena Xiao is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.