There is .Net Framework MVC Web Application. Added Robots.txt to disallow a url that contains a key “somewords”.
Please advise how can i test that crawler is not allowed to access the page in local and test environment.
I cannot use any third party tool in my environment since I need to get permission to use that