A robots.txt file is a text file that instructs web crawler software or search engines whether to visit or register or index a specific webpage or website.
Block all - User-agent: * Disallow: /
Block a specific - User-agent: Googlebot Disallow: /no-google/blocked-page.html
No comments:
Post a Comment