Deny search engines access to site - Robots.txt

Created by
Thursday, August 4, 2016

Denying search engines access to your site

  1. In the root directory of your site add a text file called /Robots.txt
  2. Add the following
    User-agent: *
    Disallow: /

     

  3. Upload this file to your root.

 

When search engines first attempt to index your site they look for the robots.txt file. within this file you have the power to tell as specific or all (*) search engines that they are not allowed any directories on the site.

 

Specific search engines: You may also what to specify a single search engine that you want to exclude or the other way around. Allow only one search engine but not others

-- Allow Bing to index

User-agent: Bing
Disallow:

User-agent: *
Disallow: /

-- Exclude only Bing

User-agent: Bing
Disallow: /

Exclude specifics: Using the same ideas as above you can allow and deny spcific directories and files on your server.

User-agent: *
Disallow: /~some/coolThings/
0
Rating
1 2 3 4 5

Save

Saved

Report

Reported
Reasons
Cancel
0 comments