The best way to ensure content is not indexed by the search engines is to not put it on the publicly accessible web in the first place. If you don’t want personal details of your love life to be found then don’t put them on the internet!

If you want to publish information for a select group (perhaps a new design) then if you don’t link to the new pages from anywhere then the spiders will not be able to find them.

You can put password protection on the pages – the search engines will not enter passwords and so not index the pages.

Blocking seo spiders

You can add a line to the robots.txt file blocking access to these pages (a good tutorial is here). This allows visitors to read the pages but will block spiders from reading them. This is also probably the best way to block whole directories. One down side is it makes it clear to anyone where your protected content is.

The robots meta tag is also available, just put the following line in the section of each page:

The spiders will download the page, but upon finding the meta Blocking seo spiders tag will ignore the page. This should be avoided where you don’t want the spiders to download the page – due to resource usage issues for example.
Click for more information on the robots meta tag

The ‘rel=”nofollow”‘ tag is only Blocking seo spiders by Google at present and so doesn’t help protect content as well as the methods above. I mainly use it to block text links I don’t want to be found for (links saying ‘home’ for example).