Tag Archives: robots.txt

Robots.txt or Not?

I’ve been wondering lately if I should have a robots.txt file or not. In the early days, I did not have it but later I read that we are supposed to have it to guide search engine spiders on which links to crawl and which to ignore. So I uploaded it to all my WordPress blogs.

Lately, I have been getting crawl errors which are actually not errors but a direction from my robots.txt of not crawling those links. I hope you are following me!

However, I am sick of seeing those errors. I don’t remember if I used to have these errors but they are adding up lately!

I asked a few people who have been hosting their own blogs longer than myself and they said that they do not use Robots.txt. They just allow the spiders to crawl ALL links. That should be the case, though I read that it puts as as risk for duplicating content or something. HHmm makes one wonder if we should or should not use robots.txt.

What do you think?