robots

  • Thread starter Thread starter ME
  • Start date Start date
M

ME

I've just added a custom 404 error page to our site, as there are quite a
few links in various search engines that point to pages that no longer
exist. I've been reading here a little about robots.txt. I'm not sure
whether I need to set up a robots.txt file to keep search engines from
including the custom 404 page or if I should just not worry about it.
 
Hi,
On a windows server, not sure about *nix, a custom 404 returns a 200 OK
status message rather than a 404 so the search engines will actually see it
as a normal page. If you have several pages indexed that are now 404s I can
see the search engines penalising you because all the links effectively
point to the same page.

Better would be to use a 301 redirect to point the search engine to the
correct page location, this will also remove your old urls from the index
over time
 
Back
Top