Robots . txt

From Encyclopedia Dramatica
Jump to: navigation, search

Robots.txt is a text file that can be placed on a server, and the file basically tells spiderbots (from the Internet Wayback Machine, search engines like Jewgle, and website archiving software) to fuck off.

In simpler terms, if this file were to be placed on ED's server, Google and other search engines wouldn't be able to archive copies of webpages, the Wayback Machine wouldn't be able to either, and you wouldn't be able to use some webpage archiving program to do it yourself.

This file is usually put on websites who don't want to have to shit themselves because some faggot posted something embarrassing, and they don't want to leave an internet record, so they usually can take down the page and put it back up minus the incriminating/embarrassing shit.

However, this does not stop you from taking screencaps, so take those if you find a website with one of these.

If you don't know if a website has one of these, go to Google, find the webpage, and click on the "Cached" link under the page link, and if it tells you something like "this page cannot be archived due to Robots.txt", then you'll have to save copies of the webpage by hand or take screencaps. You can also find out if it has this text file by punching in the address into the Internet Wayback Machine, and it will tell you the same thing if the server has this file.

Websites with this file and how to save the lulz[edit]

  • has this file, so if someone you want to troll posted fanfics that are fucked up or lulzy, save copies of the stories they posted (especially if that's why you are trolling them) and repost them somewhere like a pastebin, because while they can't delete their accounts (they can blank their profile pages, so take screencaps if those have some lulzy content), they can delete all their story posts and story comments.
Softwarez series.jpg

Robots . txt is part of a series on


Visit the Softwarez Portal for complete coverage.