How to find secret web site pages and content

By Sidharth | Ideas


Web site owner hide their webpages using commands in Robots.txt.Robots.txt is a text file which is located in the root directory of a site.It is used to control webpages  indexed by a robot,ie. you can disallow a particular web page or content to be spidered from search engine robots. By using ‘disallow‘ word you can block any URL of your blog from  reaching search engines.

We will take the help of Robots.text file to see the hidden web site pages and content

Step 1 – Go to Google and type this in the search box

“robots.txt” “disallow:” filetype:txt

Hit enter and you will be presented with loads of Robots file website results which have a disallow command.


Step 2 – From thousands of results we will choose any website,for example I will open Microsoft robot text file which is in the 1st page (Highlighted).After opening the robot text file,it looks like this


These are the content and pages which Microsoft doesn’t want search engine spider to get indexed.Now copy any line after the word Disallow:

For example we will copy this line :


Remember to copy the slash which is at the beginning of the line.


Step 3 – Type the main website url and then the line which you have copied in the Step 2  address_bar_browser

After combining both the main website URL and the line,Hit enter (See the screenshot)

Main url –

Line – /communities/blogs/PortalResults.mspx

Combination –


This was the page Microsoft had hidden from the search engine!

This was just an example,you can find some more interesting web pages and other secret content easily.Go ahead and try !

About the Author

Hi, I am Sidharth. Full-time blogger. Editor of Blogote. And a self-proclaimed geek!

Leave a Comment:

(9) comments

shaiksha February 8, 2009

yes this is great trick to view the hidden site in search engine….can you explain me how this process is going on in other websites ……if i want to know the hide information of some any other website then what i can do…..tell me…..

Neel January 19, 2009

cool stuff! lots to learn in this SEO world,, : (

Nitesh August 21, 2008


Jakub June 19, 2008

Hey this is cool feature, but I think that google should prevent it. Sometimes it may contain sensitive information!

But anyway, great post and I will be careful about it on my pages.

Diane June 19, 2008

Sensitive information shouldn’t be on the internet! Companies have a responsibility to keep their data safe. There’s a new story about data beig lost or stolen every week!

Avenues June 18, 2008

WOW.. this is so cool.. easy to find the hidden files from search engines..

stratosg June 17, 2008

funny result using your tip on google. never seen that page :) nice tip

web design company June 17, 2008

A great way to find articles and web site web pages which are hidden from search engine…

Steven Finch June 17, 2008

Great Post.

Add Your Reply

Leave a Comment: