Hi all,
just wondering...
As I have been building my website, I knew it would be slow going, so I decided to block all robots from crawling my site in my robots.txt file. I didnt want to get listed on google unntil I was ready. I guess now Im ready to open it up, even thought the site will always go through revisions. but I was wondering: Do you suggest opening a site for all bots to index it, or do you list only certain ones that you want? Arent some bots not so nice?
Also I was wondering... Can bots find pages that are hidden or isolated? Ones that are not linked to the rest of the site? I have my resume on one page which no one should be able to find unless I were to personally give them the link to get to it. I also have a ton of old pages that are older versions of pages and trials, experiments- I just have them around because I didnt want to delete them.
thanks mucho
Eric