Bots, Blogs and News Aggregators (http://www.BotsBlogs.com/) is a keynote presentation that I have been delivering over the last several years. Much of my information comes from the extensive research that I have completed over during this time into the "invisible" or the "deep" web. The Deep Web covers in the vicinity of 1 trillion plus pages of information located through the world wide web in various files and formats that current Internet search engines cannot find or have difficulty accessing. The current search engines locate and provide access to hundreds of billions of pages at this time.
In the last several years, some of the more comprehensive search engines have written algorithms to search the deeper portions of the world wide web by attempting to find files such as .pdf, .doc, .xls, ppt, .ps. and others. These files are predominately used by businesses to communicate their information within their organization or to disseminate information to the external world from their organization. Searching for this information using deeper search techniques and the latest algorithms allows researchers to obtain a vast amount of corporate information that was previously unavailable or inaccessible. Research has also shown that even deeper information can be obtained from these files by searching and accessing the "properties" information on these files!
This report and guide is designed to give you the resources you need to better understand the history of the deep web research, as well as various classifications of resources that allow you to search through the currently available web to find those key sources of information nuggets only found by understanding how to search the "deep web".