"I wonder about how it lets us search. Google allows us to look for things we never would have been able to, or even thought of, before. I don’t even think twice now about searching for phrases buried deep within documents, knowing I’m searching over 2 billion Web pages, using strategies that would have been punishingly expensive and time-consuming on Dialog. That’s potent stuff; but at the same time, it doesn’t allow anything approaching proximity searching and actively discourages Boolean searching—techniques we know are very powerful but which must be used with care and take time to learn and that we consider to be professional signatures. Will this have an impact on our searching skills and performance?"
"Will that also mean that the kinds of professional-level resources we rely on—ProQuest, Lexis/Nexis, EbscoHost, and their kind—will then be less used, even though we pay big bucks and make them available for our communities?"
Links to an interesting paper from the founders of Google presented at the 7th W3C conference.
http://www.ala.org/alonline/netlib/il1002.html
" The truth is, nobody knows how wide the Web is. Some say 5 billion pages, some 8 billion, some even more. Anyway, what's definite is that the major search engines (SEs) index only a fraction of the "publicly indexable Web". Moreover, every SE indexes different Web pages, which means if you use only one SE you will miss relevant results that can be found in other search engines." Lists about 20 "very good" search engines and about twice as many others.
http://www.llrx.com/features/metasearch.htm
No comments:
Post a Comment