Then there are web search combines or combiners. The one I used to know and worked with when on Windows was Copernic. Basically how it worked was that it included a bunch of search engines (like browsers include these days) and it performed searches in all of them at the same time and then listed the results in a pleasant colourful list. Fantastic.
There's a problem of principle with internet search engine. They don't show the real web. They show the database accumulated by their own web crawlers. Where the crawler has not been, that you don't see.
How do you do? Do you look at results in multiple search engines to give yourself the sense of doing proper research or are you the "I'm feeling Lucky!" type?
Then there are web search combines or combiners.
Google still has the problem of indexing sites that are closed to most people, but allow the Google crawler bot access.
Isn't Gopher comparable to searching one or a few sites using a mechanism internal to them? Like searching a forum without Google.
Surfraw provides a fast unix command line interface to a variety of popular WWW search engines and other artifacts of power. It reclaims google, altavista, babelfish, dejanews, freshmeat, research index, slashdot and many others from the false-prophet, pox-infested heathen lands of html-forms, placing these wonders where they belong, deep in unix heartland, as god loving extensions to the shell. [...]Global options are common to all Surfraw elvi (clients). You can get a list of the currently installed elvi by typing surfraw -elvi.All elvi have useful low calorie help, for example:Code: [Select]$ sr rhyme -help[...]Surfrawize the soul of your favourite internet wonder. Join the Shell Users' Revolutionary Front Against the WWW by submitting code. Reclaim heathen lands. Bear witness to the truth. Its love will set you free.
$ sr rhyme -help
Page created in 0.038 seconds with 20 queries.