Sample Searches Used to Test the Sites
We used four different sample searches to test each of the sites. The strategy was to address both professional information needs and consumer health information. We identify in each category a topic which we would expect to be well covered in any resource as well as another topic which is either of more recent interest or more technical in scope, and thus less likely to be included across the board. We also wanted to use searches that used single terms, phrases, simple concepts, and multi-part concepts. Topics were selected from a pool of suggested topics from clinicians and medical reference librarians. The search topics used were:
Standard Topic: Breast Cancer
Cutting Edge: Fibromyalgia
Standard Topic: Sleep Apnea Diagnosis
Technical/Specific Topic: Hyperbaric Oxygen for Treatment of Stroke
Search engines were also checked for common features such as allowing truncation and Boolean operators. Truncation allows the searcher to enter part of a word or term, and still get valid results. We tested truncation by using the term "cholesterol" with a truncated form "cholest."
The search engines were tested using whatever defaults were given for the entering novice user of their site, on the assumption that the typical user would not take the time to explore links to "Power Search" or help files. The accuracy of the actual search results were ranked from 1 to 6, with 1 as the highest:
1 = relevant results, most relevant listed first
2 = some relevant results, some not, ranked
3 = some relevant results, some not, no ranking
4 = possibly relevant results, must dig to find
5 = mostly irrelevant results, top ranked results not relevant
6 = no hits or no relevant topics on first page
To make things really , we tested the search engines not only of sites which specialize in health information, but also general search engines such as Yahoo and Hotbot. Because not all the sites actually had search engines, we could not actually review all the sites we reviewed for design and interface issues. If a previously reviewed site did not have a search engine, but did have a browsable structure that allowed us to find information on the designated search topics, we went ahead and included the results from those sites. For some sites, such as Martindales, we could not easily determine how to locate information on our topics, so Martindales and similar sites were removed from this set of results. We also did not include a site if there was a charge for searching for information, which excluded such sites as MDConsult and Internet Medicine. A few of the sites reviewed in the earlier section actually qualified as general search engines, so for this purpose of comparing search results, they were not included among the health sciences sites. Ultimately, we ended up comparing results from 26 health sciences search engines and 26 general search engines.