Penn Computing
Computing Menu Computing A-Z
Computing Home Information Systems & Computing Penn
Results of User Tests of Index/Search Packages

Three form-based user "scripts" were employed to exercise the products' different features against a common set of Penn data. One recorded user experiences with Thunderstone's The Webinator, one with Alta Vista, and one asked for comparative impressions of the two products

Roughly 26 sets of forms were completed against the correct universe of test data. (One or two testers did not complete one or more of the three forms). The form-generated responses were statistically tabulated for frequencies, and the free-text comment fields were appended. The results are listed in three documents:

Summary

Testers were asked in direct comparison to rate each product on a scale of 1-5 (ascending order of quality). 13 respondents rated Alta Vista higher, 6 rated Webinator higher, and 4 rated them equal. The average rating for Alta Vista was 3.6; for Webinator 3.2. This shows a broadly held (2-1) preference among the (small) testing population for Alta Vista, though the preference is relatively weakly held.

Testers remarked favorably on Alta Vista's speed, its simple interface, its consistent display of the number of hits in response to each query, its inclusion of the URL of each hit in its intial display, and the default "tips". Many noted that their prior experience with Alta Vista helped them calibrate expectations. Testers remarked unfavorably on the level of query refinement and proximity control and the consistently large number of hits returned.

Testers remarked favorably on Webinator's powerful query refinement, its "more like this" ordering of results, its proximity control, and the rich context available for hits. Many found the operators unfamiliar and potentially daunting; they faulted the lack of URLs on the first display of hits and the inability to search outside Penn from the same interface. Several also felt it to be slower.

Conclusions

The data show a clear preference among the testers for Alta Vista. The data also confirm the frequent comment that both tools were acceptable. (Many remarked that they liked Webinator better than they expected to).

A technical and administrative evaluation chart compares the products according to technical criteria, including usability features not susceptible of testing, such as the ability to offer and manage multiple indices. These results will be used along with the user test results to develop a recommendation for one product to the Penn Web Steering Committee.


Back to Penn Web main page.

top

Information Systems and Computing
University of Pennsylvania
Comments & Questions


University of Pennsylvania Penn Computing University of Pennsylvania Information Systems & Computing (ISC)
Information Systems and Computing, University of Pennsylvania