Appl Clin Inform 2014; 05(03): 731-745
DOI: 10.4338/ACI-2014-03-RA-0021
Research Article
Schattauer GmbH

Evaluating a federated medical search engine

Tailoring the methodology and reporting the evaluation outcomes
D. Saparova
1   School of Information Science and Learning Technologies, University of Missouri, Columbia, MO, 65211
,
J. Belden
2   Department of Family and Community Medicine, University of Missouri , Columbia, MO 65212
,
J. Williams
3   MedSocket, Columbia, MO 65211
,
B. Richardson
1   School of Information Science and Learning Technologies, University of Missouri, Columbia, MO, 65211
,
K. Schuster
1   School of Information Science and Learning Technologies, University of Missouri, Columbia, MO, 65211
› Institutsangaben
Weitere Informationen

Publikationsverlauf

received: 26. März 2014

accepted: 02. Juli 2014

Publikationsdatum:
19. Dezember 2017 (online)

Summary

Background: Federated medical search engines are health information systems that provide a single access point to different types of information. Their efficiency as clinical decision support tools has been demonstrated through numerous evaluations. Despite their rigor, very few of these studies report holistic evaluations of medical search engines and even fewer base their evaluations on existing evaluation frameworks.

Objectives: To evaluate a federated medical search engine, MedSocket, for its potential net benefits in an established clinical setting.

Methods: This study applied the Human, Organization, and Technology (HOT-fit) evaluation framework in order to evaluate MedSocket. The hierarchical structure of the HOT-factors allowed for identification of a combination of efficiency metrics. Human fit was evaluated through user satisfaction and patterns of system use; technology fit was evaluated through the measurements of time-on-task and the accuracy of the found answers; and organization fit was evaluated from the perspective of system fit to the existing organizational structure.

Results: Evaluations produced mixed results and suggested several opportunities for system improvement. On average, participants were satisfied with MedSocket searches and confident in the accuracy of retrieved answers. However, MedSocket did not meet participants’ expectations in terms of download speed, access to information, and relevance of the search results. These mixed results made it necessary to conclude that in the case of MedSocket, technology fit had a significant influence on the human and organization fit. Hence, improving technological capabilities of the system is critical before its net benefits can become noticeable.

Conclusions: The HOT-fit evaluation framework was instrumental in tailoring the methodology for conducting a comprehensive evaluation of the search engine. Such multidimensional evaluation of the search engine resulted in recommendations for system improvement.

Citation: Saparova D, Belden J, Williams J, Richardson B, Schuster K. Evaluating a federated medical search engine: Tailoring the methodology and reporting the evaluation outcomes. Appl Clin Inf 2014; 5: 731–745

http://dx.doi.org/10.4338/ACI-2014-03-RA-0021

 
  • References

  • 1 Daily G. Case study-A Case of Clustered Clarity-Vivisimo, Inc. helps University of Pittsburgh Health Sciences Library System patrons effectively search more than 300 health and biomedical titles in ebook. EContent-Digital Content Strateg Resour 2005; 28 (10) 44-46.
  • 2 Ketchell DS, Ibrahim K, Murri N, Wareham P, Bell D, Jankowski T A.. Architecture for a Federated Drug Reference in a managed care environment. Proc AMIA Annu Fall Symp 1996; 413-417.
  • 3 Tannery NH, Epstein BA, Wessel CB, Yarger F, LaDue J. Klem M Lou. Impact and User Satisfaction of a Clinical Information Portal Embedded in an Electronic Health Record. Perspect Heal Inf Manag 2011; Fall 8(Fall) 1d.
  • 4 Coiera E, Walther M, Nguen K, Lovell NH. Architecture for knowledge-based and federated search of online clinical evidence. J Med Internet Res 2005; 7 (05) e52.
  • 5 Bracke PJ, Howse DK, Keim SM. Evidence-based Medicine Search: a customizable federated search engine. J Med Libr Assoc 2008; 96 (02) 108.
  • 6 Keim SM, Howse DK, Bracke PJ, Mendoza K. Promoting evidence based medicine in preclinical medical students via a federated literature search tool. Med Teach 2008; 30 9–10 880-884.
  • 7 Ketchell DS, Steinberg RM, Yates C, Heilemann HA. LaneConnex: an integrated biomedical digital library interface. Inf Tech Lib 2013; 28 (01) 31-40.
  • 8 Leung GM, Johnston JM, Tin KY, Wong IO, Ho L, Lam WW, Lam T. Randomised controlled trial of clinical decision support tools to improve learning of evidence based medicine in medical students. BMJ 2003; 8 327(7423) 1090.
  • 9 Magrabi F, Coiera EW, Westbrook JI, Gosling AS, Vickland V. General practitioners’ use of online evidence during consultations. Int J Med Inf 2005; 74 (01) 1-12.
  • 10 Van Duppen D, Aertgeerts B, Hannes K, Neirinckx J, Seuntjens L, Goossens F, Van Linden A. Online onthe-spot searching increases use of evidence during consultations in family practice. Patient Edu Couns 2007; 68 (01) 61-65.
  • 11 Westbrook JI, Gosling AS, Coiera EW. The impact of an online evidence system on confidence in decision making in a controlled setting. Med Decis Mak 2005; 25 (02) 178-185.
  • 12 Westbrook JI, Coiera EW, Gosling AS. Do online information retrieval systems help experienced clinicians answer clinical questions?. J Am Med Inf Assoc 2005; 12 (03) 315-321.
  • 13 Yusof MM, Kuljis J, Papazafeiropoulou A, Stergioulas LK. An evaluation framework for Health Information Systems: human, organization and technology-fit factors (HOT-fit). Int J Med Inf 2008; 77 (06) 386-398.
  • 14 Yusof MM, Papazafeiropoulou A, Paul RJ, Stergioulas LK. Investigating evaluation frameworks for health information systems. Int J Med Inf 2008; 77 (06) 377-385.
  • 15 Westbrook JI, Gosling AS, Coiera EW. Do clinicians use online evidence to support patient care? a study of 55,000 clinicians. J Am Med Inf Assoc 2004; 11 (02) 113-120.
  • 16 Westbrook JI, Gosling AS, Westbrook M. Use of point-of-care online clinical evidence by junior and senior doctors in New South Wales public hospitals. J Intern Med 2005; 35 (07) 399-404.
  • 17 Magrabi F, Westbrook JI, Coiera EW, Gosling AS. Clinicians’ assessments of the usefulness of online evidence to answer clinical questions. In: Fieschi M. et al., editor. MEDINFO 2004. Amsterdam: IOS Press; 2004: 297-300.
  • 18 Westbrook JI, Coiera EW, Gosling AS, Braithwaite J. Critical incidents and journey mapping as techniques to evaluate the impact of online evidence retrieval systems on health care delivery and patient outcomes. Int J Med Inf 2007; 76 2–3 234-245.
  • 19 Coiera EW, Westbrook JI, Rogers K. Clinical Decision Velocity is Increased when Meta-search Filters Enhance an Evidence Retrieval System. J Am Med Inf Assoc 2008; 15 (05) 638-646.
  • 20 Covell DG, Uman GC, Manning PR. Information needs in office practice: are they being met?. Ann Intern Med 1985; 103 (04) 596-599.
  • 21 McConaghy JR. Evolving medical knowledge: moving toward efficiently answering questions and keeping current. Prim Care 2006; 33 (04) 831-837.
  • 22 Flynn MG, McGuinness C. Hospital clinicians’ information behaviour and attitudes towards the “Clinical Informationist”: an Irish survey. Heal Info Libr J 2011; 28 (01) 23-32.
  • 23 Hughes B, Wareham J, Joshi I. Doctors’ online information needs, cognitive search strategies, and judgments of information quality and cognitive authority: how predictive judgments introduce bias into cognitive search models. J Am Soc Inf Sci Technol 2010; 61 (03) 433-452.
  • 24 Ramos K, Linscheld R, Schafer S. Real-time information-seeking behavior of residency physicians. Fam Med KC 2003; 35 (04) 257-260.
  • 25 Ely JW, Osheroff JA, Ebell MH, Bergus GR, Levy BT, Chambliss ML, Evans ER. Analysis of questions asked by family doctors regarding patient care. BMJ 1999; 7 319(7206) 358-361.
  • 26 Kushniruk AW, Patel VL, Cimino JJ. Usability Testing in Medical Informatics: Cognitive Approaches to Evaluation of Information Systems and User Interfaces. Proceedings of the 1997 AMIA Fall Symposium. 1997; 218-222.
  • 27 Sauro J, Dumas JS. Comparison of Three One-Question, Post-Task Usability Questionnaires. Proceedings of CHI 2009. Boston, MA: 2009
  • 28 Thiele RH, Poiro NC, Scalzo DC, Nemergut EC. Speed, accuracy, and confidence in Google, Ovid, PubMed, and UpToDate: results of a randomised trial. Postgr Med J 2010; 86: 459-465.
  • 29 Bennett NL, Casebeer LL, Kristofco RE, Strasser SM. Physicians’ Internet information-seeking behaviors. J Contin Educ Health Prof [Internet]. 2004; 24 (01) 31-38. Available from: http://www.ncbi.nlm.nih.govpubmed/15069910
  • 30 Venkatesh V, Morris MG, Davis GB, Davis FD. User Acceptance of Information Technology: Toward a Unified View. MIS Q 2003; 27 (03) 425-478.
  • 31 Rogers EM. Diffusion of innovations. 4th ed. Simon and Schuster; 2010
  • 32 Kim H-W, Kankanhalli A. Investigating User Resistance to Information Systems Implementation: A Status Quo Bias Perspective. MIS Q 2009; 33 (03) 567-582.
  • 33 Xia W, Lee G. The Influence of Persuasion, Training, and Experience on User Perceptions and Acceptance of IT Innovation. Proceedings of the 21st international conference on Information Systems. 2000; 371-384.