Skip to main content

Effective web-based clinical practice guidelines resources: recommendations from a mixed methods usability study



Clinical practice guidelines (CPG) are an important knowledge translation resource to help clinicians stay up to date about relevant clinical knowledge. Effective communication of guidelines, including format, facilitates its implementation. Despite the digitalization of healthcare, there is little literature to guide CPG website creation for effective dissemination and implementation. Our aim was to assess the effectiveness of the content and format of the Diabetes Canada CPG website, and use our results to inform recommendations for other CPG websites.


Fourteen clinicians (family physicians, nurses, pharmacists, and dieticians) in diabetes care across Canada participated in this mixed-methods study (questionnaires, usability testing and interviews). Participants “thought-aloud” while completing eight usability tasks on the CPG website. Outcomes included task success rate, completion time, click per tasks, resource used, paths, search attempts and success rate, and error types. Participants were then interviewed.


The Diabetes Canada CPG website was found to be usable. Participants had a high task success rate of 79% for all tasks and used 144 (standard deviation (SD) = 152) seconds and 4.6 (SD = 3.9) clicks per task. Interactive tools were most frequently used compared to full guidelines and static tools. Misinterpretation accounted for 48% of usability errors. Participants overall found the website intuitive, with effective content and design elements.


Different versions of CPG information (e.g. interactive tools, quick reference guide, static tools) can help answer clinical questions more quickly. Effective web design should be assessed during CPG website creation for effective guideline dissemination and implementation.

Peer Review reports


Clinicians require up-to-date and reliable resources to keep up their knowledge. Clinical practice guidelines (CPGs) are an important knowledge translation tool that can address this need, but uptake of CPGs can be inconsistent [1, 2]. Previous research has identified that aside from content itself, effective communication is also required to improve guideline implementation [2, 3]. Format of CPGs, for example, is identified as one of 6 key domains of implementability by Kastner and colleagues [3]. However, less than half of available guideline assessment tools address evaluating the format of guidelines [4]. Notably, even in tools that do evaluate format such as GLIA and CAN-IMPLEMENT, the focus is on the formatting of content within a document e.g. using bold text for recommendations or including visual algorithms [5, 6]. These recommendations do not address the full scope of formatting considerations, since many CPGs are more versatile now than a single document.

Increasingly, guideline organizations are creating multiple versions of their CPGs, which has been thought to improve guideline dissemination [7]. Short summary versions of guidelines and interactive tools based on guideline content may be utilized more easily at the bedside than the full text version. The website modality is particularly suited for keeping these different versions of content in one accessible location. As the internet is already starting to replace traditional sources of information like books and colleagues, CPG websites will only become more common [8, 9]. However, to our knowledge, none of the current guideline assessment tools make recommendations specific to CPG website creation.

We performed a mixed methods usability study of the Diabetes Canada (DC) CPG website, a comprehensive CPG website with a wide reach to both clinicians and the general public [10,11,12,13,14]. Usability testing has been widely used to assess and improve health technologies including websites [12]. Our aim was to assess the effectiveness of the content and format of DC Guidelines website, using our results to inform recommendations for this and other CPG websites.


We used mixed methods consisting of (a) questionnaire, (b) usability testing, and (c) semi-structured interviews to assess the live version of the Diabetes Canada Clinical Practice Guidelines website, updated April 10, 2018 [10].


The Diabetes Canada CPG website has a wide reach with health care providers (HCPs) as well as the general population [11]. The content of the website can be divided into 4 categories. Its contents and main components are described in Tables 1 and 2.

Table 1 Content of the DC Guidelines Website
Table 2 DC Guideline website components and descriptions

Participant and setting

We recruited clinicians from four professions frequently involved in diabetes care: family physicians, nurses, pharmacists, and dieticians. Investigators performed convenience sampling by emailing their own personal and professional network of health care providers from across Canada. From there, snowball sampling was used to elicit further contacts [15]. In order to ensure an adequate number of dieticians were represented, the publicly available Dieticians of Canada registry was utilized to cold email potential participants [16]. All participants were assigned to a randomly generated numeric code for data collection.


We assessed the following outcomes for the usability tasks:

Task success rate

Successful completion was defined based on whether the response given by the user matched an acceptable answer from the investigators’ answer key.

Task completion time

The time from the initiation of first movement on the screen after instructions have been given until either a) a complete answer is given or b) the user asks to move on to the next task. If the user made further attempts for exploration purposes after completing the task initially, the extra time was not counted.

Clicks per task

The total number of clicks users made until the task was completed or aborted.

Resource used to complete task

The type of resource on the CPG website (full guidelines, a static tool, or interactive tool) participants used to complete a task successfully or unsuccessfully. If an attempt was aborted, the last page the participant landed on was considered to be the resource accessed.


The links and pages accessed by the user, in the order in which they accessed them, from the beginning of a task until the task was completed or aborted.

Start of path

The section of the website (quick buttons on the home page, the links of the navigation menu, or the search function) that lead the users to the resource used to complete task. As users could explore multiple resources before arriving at the final answer, we defined the start of the path as being the most proximal to the final answer. For example, if a user followed the path [home page button ➔ page A ➔ back ➔ navigation menu link ➔ page B], the start of the path for the task would be navigation menu, as it is most proximal to page B, the resource used to complete the task.

Search attempts and success rate

The number of times users tried to use the website’s built-in search function and whether they succeeded in arriving at the correct resource.

Error type

Scheme was developed in data analysis (described below) to categorize unsuccessful task attempts into distinct error types.

Data collection

Demographic and practice characteristics questionnaire

Participants completed a sixteen-item electronic questionnaire prior to usability testing sessions via email. This included questions about patient’s age, practice type and setting, and prior familiarity with the Diabetes Canada CPG website (Additional File 1).

Usability testing

Usability testing was conducted remotely over the internet using proprietary video conferencing software (GoToMeeting 2019 version) [17]. The software allowed participants to share their computer screen and communicate by audio with the investigator (WW). Sessions were approximately 1 hour in length. The sessions were recorded in video format with corresponding audio.

Each participant completed eight usability testing tasks in random order. Instructions for the tasks was provided in writing using the text chat feature of the conferencing software.

Participants vocalized their thought process aloud using the “Think Aloud” methodology [18, 19].

Audio was transcribed to text in non-strict verbatim format with filled pauses (i.e. “um”, repeated words, and stutters) removed. Video was used to transcribe the pathways participants took from the start to completion of each task into text format.


The eight usability tasks (Table 3 and Additional File 2) were developed to reflect commonly encountered clinical situations in the diagnosis and treatment of diabetes. Some of these scenarios were used in the usability testing of the previous iteration of the Diabetes Canada CPG website. We included scenarios that could be answered using five of the six most accessed tools on the previous iteration of the Diabetes Canada website, in order to ensure the most popular tools were being assessed [11]. Five other clinicians uninvolved in the study were asked for input to further refine the tasks.

Table 3 Usability testing tasks. T2DM = type 2 diabetes mellitus

The usability tasks were divided into two types: information retrieval and resource retrieval. A previous study of the 2013 version of the website found that users likely entered the website to use a specific resource or to answer a specific clinical question [11].

The five information retrieval tasks asked participants to find the answer to a clinical question in a hypothetical clinical scenario (Table 3). For example, a task would describe a patient with diabetes and ask the clinician how often they should monitor their blood glucose. Resource retrieval tasks asked participants to locate a tool on the website, such as a patient handout. An answer key was developed with predetermined acceptable correct answers. All tasks could be completed using at least two resources on the website (e.g. full guideline chapter and quick reference guide).

Semi-structured interviews

Immediately after completing usability tasks, participants were interviewed to answer thirteen open-ended questions pertaining to the strengths and weaknesses of the website. The questions specifically asked for feedback on the website content as well as the format (see Additional File 3 for interview guide).

Data analysis

Demographic and practice characteristics questionnaire

Results were entered into a spreadsheet and descriptive statistics performed.

Usability testing

Transcripts and path texts were analyzed to extract the outcomes defined above for the usability tasks. Descriptive statistics was performed.

One investigator (WW) reviewed the recordings and think-aloud transcripts of all unsuccessful attempts and developed a preliminary coding scheme for type of error made (Table 4). This was then applied by a second investigator (DC) independently to recode the errors. All disagreements were reviewed and settled by discussion.

Table 4 Type of errors

Semi-structured interviews

We performed qualitative content analysis of the transcripts of the semi-structured interviews using grounded theory, similar to previous studies in usability [20]. Two investigators (WW, DC) independently performed open coding of the transcripts to derive the initial themes, sorted under the categories of positive comments, negative comments, and suggestions for improvement [15]. We then performed axial coding of these initial themes to identify strengths and weaknesses of the website in terms of content and format.


User characteristics

Fourteen clinicians participated in the study (Table 5). Most of the participants were young (age 20–39) and female. All participants had heard of the 2018 DC CPG website and 12 of 14 (85.7%) had used the website previously.

Table 5 User characteristics

Success rate, completion time, clicks taken

Overall, users had a 79% success rate across all tasks and used 144 (± 152) seconds and 4.6 (± 3.9) clicks per task (Table 6). Resource retrieval tasks had a higher success rate than information retrieval tasks.

Table 6 Success rate, time taken, and clicks for usability tasks

Task 7 had the lowest completion rate of 43%. Users also spent the longest time and most clicks on this task. Task 1 had the second lowest completion rate of 57% but was the shortest task at 86 (± 41) seconds.

Resource used and path taken

Interactive tools were more frequently used to complete tasks successfully than the full guidelines and static tools (Table 7). In unsuccessful attempts, the three types of resources were used about equally. The Quick Reference Guide was the most frequently used among static tools.

Table 7 Resource accessed when completing information retrieval tasks

The homepage quick buttons leading directly to interactive tools were almost as frequently the start of final paths as links on the navigation bar (Table 8).

Table 8 Start of path when completing information retrieval tasks (n = 70)

Search function

Although the search function was not used to complete information retrieval tasks (Table 8), some users did use search unsuccessfully before changing their strategy, while others used it to complete resource retrieval tasks. In total, the website’s built-in search was accessed 13 times and successfully found an answer or resource in 2 of the 13 attempts (15.4%).


Interpretation error accounted for 48% of total errors (Table 9). Task 1 alone accounted for over half of these, with all six unsuccessful attempts being interpretation errors. They were made by three of four users that accessed the quick reference guide, containing an algorithm, and three of nine users that accessed the screening interactive tool. Users gave up most frequently (3 of 8 instances) in task 7 on pregnancy and diabetes.

Table 9 Types of errors committed by task

(Task 2 and Task 8 had perfect completion and are not shown on this table.)

Content-related themes

The top strengths and weaknesses pertaining to website content are shown in Table 10.

Table 10 Top strengths and weaknesses of website content identified through content analysis and their representative quotes

Users showed a preference for interactive tools, static tools, and quick reference guides, while the full guidelines text was perceived as less useful. Users liked that the interactive tools were “quick” in generating answers.

Several themes addressed the presentation of information within the content. Users wanted to see more visual representations of content, such as tables and flowcharts. Some users pointed out there was too much medical jargon on the website, which could make it less useful for patients.

Users had mixed opinions about the breadth of content. Several users praised the website for being comprehensive, but similar number of users thought content was sparse in certain clinical areas.

Format of the website

The top strengths and weaknesses pertaining to website format are shown in Table 11.

Table 11 Top strengths and weaknesses of website format identified through content analysis and their representative quotes

Navigation elements were the most highly rated aspects of the website design. Most users found the left-hand navigation bar useful. Almost half of users found the links to the tools on the main page helpful. Being able to access the same information in multiple ways was also seen as a positive.

Many users still found it difficult to find information. Links that were formatted as lists, rather than as buttons, were challenging to use for most users. About a third of users thought the search function did not work well.


Overall usability

Within the context of our chosen tasks, the Diabetes Canada CPG website had good usability with a task success rate for both information and resource retrieval of 79%. Users brought up many positive qualities of the website in the interview, including praise for the individual content on the website as well as design elements. A third of users praised the website for being intuitive.

Effective content

The different versions of guideline content on the Diabetes Canada CPG website facilitated task completion. For the information-retrieval tasks, the majority of successful attempts (90%) were completed using the interactive tools, static tools, or quick reference guide, rather than the full guideline chapters. Users also considered the alternate versions to be strengths of the website, while full guideline chapters were perceived to be less useful.

The semi-structured interview elicited “Comprehensiveness” as a strength and “Not enough content” as a weakness of our website, which may seem contrary to the barriers of time and information overload. One explanation is that CPGs can have serve other purposes, such as for dedicated learning outside of clinical care, in which there are less time constraints. Guideline use can be improved by having alternate versions for different purposes [7]. Clinicians have significant point-of-care information needs, raising 0.18 to 1.5 questions on the average patient, most commonly regarding treatment and diagnosis [21]. However, time and information overload are important barriers to seeking information [21,22,23]. Even at the point-of-care, clinicians can have different preferences regarding volume of information, with “competing demands of brevity and comprehensiveness” [20, 24]. While full text guideline documents are more comprehensive, alternate versions contain targeted information which can be utilized more quickly at point-of-care. Thus, creating websites with multiple versions of the guidelines in one place gives users the choice of how and how much CPG information they want to use, and may ultimately improve uptake.

Users also wanted to see more visual content such as tables and algorithms on the website. Visual presentation can enhance information delivery and has been recommended by multiple authors [3, 25, 26]. Interestingly, we found that while visual presentation may speed up information delivery, it does not guarantee accuracy: task 1 took users the least time to complete, yet it was the task with the second most errors. Usability testing should be considered for elements like algorithms, to ensure accurate information delivery.

Effective format

Existing recommendations for CPG format include using boxes to highlight recommendations, algorithms and tables, different versions, and layering [5, 6, 26, 27]. However, little recommendations exist in the CPG literature pertaining to web design.

Most users brought up the fixed navigation bar on the website as a strength, as it allowed them to jump around different sections without using the back button. Many users also liked the quick buttons on the home page that linked to the interactive tools. In contrast, organization of links in other parts of the website were rated poorly as was the search function, which performed poorly, with only 15% of attempts generating the desired result. A major weakness of the search function we identified was the default search mode of “exact phrase” for search terms. This could lead to omission of desired results if the phrase is worded differently on the webpage or if the user makes a spelling mistake. The algorithm may also be flawed, as we sometimes observed less relevant webpages containing the search terms prioritized over the more relevant webpage in the results. The World Wide Web Consortium’s guidelines on web accessibility recommends implementing search functions that can generate results that account for spelling errors, different endings on search terms (stemming), and use of synonyms [28]. It also recommends using techniques like meta tags to optimize results. Our next steps for improving the website includes re-designing the search function to reflect these recommendations.

In our study, the average amount of time taken to complete information retrieval tasks was only 160 seconds. This is comparable to the amount of time actual users spent per session on the previous iteration of the Diabetes Canada CPG website (180 seconds) and less than the amount of time internal medicine physicians have been observed searching online sources while working (252 seconds) [11, 29]. When a task took much longer to complete (task 7 at 352 seconds), it resulted in the lowest success rate (43%), with some users giving up after not finding information quickly, highlighting the importance of intuitive navigation.

The strengths and weaknesses identified in our study of a CPG website are echoed in existing website design guidelines not specific to clinician use. For instance, the use of navigation menus (left-hand positioning specifically for primary menus) is recommended in the Research-Based Web Design & Usability Guidelines published by the U.S. Department of Health and Human Services (HHS) [30]. Both the HHS and International Organization for Standardization (ISO) guidelines recommend allowing access to important options on the home page and providing an effective search function [30, 31].

Recommendations for CPG website design

Based on our findings and literature review, we have synthesized a list of recommendations for guideline developers and disseminators (Table 12) that can be adapted or added to existing guideline implementation planning checklists and tools [32]. Incorporating web design recommendations in the assessment of CPG websites could lead to improvements in their usability and uptake.

Table 12 Recommendations for the design of Clinical Practice Guideline websites based on our findings and literature review

Study limitations & strengths

Limitations include convenience sampling of participants, lack of iterative testing, and the inability of usability tasks to completely replicate real-world use of the website. Our study demographic overrepresented younger, female users (64% younger than age 40, 64% female) and experienced users. This may have affected some of the study results, although we know from real-world data of our website that the actual end users also skew towards younger, female users (64% younger than age 44, 70% female). Younger users are more likely to be comfortable with using technology and may be more adept at completing tasks. Sex has not been found to influence user experience of websites in other studies [33]. Previous experience may have led to better completion rate and speed, but studies show that experience affects these metrics minimally when web designs are highly effective [34]. Strengths of the study include the variety of usability tasks chosen, inclusion of participants with differing clinical backgrounds, mixed-methods design, and and independent coders.


Multiple versions of CPGs (e.g. interactive tools, static tools, summaries) can be used to answer clinical questions more quickly. Usability testing can be used to identify previously unknown issues with CPG content. Effective web design should be assessed in the creation of CPG website.

Availability of data and materials

All data generated or analysed during this study are included in this published article [and its supplementary information files].



Clinical practice guidelines


Diabetes Canada


Health Care Providers


Self-Monitoring of Blood Glucose


  1. Gupta S, Rai N, Bhattacharrya O, Cheng AYY, Connelly KA, Boulet L-P, et al. Optimizing the language and format of guidelines to improve guideline uptake. CMAJ. 2016;188(14):E362–E8.

    Article  Google Scholar 

  2. Kastner M, Estey E, Hayden L, et al. The development of a guideline implementability tool (GUIDE-IT): a qualitative study of family physician perspectives. BMC Fam Pract. 2014;15:19.

    Article  Google Scholar 

  3. Kastner M, Bhattacharyya O, Hayden L, et al. Guideline uptake is influenced by six implementability domains for creating and communicating guidelines: a realist review. J Clin Epidemiol. 2015;68(5):498–509.

    Article  Google Scholar 

  4. Siering U, Eikermann M, Hausner E, Hoffmann-Eßer W, Neugebauer EA. Appraisal tools for clinical practice guidelines: a systematic review. PLoS One. 2013;8(12):e82915.

    Article  Google Scholar 

  5. Shiffman RN, Dixon J, Brandt C, Essaihi A, Hsiao A, Michel G. The GuideLine Implementability appraisal (GLIA): development of an instrument to identify obstacles to guideline implementation. BMC Med Inform Decis Mak. 2005;5:23.

    Article  Google Scholar 

  6. Harrison MB, Graham ID, van den Hoek J, Dogherty EJ, Carley ME, Angus V. Guideline adaptation and implementation planning: a prospective observational study. Implement Sci. 2013;8:49.

    Article  Google Scholar 

  7. Gagliardi AR, Brouwers MC, Palda VA, Lemieux-Charles L, Grimshaw JM. How can we improve guideline use? A conceptual framework of implementability. Implement Sci. 2011;6:26.

    Article  Google Scholar 

  8. Le JV, Pedersen LB, Riisgaard H, Lykkegaard J, Nexøe J, Lemmergaard J, et al. Variation in general practitioners’ information-seeking behaviour – a cross-sectional study on the influence of gender, age and practice form. Scand J Prim Health Care. 2016;34(4):327–35.

    Article  Google Scholar 

  9. Bernard E, Arnould M, Saint-Lary O, Duhot D, Hebbrecht G. Internet use for information seeking in clinical practice: a cross-sectional survey among French general practitioners. Int J Med Inform. 2012;81(7):493–9.

    Article  Google Scholar 

  10. Diabetes Canada. Diabetes Canada Clinical Practice Guidelines; 2018. Available at: Accessed Oct 2019.

    Google Scholar 

  11. Yu CH, Gall Casey C, Ke C, Lebovic G, Straus SE. Process evaluation of the diabetes Canada guidelines dissemination strategy using the reach effectiveness adoption implementation maintenance (RE-AIM) framework. Can J Diabetes. 2019;43(4):263–70.e9.

    Article  Google Scholar 

  12. Nichols J, Shah BR, Pequeno P, Gall Casey C, Yu CH. Impact of a comprehensive guideline dissemination strategy on diabetes diagnostic test rates: an interrupted time series. J Gen Intern Med. 2020;35(9):2662–7.

    Article  Google Scholar 

  13. Yu CH, Lillie E, Mascarenhas-Johnson A, Gall Casey C, Straus SE. Impact of the Canadian Diabetes Association guideline dissemination strategy on clinician knowledge and behaviour change outcomes. Diabetes Res Clin Pract. 2018;140:314–23.

    Article  Google Scholar 

  14. Rigobon AV, Kalia S, Nichols J, Aliarzadeh B, Greiver M, Moineddin R, et al. Impact of the diabetes Canada guideline dissemination strategy on the prescription of vascular protective medications: a retrospective cohort study, 2010-2015. Diabetes Care. 2019;42(1):148–56.

    Article  Google Scholar 

  15. Berg B. Qualitative research methods for the social sciences. 4th ed. Boston: Allyn and Bacon; 2001.

    Google Scholar 

  16. Dieticians of Canada. Find a Dietician; 2019. Available from: Accessed Oct 2019.

    Google Scholar 

  17. LogMeIn. GoToMeeting; 2019. Available from: Accessed May 2019.

    Google Scholar 

  18. Yu CH, Parsons JA, Hall S, Newton D, Jovicic A, Lottridge D, et al. User-centered design of a web-based self-management site for individuals with type 2 diabetes – providing a sense of control and community. BMC Med Inform Decis Mak. 2014;14:60.

    Article  Google Scholar 

  19. Jaspers MW. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence. Int J Med Inform. 2009;78(5):340–53.

    Article  Google Scholar 

  20. Cook DA, Sorensen KJ, Wilkinson JM, Berger RA. Barriers and decisions when answering clinical questions at the point of care: a grounded theory study. JAMA Intern Med. 2013;173(21):1962–9.

    Article  Google Scholar 

  21. Daei A, Soleymani MR, Ashrafi-Rizi H, Zargham-Boroujeni A, Kelishadi R. Clinical information seeking behavior of physicians: a systematic review. Int J Med Inform. 2020;139:104144.

    Article  Google Scholar 

  22. Clarke MA, Belden JL, Koopman RJ, Steege LM, Moore JL, Canfield SM, et al. Information needs and information-seeking behaviour analysis of primary care physicians and nurses: a literature review. Health Inf Libr J. 2013;30(3):178–90.

    Article  Google Scholar 

  23. Davies KS. Physicians and their use of information: a survey comparison between the United States, Canada, and the United Kingdom. J Med Libr Assoc. 2011;99(1):88–91.

    Article  Google Scholar 

  24. Aakre CA, Maggio LA, Fiol GD, Cook DA. Barriers and facilitators to clinical information seeking: a systematic review. J Am Med Inform Assoc. 2019;26(10):1129–40.

    Article  Google Scholar 

  25. Versloot J, Grudniewicz A, Chatterjee A, Hayden L, Kastner M, Bhattacharyya O. Format guidelines to make them vivid, intuitive, and visual: use simple formatting rules to optimize usability and accessibility of clinical practice guidelines. Int J Evid Based Healthc. 2015;13(2):52–7.

    Article  Google Scholar 

  26. Fearns N, Kelly J, Callaghan M, Graham K, Loudon K, Harbour R, et al. What do patients and the public know about clinical practice guidelines and what do they want from them? A qualitative study. BMC Health Serv Res. 2016;16:74.

    Article  Google Scholar 

  27. Kristiansen A, Brandt L, Alonso-Coello P, Agoritsas T, Akl EA, Conboy T, et al. Development of a novel, multilayered presentation format for clinical practice guidelines. Chest. 2015;147(3):754–63.

    Article  Google Scholar 

  28. W3C Working Group. G161: Providing a search function to help users find content. In: Techniques and Failures for Web Content Accessibility Guidelines 2.0. 2016. Available from: Cited 2022 Dec 17.

  29. Hoogendam A, Stalenhoef AF, Robbé PF, Overbeke AJ. Answers to questions posed during daily patient care are more likely to be answered by UpToDate than PubMed. J Med Internet Res. 2008;10(4):e29.

    Article  Google Scholar 

  30. Leavitt MO, Shneiderman B, Baily RW, Barnum C, Bosley J, Chaparro B, et al. Research-based web design & usability guidelines. Washington DC: United States Govt Printing Office; 2006.

    Google Scholar 

  31. Bevan N. Guidelines and standards for web usability. Proc HCI Int. 2005;2005:407–19.

  32. Schünemann HJ, Wiercioch W, Etxeandia I, Falavigna M, Santesso N, Mustafa R, et al. Guidelines 2.0: systematic development of a comprehensive checklist for a successful guideline enterprise. CMAJ. 2014;186(3):E123–42.

    Article  Google Scholar 

  33. Aufderhaar K, Schrepp M, Thomaschewski J. Do women and men perceive user experience differently? Int J Interactive Multimedia Artificial Intell. 2019;5(6):63.

    Google Scholar 

  34. Khodambashi S, Wang Z, Nytrø Ø. Reality versus user’s perception in finding answer to clinical questions in published national guidelines on the web: an empirical study. Procedia Comput Sci. 1877;2015(63):268–75. Available from:.

    Article  Google Scholar 

Download references


Not applicable.


This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Author information

Authors and Affiliations



WW conducted data collection, data analysis and drafted the manuscript. DC conducted data analysis. CHY conceived the study, designed the study and supervised the research. All authors contributed to the manuscript, data analysis, and have approved the final manuscript.

Corresponding author

Correspondence to Catherine H. Yu.

Ethics declarations

Ethics approval and consent to participate

All methods were performed in accordance with the relevant guidelines and regulations.

Ethics approval for the study was obtained through the Unity Health Toronto Research Ethics Board (REB 14–085).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Baseline questionnaire.

Additional file 2.

Task and accepted answers.

Additional file 3.

Semi-structured interview guide for usability testing.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, W., Choi, D. & Yu, C.H. Effective web-based clinical practice guidelines resources: recommendations from a mixed methods usability study. BMC Prim. Care 24, 29 (2023).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: