Skip to main content

The implementation of health promotion in primary and community care: a qualitative analysis of the ‘Prescribe Vida Saludable’ strategy



The impact of lifestyle on health is undeniable and effective healthy lifestyle promotion interventions do exist. However, this is not a fundamental part of routine primary care clinical practice. We describe factors that determine changes in performance of primary health care centers involved in piloting the health promotion innovation ‘Prescribe Vida Saludable’ (PVS) phase II.


We engaged four primary health care centers of the Basque Healthcare Service in an action research project aimed at changing preventive health practices. Prescribe Healthy Life (PVS from the Spanish “Prescribe Vida Saludable) is focused on designing, planning, implementing and evaluating innovative programs to promote multiple healthy habits, feasible to be performed in routine primary health care conditions. After 2 years of piloting, centers were categorized as having high, medium, or low implementation effectiveness. We completed qualitative inductive and deductive analysis of five focus groups with the staff of the centers. Themes generated through consensual grounded qualitative analysis were compared between centers to identify the dimensions that explain the variation in actual implementation of PVS, and retrospectively organized and assessed against the Consolidated Framework for Implementation Research (CFIR).


Of the 36 CFIR constructs, 11 were directly related to the level of implementation performance: intervention source, evidence strength and quality, adaptability, design quality and packaging, tension for change, learning climate, self-efficacy, planning, champions, executing, and reflecting and evaluating, with —organizational tracking added as a new sub-construct. Additionally, another seven constructs emerged in the participants’ discourse but were not related to center performance: relative advantage, complexity, patients’ needs and resources, external policy and incentives, structural characteristics, available resources, and formally appointed internal implementation leaders. Our findings indicate that the success of the implementation seems to be associated with the following components: the context, the implementation process, and the collaborative modelling.


Identifying barriers and enablers is useful for designing implementation strategies for health promotion in primary health care centers that are essential for innovation success. An implementation model is proposed to highlight the relationships between the CFIR constructs in the context of health promotion in primary care.

Peer Review reports


The impact of health behaviors and lifestyles on health outcomes is undisputed [1] and its individual, social, environmental and cultural determinants are well known [2]. Primary health care (PHC) professionals have many opportunities to promote healthy behaviors in patients with effective interventions [3, 4]. However, healthy lifestyle promotion is far from being integrated in routine primary care practice [57]. Our own research group has contributed to generate evidence on the effectiveness of clinical interventions for health promotion through several clinical trials [8, 9]. Nevertheless, the delivery of primary care interventions tends to stop after PHC professionals participate in such trials [10]. This lack of integration of evidence-based interventions may be explained by weaknesses in implementation strategies [11].

Implementation strategies can be defined as sets of methods, techniques, and interventions used to enhance the adoption and integration of evidence-based innovations into usual care [12]. In order to build strong implementation strategies, we need to identify factors that determine change in practice, namely, barriers and enablers of change [13]. This requires learning from the experiences of all innovation agents. However, when implementing a new intervention or practice most of the people that can participate in such research have little or no experience of change in practice and, instead of reporting determinants of change, they report what determines their current performance. In the ‘Prescribe Vida Saludable’ (PVS) innovative project for healthy promotion by PHC professionals we had the opportunity to explore the experiences of individuals who were actively involved in changing clinical practice in health promotion.

The aim of this inquiry is to identify core factors associated with success and barriers in the implementation of the PVS intervention and assess factors associated with better performance in its piloting phase. We describe the experience of the PHC professionals who have already been involved in innovation and how they assess the successes and challenges of the implementation of PVS. The analysis is based on Damschroder et al.’s Consolidated Framework for Implementation Research (CFIR) [14] which is outlined in the methods section as part of the analytical strategy. Identifying core constructs associated with implementation should strengthen future dissemination and inform the modeling process of sound and effective implementation strategies for PHC professionals practice redesign.


A qualitative comparative analysis design [15, 16] was used to comprehensively explore PVS implementation barriers and enablers. In addition to a comparative analysis of ongoing process indicators for each primary health care (PHC) center, the qualitative evaluation consisted of focus groups with PHC professionals involved. The study protocol was approved by the Primary Care Research Committee of the Basque Healthcare Services and by the Basque Country Clinical Research Ethics Committee.


The four participating PHC centers were a convenience sample, selected by the medical directors of the primary care districts of the Basque Healthcare Service on the basis of their previous involvement in health promotion programs or preventive practice optimization initiatives. The PHC professional teams initiated an action research process to design the local PVS implementation strategy for each center. In brief, a bottom-up decision making processes was initiated in the four participating centers to select actions to be included in the implementation strategy, based on discussion and consensus meetings among a multi-professional primary care team and community members. We refer to this method as collaborative modeling facilitated by the research team. A coordinator at each center was the liaison with the research team and leaded the process at the local level. The Department of Public Health supported the teams, with a district public health department representative attending the aforementioned monthly meetings.

The PVS project emerged as an initiative facilitated by the Primary Care Research Unit of Bizkaia (UIAPB) of the Basque Health Service (Osakidetza). The research team provided external facilitation for changing clinical practice. Other participants in the project were community-based organizations, including nine local municipality departments, six schools, four sports facilities, and two manufacturing companies, as well as local councils and senior management of several Osakidetza departments (information technology, primary care, and public health).

PVS innovation

The PVS first phase consisted of a collaborative modeling process to adapt evidence-based health promotion interventions to the specific contexts of the PHC centers and communities, and simultaneously change PHC professional organization (see Fig. 1). Most of the staff (80%) of the four participating PHC centers were actively involved in this process and they selected the 5 A’s evidence-based clinical intervention (A1: Assess, A2: Advise, A3: Agree, A4: Assist, and A5: Arrange Follow-Up) to address three healthy lifestyles: healthy diet, physical activity, and smoking cessation [3, 4, 17]. The second phase of PVS was the pilot implementation to evaluate the feasibility of the strategy. This pilot study included all 22,459 patients 10 to 65 years old who had attended a healthcare appointment over the 2-year study period. They were assessed for physical activity, diet, and smoking in 52% of cases, 33% of them received advice at least once for changing their behavior, and 10% were prescribed lifestyle changes, with a personalized plan (see accompanying paper by Sanchez et al.).

Fig. 1
figure 1

PVS research steps

The Primary Care Research Unit provided PHC centers with a monthly progress report outlining the results of their activities, which included indicators of assessment of habits (A1), counseling and advising (A2), and prescription of changes in behavior with tailored plans (A4). These monthly reports were a cornerstone of the audit and feedback strategy. The research team met monthly with the coordinators of the healthcare teams to evaluate the program, discuss the implementation indicators, identify ways of increasing the effectiveness of the intervention, and overall reflect on what the teams were learning about the implementation process. The coordinators organized monthly meetings in their PHC centers with the same implementation improvement objectives.

Implementation strategy

The PVS implementation strategy was focused on various levels: community involvement, top-down support from managers, bottom-up primary care organizational change, and the development of innovative e-health information and communication technologies. The study protocol has been published previously [18]. The participants adapted the 5As’ strategy into a community health center’s functioning plus at least one collaborative partner in the community through a collaborative modeling process. The research team facilitated a bottom-up decision making process, based on 8–10 discussion and consensus meetings among a multi-professional primary care team and community members. The intervention components did not all reside in the community health centers, A1 (Ask), for instance, could be accomplished at a school, a workplace, a PVS website link, or administrative personnel at the health center in advance of the primary care consultation. A2 (Advise) and A3 (Agree) were tasks mostly assigned to the family physicians, while the A4 (Assist) was often a role accomplished by nurses who prescribed and helped plan for health behavior modification. All the healthcare center professionals were involved in A5 (Arrange Follow Up) with client services playing an important role too.

New e-health tools were designed, tested and incorporated into the electronic health record (EHR). Screening for healthy behaviors was performed with paper-based self-report questionnaires, entered into the EHR by administrative staff, and with an electronic version publically available on the Basque Healthcare Service web portal for individual self-assessment, with online transmission of data to the EHR through a secure developed web service (see accompanying paper by Sanchez et al.). This screening provided automatic feedback to individuals and generated reminders in the EHR that prompted clinicians to implement health promotion activities guided by the PVS application integrated in the EHR information system. These reminders helped primary care staff to easily identify individuals whose lifestyles had not been assessed; to assess physical activity, diet, and smoking status of attendees; to identify those not meeting recommendations; and to select high priority populations for interventions based on data recorded in the EHR. Further, the PVS application guided clinicians in providing individually tailored advice based on effective communication of risks and benefits associated with lifestyle. It also facilitated the prescribing of plans for lifestyle modification, providing algorithms, evidence-based support, and recommendations, warnings, timetables and other information about community resources. Besides it facilitated monitoring of patients over a follow-up period. The PVS application integrated all the information, and made it available to all PHC center staff, thereby making it easier to track patients.

The fidelity and maintenance of the strategy was based on a monthly audit and feedback meetings, where indicators on patients’ results were analyzed.

Implementation performance

Considering the 2-year process indicators of the PVS second phase pilot study, participating centers were categorized as having high, medium, or low implementation performance, based on their adoption and implementation of the PVS activities (see Fig. 1). One of the centers was categorized as having high and one low implementation performance, while two were assigned to the “medium” category. Another paper [see accompanying paper by Sanchez et al.] and Table 1 provide further details of the PVS intervention during the 2-year pilot implementation process.

Table 1 Characteristics and implementation indicators of the primary care centers participating in this PVS pilot

Qualitative data collection

Staff of each of the four PHC centers were invited to participate in five focus groups (the largest center requiring two groups) and these were completed over 5 weeks (see Fig. 1). Specifically, the coordinators of the participating centers were asked to invite all the PVS implementation participants, and 75 physicians, nurses, and administrative personnel participated. An average of 15 participants attended each group (SD = 3.54). The potential pool of participants was 109 (\( \overline{\mathrm{X}} \) = 21.8; SD = 4.49). Participants consent to their contribution to be published.

One researcher facilitated all five focus groups at the PHC centers. The research team selected this type of group interview, since one of the objectives was that all participants of each center participate jointly, in order to grasp the point of view of each individual within the group. The interviewer was selected ensuring that she had no previous direct relationship with or knowledge of participants. The discussion was guided by a set of open-ended questions to identify what factors were associated with the degree to which the PVS program had been adopted: What are your thoughts regarding the implementation of PVS in this health center? What has been the impact of PVS on daily work? How do you interpret the indicators (participants being shown charts and graphs)? What are the factors that explain the wide variation in implementation of PVS between centers and between different members of staff within each center? What is your assessment of the way how patients are being reached? What may cause problems in the implementation and can you suggest any solutions? What are the barriers to implementation of the PVS?

Data were shared during the interview on indicators for the sixth month of the project, from the monthly report. These data provided a clear picture of three cases depending on the degrees of success in the implementation. An observer, a representative of the public health department in the PVS efforts, accompanied the facilitator in the interviews and wrote an ethnographic report with her own analysis of the focus groups. The data analysis was ongoing, iterative, and informed by the research team’s observations. The interviews were audio-recorded and transcribed for analysis.

Data analysis

A consensual qualitative research approach comprising two data analysis strategies was used to guide the systematic analysis of the barriers and enablers influencing the implementation. The two strategies included inductive and deductive data analysis to ensure not only triangulation of the data sources but also their trustworthiness.

The inductive methodology consisted of basic grounded thematic analysis. Basic themes were extracted from the transcripts analyzed separately by research team members and then consensually validated in team meetings. The ethnographic notes prepared by the observers of the group interviews also aided this thematic analysis and helped the organization of the transcribed material.

On completion of the first thematic analysis, the research team concluded that deductive analysis utilizing the CFIR model could be useful to address the wealth of data and to focus on the task of assessing implementation dimensions. This coding framework was chosen because it offers a complete taxonomy of operationally defined constructs that can influence the adoption of complex programs. CFIR constructs are organized into five major domains [15]: (1) the characteristics of the intervention, (2) the outer setting including patients’ needs and resources, (3) the inner setting (i.e., how compatible the program is with existing interventions), (4) the process employed to implement the program, and (5) the characteristics of the individuals involved.

Each transcript was treated as a case and two members of the research team separately completed exhaustive within-case coding. To reach consensus, the team carried out reviews of each focus group. This within-case analysis was completed for each group and audits of the analysis were completed to ensure consistency. After completion of the within-case analysis, common themes across cases were identified and a rating value (valence) assigned to each code. These data reflect the attributes the participants associated with each of these concepts. We used the same criteria as Damschroder and Lowery [15] for assigning these ratings. Valences from +2 to −2 reflect a positive or negative influence of each construct on the organization, work processes, and/or implementation efforts [15]. The rating was completed with the same consensual data analysis as that used in the coding of the transcripts. The analysis required continuous comparison across groups, and the examination of patterns in the data. One of the researchers performed the analysis without knowing the quantitative outcomes of the health center in question to enhance the trustworthiness of the qualitative data analysis. At the end of the analysis, vignettes were selected to exemplify the CFIR dimensions and to highlight their association with the actual PVS outcomes at each center. The research team shared the analysis with the coordinators of the PHC centers to feedback the findings.


What factors are associated with community health promotion and primary care innovation?

In our study, 18 of the 36 CFIR constructs proposed by Damschroder and analyzed by the research team present high levels of trustworthiness. According to the positive (+2, +1), neutral (0), or negative (−1, −2) valences assigned by participating staff, these constructs can be divided in two groups. One group includes 11 constructs associated with the actual level of success in the implementation performance; there is a trend across centers in the valences assigned by participants and this is correlated with the level of success or failure in implementation performance. The other set of seven constructs appears to be unrelated to the actual level of implementation performance (See Table 2).

Table 2 CFIR constructs associated with actual implementation performance

For the other 18 CFIR constructs, insufficient data emerged to assess their potential association with performance. They can be considered less important for designing strategies for change in public PHC professional services. Despite the lack of data that emerged on these dimensions, they should be studied further to assess their meaning in relation to predicting implementation performance of healthy lifestyle interventions. For example, the lack of engagement of external change agents appeared only in one of the centers, the one which showed the highest implementation rating, and it appeared to be negatively valued (see Table 2).

Tables 3 and 4 include quotes that exemplify the dimensions that were or were not associated with the implementation performance. The valences assigned to the set of quotes are also included in each cell. The CFIR constructs not associated with the level of performance explain important barriers to and enablers of the implementation in general, but comparing the positive and negative values assigned to them by staff does not show a relation to the actual implementation observed in each center. They include: relative advantage, complexity, patients’ needs and resources, external policy and incentives, structural characteristics, available resources, and formally appointed internal implementation leaders. The CFIR constructs that distinguished between health teams with low, medium, and high implementation performance included: intervention source, evidence strength and quality, design quality and packaging, adaptability, tension for change, learning climate, self-efficacy, champions, reflecting and evaluating, planning, and executing.

Table 3 CFIR constructs associated with PVS performance
Table 4 CFIR constructs not associated with PVS performance

Our emerging conceptual implementation model for health promotion interventions synthesizes the findings. It also highlights the associations of the CFIR constructs with the implementation of innovation in community health and primary care practices (see Fig. 2). Successful implementation appears to be associated with three main components: the context, the implementation process, and the collaborative modelling. We highlight the relationships between these dimensions and potential linkages that require further research. This includes the CFIR constructs that appeared as associated with actual implementation but also others not necessarily associated with performance [15, 19]. Even though the latter were not useful for discriminating between high and low performance, they continue to be important for the implementation model. These variables appear in lowercase in Fig. 2.

Fig. 2
figure 2

Implementation model for health promotion in primary and community health care


The organizational conditions associated with implementation performance include tension for change, learning climate, and self-efficacy [20]. These constructs are interrelated. A positive learning climate nurtures a sense of self-efficacy and the development and impact of effective leaders. Similarly, higher feelings of self-efficacy made more resources available to achieve better outcomes. These organizational conditions impact, and yet are influenced by a bottom-up approach to designing and modelling the PVS intervention. The organizational climate is, therefore, the result of the characteristics of individuals, the inner setting, and the overall organizational process, dimensions that compound each other. Overall, the organizational conditions are in dialectical relationship with the context, shaping it as these conditions evolve. Several organizational constructs appear not to be associated with performance. Teams perceive, for instance, the lack of resources as a severe obstacle to change and innovation. Other constructs included in the diagram show a similar pattern. However, these constructs may not be as relevant when making systematic decisions about which teams may be more motivated to change their practices.

Implementation process

The constructs champions, planning, executing, and reflecting and evaluating, are shaped by the aforementioned organizational conditions and generate the implementation collaborative modeling. These local leaders ensure engagement of appropriate team members and community stakeholders and the drawing of resources into the project. In centers where the figure of the local leader is highly valued, the performance was also higher. According to some of the dimensions we have analyzed, a positive set of organizational conditions and practice facilitation provided by the research team shapes successful planning and execution. The ongoing feedback provided by the researchers about the performance of the PHC professional teams plus time for learning-based discussion of these data may produce ambivalent feelings, and PHC centers with the highest implementation rating demonstrated this tension with the highest intensity. In order to avoid negative feelings associated with organizational tracking, the research team should not give feedback to the community teams without first assessing the teams’ own needs and their particular decision-making process.

Collaborative modeling

In the PVS first phase, the research team worked collaboratively and transparently with the PHC centers to adapt the 5 A’s intervention (which includes five major steps to develop healthy habits) to the context, guided by action research principles. This engagement included discussions of epidemiological data, community demographics, evidence related to healthy behavior interventions, and the need for an ecological and community approach to prevent and ameliorate chronic illnesses. The degree to which the different teams perceived a need for change varied. However, the introduction of new information and the evolving consensus on needing to change practices may lead highly motivated teams to higher levels of implementation. Without an external research team that helps articulate this process, however, the implementation would stall. The facilitator of this process is the research team but the ownership of the intervention and its characteristics lie in the real local context [21]. The strength of the evidence, therefore, was built on this shared understanding to motivate higher performance among community health teams. The impact of the intervention, the origins of the intervention, the design of support tools and the adaptation to local context nurture the organizational conditions for successful implementation.


Eleven of the thirty-six CFIR constructs are directly related to the level of implementation performance. From the start of the inductive qualitative analysis, we realized that the construct reflecting and evaluating is difficult to assess, since the same PHC professionals valued two distinct aspects during the focus groups. On the one hand, teams which were the most concerned about the implementation process feedback were those with higher levels of implementation effectiveness; in these cases, reflecting and evaluating was positively accepted as an appropriate reflection on action related to how the teams welcome the coaching and feedback was closely connected to the field experience. However, among teams in which the implementation had lower intensity, the impact of the evaluation process is not mentioned. On the other hand, in these cases, reflecting and evaluating generates negative feelings associated with external pressure, and a perception that the external evaluators are judging performance as well as imposing some cumbersome requirements typical of research processes. This reaction appears very strongly in the focus groups. We have reserved a section in Table 2 for this, which is called “organizational tracking”. It is similar to goals and feedback, but we have not categorized it in that way because the evaluation is external to the health center.

Our findings can be compared to those of other studies that have applied the CFIR to evaluate the implementation of health promotion programs in other settings. Similarly to our results, Damschroder and Lowery [15] found that tension for change, learning climate, planning, and reflecting and evaluating clearly distinguish between centers with different effectiveness in the implementation of the MOVE! weight management program in the US Veterans Health Administration PHC centers. These researchers did not find intervention source to be associated with different implementation performance, MOVE! being an externally developed program. In our study, the feeling of ownership of the intervention by the participants emerged as a characteristic strongly associated with implementation performance, and along with evidence strength and quality, strongly distinguished between high and low implementation performance centers. Adaptability and design quality and packaging were not useful for distinguishing between low and high implementation effectiveness of the MOVE! as materials and support tools were consistently considered helpful by all the participating centers. However, in the PVS pilot, a number of failures detected in the new information and communication technology support tools integrated into the electronic health records and delays in fixing them had a negative influence on implementation efforts of the staff of the center with the highest implementation effectiveness, while medium-to-low implementation performance centers were especially appreciative of these support tools. Unlike in the case of the MOVE!, PHC centers piloting PVS designed their own implementation strategy and agreed on well-defined milestones and standard performance measures. Specifically, process constructs were negatively associated with low implementation performance and positively associated with or unaffected by high implementation performance. See Table 4 for quotes that exemplify these constructs differentiating between centers with different levels of implementation performance.

The perception of PHC professionals of the intervention source as internal can be fostered by involving them in discussion, consensus, and a decision making process about priorities in each center according to the specific context and about workflow and the role and contributions of the different members of the PHC professional team. This bottom-up process may lead to a greater sense of ownership and commitment to adhering to the program [22, 23]. Collaborative modeling of the specific implementation strategy for each PHC center in turn favors other constructs associated with the implementation in our study, i.e., adaptability and learning climate.

In this pilot study, we identified two distinct responses to the monthly provision of clinical performance indicators. One response includes PHC professionals valuing it as a positive contribution because it encourages reflection and prompts team discussions to identify problems and look for solutions. The alternative response is related to the continuous assessment and evaluation, the provision of indicators being perceived as stressful because desired changes depend in part on factors beyond the control of the healthcare providers, i.e., excellence in the design and maintenance of the information technology support tools or requirements imposed by the research protocol associated with the implementation effort. The effectiveness of audit and feedback seems to depend on how the feedback is provided and this requires further investigation [24, 25].


Our research design has limitations inherent to a cross-sectional study of the perception of what explains performance by healthcare professionals, the factors perceived to be associated with positive or negative implementation outcomes being based on the perceptions of the teams’ post facto. The CFIR constructs and performance associations could be bidirectional, are not mutually exclusive categories, and are in continuous evolution. The focus group guide was a set of open-ended questions and was not developed using the CFIR constructs; the CFIR model oriented the analysis only after the initial stage of the grounded consensual qualitative analysis of transcripts. As a result, the lack of data about some of the constructs may be biased by the data collection process rather than a lack of significance of these constructs in the participants’ experience. Our semi-structured data gathering and initial grounded data analysis, however, could have prevented a confirmation bias of the validity of the CFIR framework since the constructs were not employed in the interview design. Further research is required to establish the direction of these associations. To assess for organizational readiness for change, community health centers could be measured via surveys based on a systematic review of applications of the CFIR.

Further, the omission of some of the CFIR constructs may, in some cases, be related to the specific cultural and organizational dynamics of health service delivery in the Basque Country. Other constructs are embedded in or impacted directly by some of the dimensions not mentioned in the groups. For instance, leadership engagement is a core aspect of the learning climate, a dimension that participants found significant.


Strong implementation strategies are required to influence the multiple factors associated with innovation in health-promoting practices by PHC professionals. This study identifies a set of factors associated with the implementation of the PVS program (see Fig. 2). In order to develop such strategies, they should be linked to specific actions, techniques, and processes that foster change and tackle barriers [13]. Partnership between clinicians and researchers is required from the design stage. The first step is engaging the majority of the members of the PHC center in a collaborative activity led by PHC professionals appropriately informed about epidemiological and clinical evidence on health promotion interventions and evolving over time through pilot cycles within a learning organization [26].

The findings and recommendations are relevant to primary care practice as it reflects how primary care could be strengthened to highlight the relevancy of prevention measures. In many countries, primary care aims at integrating the preventive effort. Our trials intended to contribute to knowledge related to those assumptions to intentionally include a preventive dimension in family practice. We do believe the paper makes a contribution that bridges implementation and health services research in the context of primary care practice.


5 A’s:

Ask, Advise, Agree, Assist, and Arrange follow-up


Consolidated Framework for Implementation Research


Electronic health record


Primary Health Care


Prescribe Vida Saludable (Prescribe Healthy Life)


The Primary Care Research Unit of Bizkaia


  1. Lim SS, Vos T, Flaxman AD, Danaei G, Shibuya K, Adair-Rohani H, et al. A comparative risk assessment of burden of disease and injury attributable to 67 risk factors and risk factor clusters in 21 regions, 1990–2010: a systematic analysis for the Global Burden of Disease Study 2010. Lancet. 2012;380(9859):2224–60.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Dahlgren G, Whitehead M. Policies and strategies to promote social equity in health. Stockholm: Institute for Future Studies; 1991.

    Google Scholar 

  3. Whitlock EP, Orleans CT, Pender N, Allan J. Evaluating primary care behavioral counseling interventions: an evidence-based approach. Am J Prev Med. 2002;22(4):267–84.

    Article  PubMed  Google Scholar 

  4. Glasgow RE, Goldstein MG, Ockene JK, Pronk NP. Translating what we have learned into practice. Principles and hypotheses for interventions addressing multiple behaviors in primary care. Am J Prev Med. 2004;27(2 Suppl):88–101.

    Article  PubMed  Google Scholar 

  5. Grandes G, Sanchez A, Cortada JM, Balague L, Calderon C, Arrazola A, et al. Is integration of healthy lifestyle promotion into primary care feasible? Discussion and consensus sessions between clinicians and researchers. BMC Health Serv Res. 2008;8(1):213.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Glasgow RE, Lichtenstein E, Marcus AC. Why Don’t We See more translation of health promotion research to practice? rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003;93(8):1261–7.

    Article  PubMed  PubMed Central  Google Scholar 

  7. McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348(26):2635–45.

    Article  PubMed  Google Scholar 

  8. Grandes G, Sanchez A, Sanchez-Pinilla R, et al. Effectiveness of physical activity advice and prescription by physicians in routine primary care: a cluster randomized trial. Arch Intern Med. 2009;169(7):694–701.

    Article  PubMed  Google Scholar 

  9. Grandes G, Cortada JM, Arrazola A, Laka JP. Predictors of long-term outcome of a smoking cessation programme in primary care. Br J Gen Pract. 2003;53(487):101–7.

    PubMed  PubMed Central  Google Scholar 

  10. Grandes G, Sanchez G, Cortada JM, Calderon C, Balague L, Millan E. Useful strategies for promoting healthy lifestyles in primary care [Spanish]. Vitoria-Gasteiz: Eusko Jaurlaritzaren Argitalpen Zerbitzu Nagusia Servicio Central de Publicaciones del Gobierno Vasco Donostia-San Sebastián, 1–01010; 2008. p. 122.

    Google Scholar 

  11. Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients’ care. Lancet. 2003;362(9391):1225–30.

    Article  PubMed  Google Scholar 

  12. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69(2):123–57.

    Article  PubMed  Google Scholar 

  13. French SD, Green SE, O’Connor DA, McKenzie JE, Francis JJ, Michie S, et al. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework. Implement Sci. 2012;7(1):38.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing. Implement Sci. 2009;4(1):50.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Damschroder LJ, Lowery JC. Evaluation of a large-scale weight management program using the consolidated framework for implementation research (CFIR). Implement Sci. 2013;8(1):51.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Schneider CQ, Wagemann C. Set-Theoretic Methods for the Social Sciences: A Guide to Qualitative Comparative Analysis. Cambridge: Cambridge University Press; 2012.

  17. Spring B, Ockene JK, Gidding SS, Mozaffarian D, Moore S, Rosal MC, et al. Better population health through behavior change in adults a call to action. Circulation. 2013;128(19):2169–76.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Sanchez A, Grandes G, Cortada JM, Pombo H, Balague L, Calderon C. Modelling innovative interventions for optimising healthy lifestyle promotion in primary health care: “Prescribe Vida Saludable” phase I research protocol. BMC Health Serv Res. 2009;9:103.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Emily C, Williams MLJ. Strategies to implement alcohol screening and brief intervention in primary care settings: a structured literature review. Psychol Addict Behav J Soc Psychol Addict Behav. 2011;25(2):206–14.

    Article  Google Scholar 

  20. Brownson RC, Colditz GA, Proctor EK. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford: Oxford UniversityPress; 2012.

  21. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Lomas J. Making clinical policy explicit: legislative policy making and lessons for developing practice guidelines. Int J Technol Assess Health Care. 1993;9(01):11–25.

    Article  CAS  PubMed  Google Scholar 

  23. Leykum LK, Pugh JA, Lanham HJ, Harmon J, McDaniel RR. Implementation research design: integrating participatory action research into randomized controlled trials. Implement Sci. 2009;4(1):69.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6:CD000259.

    Google Scholar 

  25. Foy R, Eccles MP, Jamtvedt G, Young J, Grimshaw JM, Baker R. What do we know about how to do audit and feedback? Pitfalls in applying evidence from a systematic review. BMC Health Serv Res. 2005;5(1):50.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  26. Dixon-Woods M, Bosk CL, Aveling EL, Goeschel CA, Pronovost PJ. Explaining Michigan: developing an ex post theory of a quality improvement program. Milbank Q. 2011;89(2):167–205.

    Article  PubMed  PubMed Central  Google Scholar 

Download references


We gratefully acknowledge all the PVS participants who contributed to the focus groups and the representatives of the public health department who accompanied the facilitator in the interviews (see Additional file 1). The authors wish to especially thank Laura J. Damschroder, MS, MPH from Ann Arbor VA Center for Clinical Management Research for her advice in the interpretation of results and reviewing the manuscript.


The PVS project was funded by the Carlos III Health Institute of the Spanish Ministry of Economy and Competitiveness, co-financed by the European Regional Development Fund (research grant PS09/01461), the Health Department of the Basque Government (research grants 2007111009, 2009111072, 2011111145), the Basque Foundation for Social and Health Care Innovation (grant CA-2012-086), the Basque Research Center in Chronicity-Kronikgune (grant 11/056), and the Spanish Primary Care Research Network for Prevention and Health Promotion (redIAPP RD06/0018/0018 and RD12/0005/0010). The Basque Foundation for Science (Ikerbasque) funded Dr. Bacigalupe. The funding body had no role in the design of the study and collection, analysis, and interpretation of data and in writing the manuscript.

Availability of data and materials

Since data supporting the present study regards to routine data retrieved from the electronic health records of the Basque Health Service-Osakidetza and data collected in focus groups, it will be only shared upon justified request to the study guarantors.

Authors’ contributions

GG is the principal investigator and guarantor of the scientific quality of the PVS project and along with AS conceived the idea and designed the implementation research project. JMC, HP, CM, and PB contributed to the PVS project design, obtaining funding and coordinating the fieldwork. CM designed this qualitative evaluation, and organized and conducted the focus groups. GB, CM and JMC carried out the qualitative data analysis and triangulation. GG collaborated in and supervised quantitative and qualitative analyses. CM, GB, JMC and GG drafted the initial version and oversaw revisions of this paper. All the authors critically read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.

Consent for publication

Not applicable.

Ethics approval and consent to participate

The study protocol was approved by the Primary Care Research Committee of the Basque Health Service, Osakidetza, and by the Basque Country Clinical Research Ethics Committee (Ref: 6/2009). Health care professionals that committed to participate also gave written consent for the anonymous management and publication of data pertaining both to patients assigned to their practices and indicators related to their health care delivery activity.

Author information

Authors and Affiliations



Corresponding author

Correspondence to Gonzalo Grandes.

Additional file

Additional file 1:

Members of the PVS group. (PDF 193 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Martinez, C., Bacigalupe, G., Cortada, J.M. et al. The implementation of health promotion in primary and community care: a qualitative analysis of the ‘Prescribe Vida Saludable’ strategy. BMC Fam Pract 18, 23 (2017).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


  • Implementation Research
  • Program Evaluation
  • Qualitative Research
  • Organizational Innovation
  • Primary Health Care
  • Community Health Services
  • Health Promotion
  • Complex Interventions
  • Pilot Implementation