Skip to content Skip to sidebar Skip to footer

Sutter Health Continuous Quality Improvement Program

  • Journal List
  • HHS Author Manuscripts
  • PMC5646166

Int J Health Care Qual Assur. Author manuscript; available in PMC 2017 Oct 18.

Published in final edited form as:

PMCID: PMC5646166

NIHMSID: NIHMS912094

Employing continuous quality improvement in community-based substance abuse programs

Matthew Chinman

RAND Corporation, Pittsburgh, PA, 412 683-2300

Sarah Hunter

RAND Corporation, Santa Monica, CA 310 393-0411

Patricia Ebener

RAND Corporation, Santa Monica, CA 310 393-0411

Abstract

Purpose

This article describes Continuous Quality Improvement (CQI) for substance abuse prevention and treatment programs in a community-based organization setting.

Method

Continuous Quality Improvement (e.g., Plan-Do-Study-Act cycles) applied in healthcare and industry was adapted for substance abuse prevention and treatment programs in a community setting. We assessed the resources needed, acceptability and CQI feasibility for ten programs by evaluating CQI training workshops with program staff and a series of three qualitative interviews over a nine month implementation period with program participants. The CQI activities, PDSA cycle progress, effort, enthusiasm, benefits and challenges were examined.

Limitations

The study was conducted on a small number of programs. It did not assess CQI impact on service quality and intended program outcomes.

Findings

Results indicated that CQI was feasible and acceptable for community-based substance abuse prevention and treatment programs; however, some notable resource challenges remain. Future studies should examine CQI impact on service quality and intended program outcomes.

Implications for research, practice and/or society

This project shows that it is feasible to adapt CQI techniques and processes for community-based programs substance abuse prevention and treatment programs. These techniques may help community-based program managers to improve service quality and achieve program outcomes.

Value

One of the first studies to adapt traditional CQI techniques for community-based settings delivering substance abuse prevention and treatment programs.

Keywords: Substance abuse, Organizational change, Quality improvement, PDSA, Unites States

Introduction

Alcohol and other drug use among the youth exact a high toll in communities as they are linked to increased violence, accidents and crime. Evidence-based community-oriented substance abuse prevention and treatment programs can improve these outcomes and recoup treatment costs (NIDA, 1997). They need to be comprehensive and implemented well to reap these benefits, however (Backer, 2001). One way to improve program quality is to use evaluation data. While substance abuse program practitioners may have process and/or outcome evaluation data, they often lack capacity to use them to improve programming. Putting evaluation results to work in this manner is a Continuous quality improvement (CQI) core tenet. While practices similar to CQI have been associated with better outcomes in community-based substance abuse program researcher-led evaluations (Durlak and DuPre, 2008), communities remain challenged by evaluation tasks and how to apply data to systematically improve programs. Given CQI's positive impact in industry and healthcare, it would appear beneficial to build capacity among community-based organizations to use similar practices and improve programming quality. In this article, we assess quality improvement process acceptability and feasibility in ten community-based, substance abuse prevention and treatment programs.

Continuous Quality Improvement

Continuous quality improvement can be defined as a planned approach to transform organizations by evaluating and improving systems to achieve better outcomes (Colton, 2000). Historically, US CQI began in manufacturing, using Shewhart's ideas (Colton, 2000). In the 1980s and 1990s, CQI was adapted for healthcare, stimulated by multiple factors such as Medicaid cost containment (Burda, 1988); the National Demonstration Project in Quality Improvement in Healthcare (McLaughlin and Simpson, 1994) and the Joint Commission (1996) requiring hospital managers to have CQI programs in place.

Many CQI models have been used in industry and healthcare to provide systematic and ongoing feedback. For example, Balanced Scorecards (BSC - Kaplan and Norton, 1996) were developed to assess company performance in financial, customer, internal business processes, learning and business expansion domains. Whole System Measures (Martin et al., 2007) are similar metrics developed in healthcare to measure quality and performance using relatively few high-level indicators. Dashboards are increasingly popular, which update performance data in real time and aggregate them at individual, program and organization levels. Another common CQI model, developed by Shewhart and refined by Deming (2000), recently made popular by the Institute for Healthcare Improvement (IHI), is the Plan-Do-Study-Act (PDSA) cycle - a method in which individuals, drawn from the organization, make small, repeated and rapid changes to organizational function, test their impact and then decide whether to incorporate the change permanently. Recently, PDSA cycles were tested in the Network for the Improvement of Addiction Treatment (NIATx), small, community-based organizations providing substance abuse treatment. In the NIAtx effort, tools and technical support were provided to help improve access and retention in substance abuse treatment settings. The NIATx agency published several case studies demonstrating PDSA's feasibility when ongoing technical assistance and support are provided (McCarty et al., 2007; 2009). The CQI effort we describe differs from NIATx, which only involved treatment programs, dictated CQI work focus and only included programs that met stringent criteria. In this CQI project, most programs were prevention, the staff chose the CQI focus and all program staff wanting to participate were included. All the QI approaches share three common principles (Miller et al., (2008): determining baseline effectiveness; engaging in deliberate practice such as setting objectives; and providing direct feedback on performance to make adjustments and set new objectives.

Purpose and method

We demonstrate a certain CQI approach; i.e., PDSA cycles, in substance abuse prevention and treatment programs typical to community-based organizations. Manufacturing industry-based CQI techniques were adapted to make them appropriate for traditional healthcare settings (Colton, 2000). The same effort is needed to adapt CQI for community-based settings, which typically lack resources to collect and manage quantitative outcome data. We highlight CQI development for community settings and present evaluative data from CQI workshops with program staff and three waves of qualitative interviews with program participants to assess feasibility, acceptability and resources required in community-based settings.

Study site

We studied a community-based, substance abuse service located in Santa Barbara, CA - The Council of Alcoholism and Drug Abuse ('Council'), which operates 16 adult and adolescent programs targeting substance abuse prevention and treatment. In this demonstration, staff from ten adolescent programs participated. Staff had participated in a previous project called 'Getting To Outcomes' (GTO, Chinman et al., 2008) specifically designed to address community practitioner needs when implementing substance abuse prevention programs. The process included ten critical steps; beginning with planning and using evidence-based strategies (steps 1–6); implementation and outcome evaluation (steps 7 and 8); improvement and program sustainability (steps 9 and 10). Each step was accompanied by tools and worksheets to assist practitioners to implement, maintain and self-evaluate program effectiveness. Previous GTO work focused on process and outcome evaluation (GTO Steps 7 and 8). Here we describe how a CQI process was developed to use evaluation data for program improvement purposes. The ten participating programs were:

  1. A positive youth development program engaging middle and high-school youth with training, service project and an annual youth leadership conference.

  2. A semester long, evidence-based alcohol and drug prevention program targeting high-risk middle-school youth.

  3. A student assistance program, whose campus-based personnel delivered universal and selective prevention programs, coordinated campus-wide drug-free events and provided one-on-one counseling and referrals in middle- and high-schools.

  4. A teen court serving first time youth offenders diverted from the criminal justice system, using peer juries to impose sentences including alcohol and drug education or community service.

  5. A second teen court implemented in a different part of the county.

  6. An ongoing mentoring program that matched high-risk elementary school children with adult mentors.

  7. An evidence-based, brief intervention designed to prevent marijuana use among adolescents (Motivational Enhancement Therapy-Cognitive Behavioral Therapy-5), (Sampl and Kadden, 2001)

  8. An educational program for parents experiencing challenges raising children aged 10–18 years, including six interactive classes and an ongoing parent support group.

  9. A four-week educational group for parents and their teenage children focusing on increasing parental involvement, drug education, family communication skills and discussions about legal, psychological and physical aspects.

  10. An outpatient, adolescent substance abuse treatment program.

Developing CQI

The CQI process described below resulted from study investigators collaborating with Council staff and incorporating concepts and tools popularized by the Institute for Healthcare Improvement (i.e., PDSA); RAND's work in improving practice guideline implementation in the US Army's medical systems (Nicolas et al., 2001), and RAND's collaborative quality improvement projects with Phoenix House, a national substance abuse treatment organization. However, these concepts and tools needed to be adapted because in community-based organizations, measuring quality - a task central to CQI success - was challenging. Process and outcome indicators were more difficult to identify and track. In community-based organizations delivering substance abuse prevention services, outputs are services delivered rather than tangible manufactured products or a specific medical procedure. Despite these challenges, many techniques and tools developed in business and healthcare fields were applicable (with some adaptation). Although many Council programs had developed process and outcome measures using the GTO model in a previous study (Chinman et al., 2008), most still faced challenges when using evaluation data to make systematic improvements. With leaders, we developed a broad CQI process: developed and planned CQI activities during semi-annual CQI Workshops and used the time between Workshops (called Implementation Periods) to implement the strategies (Figure 1).

An external file that holds a picture, illustration, etc.  Object name is nihms912094f1.jpg

Starting in November 2006, a CQI Workshop Planning Committee met eight times before the first CQI Workshop (Spring, 2007). The Planning Committee included representation from over half the participating programs staff and collaboratively worked on developing the workshop agenda, activities and CQI tools - worksheets developed by modifying previously created instruments (Phoenix House materials, Ishikawa or fishbone diagrams) and creating new ones. All were designed to help program staff progress the PDSA cycle. A key Planning Committee decision was to ask program staff to summarize their process and outcome evaluations for the CQI Workshops. Some tools were developed to assist program staff to summarize evaluation information. The Planning Committee reported progress and got input at regularly scheduled monthly meetings with all ten participating program directors.

Study investigators served as CQI coaches, helping to refine processes and tools. At the first CQI Workshop, study investigators, Planning Committee members and the Executive Director for the organization all played key roles explaining CQI's value. Then staff, in small break-out groups, answered questions about CQI and how it could be implemented and sustained. Using their evaluation summaries, staff from each program developed specific CQI actions to implement using the PDSA cycle with input from the study investigators and executive director. After this workshop, the first implementation period got underway when program staff started their CQI actions. The executive director asked participants for CQI updates at monthly staff meetings.

Council staff asked study investigators to facilitate a second half-day CQI workshop (Fall, 2007) to educate seventeen new staff. Previous workshop participants were not required to attend. This workshop focused on developing or refining CQI actions rather than didactic GTO or CQI presentations. After, program staff continued their revised CQI actions (i.e., the second implementation period). In May 2008, a third, full-day CQI workshop was conducted for all nineteen participating program staff. Participant numbers varied over time as the budgeting and staffing changed year-to-year. After a CQI refresher, cross-program groups were formed and developed an organization-wide CQI action to improve referrals between programs, based on staff requests to work on an issue that cut across programs. Staff also refined old or developed new, program-specific CQI actions. The study ended after the third workshop. Additionally, two study investigators attended monthly staff meetings to provide further coaching and to answer questions. Following the first workshop, study investigators contacted program staff quarterly to administer an interview protocol and provide additional technical assistance (e.g., tips to improve program recruitment, easy to use techniques for coding data from open-ended survey questions). These calls lasted 45 to 90 minutes.

There are key differences between our CQI process and those used in other studies. For example, many screened and then selected participants (or CQI proposals) with higher capacity; provided participants with significant additional funding to carry out CQI activities; and/or stated a priori what organization staff would work on (e.g., McCarty et al., 2007; Rubenstein et al., 2006). In our study, all participating organization programs were included, no additional funding was provided and research staff (as coaches) worked collaboratively with program staff to focus their CQI efforts. We took this approach because we were interested in assessing the degree to which CQI could be undertaken in real world situations (i.e., no additional funding) and we believed that consistent with organizational change theories (Green et al., 1980), extensive collaboration and tailoring to meet local staff needs would promote greater CQI process uptake.

Measures

A ten-item measure based on the reliable (Cronbach alpha=0.95), 22-item Texas Christian University Workshop Evaluation (WEVAL) (Bartholomew et al., 2007) was used to assess the CQI workshop. All items had four-point response (e.g., very dissatisfied = 1 to very satisfied = 4; Table I). Given CQI's novelty in these settings, we attempted to characterize the CQI work accomplished, resources required and its feasibility and acceptability using both open and closed interview questions with participating staff. Question topics were about the Nature of CQI Actions that program staff developed (open-ended); Progress within the PDSA Cycle (closed-ended, defined by whether they reached the PDSA stages, rated by the interviewer); Effort, defined as the Staff Involved in the project (closed), Resources Required for CQI Actions (open), Collaborations Required for CQI Actions (closed, defined as whether or not a collaboration with personnel outside the participating program was needed), Hours Spent in the last three months on the CQI project (closed); Enthusiasm (closed 1–10 scale, 10 indicating highest enthusiasm) and the Benefits and Challenges encountered (open). All items were developed for this study.

Table I

Workshop evaluation

Items Mean (sd) within three data collection waves
W1 (n=24) W2 (n=17) W3 (n=19)
1. Information Quality 3.25 (.91) 3.35 (1.00) 3.53 (.61)
2. Information Relevance 3.25 (.88) 3.35 (.94) 3.53 (.92)
3. Workshop Organization 3.58 (.72) 3.18 (.95) 3.42 (.84)
4. Presenter Sensitivity 3.58 (.79) 3.53 (.80) 3.42 (.90)
5. Opportunity for questions/discussion 3.46 (.99) 3.53 (.72) 3.37 (.76)
6. Handout Quality 3.26 (.83) 3.47 (1.01) 3.16 (.90)
7. Time allocated to tasks 3.21 (.69) 3.19 (1.11) 3.29 (.78)
8. Overall satisfaction 3.50 (.70) 3.47 (.62) 3.37 (.68)
9. Opportunity to develop new ideas 3.30 (.94) 2.82 (1.01) 2.89 (.94)
10. Likely to use plans you developed here 3.74 (.55) 3.38 (.62) 3.63 (.50)

Data collection

The workshop evaluation survey was administered to all participants in three events (n = 24, 17 and 19). The CQI interview was conducted with program directors from ten participating programs at three, six and nine months following the initial CQI workshop about progress on their CQI actions. All program directors gave consent for the project, which was approved by the RAND Corporation's Institutional Review Board. For efficiency, two authors were assigned three programs and one author was assigned four programs based on the programs with which each author was most familiar. Interviews were conducted over the telephone. A separate note taker took detailed notes for each interview. Interview notes were provided to each author, who then reviewed and revised them for accuracy and completeness.

Data analysis

Descriptive statistics were conducted for the workshop evaluation and close-ended interview responses at each wave. To preserve anonymity, responses were not linked over time but evaluated as a cohort at each wave. The open-ended responses were analyzed by beginning with the interview protocol to establish the general themes (e.g., CQI actions) while allowing additional themes to emerge, which is consistent with grounded theory (Glaser and Strauss, 1964; Strauss and Corbin, 1990). From detailed notes, a research assistant familiar with GTO but not otherwise connected to the project, listed responses under relevant protocol questions, for each program. The GTO team senior member reviewed the detailed listings and developed written narratives summarizing responses across all programs, following the interview protocol questions. The GTO team reviewed the detailed interview summaries and prepared a final narrative across all the interviews based upon consensus discussion.

Results

CQI workshop evaluations

Reactions to each CQI Workshop were universally positive across all dimensions. All items were rated above three on a four-point scale (four being the highest, see Table I). The exception was 'opportunities to develop new ideas', which after the second and third workshops were rated lower ( x ̄ = 2.82 and 2.89 respectively).

CQI interviews

Across ten programs, 19 CQI actions were developed over nine months (Figure 1). The 'cross-program' CQI actions were not counted in this total and were not tracked owing to the study's timeline. Five strategies involved adding or significantly revising a program component (e.g., new video, new aftercare component, new or revised curriculum). Four involved improving staff competencies; for example, by either adding new or improving staff education. Three made improvements to their existing evaluation plans by either adding new data collection (open-ended, qualitative questions to an existing survey) or modifying existing survey questions. Three involved enacting changes to improve communication across programs and program referrals, or better coordinate services for clients enrolled in more than one program. Two involved improving program recruitment; one included devising ways to increase funding and another used the CQI framework to implement program strategies already planned (a youth leadership conference). The changes mostly targeted program staff, with only two activities directly impacting clients (e.g., improving recruitment).

PDSA cycle progress

To evaluate feasibility, we wanted to understand the degree to which programs could accomplish each PDSA step. Figure 2 shows how many CQI actions reached each PDSA step, organized by CQI action type. Of 19 planned CQI actions, 14 (74%) were implemented (i.e., DO), 13 (68%) were assessed (i.e., STUDY) in some manner and 12 (63%) reached the ACT phase, meaning that an official decision was made to modify, continue, or discontinue CQI actions. Eleven of those 12 were continued and one was discontinued (improving staff competency). Strategies varied in the extent to which they progressed through the PDSA cycle (Figure 2). As a group, five strategies attempting to add or significantly revise program component faced the most difficulty. Only one (revising an existing curriculum) reached the ACT phase. Three from four strategies designed to improve staff competencies were attempted, studied and acted upon. The fourth was not attempted because staff leading the effort left the program. Two of three strategies aiming to improve communication across programs progressed to the PDSA cycle end. The third was not attempted because it was viewed to difficult. All strategies involving increasing program recruitment, using the CQI process to implement program strategies already planned, securing additional funding and improving/expanding evaluation plans, successfully progressed through all PDSA cycle stages.

An external file that holds a picture, illustration, etc.  Object name is nihms912094f2.jpg

CQI actions reaching each PDSA step

Effort

To evaluate Council members' efforts, we examined the following domains across all strategies over nine months, except for hours spent, which is reported separately for each of the three quarters corresponding to the three interview waves:

  • Staff involved: All but one CQI action involved the entire program staff (developing and distributing new written training materials to improve staff competencies was completed by just the program director).

  • Resources required: Half the CQI actions were reportedly completed by existing staff. The other half required additional resources (i.e., to support new training or assist newly installed evaluation activities).

  • Collaborations required: Five, or one-quarter of the CQI actions, required collaboration between different programs within the Council. For example, three CQI actions involved establishing procedures to recruit youth and families from individuals served by other programs within the Council. The other two actions involved staff from one program providing data management training and assistance to staff in other programs. Two other actions forged collaborations between organizations outside the Council; one received assistance developing new training materials and the other involved receiving donations for a training conference.

  • Hours spent on CQI varied between 1 and 80 ( x ̄ =28.1, sd=31.9) in Quarter 1; between 1 and 240 ( x ̄ =45.7, sd=74.1) in Quarter 2; and between 0 and 180 ( x ̄ =33.7, sd=58.0) in Quarter 3. We did not count the hours staff in one program spent using the CQI structure to conduct programming they were already doing (reported as 20 hours a week for three months).

Enthusiasm

Staff were enthusiastic about working on CQI activities in Quarter 1 ( x ̄ =8.6, range = 5 to 10), but enthusiasm declined slightly across the three waves. In Quarter 2, enthusiasm was x ̄ =8.0, range = 7 to 10; in Quarter 3, the enthusiasm was x ̄ =7.8, range = 5 to 10.

Benefits

Staff reported how the CQI structure assisted their programs and forced them to review performance data and improvement. Staff reporting five CQI actions mentioned that processes and tools helped 'keep them on track' and gave them a 'to do list' for improving their programs. Staff working on other actions stated that using the CQI process held them more accountable than they had been prior to CQI launch. Staff from two other programs indicated that CQI motivated them to improve their programs. These themes were reported consistently across the three time points.

Challenges

The most commonly cited challenges across all three time points were time, funds and staff availability to carry out CQI actions. Half the program directors stated they had insufficient time and staff to meet service demands and that additional CQI activity had to be conducted in that context. Staff turnover led directly to two non-executed CQI actions in the first quarter while slowing CQI implementation in other programs. During the first implementation period, we learned that the complexity of conducting evaluation (a focus for three CQI strategies) prompted staff to request assistance analyzing qualitative and quantitative data; developing a design to better capture more survey respondents; developing new survey tools; and modifying existing tools to better fit their needs. Staff from only one program stated that the CQI process, which involves studying and assessing changes, was unfamiliar. At the last time point, two program directors mentioned that CQI methods got easier with experience and training after initially being overwhelming. At this point, a few directors stated that program staff had not supported the planned changes developed during CQI actions, which impeded implementation. In these cases, staff expressed to the directors that changes were not helping (e.g., staff perceived the competency improvement efforts were not building competency) or involved too much work for the time available.

Discussion

We assessed the feasibility of using traditional CQI techniques in community-based substance abuse prevention and treatment settings. While there were challenges, we found that it is feasible for staff to use CQI techniques when supported. Staff in ten participating programs developed several CQI actions, involved most program staff, secured additional resources in half the CQI actions, forged collaborations with other Council programs or outside entities in about a quarter of the CQI actions and spent on average between 2.33 to 3.75 hours a week on CQI activities over the nine-month period. Two-thirds completed their CQI actions; i.e., progressed through the entire PDSA cycle. Although progression is not equivalent to program impact, we argue that it is an important proxy (necessary, but not sufficient) for program impact to occur. Program staff were generally positive about using CQI; enthusiasm, while declining over nine months, was high overall. Staff ratings of opportunities to develop new ideas declined in the second and third workshops. This finding is consistent with how the workshops were run. All time in the first workshop was spent on generating new ideas, whereas the subsequent workshops split time between generating new and refining old ideas. Qualitative data showed that program staff felt the CQI process helped them become more organized and accountable. As such, results indicate that these CQI methods can be a bridge between conducting self-evaluation and making concrete changes to improve programming.

We learned several lessons that have implications for using CQI in community-based settings. Much progress getting CQI off the ground followed significant support from Council leaders, which resulted from having a multi-year relationship with Council staff in which data value and improvement were discussed often through using the GTO model. From there, CQI organization champions emerged, playing a key role shepherding the process. Having local organizational support (Colton, 2000; Heller and Arozullah, 2001) and champions present (Shortell et al., 2004) have been cited as key factors for adopting quality improvement activities. Outside facilitators (GTO staff) were critical to engaging and supporting Council staff and helping to launch the CQI effort. At the outset, Council staff asked for assistance in using their evaluation data to improve programming, signaling a readiness to begin a CQI process, but needed help getting started. In this case, GTO staff played an important role engaging program staff to participate in CQI development and planning for their organization, training Council staff in CQI fundamentals, establishing CQI basic structure (i.e., repeated workshops with intervening implementation periods) and providing ongoing support. This technical assistance or facilitation is becoming widely accepted and necessary for translating research findings into practice (Stetler et al., 2006). Several successful CQI studies in various domains have included this support (McCarty et al., 2007; McBride et al., 2000; Ockene et al., 1996; Carney et al., 1992).

Continuous quality improvement activity may impact its likelihood for success in these settings. The CQI actions, larger in scope (i.e., changing a program significantly), were completed at a lower rate than smaller strategies. After reviewing 55 CQI studies across several domains, Shortell et al., (1998) also found that negative results were more common when applied to more complex care processes. The choice to significantly alter programs may undercut a PDSA cycle's key component: making small, low-risk changes in the clinical setting (Powell et al., 2008). Choosing large-scale changes put staff in situations where they needed significantly more resources beyond their own ideas and ingenuity to implement those changes successfully. Establishing CQI activities involves making decisions about issues on which program staff often have strong opinions. Many implementation theories suggest that failing to incorporate local stakeholder input can undermine new practice implementation (Fixsen et al., 2005) and we submit that starting CQI is no different. Therefore, we believed it was critical to collaborate with Council staff and leaders throughout the CQI implementation. Although study investigators broadly outlined the CQI process, Council staff made important contributions to many CQI tools and CQI workshops. This facilitated staff 'buy-in' from most and ensured the resulting process and tools were tailored to fit their organization.

Conclusions, limitations and future research

Given the study's limited scope, these findings should be interpreted cautiously since our study involved ten programs in one organization. The CQI research instruments' psychometric properties are unknown given they were developed for this study. Research assessing CQI feasibility across more community-based programs is needed. It is unclear if the changes impacted overall service quality or their intended outcomes. This is not uncommon as much CQI research focuses on implementation (Shojania and Grimshaw, 2005) and this study falls into that category. Future research should focus on CQI outcomes, which will require studies over longer time frames and with resources to collect and analyze both implementation and outcome data. Staff in all ten participating programs served different populations and CQI actions executed varied in their focus and scope. While these circumstances may suggest that CQI is possible in several community-based programs, the variation in programming also introduces potential confounders that make interpretation more difficult. More rigorous studies, such as randomized controlled trials (RCT), are needed to better understand CQI impact in these settings. This raises a tension inherent in CQI research: many comment that local adaptation is an important factor when attempting and sustaining CQI (e.g., Powell, et al., 2008) while maintaining high internal validity - or the degree to which one can be confident that the results were actually caused by CQI work - requires more standardization than would be typically found in an RCT. One solution in future CQI studies could be to specifically examine the difference between conditions in which the CQI targets and processes are prescribed, compared against approaches that allow CQI to be developed locally. Finally, given the study's short, nine-month duration, we are not able to comment on the CQI programs' long-term sustainability. To assess if community-based programs can 'go it alone', studies are needed in which assessments are made after the initial training and support are discontinued. Continuous quality improvement methods have a long history in manufacturing and more recently in healthcare. These techniques could greatly benefit community-based prevention and treatment practitioners, helping them to use evaluation data to improve their programs. In collaboration with staff in one community-based prevention and treatment organization, we have been able to demonstrate that adapting these techniques to community settings is feasible. Our study is an encouraging first step; more research is needed about the impacts such approaches have on service quality and intended program outcomes.

Contributor Information

Matthew Chinman, RAND Corporation, Pittsburgh, PA, 412 683-2300.

Sarah Hunter, RAND Corporation, Santa Monica, CA 310 393-0411.

Patricia Ebener, RAND Corporation, Santa Monica, CA 310 393-0411.

References

  • Backer TE. Finding the Balance: Program Fidelity and Adaptation in Substance Abuse Prevention. CSAP; Washington, DC: 2001. [Google Scholar]
  • Bartholomew NG, Joe GW, Rowan-Szal GA, Simpson DD. Counselor assessments of training and adoption barriers. Journal of Substance Abuse Treatment. 2007;33(2):193–199. [PMC free article] [PubMed] [Google Scholar]
  • Burda D. Providers look to industry for quality models. Modern Healthcare. 1988;18(29):24–32. [PubMed] [Google Scholar]
  • Carney PA, Dietrich AJ, Keller A, Landgraf J, O'Connor GT. Tools, teamwork and tenacity: an office system for cancer prevention. Journal of Family Practice. 1992;35(4):388–394. [PubMed] [Google Scholar]
  • Chinman M, Hunter S, Ebener P, Paddock S, Stillman L, Imm P, Wandersman A. The Getting To Outcomes demonstration and evaluation: an illustration of the Prevention Support System. American Journal of Community Psychology. 2008;41:206–224. [PMC free article] [PubMed] [Google Scholar]
  • Colton D. Quality improvement in health care: conceptual and historical foundations. Evaluation and the Health Professions. 2000;23(1):7–42. [PubMed] [Google Scholar]
  • Deming WE. Out of Crisis. MIT Press; Cambridge, MA: 2000. [Google Scholar]
  • Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology. 2008;41(2–3):327–350. [PubMed] [Google Scholar]
  • Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. FMHI publication no. 231. University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network; Tampa, FL: 2005. Implementation Research: A Synthesis of the Literature. [Google Scholar]
  • Glaser BG, Strauss AL. Awareness contexts and social interaction. American Sociological Review. 1964;29(5):669–679. [Google Scholar]
  • Green LW, Kreuter MW, Deeds SG, Partridge KB. Health Education Planning: A Diagnostic Approach. Mayfield; Palo Alto, CA: 1980. [Google Scholar]
  • Heller C, Arozullah A. Implementing change: it's as hard as it looks. Disease Management and Health Outcomes. 2001;9(10):551–563. [Google Scholar]
  • Joint Commission on the Accreditation of Healthcare Organizations. 1997–98 Comprehensive Accreditation Manual for Behavioral Health Care. Joint Commission on the Accreditation of Healthcare Organizations; Oakbrook Terrace, IL: 1996. [Google Scholar]
  • Kaplan RS, Norton DP. Balanced Scorecard: Translating Strategy into Action. Harvard Business School Press; Cambridge, MA: 1996. [Google Scholar]
  • Martin LA, Nelson EC, Lloyd RC, Nolan TW. Whole System Measures. IHI Innovation Series White Paper. Institute for Healthcare Improvement; Cambridge, MA: 2007. [Google Scholar]
  • McBride P, Underbakke G, Plane MB, Massoth K, Brown RL, Solberg LI, Ellis L, Schrott HG, Smith K, Swanson T, Spencer E, Pfeifer G, Knox A. Improving prevention systems in primary care practices: the Health Education and Research Trial (HEART) Journal of Family Practice. 2000;49(2):115–125. [PubMed] [Google Scholar]
  • McCarty D, Gustafson D, Capoccia VA, Cotter F. Improving care for the treatment of alcohol and drug disorders. The Journal of Behavioral Health Services and Research. 2009;36(1):52–60. [PMC free article] [PubMed] [Google Scholar]
  • McCarty D, Gustafson DH, Wisdom JP, Ford J, Choi D, Molfenter T, Capoccia VA, Cotter F. The Network for the Improvement of Addiction Treatment (NIATx): enhancing access and retention. Drug and Alcohol Dependence. 2007;88(2–3):138–145. [PMC free article] [PubMed] [Google Scholar]
  • McLaughlin CP, Simpson KN. Does TQM/CQI work in health care? In: McLaughlin CP, Kaluzny AD, editors. Continuous Quality Improvement in Health Care: Theory, Implementation, and Applications. Aspen; Gaithersburg, MD: 1994. [Google Scholar]
  • Miller SD, Hubble MA, Duncan BL. Supershrinks. What's the secret of their success? Psychotherapy in Australia. 2008;14(4):14–22. [Google Scholar]
  • NIDA. NIH publication no. 97-4212. Washington, DC: 1997. Preventing Drug Use among Children and Adolescents: A Research-Based Guide. [Google Scholar]
  • Nicolas W, Farley DO, Vaiana ME, Cretin S. Putting Practice Guidelines to Work in the Department of Defense Medical System: A Guide for Action. Arroyo Center: Center for Military Health Policy Research; Santa Monica, CA: 2001. [accessed 10th October 2006]. p. MR-1267-A. available at http://www.rand.org/pubs/monograph_reports/MR1267/ [Google Scholar]
  • Ockene IS, Hebert JR, Ockene J, Merriam PA, Hurley TG, Saperia GM. Effect of training and a structured office practice on physician-delivered nutrition counseling: the Worcester-area trial for counseling in hyperlipidemia (WATCH) American Journal of Preventative Medicine. 1996;12(4):252–258. [PubMed] [Google Scholar]
  • Powell AE, Rushmer RK, Davies HTO. A Systematic Narrative Review of Quality Improvement Models in Health Care. Social Dimensions of Health Institute at the Universities of Dundee and St Andrews; Dundee: 2008. [Google Scholar]
  • Rubenstein LV, Meredith LS, Parker LE, Gordon NP, Hickey SC, Oken C, Lee ML. Impacts of evidence-based quality improvement on depression in primary care: a randomized experiment. Journal of General Internal Medicine. 2006;21(10):1027–1035. [PMC free article] [PubMed] [Google Scholar]
  • Sampl S, Kadden R. Motivational Enhancement Therapy and Cognitive Behavioral Therapy for Adolescent Cannabis Users: 5 Sessions, Cannabis Youth Treatment (CYT) Series. Vol. 1. Center for Substance Abuse Treatment, Substance Abuse and Mental Health Services Administration; Rockville, MD: 2001. [Google Scholar]
  • Shojania KG, Grimshaw JM. Evidence-based quality improvement: the state of the science. Health Affairs. 2005;24(1):138–150. [PubMed] [Google Scholar]
  • Shortell SM, Bennett CL, Byck GR. Assessing the impact of continuous quality improvement on clinical practice: what will it take to accelerate progress. Milbank Quarterly. 1998;76(4):593–624. [PMC free article] [PubMed] [Google Scholar]
  • Shortell SM, Marsteller J, Lin M, Pearson M, Wu S, Mendel P, Cretin S, Rosen M. The role of team effectiveness in improving chronic illness care. Medical Care. 2004;42(11):1040–1048. [PubMed] [Google Scholar]
  • Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, Guihan M, Hagedorn H, Kimmel B, Sharp ND, Smith JL. Role of 'external facilitation' in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. [accessed 5th December 2006]; Implementation Science. 2006 1 available at http://www.implementationscience.com/content/1/1/23. [PMC free article] [PubMed] [Google Scholar]
  • Strauss A, Corbin JM. Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Sage Publications, Inc; Thousand Oaks, CA: 1990. [Google Scholar]

provostthessell.blogspot.com

Source: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5646166/

Publicar un comentario for "Sutter Health Continuous Quality Improvement Program"