Skip Navigation U.S. Department of Health and Human Services www.hhs.gov/
Agency for Healthcare Research Quality www.ahrq.gov
www.ahrq.gov/

Better Improvement Research


Slide Presentation from the AHRQ 2008 Annual Conference


On September 9, 2008, John Ovretveit made this presentation at the 2008 Annual Conference. Select to access the PowerPoint® presentation (1.9 MB; Plugin Software Help).


Slide 1

Better Improvement Research

John Ovretveit,
Director of Research, Professor, Karolinska Medical Management Centre
Sweden and Professor of Health Management, Faculty of Medicine, Bergen University

Resources download from: http://homepage.mac.com/johnovr/FileSharing2.html

Slide 2

Recognition of AHRQ & researcher: You are making a difference...

The slide shows a screen shot of a page entitled, "Health Care Innovations Exchange," from AHRQ's Web site.
Just some achievements:

  • Shojania ed, 2001; 700 page review of safety interventions
  • Quality and safety indicators
  • Culture survey
  • TeamSTEPPS™ & other tools
  • Innovations exchange

Notes:

  • A "Thank you from Europe"
    AHRQ and researchers funded by them

Slide 3

Achievements

Notable research funded by AHRQ:

Notes:

Slide 4

Acknowledge also:

QUERY series, Mittman et al, eg Yano 2008

  • ...employed to foster progress through QUERI's six-step process. We report on how explicit integration of the evaluation of organizational factors into QUERI planning has informed the design of more effective care delivery system interventions and enabled their improved "fit" to individual VA facilities or practices. We examine the value and challenges in conducting organizational...

Notes:

Slide 5

Achievements

The slide shows a black screen.

  • Questions are the answer

Slide 6

Shown excellence, but now challenges

  • #1: Is it effective? (for many types of QSI)
  • #2: Why?: causal model
  • #3: Who cares anyway? - More useful to research users
    • 3a: How to implement it?
    • 3b: Researcher-user interaction: use knowledge translation res/K to shape question and enable users to use
  • = Exciting opportunity for research innovation
  • But silos

Slide 7

My subject: interventions to providers/organizations, not patients evaluating non-standardizable complex interventions and implementation strategies

Not Treatments: BBs after AMI (beta-blockers after myocardial infarction). But:

  • Intervention to get BBs given appropriately (eg Education, guidelines, CDS, audit)
  • Intervention to spread Consolidated Data Management (CDM); e.g., Breakthrough collaborative
  • Rapid Response Team (RRT) (or Crew Resource Management ([CRM])
  • Development programme to lead improvement
  • Pay-for-Performance (P4P) for QS
  • Accreditation: benefits for costs compared to alt?

Slide 8

Distinguish

Intervention
Seed
Implementation Strategies
Planting
Context
Soil and Climate
Clinical QSI
(e.g. prescribe BBs)
Education
Guidelines
Audit and Feedback
Academic detailing
Organizational structure
Culture Systems
Financial system?
Organizational QSI
(e.g. care management; RRT)
Breakthrough
collaborative
 
  Which effective for which
intervention?
Classification of strategies?
Which features help and
Hinder which strategies/
support which interventions?

Slide 9

Themes

  • Horses for courses:
    • Match method to question and type of QSI
    • More flexibility and innovation
  • Its not the camera, but what's behind and in front that makes a quality picture
  • Its not the intervention, but the context and the beneficiaries that makes the impact

Slide 10

  • Evaluation Method >
  • How context dependent is the intervention?
    More complex = more dependent on context for implementation
  • Level of the target of the intervention:
    • Individual
    • Team
    • Department
    • Hospital
    • Regional health system
    • National health system
  • Likely effect of context on implementation and on the effects of the intervention:
    • Drug on patient
    • Context independent
    • Health promotion intervention
    • Context dependent

Slide 11

Next: 4 challenges and resolutions

  • Useful research
  • Efficacy
  • Effectiveness/generalization
  • Translation
  • Examples: RRT; CRM; Transition interventions; Accreditation.

Slide 12

The table presents "Challenges" and "Resolutions."

  1. Decisionmakers information needs:
    • Their hierarchy of evidence
  2. Proof of efficacy
    • RCT/CT priceless; ...For all else, strengthen observational studies; Parallel process evaluation; Reporting
  3. Effectiveness research for generalization:
    • Pragmatic trials—variations; Case study; Theory-based research; Action evaluation learning cycle
  4. Faster wider use
    • Content; process; structure; culture? Silos?

Slide 13

#1 challenge: decision makers information needs

  • Go/not go decision—pilot, full-scale?
  • Implementer's guidance: adapt and progress it?
  • Install update?
    Needs: useful credible information, now!, about:
    • Costs, savings, benefits, risks—for our organization
    • Implementation to maximize success
      • Don't even think about it unless....
    • Utility not purity: "Good enough validity" &some attention to bias
  • Researcher response?
    • No compromise—publication and promotion

Slide 14

#1 challenge: decisionmakers information needs

"Many QIs have small to moderate effect"

  • Research design limitations?
    Does quantitative RCT/CT design
    1. Fail to measure enough intermediate or ultimate outcomes?
    2. Obscure extremes, where context important?
    3. Require prescribed implementation, when iterative adaption necessary?

Slide 15

#1 challenge: decision maker's information needs

Resolution by decision-maker's:

  • Hierarchy of evidence:
    1. Face validity/make sense?—Try it on a small scale
    2. Steve or Jane's experience in Kansas
    3. Institute for Healthcare Improvement (IHI) practitioner reports: O1 > I  >  O2 data (Before>Intervention>After)
    4. Published practitioner-scientist study
    5. High-church medical journal publication
  • Proportionality of proof—cost/ease, risk, benefit

Slide 16

#2 Challenge: Efficacy proof

  • Does it work— anywhere?
    • Maximize certainty of attribution of outcomes to intervention
    • Causal assumptions: why/how does it work?
  • Resolutions:
    • Paradigm: O1 > I  >  O2  quantitative experimental black box
      • Is there are difference?
    • Better:
      O1 > I   >  O2     Bigger difference?
      O1 > ?  >  O2
      • Other explanations for difference?
      • Control, randomize, compare, hygiene to avoid contamination by confounders

Slide 17

Disconnect between

  1. Linear—sequential—intervention—outcome assumptions underlying research designs and explanation and
  2. Sophisticated systems understanding of causes
    • Outcomes the result of a number of causes
    • Causes interact with each other and with influences outside the boundary of the system
  • Note:  Eg Senge Archetypes (latent predisposing factors/active "cause") ref Anderson et al 2005

Slide 18

#2 Resolutions to increase proof of Efficacy

  • Strengths
    • √ specifiable, controllable interventions like drug
    • = √ unchanging, control known confounders and randomize others, 2/3 measures all you need
  • Limitations
    • Absence of above. Works for whom?—Multiple perspectives. Unintended consequences—study more outcomes
    • Decisionmakers translation—info they need in addition

Slide 19

#2 Resolutions to increase proof of Efficacy

  • Strengthening:
    • Parallel process evaluation
    • Reporting ("SQUIRE" etc)
      • (Labels for what implemented, not the brand)
    • Attribution steroids for observational studies
    • (sensitivity analyses to assess results Propensity score (Johnson et al 2006) and instrumental variable (Harless and Mark 2006) methods

Slide 20

#3 Challenge: effectiveness research for generalization

  • Effectiveness in different situations?
  • Issues:
    • Many interventions sensitive to context:
      • Implementable only if changed to suit context
      • Evolve in interaction with changing context—journey/story
        IE
    • Efficacy guarantee violated by user adaption of some interventions
    • For others: guarantee failure if you do not adapt
      • Or buy installation and 3 year guarantee

Slide 21

#3 Resolutions: generalizable effectiveness research

  • R1: Maintain paradigm: "Pragmatic trials"
    • Minimize loss of attribution with Time series, Step-wise wedge, SPC (but increase cost and time)
    • Some √ for routine practice feedback
    • Generalizable to similar situations and interventions
    • Add more situations and variations of the intervention
      • Compare many pragmatic trials and assess what works best where
      • Invite trails in X situations?
      • Improve reporting (standardize and details)
  • —ve: no answer to why?
    • —Explanation helps adapt, and contributes to science

Slide 22

#3 Resolutions: context sensitive generalizable effectiveness research

  • R2: Case study research
    • √ Describes intervention as it evolves & context helpers and hinderers
    • √ Assesses intermediate changes
    • √ Links these to ultimate patient/cost outcomes, if possible
    • Multiple case study in selected situations (e.g., Dopson 2002)
  • NEXT: What we have learned in doing this research

Slide 23

What we have learned in doing this research

  • The research:
    • 12 Action evaluation case studies of innovation implementation in Swedish health care
    • Variety of "research into practice" implementation and change studies

Slide 24

L2: Distinguish

  • Safer clinical practices:
    • Changed providers behavior = reduce adverse events?
  • Safer organization and processes: "The seed"
    • Support changes in provider behavior and address latent causes
  • Implementation actions to achieve the above: "Planting"
    • At team, organization, system and national levels
  • External context helpers and hinders:"Soil & Climate"
  • Note: (is a MET/RRT a safe clinical practice or a "safer organization or process" change, or both?)

Slide 25

Blank Slide

Slide 26

L3: Theory essential—of intervention pathway to outcomes

  • To decide which data to gather
  • To provide explanations to test
  • To give implementers to help them adapt.
  • Note: (Program theory, Weiss 1972, 1997; Rog & Fournier 1997; Logic Model Wholey 1979; Theory-driven evaluation, Chen 1990, Sidani & Braden 1998; realist evaluation, Henry et al 1998, Pawson & Tilley 1997; Theories Grol et al 2007)

Slide 27

L4: Action evaluation learning cycle

  • Feedback findings during implementation:
    • + and—for science
  • Assess effect of researcher on implementation and results
  • Helps develop intervention during the implementation journey
  • Increases cooperation and access to data
  • Partnership, but distinct roles
  • Study how implementers use knowledge and help use more

Slide 28

#4 Challenge: use—faster, wider

  • Demand?—Real men don't need research.
  • Supply?—Real researchers don't write exec summaries.
    • Make sure unusable and "throw over the fence" delivery
  • Closing the research/practice gap

Slide 29

Translation in QSI HSR

  • Evidence >Test >Package:
    • User >Adapt >Implement/Adjust
  • Development Translation 1:
    • (Intervention development and testing)
  • Implementation Translation 2:
    • (Adoption/spread)
  • What is the intervention?
  • Where do you draw the boundary?

Slide 30

#4 Resolutions—our experience

  • Use KT/KM literature—what works?
  • Content: accessibility and relevance
    • Service implications; many examples; 3:20:Appx reports; ghost writers and mediator authors;
    • Engage emotionally: patient describes experience or video
  • Process: interact with users at each stage
  • Structure: forums, networks, joint appointments, brokers

Slide 31

The table presents "Challenges" and "Resolutions."

  • Decisionmakers information needs:
    • Their hierarchy of evidence
  • Proof of efficacy:
    • Randomized Controlled Trial (RCT)/CT priceless; ...For all else, strengthen observational studies; Parallel process evaluation; Reporting
  • Effectiveness research for generalization:
    • Pragmatic trials—variations; Case study; Theory-based research; Action evaluation learning cycle
  • Faster wider use:
    • Content; process; structure; culture? Silos?

Slide 32

Questions

  • Efficacy and causality:
    • System thinking in research—causality explanations and data gathering
    • Always trade off between internal/external validity?
  • Generalizable effectiveness research:
    • Journey/story approach—unique?
  • Use: faster, quicker:
    • Extend researcher role?
    • Increase demand?
    • Effect of action role?

Current as of January 2009


Internet Citation:

Better Improvement Research. Slide Presentation from the AHRQ 2008 Annual Conference (Text Version). December 2008. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/about/annualmtg08/090908slides/Ovretveit.htm


 

AHRQAdvancing Excellence in Health Care