The initiative described in this quality improvement report took all of eight years and two months, from idea generation to publication, which is often times, the only acceptable endpoint of a project in academic medicine. The project was meaningful work and the results and lessons learned were well worth the effort. In this article, I would like to share some leadership lessons learned on a personal and professional level that did not have a place in the publication. These are in hindsight, which, as they say, is 20/20. The disclaimer is that I am not representing my co-authors’ views or those of the institutions involved in the initiative.

1. You need to be invited to and be at the table. During the week in which 1115 waiver project proposal submissions were due in December of 2012, the chief medical officer at that time invited me as an infectious diseases expert to weigh in on whether sepsis mortality reduction is a worthwhile project to pursue. I happened to sit next to the officer in charge of writing and submitting the proposals to the Centers for Medicare and Medicaid Services via the state. She had the list of outcomes (central line infections, urinary tract infections, falls, medication errors, expanding access, etc.) that the CMS was interested in. Her reaction to my question whether we can package the HAIs along with sepsis improvements together as one project was, “of course yes!” If infection prevention as a department wanted to submit project proposals and budget requests to the executive leadership for working on each of the outcomes individually, it might take years. Pitching to work on outcomes that were not among the publicly reportable infections might have been a hard sell.

2. Thinking quick in opportune moments helps. This point doesn’t need elaboration.

3. It is important to be with bosses and colleagues who trust you. Things do move at the speed of trust. The initial project proposal was written and submitted within a week of that meeting.

4. Having the necessary subject matter knowledge and a good sense for organizational needs is helpful. The CMS provided a list of suggested interventions that we could pick and choose from and fit into different frameworks: the quality improvement framework of Structures – Processes – Outcomes; framework described in healthcare epidemiology literature of the day suggesting use of a balanced combination of technical + adaptive, and vertical + horizontal approaches; and the Agency for Healthcare Research in Quality concept of “raise the floor” based on the idea that “your organization is only as strong as your weakest link.” We took a quick inventory of the local context and needs, and picked targets we thought are achievable, and interventions that we thought might help us achieve those targets.

5. We get to improve our patient outcomes with the data we can have and not the data we would like to have. In healthcare, we cannot always get the data we want. We are constrained by having access to only secondary data, not having sufficient personnel for data collection, and sometimes the logistics are simply too complicated. For example, we wanted to corroborate data from direct observations of hand hygiene adherence with volume of hand sanitizer utilized, which presented another set of challenges. To overcome these challenges, we converted the hand hygiene direct observation data to defects per million opportunities, and we got purchasing data from the purchasing department and standardized it to liters per month purchased.  

6. It is helpful to have room for learning and figuring things along the way. We knew clinician engagement was important enough to include as an intervention. When we researched best practices and frameworks for use, we didn’t find many that were helpful at that time. So, we used an adapted version of discovery and action dialogues using questions that I thought were pertinent to the project. I conducted ninety-four interviews, each an hour-long, with leaders of all strata and some frontline clinicians, using a questionnaire. They helped me understand their readiness and availability to contribute to the initiative, their response to me as a project leader, and their original ideas on how to achieve improvement.

7. Contributions, big and small, enabled project success. During our clinician engagement efforts, our mantra was, “we’ll take what we can get.” Leaders and stakeholder were allowed latitude on what they chose to contribute. One such example is sponsorship of clinical safety and effectiveness course (9-day course in clinical quality improvement) to project participants and stakeholders free of charge, by the medical school office of quality and safety education.

8. Initial success generated enthusiasm and increased team morale. Meeting project goals in the first year motivated our teams to keep going. Our teams and team leaders were already tired from putting out regulatory fires. This was a proactive project that was energizing as opposed to responses to regulatory findings that were reactive and drained our energies. My job changed into one of channeling the enthusiasm in the direction of what’s important to project success.  

9. Framing the results and lessons learned from the initiative in proper context is important, even if it is messy. Change in healthcare systems is complex, with multiple moving parts. As we examined the study results, we recognized that concurrent improvements in culture of safety and hand hygiene were very important to project success, in addition to the menu of interventions we implemented as part of the initiative. Sometimes, this “messiness” is a deterrent for academic publications, and you don’t get academic credit until the data are published.

10. Last but not the least, improvement work is like scaling mountains. As the title of a book on Paul Farmer goes, there are mountains beyond mountains. We can spend future energies to achieve similar improvement in infection-related outcomes in other health systems, and also to scale these efforts to other types of improvements in healthcare delivery. Embarking on large complex initiatives involves calculated risk-taking. We learn by doing.