“Wait! Before you start, there are some things you should know…” How often have you used these words to stop your students or colleagues from getting started on a project which has not yet been fully planned or for which they are not fully prepared? You know that there are critical steps and considerations that must be addressed for projects to move forward successfully, and that initial enthusiasm, while helpful, is no substitute for careful preparation. Implementing technology initiatives in schools and classrooms requires the same careful planning. This Research in Brief reviews the literature behind such implementation to provide some guidance to teams charged with getting an initiative started.
The National Implementation Research Network (NIRN) defines implementation as “a specific set of activities designed to put into practice an activity or program” (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005, p. 5). NIRN has researched program implementation across disciplines, including social services, business, engineering, and education, providing a broad overview of the challenges and facilitators. Its review highlights major problems in implementation practice, such as relying solely on implementation “by edict” or training alone, or implementing a new intervention without fidelity, without a broad enough scale to effect change, or without a plan for sustainability. The information in this Research in Brief will help you avoid these mistakes, traps, and false starts.
NIRN's work is based on a cross-disciplinary literature review, which identified a sequence of stages that implementation efforts must address in order to be successful: exploration and adoption , program installation , initial implementation , full implementation , innovation , and sustainability . These stages often must be addressed iteratively as efforts are reassessed or reevaluated in light of new realities. Each stage is described below and examples from CITEd’s work with schools and districts across the country are used to illustrate the process of technology implementation.
NIRN estimates that the first four phases described here take systems from two to six years to implement, so pace your planning knowing that early work is an investment in implementing an effective solution. In addition, NIRN emphasizes the importance of aligning the goals of the immediate setting with the layers of settings that connect it to the larger community—the district, the community, and the state. Starting with the goal in mind is essential to implementation efforts’ integrity and evaluation plans. As the saying goes, if you don’t know where you are going, any road will get you there; on the other hand, planning for a clear goal will help chart a particular course with specific choices, consequences, and intended outcomes. CITEd’s EdTech Locator is a tool that can help school-based teams have a conversation about the alignment of uses of technology with goals for teaching and learning. Consult the Locator with a team of colleagues to chart a course toward deeper technology integration in the service of fuller inclusion and higher expectations for all students.
In fact, the importance of a team approach is emphasized in the NIRN work. A single champion cannot carry out the system-level work necessary to effect real change. See the companion Research in Brief article, Implementation Teams , to learn more.
This stage is the initial process of problem articulation and solution identification. What is the problem that requires focused remediation? For example, “meeting AYP” is too vague; "boosting math scores among English language learners who, according to district test data, seem to be struggling," is a much more narrowly identified problem.
Getting to the heart of an identified problem is not easy. Many layers of perceptions, assumptions, and habits often have to be peeled away. One administrator shared that it took several months of data analysis for her team members to “gather the courage” to ask the hard questions that pinpointed areas of systemic weakness.
The CREATER model of problem solving, drawn from the findings of OSEP-funded technical assistance centers, provides guidance in this problem-identification process (Havelock & Hamilton, 2004). This model emphasizes the importance of gathering a team of champions to establish priorities, define the heart of a problem, and identify a path toward solutions. The importance of this early definitional and social marketing work cannot be underestimated as an investment in the long-term success of the implementation effort. Learn more about the CREATER model and background .
Once a problem is identified and well-defined, the team needs to explore the possible evidence-based solutions available to address it. Where can you find programs and solutions that have evidence of effectiveness and how can you compare alternative which programs best fit your setting? Several U.S. Department of Education-sponsored databases collect and validate the evidence on curriculum and school reform models. The What Works Clearinghouse has evaluated the research behind curriculums and interventions. Identifying effective teacher strategies and interventions is the focus of the Doing What Works site. Both are good databases to examine for evidence-based practices. You can find and compare educational and assistive technology solutions in the TechMatrix , an online database co-developed by CITEd and the National Center for Technology Innovation (NCTI), of tools, software, and research in the areas of assistive technology, reading, writing, and mathematics. Search the TechMatrix by learning support (such as providing practice and reinforcement or providing alternative access to print), feature (such as text-to-speech or progress monitoring), or specific product. A clearinghouse of assistive technology and mobility products is available at www.ableproject.org , where you can search in product categories, by role, and by age of end user.
Bringing practitioners and stakeholders together around the solution is a challenge that needs to be faced in this early stage. The implementation teams need to have a dialogue with colleagues and the larger community to ensure that the problem and solution are aligned with the broad goals of the community.
This stage focuses on the system that is being altered in order to take on the process of implementing a new program or solution. Dean Fixsen, co-director of NIRN, shares that this stage is the most often overlooked in education. Schools and practitioners, it seems, do not have the organizational habit as do community-based organizations or commercial enterprises that an existing system needs to be “built out” in advance of starting a new program. Rather, schools often adopt a new program or initiative as an add-on to existing staff, time, equipment, and commitments. Too often, the result is a disappointment for all involved.
Ideally, this stage of implementation involves taking stock of existing resources—human, physical, and financial—for possible reassignment, as well as addressing resource gaps for the planned new program. Some questions to ask at this stage include:
- Do new policies need to be written to reflect the upcoming changes?
- Do existing staff members have the expertise to implement the identified solution or are new hires necessary?
- Who will deliver the training, and when and where? How will staff and stakeholders be paid or compensated for attending the training?
- How will the effectiveness of the training be measured and how will ongoing learning be supported?
- Are our physical space, infrastructure, and equipment adequate?
- Can additional funding streams be sought to cover the long-term costs?
- What outcome measures will be watched and how will progress be tracked? What are the benchmarks? What metrics will be used?
Paying attention to issues such as these demonstrate a commitment by the system and its leaders to ensure that a program or project and its practitioners will be supported through the implementation.
Here is an example of how one school district faced just this situation. The district received a grant for an influx of new classroom technologies. Over the summer holidays, the district arranged for the installation of interactive whiteboards, ceiling-mounted projectors, and a wireless network throughout the district's buildings, and upgraded the teachers’ computers with presentation software. A demonstration project of technology-enhanced teaching methods was scheduled for the fall. Teachers returned to school to find they were committed to a new teaching project involving equipment they had not learned to use. As administrators watched teachers’ use of the equipment not take off as hoped, they realized they had not paid enough attention to training teachers and building out human infrastructure during the installation phase. Administrators delayed the start of the demonstration project in order to focus the available professional development time on helping teachers become more comfortable with the new equipment in their rooms and establishing a teaching and learning community among the teachers.
While program installation focuses on the alterations to the system, initial implementation focuses on the changes that must occur in practice. Resistance may arise when practitioners experience the uncomfortable sensation of changing their daily practice—using new language, equipment, routines, or documentation. Supporting practitioners through this stage is critical, NIRN cautions, “when the program is struggling to begin and when confidence in the decision to adopt the program is being tested” (Fixsen et al., 2005, p. 16). This is not a pilot test of the program, but the initial roll-out with practitioners.
Schools and districts can get stuck in this mode of initial implementation, trying on one solution after another, but not persisting through the initial resistance until practitioners incorporate the program into their practice. If coupled with a lack of attention to preparing the system to incorporate a new intervention in the program installation phase, the intervention will enter the system on very shaky ground indeed; it is not a surprise in these circumstances that practitioners resist rather than try to accommodate a new program without additional support. It’s no wonder then to hear teachers say something like the following: “This year it’s critical thinking, last year it was cooperative learning. I’ve stopped listening to the superintendent’s annual back-to-school message” (Zorfass, 2001, p. 89).
Factors found to be key supports to program implementation include:
- professional development that is planned, intentional, and part of an ongoing, school-wide effort;
- leadership, in which school leaders serve as role models, cheerleaders, and facilitators, and ensure that teachers have the needed resources;
- organization and a structure that supports, encourages and recognizes change efforts; and
- resources and support, internal and external, such as tech support, administrative leadership, buy-in from the community, and partnerships with other educational organizations.
Learn more about these factors and the research behind them in the companion Research in Brief, Technology Implementation in Schools: Key Factors to Consider .
The transition from initial implementation to full implementation draws attention to the process of “scaling up.” This is more than adding numbers of practitioners charged with using the intervention. Learn more about the issues involved in the companion Research in Brief, Scaling Up Technology Initiatives .
Once an evidence-based program or practice has been scaled up—adopted in a system as standard practice—it can be considered in the full implementation phase. Practitioners become reflective and more skillful in their practice of the program. The system reflects the program’s core elements of practice, language, and outcome measures. Benefits of the program begin to show in the data. This phase reflects the feedback loops from practitioners, stakeholders, and collected data to managers and administrators that help solidify the full implementation of the program.
One large district, for example, began a computer-loaner program to address the digital divide they saw opening across socioeconomic lines in their area. They implemented the program over several years, drawing on community engagement, corporate partnerships (to secure the equipment), and teacher training (to embed deeper utilization of the technology in the curriculum as well as homework assignments). Access to computers at home has extended the school day and has become an expectation in the district rather than an issue of equity. Through this effort, they report that their district’s technology plan has come into tighter alignment with the curriculum and instruction plan, and that families have become more involved in the students’ schooling.
This phase is sensitive to the balances in the system that got it started. As staff shift—in roles and responsibilities as well as in number—the core team of champions is likely to change, meaning the original training grows diffuse among the team. Change in leadership roles, in particular, expose the implementation process to setbacks and detours. NIRN’s model of Core Implementation Components (see figure below) illustrates an understanding of how these inevitable shifts can be counterbalanced with planning and a focus on program-centered practice, rather than practitioner-centered practice (see Chapter 4 of the NIRN literature review for a full explanation of the Components). For example, over time, recruitment criteria may become more relaxed, bringing less experienced staff into the system; one remedy is to make pre-service training more rigorous to fill in the gaps for the new practitioners. Similarly, if administration supports are withdrawn or lost through attrition, other components will need to be strengthened to keep the implementation moving forward, perhaps stronger, more expert coaching on site.
Systems in this phase of an implementation are ready to research the effectiveness of their efforts and revisit alignment with the broader goals of the community, which may have shifted since the program was developed. Those involved might work with the program developers or others to determine the effectiveness of a program functioning in place compared to the evidence base behind the program. Questions to consider include:
- Are the outcome measures at the expected level of performance? Why or why not?
- Are the expected performance standards still appropriate?
- Have the goals of the community and district shifted since the program began? Is the program still aligned to the larger goals?
- What does a cost analysis of training and investment show as a return on investment with this program?
- What do our practitioners think of this program? Has it become expected practice and spread beyond the champions?
These types of questions go beyond annual evaluation data to a more in-depth evaluation of the process from the inception.
Sites and teams should attempt to implement a program with as much fidelity as possible from the early stages in order to collect and evaluate outcome data and compare results against the evidence base. This will help establish the credibility of the implementation effort and validate the hard work practitioners invest in implementing a new program.
However, each site will need to make some adaptations to any external evidence-based program that is adopted. Unique community, population, or personnel factors will necessitate the inevitable shift from the pure design.
How can your team identify those elements that characterize a program and are key to the successful outcomes reported in the literature? What can change and what must be adopted as is? These questions are real challenges for practitioners far removed from the original researchers and program designers. One solution is to engage the designers or researchers in dialogue or consultation about your sites’ adaptations. These individuals may be more receptive than you think.
Another option is to work with an intermediary entity that might have experience with the model and previous implementation efforts. The federally funded network of regional resource, comprehensive, and technical assistance and dissemination centers, of which CITEd is one, may be able to broker conversations with researchers or offer suggestions and resources for innovations that maintain the integrity of the program. See the map of U. S. Department of Education funded centers and contact one that is convenient to you or that focuses on your area of interest. Any adaptation to an evidence-based program needs to be carefully planned to retain fidelity of the program to achieve results. Continued data collection and evaluation are important to track the success of the innovation.
Adapting a research-based model was a challenge for one school. A school-based team spent a significant amount of time researching potential programs to meet their needs and identified Looking at Student Work, a professional development model that is derived from OSEP-funded technical assistance research, the STAR Tech model. This model creates a teaching and learning community among teachers in a school with a structured professional development routine and conversation. Technology solutions are an integral component of the solutions teachers identify to address classroom-based goals and needs. Adapting the model to this school’s unique characteristics where teachers could not meet face–to-face on a regular basis required conversations with an intermediary—CITEd technical assistance liaisons—as well as the designers of the program at the Education Development Center who were pleased to discuss elements that could be adjusted without threatening the effectiveness of the model. The result was a unique innovation that met the needs of the school team and drew on the research base of the model.
Planning with the ends in mind means that sustainability, far from the last stage on the list, is an integral part of the implementation process as a whole. We have all seen implementation efforts led by a single champion or team that have struggled and withered when one member of the team is reassigned or the external coaching support is withdrawn. Long-term vision should be an integral part of the entire implementation model, requiring that teams focus on support, scaling up, and sustainability from the inception of the planning process.
As a team, how can you establish and sustain a community which thrives on teaching and learning? Addressing the factors related to implementation, carefully planning for each of the stages, and supporting evidence-based sustainability models will set the stage for successful efforts. As a teacher, sharing your learning is critical to reducing isolation and expanding your repertoire of solutions. Find ideas for teaching and learning communities that support sustainability in technology initiatives in the Research in Brief article, Sustaining Technology Implementation .
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature . Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231).
Havelock, R., & Hamilton, J. (2004). Guiding change in special education: How to help schools with new ideas and practices . Thousand Oaks, CA: Corwin Press.
Zorfass, J. (2001). Sustaining a curriculum innovation: cases of Make it Happen! In J Woodward & L. Cuban (Eds.), Technology, curriculum and professional development: Adapting schools to meet the needs of students with disabilities (pp. 87-114). California: Corwin Press.