Like evidence-based medicine, evidence-based education seeks to produce sound evidence of impact that can be used to intervene successfully in the future. The function of educational innovations, however, is much less well understood than the physical mechanisms of action of medical treatments. This makes production, interpretation, and use of educational impact evidence difficult. Critiques of medical education experiments highlight a need for such studies to do a better job of deepening understanding of learning in context; conclusions that "it worked" often precede scrutiny of what "it" was. The authors unpack the problem of representing educational innovation in a conceptually meaningful way. The more fundamental questions of "What is the intended intervention?" and "Did that intervention, in fact, occur?" are proposed as an alternative to the ubiquitous evaluative question of "Did it work?" The authors excavate the layers of intervention-techniques at the surface, principle in the middle, and philosophy at the core-and propose layered analysis as a way of examining an innovation's intended function in context. The authors then use problem-based learning to illustrate how layered analysis can promote meaningful understanding of impact through specification of what was tried, under what circumstances, and what happened as a result. Layered analysis should support innovation design and evaluation by illuminating what principled adaptation of educational technique to local context could look like. It also promotes theory development by enabling more precise description of the learning conditions at work in a given implementation and how they may evolve with broader adoption.