- Introduction: The Case for a Hybrid Innovation Model: Navigating Wicked Problems with Lean Action Research
- Part I: Deconstructing the Core Methodologies
- Part II: Synthesis – The Lean Action Research (LEARN) Framework
- Chapter 4: Forging the Hybrid: Principles of the LEARN Framework
- Chapter 5: The LEARN Cycle: A New Iterative Process
- Chapter 6: The Minimum Viable Intervention (MVI): From Product to Practice
- Chapter 7: Impact Accounting: Beyond Vanity Metrics in Social Change
- Chapter 8: Cultivating the Community for Lean Action Research
- Chapter 9: The Practitioner’s Toolkit: Navigating the LEARN Cycle
- Chapter 10: The Critical Decision Point: To Pivot or Persevere in Practice
- Conclusion: The Future of Collaborative, Evidence-Based Problem-Solving
Introduction: The Case for a Hybrid Innovation Model: Navigating Wicked Problems with Lean Action Research
The Challenge of “Wicked Problems”
In the contemporary landscape of public service, education, healthcare, and social development, leaders and practitioners are increasingly confronted with challenges that defy simple, linear solutions. These are often termed “wicked problems”—complex, systemic issues characterized by high degrees of uncertainty, interconnectedness, and ambiguity.1 Problems such as climate change mitigation, chronic disease management, educational inequity, and urban poverty are not merely complicated; they are dynamic systems where the problem itself is hard to define, and any attempted solution can have unforeseen consequences.1 Traditional top-down, plan-and-implement models of change frequently fail in this environment. They are often too slow, too rigid, and disconnected from the nuanced, contextual realities of practice.4 The result is a history of public innovations that are often “accidental and episodic,” rather than systematic and sustained, leaving a gap between the scale of the problems we face and our capacity to solve them.1
Introducing the LEARN Framework
This report addresses this gap by proposing a new, integrated approach: the Lean Action Research (LEARN) Framework. This hybrid model is engineered to provide a structured, yet adaptive, methodology for solving practical problems within complex human systems. It achieves this by synthesizing three powerful, evidence-based methodologies into a single, cohesive framework. It integrates the rapid, experimental rigor of the Lean Startup methodology, which excels at navigating extreme uncertainty through validated learning.6 It grounds this speed in the contextual, practitioner-led inquiry of Action Research, a process that empowers those closest to the problem to become co-researchers in improving their own practice.7 Finally, it houses this entire process within the sustainable, collaborative structure of a Community of Practice (CoP), creating a social engine for continuous learning and knowledge stewardship.9 The LEARN framework is therefore presented not as a universal panacea, but as a purpose-built methodology for groups of practitioners aiming to generate meaningful, evidence-based, and scalable solutions to the wicked problems they face daily.
Roadmap of the Report
This report is structured to guide the reader from foundational understanding to practical application. Part I provides an in-depth deconstruction of the three core methodologies—Lean Startup, Action Research, and Community of Practice—to establish a nuanced understanding of their individual strengths and philosophies. Part II presents the synthesis, detailing the principles, processes, and critical concepts of the integrated LEARN framework itself. This section introduces the central operational tools of the model: the LEARN cycle, the Minimum Viable Intervention (MVI), and a sophisticated approach to measurement called Impact Accounting. Finally, Part III serves as a strategic and practical guide for implementation, offering toolkits and frameworks for a Community of Practice to cultivate, navigate, and sustain this powerful approach to collaborative innovation.
Part I: Deconstructing the Core Methodologies
Chapter 1: The Engine of Acceleration – The Lean Startup Methodology
1.1. Core Philosophy: Eliminating Uncertainty and Waste
The Lean Startup methodology, popularized by Eric Ries, is fundamentally a scientific approach to creating and managing new ventures under conditions of extreme uncertainty.6 Its primary purpose is not merely to build products, but to put a rigorous process around the development of a sustainable enterprise when the path forward is unknown.6 This approach counters the traditional method of extensive planning and long development cycles, which often results in building something that customers do not want.6
Derived from the principles of lean manufacturing pioneered by Toyota, the methodology is centered on the systematic elimination of waste.12 In this context, waste is defined as any activity, effort, or resource that does not create value for the end customer.14 The philosophy is not simply about “failing fast, failing cheap” or spending less money; it is about working smarter by focusing all efforts on answering the most critical questions facing a new venture: “Should this product be built?” and “Can we build a sustainable business around this set of products and services?”.6 By framing a startup as a grand experiment, the methodology provides the tools to test a vision continuously, creating order out of the inherent chaos of innovation.6
A deeper analysis reveals that the Build-Measure-Learn loop functions not merely as a development process, but as a strategic framework for managing risk. Startups operate under “extreme uncertainty,” which is synonymous with high risk.6 Given that the primary cause of failure is building products nobody wants 11, the loop’s explicit purpose is to test fundamental “leap-of-faith assumptions” with minimal resource expenditure.12 Each cycle, therefore, serves to buy information, systematically reducing uncertainty and de-risking subsequent, larger investments of time and capital. This reframes it from a “development” cycle to a “learning and de-risking” cycle.
1.2. The Build-Measure-Learn Feedback Loop: The Heart of the Methodology

The core component of the Lean Startup methodology is the Build-Measure-Learn feedback loop, a continuous cycle designed to accelerate learning.6 This loop is the fundamental activity of a startup, guiding the process of turning ideas into validated, market-ready products.11
- Build: This phase begins with an idea or a hypothesis. The team builds a Minimum Viable Product (MVP)—the simplest version of the product that allows for a full turn of the feedback loop with the least amount of effort.13 The MVP is not a smaller version of the final product; it is an experiment designed to test a core assumption.13
- Measure: Once the MVP is in the hands of early customers, the next step is to measure their behavior and gather feedback. This phase is about collecting data—both quantitative (e.g., usage rates, conversion metrics) and qualitative (e.g., customer interviews, feedback)—to see how customers actually respond.12 The focus must be on actionable metrics that provide clear signals, rather than vanity metrics that can be misleading.13
- Learn: This is the critical phase where the collected data is analyzed to generate validated learning. The team determines whether their initial hypotheses were correct. This learning then informs the crucial decision of whether to pivot (make a significant change in strategy) or persevere (continue on the current path with further refinements).6
The ultimate goal of this process is to minimize the total time through the loop.6 Some practitioners even advocate for reversing the loop’s planning sequence to Learn-Measure-Build. This alternative approach starts by first identifying the most critical learning goal, then determining how to measure it, and only then building the smallest possible experiment to get that data, ensuring that every action is maximally focused and efficient.19
1.3. Validated Learning: The True Unit of Progress
In traditional management, progress is measured by the production of goods or the hitting of milestones. The Lean Startup proposes a different unit of progress: validated learning.6 Validated learning is the rigorous, empirical process of demonstrating that a team has discovered valuable truths about a venture’s present and future prospects.14 It is a scientific method for tracking progress in a context of extreme uncertainty, where traditional metrics are often meaningless.6
Startups exist not just to make things or serve customers, but to learn how to build a sustainable business.6 Every initiative, every feature, and every marketing campaign should be viewed as an experiment designed to test a fundamental business hypothesis, often referred to as a “leap-of-faith assumption”.12 For example, Dropbox famously used a simple explainer video as an MVP to test the hypothesis that people wanted a cloud-based file storage solution, generating 75,000 sign-ups overnight and validating their core idea before writing much of the code.11 This learning—that the market demand was real—was far more valuable at that stage than a perfectly engineered product. By embracing validated learning, entrepreneurs can significantly shorten the development process, adapting their plans incrementally based on real evidence rather than waiting months for a beta launch to discover they were on the wrong path.6
1.4. Innovation Accounting and Actionable Metrics: Measuring What Matters
To support validated learning, the Lean Startup introduces Innovation Accounting, a new form of accounting tailored for startups. It is designed to measure progress and create learning milestones when traditional financial metrics like revenue and profit are zero or negligible.6 This framework holds entrepreneurs accountable to a different, more appropriate set of standards focused on learning and de-risking the business model.6
A critical component of innovation accounting is the disciplined distinction between actionable metrics and vanity metrics.13
- Vanity Metrics are measurements that look good on paper but often obscure the truth and do not provide a basis for action. Examples include total number of downloads or raw page views.13 These metrics can be dangerously misleading because they give the “rosiest picture possible” without reflecting the key drivers of a sustainable business.13 A high number of new users per day, for instance, is a vanity metric if the cost to acquire each user far exceeds the revenue they generate.13
- Actionable Metrics, in contrast, demonstrate a clear cause-and-effect relationship and lead to informed business decisions.13 According to Ries, they should follow the “three A’s”: they must be Actionable (show clear cause and effect), Accessible (be understood by everyone), and Auditable (the data is credible and can be verified).22 For example, tracking the conversion rate of a specific user cohort that was exposed to a new feature is an actionable metric, as it directly links the feature (cause) to user behavior (effect). A powerful tool for focusing on what matters is the “One Metric That Matters” (OMTM), where a team concentrates on improving a single, key metric that aligns with its current stage and goals, as Dropbox did by focusing on user referrals to drive viral growth.11
1.5. The Pivot or Persevere Decision
Every cycle through the Build-Measure-Learn loop culminates in a critical decision point: to pivot or persevere.11 This is the moment of accountability where the team must look at the evidence gathered and decide if they are making sufficient progress to justify staying the course.21
- Persevere: If the data from the experiments validates the core hypotheses and the actionable metrics are moving in the right direction, the team should persevere. This means continuing with the current strategy while making small optimizations and refinements in subsequent loops.12
- Pivot: If the experiments consistently invalidate the hypotheses and the team is not making progress toward a sustainable business model, a pivot is necessary. A pivot is a “structured course correction designed to test a new fundamental hypothesis about the product, strategy, and engine of growth”.12 It is not an admission of failure but a courageous, strategic decision to change direction based on what has been learned. A pivot maintains the original vision but changes the strategy to achieve it.27 Ries outlines several types of pivots, such as a “zoom-in pivot,” where a single feature becomes the entire product, or a “customer segment pivot,” where the product is repositioned for a different audience.11
This decision must be data-driven and based on the validated learning from the feedback loop, not on emotion or gut feeling.21 Scheduling regular “pivot or persevere” meetings forces the team to confront the data and make these hard choices, ensuring that the venture remains agile and responsive to reality.25
The principle that “Entrepreneurs are Everywhere” is a radical call for democratizing innovation within any institution, not just tech startups.6 This implies that the methodology is inherently suited for practitioner-led change within established organizations. Ries defines an entrepreneur as anyone operating within a “human institution designed to create a new product or service under conditions of extreme uncertainty” 6, a definition untethered to company size or sector. He explicitly includes managers of new initiatives within large companies as entrepreneurs.22 Case studies confirm the application of Lean principles in diverse environments like GE, government, and healthcare.29 This directly supports the idea that a teacher trying a new pedagogy, a nurse improving a patient care process, or a community organizer testing a new outreach method are all acting as entrepreneurs. The Lean Startup framework, therefore, provides a pre-existing language and process for these “intrapreneurs,” making it a natural fit for practitioner-led action research.
Chapter 2: The Engine of Inquiry – The Action Research Cycle
2.1. Core Philosophy: Learning by Doing and Improving Practice
Action Research is a family of research methodologies that unites two essential aims: to take action to solve an immediate, practical problem and to study that action to generate knowledge.32 It is fundamentally a process of “learning by doing”.7 A group of practitioners, facing a concrete challenge in their own context, systematically investigates the problem, devises and implements an intervention, and evaluates its effectiveness to improve their practice.8 Unlike traditional research that often seeks to produce generalizable theoretical knowledge, the primary goal of action research is to generate practical knowledge that is useful to people in the everyday conduct of their lives.8
A defining characteristic of this approach is its participatory and collaborative nature. It actively seeks to break down the hierarchy between researcher and subject, transforming the people involved—such as teachers, nurses, or community members—into co-researchers who investigate their own situations.7 This philosophy is rooted in the belief that people learn best and are more willing to apply what they have learned when they are active participants in the knowledge creation process.7 It bridges the often-cited gap between theory and practice by ensuring that research is not something done to practitioners, but something done by and for them.32
2.2. The Plan-Act-Observe-Reflect Cycle: A Spiral of Inquiry

The operational core of action research is a cyclical, or spiral, process.36 While models vary slightly, they are most commonly represented by the four-stage cycle developed by Kurt Lewin and later refined by Stephen Kemmis and others: Plan, Act, Observe, and Reflect.7 This cycle is not a rigid, linear path but an iterative spiral, where each loop informs the next, leading to continuous refinement and deeper understanding.33
- Plan: The cycle begins with identifying and diagnosing a practical problem. Practitioners collectively reflect on the issue and develop a plan of action or an intervention designed to bring about improvement.33 This stage involves clarifying the goals and anticipating the steps needed to address the challenge.
- Act: The team implements the planned intervention in their real-world setting.33 This is the “doing” phase, where the theoretical plan is put into practice. Simultaneously, the process of data collection begins.
- Observe: During and after the action, the practitioner-researchers systematically observe the effects and consequences of the intervention.33 This involves collecting data through various mechanisms, such as questionnaires, interviews, diaries, or case studies, to gather evidence about the change.8
- Reflect: In this crucial stage, the team analyzes the collected data and reflects critically on the outcomes.33 They evaluate what worked, what did not, and why. This reflection generates new insights and learning, which then forms the basis for planning the next cycle of the spiral.36
This process is often described as inherently “messy” and flexible, valuing responsiveness to the situation over rigid adherence to a pre-set plan.8 The emergent nature of the research allows the process to adapt as the practitioners’ understanding of the problem deepens with each cycle.36
The cyclical processes of Action Research (Plan-Act-Observe-Reflect) and Lean Startup (Build-Measure-Learn) are methodological cousins, but they operate with different “clockspeeds” and levels of formality, presenting both a challenge and an opportunity for integration. Both are fundamentally iterative learning loops.6 However, Lean Startup emphasizes maximum acceleration and speed, with some tech companies deploying new code up to 50 times a day.6 Its cycles are designed to be rapid. In contrast, Action Research cycles are often more deliberate, extending over longer periods to allow for deep reflection on complex social dynamics.36 The process is valued for being emergent and sometimes “messy”.8 The opportunity in a hybrid model is to use the Lean mindset to inject urgency and a focus on rapid testing into the Action Research cycle, while using the Action Research mindset to ensure the “learning” is deep, reflective, and contextually rich, not just a superficial data point. This creates a powerful, variable-speed engine for change.
2.3. Guiding Principles of Action Research
Beyond its cyclical process, action research is defined by a set of core principles that shape its character and purpose:
- Reflexive and Dialectical Critique: This principle requires participants to reflect on their own implicit assumptions, biases, and interpretations.7 It acknowledges that in a social setting, truth is not absolute but is consensually validated and constructed through dialogue. This critical self-reflection is essential for generating theoretical insights from practical accounts.7
- Collaborative Resource: Action research democratizes the research process by presupposing that each participant’s ideas are equally significant as potential resources for analysis.7 It strives to avoid giving undue weight to ideas based on the prior status of the person who holds them, ensuring that insights can emerge from any member of the group.
- Context-Based Knowledge: Knowledge in action research is not abstract or decontextualized; it is created through action and application within a specific, lived situation.8 The findings are not intended to be broadly generalizable in a statistical sense, but they are deeply relevant to the context in which they were generated and can offer valuable, transferable insights to others in similar situations.34
- Theory and Practice Transformation: Action research embodies a continuous, symbiotic relationship between theory and practice. People’s actions are based on implicit theories, and the observed results of those actions serve to enhance and refine theoretical knowledge in a continuous transformation.7
2.4. Connection to Improvement Science
Action research shares a strong conceptual and methodological kinship with the field of Improvement Science. Improvement science has emerged, particularly in healthcare and education, to provide a rigorous framework for research focused on determining which improvement strategies are effective in complex systems.39 Its primary goal is to ensure that quality improvement efforts are themselves based on evidence.39
The most direct link between the two fields is the shared use of iterative cycles for testing changes. The Plan-Do-Study-Act (PDSA) cycle, a cornerstone of improvement science, is functionally identical to the action research cycle.41 This model guides practitioners to plan a change, do (implement) it, study the results, and act on what is learned to refine the change.43 This parallel reinforces the validity of using such cycles as a systematic, disciplined approach to practitioner-led inquiry and evidence-based improvement. Both fields are explicitly designed to accelerate “learning-by-doing” in a problem-centered way.41
The principle of “collaborative resource” in action research, which positions all participants as co-researchers, provides the necessary philosophical and ethical foundation for applying a Lean-style experimental approach in sensitive social contexts, such as with vulnerable populations.7 While a purely Lean approach might be perceived as extractive, focusing on the “customer” primarily as a source of data 45, action research—and particularly its variant, Participatory Action Research (PAR)—is built on principles of power-sharing, mutual trust, and co-learning.32 It recognizes community members as experts in their own lived experiences. By integrating this ethos, the experimental loop is transformed. The community does not merely provide feedback on a pre-designed intervention; they become partners in designing the experiment, defining the metrics of success, and co-interpreting the results. This transforms the Lean feedback loop from a transactional data-gathering exercise into a deeply collaborative and ethically robust inquiry process.
Chapter 3: The Engine of Collaboration – The Community of Practice (CoP)
3.1. Defining a Community of Practice: More Than a Team
A Community of Practice (CoP) is a concept that describes a unique and powerful form of collective learning. Etienne Wenger, a key theorist in this field, defines a CoP as a group of people “who share a concern or a passion for something they do and learn how to do it better as they interact regularly”.10 This is more than just a network of connections or a formal project team. It is a social learning system where practitioners are bound together by a shared interest and a commitment to deepen their knowledge and expertise in a particular domain.9
In this view, learning is not an individual acquisition of abstract knowledge but a process of social participation.49 Individuals learn by actively engaging in the practices of a community, which in turn shapes their identity within that group. The CoP acts as a “living curriculum” for its members, particularly in contexts like apprenticeships, where novices learn at the periphery and gradually become more central to the community’s practice.10 These communities can be formal or informal, small or large, and can exist within or across organizations, but they are all brought together by mutual engagement in a shared endeavor.10
3.2. The Three Pillars of a CoP
For a group to be considered a Community of Practice, it must possess three crucial and interdependent characteristics: the Domain, the Community, and the Practice.9 It is the combination and parallel development of these three elements that constitutes a CoP.50
- The Domain: This is the shared domain of interest that gives the community its identity and defines its purpose. It is not just a casual interest but implies a commitment and a level of shared competence that distinguishes members from non-members.9 The domain provides the common ground, guides the learning, and gives meaning to the members’ actions.52 For example, a group of engineers working on similar problems or a group of nurses sharing insights on patient care are defined by their respective domains.9
- The Community: This refers to the social fabric of the CoP. In pursuing their interest in the domain, members engage in joint activities, discussions, and information sharing. They build relationships that enable them to learn from one another.9 This mutual engagement is critical; simply having the same job title or being on the same distribution list does not create a community. Members must interact and learn together, fostering a social environment that encourages a willingness to share ideas, ask for help, and build trust.9
- The Practice: Members of a CoP are practitioners. Over time, through sustained interaction, they develop a shared repertoire of resources. This includes experiences, stories, tools, routines, and established ways of addressing recurring problems.10 This shared practice is the specific focus around which the community develops and maintains its collective knowledge. It is the tangible and intangible set of resources that carries the accumulated wisdom of the group, representing not just technical skill but shared ways of doing and approaching things.10
A Community of Practice is the ideal organizational “container” for a Lean Action Research process because its structure is inherently designed for the long-term, sustained interaction and knowledge-building required to move beyond single experiments. Both Lean Startup and Action Research are iterative methodologies; a single cycle is rarely sufficient, and they both depend on the principle of perseverance to achieve meaningful results.25 Traditional project teams, however, are often temporary structures that dissolve upon project completion, leading to a significant loss of institutional memory and momentum. In contrast, a CoP is defined by its sustained interaction over time around a shared domain.10 Its fundamental purpose is ongoing learning and the development of a “shared repertoire” of practice.51 This makes the CoP the perfect permanent social infrastructure to house multiple LEARN cycles. The “practice” of the CoP becomes the continuous process of running these experiments, and the “shared repertoire” becomes the accumulated, validated learnings that result. This structure effectively solves the sustainability problem that plagues many one-off innovation projects.
3.3 Mapping How the AR Cycle Animates the CoP

The integration of the AR & CoP frameworks can be visualized by mapping the phases of the Action Research cycle directly onto the three pillars of a Community of Practice. This mapping reveals how the AR cycle animates the CoP, turning it into a dynamic system for continuous improvement.
- PLAN: The cycle begins within the Community. Members collaboratively identify a pressing problem or a compelling question that is relevant to their shared Domain. This planning phase involves collective sense-making, reviewing the existing knowledge contained within their shared Practice, and developing a joint plan for action. This process of working together on a shared goal inherently strengthens the Community pillar.
- ACT: Members of the Community then implement the planned intervention or change within their own contexts. This coordinated action is a direct contribution to the evolution of the group’s shared Practice.
- OBSERVE: The Community works together to systematically gather data on the effects of their action. This observation is focused on understanding the action’s impact on the Practice and, by extension, on the Domain itself.
- REFLECT: The Community reconvenes to engage in collective reflection. This is the heart of the learning process, where members analyze the data, interpret the findings, and engage in the kind of reflexive and dialectical critique central to AR. The new insights and knowledge generated during this phase are then explicitly integrated back into the shared Practice—for example, as a new tool, a refined process, a documented best practice, or a compelling story. This reflection may also lead the community to re-evaluate and redefine its Domain, setting the stage for the next spiral of the cycle.
This integrated process directly addresses the primary vulnerability of a CoP: stagnation. The iterative, data-driven nature of the AR cycle provides a formal, structured process for a CoP to critically evaluate and evolve its own “Practice”. The mandatory ‘Reflect’ phase, based on empirical evidence gathered during the ‘Observe’ phase, compels the community to move beyond anecdote and shared assumptions. It institutionalizes the AR principle of ‘Reflexive Critique’ within the CoP’s operations.This systematic, evidence-based challenge prevents the shared ‘Practice’ from becoming rigid dogma and ensures it remains a living, evolving body of knowledge. In this way, the Action Research cycle serves as the perfect antidote to a CoP’s potential for defensiveness and inertia.
3.3 The Human Element: Mentoring and Critical Friendships
The success of a community of practice hinges on the quality of the human relationships within it. Two concepts are particularly vital: critical friendship and non-hierarchical mentoring. A “critical friend” is a trusted colleague who provides both support and challenge, asking provocative questions, offering different perspectives, and providing feedback within a relationship of mutual trust. Within an CoP, members can serve as critical friends for one another, ensuring that inquiry remains rigorous and reflective.
Similarly, mentoring within an CoP is best understood as a non-hierarchical and reciprocal process of co-learning. Rather than a top-down transfer of knowledge from expert to novice, mentoring becomes a collaborative activity where all members, regardless of experience, are positioned as “co-constructors of knowledge”.37 This approach aligns perfectly with the AR principle of “Collaborative Resource,” which values all participants’ contributions equally and fosters a culture of mutual respect and learning.
3.5. How CoPs Function and Evolve
Communities of Practice are not static entities; they are living, dynamic systems that evolve over time. Cultivating a successful CoP requires understanding and supporting this natural evolution rather than imposing a rigid, pre-defined structure.54
- Designing for Evolution: Because CoPs are organic, designing them is more akin to shepherding their development than creating them from a blueprint.54 The design elements should act as catalysts. It is often best to start with a simple structure—such as regular meetings—and then introduce other elements like a website, shared documents, or specific projects as the community engages and builds relationships.54 The design must be flexible, as the community’s focus may shift as new members join or as the external environment changes.55
- Levels of Participation: A healthy CoP accommodates and values different levels of participation.49 Wenger identifies three main levels: a core group that participates intensely and often takes on leadership roles; an active group that participates regularly but less intensely; and a larger peripheral group of more passive participants who still learn from their observation and access to the community’s resources. All three levels are essential for the community’s vitality and reach.49
- Value Creation: The ultimate success of a CoP depends on its ability to generate enough value, relevance, and excitement to attract and engage its members, especially since participation is often voluntary.54 This value is created through a variety of activities, such as collaborative problem-solving (“Can we brainstorm some ideas on this?”), sharing experiences (“Has anyone dealt with this situation before?”), reusing assets, documenting projects, and building collective confidence.9
The three pillars of a CoP—Domain, Community, and Practice—provide a direct and powerful mapping for the key elements of the proposed hybrid LEARN framework. The Domain clearly defines the “wicked problem” that the group is committed to solving, setting the boundaries for their work and providing a shared identity.9 This aligns perfectly with the “Plan” phase of Action Research and the initial “Problem/Solution Fit” exploration in Lean Startup. The Community pillar creates the essential collaborative and trust-based environment.9 This social fabric is a prerequisite for the participatory ethos of Action Research and the “fail forward” culture of psychological safety needed for the rapid experimentation of Lean Startup.5 Finally, the Practice is the tangible, iterative work of the LEARN cycle itself. CoP members are practitioners who are not just discussing the problem but are actively developing, testing, and refining solutions, thereby building their shared repertoire of tools, stories, and methods.10 This ensures the CoP functions as an active “do-tank,” not just a passive “think-tank.”
Part II: Synthesis – The Lean Action Research (LEARN) Framework
Chapter 4: Forging the Hybrid: Principles of the LEARN Framework
4.1. From Three Methodologies to One Integrated Framework
The preceding chapters have deconstructed the individual strengths of Lean Startup, Action Research, and Communities of Practice. While each is powerful in its own right, their true potential for solving complex practical problems is unlocked when they are synthesized into a single, integrated methodology. This report proposes the Lean Action Research (LEARN) Framework as that synthesis. The LEARN framework is designed to create a system that is simultaneously fast and reflective, data-driven and human-centered, experimental and sustainable. It is not merely a collection of tools from each parent methodology but a coherent new model with its own guiding principles and operational logic.
This type of synthesis is not without precedent. Hybrid models are emerging in various fields as practitioners seek to combine the best features of different approaches. In healthcare, for example, “Lean Action Research Learning Collaboratives” have been formed to use lean as an operating system for continuous improvement, supported by objective evaluation and feedback loops.30 Researchers in manufacturing and education have also explored the integration of lean principles with action research methodologies to drive change.56 Furthermore, the Lean Startup itself has been combined with other frameworks like Design Thinking and Agile to enhance new product development processes.59 The LEARN framework builds on this trend, creating a purpose-built model for practitioner-led innovation within a collaborative community structure.
4.2. The Four Core Principles of LEARN
The LEARN framework is built upon four foundational principles that integrate the core philosophies of its parent methodologies. These principles guide the mindset and actions of the Community of Practice as it engages in its work.
- Practitioner-Led Experimentation: The work of the LEARN framework is driven by the practitioners who are closest to the problem. They are not subjects of research but are empowered as co-researchers, innovators, and agents of change. This principle grounds the framework in the authentic, lived reality of practice. It draws directly from the core philosophy of Action Research, which champions participatory, practitioner-led inquiry 7, and fuses it with the Lean Startup’s ethos of “Entrepreneurs are Everywhere,” which recognizes that innovative individuals can be found in any role within any institution.22
- Rapid, Iterative Inquiry: The framework prioritizes speed, learning, and adaptation through short, iterative cycles. This principle addresses the need for agility in complex and uncertain environments. It injects the urgency and efficiency of the Lean Startup’s Build-Measure-Learn loop 6 and the adaptive nature of Agile methodologies 62 into the traditionally more deliberate and paced Action Research cycle. The goal is to learn and adjust quickly, avoiding long periods of investment in ineffective strategies.
- Contextual Validation: Within the LEARN framework, success is not defined by universal, one-size-fits-all metrics. Instead, it is defined by validated evidence of positive change within the specific context of the community’s practice. This principle adapts the concept of “Validated Learning” from Lean Startup, which demands rigorous, empirical evidence of progress.6 It refines this concept through the lens of Action Research’s emphasis on context-specificity 8 and the idea of “social validity,” which assesses an intervention’s significance and acceptability to the community members themselves.64
- Collective Sense-Making and Knowledge Stewardship: Learning is understood as a social and collaborative process, not an individual one. The framework is housed within a Community of Practice to ensure that insights generated from experiments are not lost. The CoP structure provides the forum for members to collectively interpret data, make sense of findings, and document what they have learned. This process transforms individual insights into a durable, shared repertoire of practice, effectively making the community a steward of its own evolving knowledge.9
A fundamental contribution of the LEARN framework is its redefinition of “failure.” In many public and social sector organizations, failure is culturally unacceptable and often carries significant professional risk, particularly when funders expect guaranteed outcomes.5 This creates a powerful disincentive for the very experimentation that is necessary for innovation. The Lean Startup, in contrast, explicitly embraces failure as an essential opportunity to learn and pivot.12 This presents a significant cultural mismatch. The LEARN framework bridges this gap. The “Reflect” stage of Action Research already provides a structured process for learning from what did not work.33 By integrating this with the Lean mindset within the trusting and collaborative environment of a CoP, the framework creates a “safe-to-fail” space. The use of low-cost, rapid Minimum Viable Interventions (MVIs) lowers the stakes of any single experiment. Consequently, a negative result is reframed. It is no longer a “failed project” but a “successful experiment that invalidated a hypothesis”—a crucial form of validated learning that represents progress. This cultural reframing is essential for unlocking genuine, practitioner-led innovation.
Chapter 5: The LEARN Cycle: A New Iterative Process
5.1. Merging the Cycles: From B-M-L and P-A-O-R to a Unified Loop

The operational core of the LEARN framework is its integrated iterative process, the LEARN Cycle. This cycle synthesizes the key stages of its parent methodologies—Lean Startup’s Build-Measure-Learn (B-M-L) loop and Action Research’s Plan-Act-Observe-Reflect (P-A-O-R) cycle—into a new, four-stage loop that is both rigorous and agile. The proposed cycle is:
Frame → Test → Assess → Synthesize
This structure provides a clear, repeatable process for the Community of Practice to move from problem identification to validated learning. It honors the intent of each parent methodology: the upfront planning and hypothesizing of Action Research, the rapid building and action of both, the disciplined measurement of Lean, the deep observation of Action Research, and the reflective learning central to both.
5.2. Detailed Breakdown of the Four Stages
Each stage of the LEARN cycle has a distinct purpose, a set of guiding activities, and a clear output that feeds into the next stage.
- Stage 1: FRAME (Plan & Hypothesize)
- Purpose: To collectively define the specific problem to be addressed in the cycle, clarify the group’s underlying assumptions and theories of change, and formulate a precise, falsifiable hypothesis that can be empirically tested.
- Activities: The CoP begins with deep dialogue and reflexive critique to ensure a shared understanding of the problem’s context and root causes, potentially using techniques like the “Five Whys”.6 The group then maps out its assumptions about a potential solution, using tools like a simplified Lean Canvas or Business Model Canvas to make these “leap-of-faith” assumptions explicit.22 From this, they define a clear, measurable learning goal for the cycle and formulate a specific, testable hypothesis.20
- Output: A documented, falsifiable hypothesis and a clear plan for the Minimum Viable Intervention (MVI) that will be used to test it.
- Stage 2: TEST (Build & Act)
- Purpose: To build and deploy the planned intervention in the real-world context of the practitioners’ work.
- Activities: This stage merges the “Build” phase of Lean Startup with the “Act” phase of Action Research. The CoP develops and implements the MVI they designed in the FRAME stage.12 The emphasis is on speed and resource efficiency—doing the smallest thing possible to generate a valid signal and begin the learning process.67 This could be a new classroom routine, a modified patient checklist, or a new community outreach script.
- Output: The MVI is deployed and actively interacting with the target population, process, or system.
- Stage 3: ASSESS (Measure & Observe)
- Purpose: To systematically and rigorously collect both quantitative and qualitative data regarding the MVI’s performance, effects, and reception.
- Activities: This stage is a powerful fusion of Lean’s “Measure” phase and Action Research’s “Observe” phase. The team collects pre-defined actionable metrics to track behavioral changes and test the quantitative aspects of the hypothesis.13 Simultaneously, they gather rich qualitative data through methods such as direct observation, interviews, participant diaries, and focus groups to understand the context and the “why” behind the numbers.8 This mixed-methods approach ensures a holistic understanding of the intervention’s impact.
- Output: A rich dataset containing both quantitative metrics and qualitative evidence.
- Stage 4: SYNTHESIZE (Learn & Reflect)
- Purpose: To collectively analyze the full dataset, generate validated learning by comparing the results to the initial hypothesis, and make a structured decision about the path forward.
- Activities: This final stage combines Lean’s “Learn” with Action Research’s “Reflect.” The CoP comes together to analyze the data from the ASSESS stage. They engage in a structured dialogue to interpret the findings, discuss surprises and unintended consequences, and formally determine whether their hypothesis was validated or invalidated.7 This collective sense-making process culminates in the critical “Pivot or Persevere” decision.11
- Output: A documented “validated learning” (e.g., “Our hypothesis that X would lead to Y was invalidated; we learned that Z is the actual barrier”), a clear decision to pivot or persevere, and the insights needed to begin the next FRAME stage.
The LEARN cycle is engineered to be a “double-loop learning” engine.69 Single-loop learning involves correcting errors to better achieve a pre-defined goal, asking, “Are we doing things right?” Double-loop learning goes deeper, questioning the goal itself and the underlying assumptions of the strategy, asking, “Are we doing the right things?”.69 The FRAME stage of the LEARN cycle forces the team to make its assumptions and theory of change explicit. The SYNTHESIZE stage then compels them to confront whether those assumptions were correct based on real-world evidence from the ASSESS stage. The “Pivot” decision is the ultimate expression of double-loop learning—it is a formal acknowledgment that the initial theory was flawed and requires fundamental rethinking.12 Therefore, the LEARN cycle is not merely a project management tool for improving an intervention; it is a strategic learning process that enables the Community of Practice to continuously challenge, refine, and improve its own mental models of the complex problem it seeks to solve.
Chapter 6: The Minimum Viable Intervention (MVI): From Product to Practice
6.1. Adapting the MVP for Social and Practical Problems
The Minimum Viable Product (MVP) is a cornerstone of the Lean Startup methodology. It is defined as that version of a new product which allows a team to collect the maximum amount of validated learning about customers with the least effort.11 The purpose of an MVP is not to be a feature-reduced final product, but to be a tool for testing a fundamental hypothesis as quickly and cheaply as possible.19
For the LEARN framework, which operates in social, educational, and other practical domains rather than commercial markets, this concept must be adapted. We introduce the Minimum Viable Intervention (MVI) as the core experimental tool of the LEARN cycle. The MVI is defined as:
The smallest, fastest, and most resource-efficient action, practice, tool, or programmatic change that can be implemented by a practitioner or group of practitioners to rigorously test a specific hypothesis about improving a practical problem.
This concept draws not only on the MVP but also on related ideas from other fields, such as the “Minimal Intervention Needed to Produce Change” (MINC) in healthcare research, which seeks the lowest level of intervention intensity required to achieve a clinically significant improvement.70 It also aligns with concepts like “Minimum Viable Research” (MVR) and “Minimum Viable Task” (MVT), which emphasize breaking down large efforts into the smallest possible actions to gain feedback and overcome inertia.67
6.2. Characteristics and Examples of MVIs
An MVI is fundamentally a learning tool, not a delivery mechanism. Its primary job is to generate data to test a hypothesis from the FRAME stage of the LEARN cycle. It should be intentionally incomplete, focusing only on what is necessary for the experiment.
Examples of MVIs across different sectors illustrate this principle:
- Education: A CoP of teachers hypothesizes that a new visual method for explaining fractions will improve student comprehension.
- Traditional Approach: Develop a full 6-week curriculum unit, create all materials, train teachers, and roll it out.
- MVI Approach: Create a single 15-minute lesson plan using the new visual method. One teacher delivers it to one class. The hypothesis is tested by comparing pre- and post-lesson quiz scores and conducting brief student interviews immediately after.66 The effort is minimal, but the learning is rapid and direct.
- Healthcare: A hospital CoP hypothesizes that a new communication script during patient handoffs will reduce information errors.
- Traditional Approach: Form a committee, design a new hospital-wide protocol, create training materials, and schedule mandatory training sessions for all nursing staff.
- MVI Approach: The CoP drafts the script on a single-page checklist. Two nurses on one unit agree to use it for all their handoffs for a single shift. They track errors and provide qualitative feedback at the end of the shift.73 The test is contained, fast, and provides immediate, actionable data.
- Community Development: A nonprofit CoP hypothesizes that community residents are not attending financial literacy workshops because the messaging is intimidating.
- Traditional Approach: Redesign the entire workshop series, rebrand it, and launch a new marketing campaign.
- MVI Approach: Create two different landing pages (an A/B test), each with different messaging—one technical, one focused on personal stories. Drive a small amount of traffic to each via social media ads and measure the click-through and sign-up rates.19 This is analogous to the famous Zappos MVP, where the founder took photos of shoes in local stores and posted them online to test the hypothesis that people would buy shoes online, without building a single warehouse or buying any inventory.13
The concept of the MVI forces a profound cognitive shift away from “solutioneering” and toward “hypothesis-testing.” Practitioners across many fields are professionally trained and culturally conditioned to deliver complete, polished, and high-quality solutions. There is an inherent pressure to avoid releasing something that feels “unfinished” or “half-baked.” The MVP/MVI concept can therefore seem counter-intuitive and even professionally risky, as it is intentionally incomplete.13 Its goal is not to deliver value in the traditional sense, but to learn as quickly and efficiently as possible.19
The LEARN framework makes this shift more palatable by framing the MVI explicitly as an “experiment” or a “test,” language that is central to both Action Research and Improvement Science. The MVI is not positioned as a “poorly delivered service” but as a “rigorous test of a change idea.” This crucial reframing moves the Community of Practice’s focus from the question, “Are we delivering a perfect program?” to the more strategic question, “Are we learning what we need to learn to build an effective program?” This process de-risks innovation, lowers the psychological barrier to experimentation, and empowers practitioners to test new ideas without fear of failure.
Chapter 7: Impact Accounting: Beyond Vanity Metrics in Social Change
7.1. The Problem with Business Metrics in the Social Sector
One of the most significant challenges in adapting business-inspired methodologies to the social, public, and educational sectors is measurement. In a commercial context, the ultimate metric of success is profit. In the social sector, however, there is often no direct “price signal” to indicate value, and the goals are far more complex and long-term.29 Applying business metrics without careful adaptation can lead to dysfunctional incentives, encouraging organizations to focus on what is easily measurable rather than what is truly meaningful.77
This leads to the “Vanity Metrics Trap” in a social context. Organizations may be tempted to report on easily countable but superficial numbers that create a misleading sense of success.13 Examples include the number of people who attended a workshop, the number of website hits an awareness campaign received, or the number of brochures distributed.80 These metrics are seductive because they are easy to collect and often produce large, impressive-looking numbers. However, they are fundamentally flawed because they measure outputs (the activities performed) rather than outcomes (the actual change that occurred as a result).81 They fail to answer the critical question: did the intervention make a difference?
7.2. Introducing “Impact Accounting”: A Hybrid Measurement Framework
To address this challenge, the LEARN framework proposes a sophisticated measurement system called Impact Accounting. This approach adapts the principles of Lean Startup’s “Innovation Accounting” 6 for a non-commercial context. Its purpose is to provide the Community of Practice with a holistic, evidence-based view of its progress by integrating three distinct types of data for every LEARN cycle.
- Quantitative Actionable Metrics (from Lean Startup): These are numerical measures that are directly tied to the MVI’s hypothesis and track a change in behavior. They must demonstrate a clear cause-and-effect link. Instead of “number of attendees,” an actionable metric would be “the percentage of first-time attendees who return for a second session,” as this measures the program’s ability to engage participants. These metrics provide hard, objective evidence of whether the intervention is having its intended effect.13
- Qualitative Evidence (from Action Research): Numbers alone cannot explain why a change is or is not happening. This component involves the systematic collection of rich, contextual data through methods like interviews, direct observation, focus groups, and participant stories.8 This qualitative evidence is essential for interpreting the quantitative metrics and understanding the lived experience of those affected by the intervention. It answers the “why” behind the “what.”
- Social Validity Assessment (from Social Science): This component, drawn from fields like applied behavior analysis, directly assesses the social significance of the intervention from the perspective of the beneficiaries and key stakeholders.64 It seeks to answer three questions: Are the goals of the intervention important and relevant to the community? Are the procedures and methods used acceptable and respectful? Are the outcomes produced meaningful and valuable to those who experience them?.64 This ensures that the intervention is not just technically effective but also appropriate and welcome.
Part III: Implementation – A Strategic Guide for the Community of Practice
Chapter 8: Cultivating the Community for Lean Action Research
8.1. Establishing the CoP: The Foundation for LEARN
The successful implementation of the LEARN framework is entirely dependent on the health and functionality of the Community of Practice that houses it. Establishing this CoP requires an intentional, structured approach focused on its three foundational pillars.
- Step 1: Identify the Domain. The first and most critical step is to define the CoP’s shared domain of interest. This should be a specific, meaningful, and persistent practical problem that members are passionate about solving (e.g., “improving math literacy for 7th graders in our district,” “reducing patient wait times in our emergency department,” or “increasing civic engagement among young adults in our city”).9 A well-defined domain provides the CoP with a clear identity, a shared purpose, and a North Star to guide all its activities. It is the “what” that binds the community together.53
- Step 2: Recruit the Initial Community. With the domain defined, the next step is to build the community itself. It is advisable to start small, recruiting a core group of committed practitioners who experience the problem defined by the domain in their daily work.54 These individuals will be the initial co-researchers and innovators. The recruitment effort should focus on those who have a vested interest in the problem and a willingness to engage in a collaborative, experimental process. This core group forms the initial social fabric of the CoP.9
- Step 3: Define the Initial Practice. The “practice” of this specific CoP is the active and disciplined use of the LEARN cycle to address the problem in their domain. The initial commitment of the core group is not to a specific solution, but to this shared methodology. Their shared repertoire will grow over time as they conduct experiments and accumulate validated learnings, but the foundational practice is the process of inquiry itself.51
8.2. Designing for Evolution and Sustained Engagement
A CoP is a living system, not a static project team. To cultivate a thriving community capable of sustained innovation, leaders should apply principles designed to foster organic growth and long-term engagement, as outlined by Wenger and others.54
- Design for Evolution: The structure of the CoP should be emergent, not imposed. It is critical to avoid over-engineering the process from the outset.54 Begin with a simple structure, such as a commitment to regular meetings to run the LEARN cycle. As the community builds trust and momentum, other elements—like a shared digital space, specific project teams, or public showcases of learning—can be introduced one at a time. The CoP’s structure should co-evolve with its learning journey.54
- Focus on Value: Participation in the CoP, especially for busy practitioners, must be valuable. The work of the LEARN cycles should directly address the real-world challenges members face, providing them with practical insights and effective solutions they can apply in their work. The CoP succeeds when it becomes the most effective and efficient way for members to solve their most pressing problems.9
- Create a Rhythm: Establishing a regular cadence for CoP activities is essential for building momentum and making the work a habit. This includes scheduling regular meetings for each stage of the LEARN cycle, such as weekly “Test” check-ins, monthly “Synthesize” sessions where the pivot-or-persevere decision is made, and quarterly reviews of overall progress. This rhythm makes the process predictable and sustainable.54
- Invite Different Levels of Participation: A healthy CoP has multiple ways to engage. There should be a small core group that actively leads and participates in every LEARN cycle. An active group should be invited to participate in specific stages, such as providing feedback during the ASSESS stage. Finally, a wider peripheral group can be kept informed of the CoP’s progress and learnings through newsletters, presentations, or a shared repository of knowledge. This tiered structure allows the CoP’s influence to spread while respecting varying levels of member capacity and interest.49
Chapter 9: The Practitioner’s Toolkit: Navigating the LEARN Cycle
This chapter provides a practical toolkit with specific methods and templates that a Community of Practice can use to navigate each stage of the LEARN cycle effectively.
9.1. Tools for the FRAME Stage
The goal of this stage is to move from a general problem to a specific, testable hypothesis.
- Problem Definition: The “Five Whys” technique, derived from the Toyota Production System, is a simple yet powerful tool for getting to the root cause of a problem. By repeatedly asking “Why?” (typically five times), the team can move beyond surface-level symptoms to uncover the underlying systemic issues that need to be addressed.6
- Assumption Mapping: To make all “leap-of-faith” assumptions explicit, the CoP can use a visual tool like the Lean Canvas or the Business Model Canvas.22 For non-commercial projects, a simplified version focusing on “Problem,” “Beneficiaries,” “Proposed Solution (Intervention),” and “Key Assumptions” can be highly effective. This exercise ensures the team knows exactly what beliefs need to be tested.
- Hypothesis Formulation: A clear, falsifiable hypothesis is the cornerstone of a good experiment. The CoP should use a structured template, such as: “We believe that by [implementing this specific MVI], for [this specific group of beneficiaries/practitioners], we will see [this measurable change in behavior]. We will know we have succeeded when we observe [this quantitative metric] and hear [this type of qualitative feedback].”
9.2. Tools for the TEST Stage
The goal of this stage is to build and deploy the MVI as quickly and efficiently as possible.
- MVI Design Brainstorm: The CoP should brainstorm a range of low-fidelity MVI options to find the one that requires the least effort for the most learning. This could include process-based MVIs (a new checklist, a different meeting format) or prototype-based MVIs (a simple brochure, a storyboard, a low-fidelity wireframe, or an explainer video).19
- Agile Project Management: Simple Agile tools can help manage the work of creating and deploying the MVI. A Kanban board, with columns for “To Do,” “Doing,” and “Done,” is a highly effective visual tool for tracking tasks and ensuring the team stays focused and coordinated.84
9.3. Tools for the ASSESS Stage
The goal of this stage is to gather a rich, mixed-methods dataset.
- Quantitative Data Collection: For tracking actionable metrics, the team can use simple tools like online surveys, analytics dashboards, or even manual tally sheets. For comparative tests, an A/B test, where two versions of an intervention are offered to different groups simultaneously, is a powerful way to measure the impact of a specific change.13
- Qualitative Data Collection: The CoP should be trained in basic qualitative methods. Key tools include semi-structured beneficiary interviews to understand user experience 16, ethnographic observation to see how the intervention is used in its natural context 59, and focus groups to facilitate group discussion and uncover shared perspectives.
9.4. Tools for the SYNTHESIZE Stage
The goal of this stage is to turn raw data into validated learning and a clear decision.
- Data Analysis: The team can use thematic analysis to identify recurring patterns and themes in their qualitative data (interview notes, observations). For quantitative data, simple bar charts or line graphs can visualize trends and compare results against the pre-defined success metrics.
- Structured Reflection Protocols: The “Pivot or Persevere” meeting should be a structured and facilitated process. A protocol could include: 1) Re-stating the original hypothesis. 2) Presenting the quantitative data. 3) Sharing key themes and quotes from the qualitative data. 4) A round-robin discussion where each member shares their interpretation. 5) A formal vote or consensus-building process to make the final decision.25
Chapter 10: The Critical Decision Point: To Pivot or Persevere in Practice
10.1. Re-contextualizing the Decision for Practitioners
In the corporate world of Lean Startup, the “pivot or persevere” decision is often a high-stakes choice with significant financial implications. For a Community of Practice using the LEARN framework, the context is different, but the decision is no less critical. It is not a question of business survival, but a strategic choice about the stewardship of the community’s most valuable resources: their time, energy, and credibility. The central question shifts from “Will this make money?” to “Is this intervention creating the desired practical improvement, and are we learning enough to justify continuing down this path?”.26 It is a decision about how to best allocate collective effort to maximize social or practical impact.
10.2. A Framework for the “Pivot or Persevere” Meeting
The “Pivot or Persevere” meeting is the primary governance mechanism of the LEARN framework. It ensures accountability and transforms the CoP from a discussion group into a decision-making body. To be effective, this meeting should follow a structured, data-informed process.
- Set Criteria in Advance: A key discipline of the process is for the CoP to define its success and failure conditions before running the experiment in the TEST stage.25 During the FRAME stage, when defining the hypothesis, the group should agree on the specific Impact Accounting metrics that will signal success or failure. For example: “We will persevere if we see a 20% increase in the retention rate and receive predominantly positive qualitative feedback. We will pivot if the retention rate is flat or negative.” This prevents emotional decision-making or the shifting of goalposts after the results are in.
- The Persevere Path: The decision to persevere is made when the data from the ASSESS stage shows positive momentum and validates the core hypothesis of the MVI. The quantitative metrics have met or are trending toward the success criteria, and the qualitative feedback is largely supportive. Persevering does not mean the work is done; it means the team commits to another LEARN cycle to refine, optimize, or scale the now-validated intervention. The next FRAME stage will focus on a new hypothesis aimed at improving the existing approach.25
- The Pivot Path: The decision to pivot is made when the data clearly invalidates the hypothesis or shows a lack of meaningful impact. The metrics are flat or negative, and/or the qualitative feedback reveals a fundamental flaw in the intervention’s logic. A pivot is not a sign of failure; it is a sign of successful learning. It prevents the community from wasting further resources on an ineffective path. The team formally acknowledges that the hypothesis was wrong and, based on the rich learning from the “failed” experiment, returns to the FRAME stage to formulate a new hypothesis and design a new MVI that takes a different strategic approach to the problem.12
- The “Kill” Option: In some cases, the learning from an experiment may be that a particular avenue of inquiry is a complete dead end. In a practitioner context, “killing” an idea is an important act of knowledge stewardship. It involves formally documenting that a specific strategy was rigorously tested and found to be ineffective. This validated learning is then archived and shared within the CoP to ensure the community does not waste time revisiting the same failed path in the future.25
The “Pivot or Persevere” meeting is arguably the most critical ritual for a CoP engaged in Lean Action Research. A common failure mode for well-intentioned communities is to devolve into a “talk shop,” where ideas are discussed but no action or accountability follows.9 The Lean Startup methodology introduces the “pivot or persevere” decision as a mandatory, high-stakes moment of truth.25 By adopting this as a formal, recurring meeting with pre-defined criteria, the CoP institutionalizes a culture of accountability. It forces the group to make a concrete, evidence-based judgment on its own work. This regular, structured decision-making process builds the CoP’s collective capacity for action and reinforces its identity as a group that does things, not just discusses them. It is the engine that drives the CoP’s “Practice.”
Conclusion: The Future of Collaborative, Evidence-Based Problem-Solving
The complex, systemic challenges that define our modern social, public, and educational landscapes demand new ways of working. Linear, top-down models of change are proving insufficient for the dynamic and uncertain nature of these “wicked problems.” This report has proposed the Lean Action Research (LEARN) Framework as a robust, integrated alternative, designed specifically for practitioner-led communities seeking to drive meaningful and sustainable improvement.
The LEARN framework derives its unique power from the synthesis of three proven methodologies. It harnesses the speed and experimental rigor of the Lean Startup, transforming problem-solving into a series of rapid, evidence-driven inquiries. It grounds this agility in the contextual depth and participatory ethos of Action Research, ensuring that solutions are developed with and by the people who will use them, not imposed upon them. Finally, it provides a sustainable, collaborative structure through the Community of Practice, creating a social engine that can steward knowledge, maintain momentum, and foster a culture of continuous learning long after any single project is complete.
The core components of the framework—the iterative FRAME-TEST-ASSESS-SYNTHESIZE cycle, the concept of the Minimum Viable Intervention (MVI), the discipline of Impact Accounting, and the critical Pivot or Persevere decision point—provide a comprehensive toolkit for any group of practitioners. This model offers a powerful alternative to both slow, bureaucratic, top-down initiatives and isolated, often un-scalable, grassroots efforts. It represents a “middle way” that combines the best of both worlds: the structure and discipline needed for scalable change, and the flexibility and practitioner wisdom needed for genuine, context-sensitive innovation.87
The future of effective social innovation lies in our ability to move beyond “accidental and episodic” improvements toward intentional, collaborative, and continuous learning.1 The LEARN framework offers a clear and actionable pathway for this journey. It is a call to action for leaders, funders, and practitioners to embrace a new paradigm of problem-solving—one that is disciplined yet agile, data-informed yet human-centered, and grounded in the collective wisdom and creativity of those working on the front lines of change. By fostering a culture of rigorous, collaborative inquiry, we can build the capacity to not only face our most complex challenges but to learn our way through them, together.
This report was generated by Google Gemini 2.5 Deep Research using the prompt “Research how the Lean Startup Methodology could be applied to conducting action research and developing a community of practice to solve a particular practical problem”. It was reviewed by Dr. Andrew Sears for accuracy. Diagrams were added by Dr. Sears along with slight modifications.
Works cited
- Collaborative innovation in the public sector, accessed June 13, 2025, https://epe.lac-bac.gc.ca/100/201/300/innovation_journal/2021/www.innovation.cc/volumes-issues/2012_17_1_1_eva_sorensen_torfing_intro.pdf?nodisclaimer=1
- Full article: Collaborative innovation in a local authority – ‘local economic development-by-project’? – Taylor & Francis Online: Peer-reviewed Journals, accessed June 13, 2025, https://www.tandfonline.com/doi/full/10.1080/14719037.2023.2185663
- Collaborative Innovation in the Public Sector – New Perspectives on the Role of Citizens, accessed June 13, 2025, https://www.researchgate.net/publication/320064546_Collaborative_Innovation_in_the_Public_Sector_-_New_Perspectives_on_the_Role_of_Citizens
- 1/30 learning’s from “The Lean Startup” – definition of startup, build learn cycle, why startup’s fail? – Reddit, accessed June 13, 2025, https://www.reddit.com/r/startups/comments/1crskvi/130_learnings_from_the_lean_startup_definition_of/
- Is Your Nonprofit Really Ready to Use the Lean Startup?, accessed June 13, 2025, https://ssir.org/articles/entry/is_your_nonprofit_really_ready_to_use_the_lean_startup
- Methodology – The Lean Startup, accessed June 13, 2025, https://theleanstartup.com/principles
- Overview of Action Research Methodology – Web Networks, accessed June 13, 2025, https://homepages.web.net/~robrien/papers/arfinal.html
- What is Action Research for Classroom Teachers?, accessed June 13, 2025, https://kstatelibraries.pressbooks.pub/gradactionresearch/chapter/chapt1/
- Introduction to communities of practice – wenger-trayner, accessed June 13, 2025, https://www.wenger-trayner.com/introduction-to-communities-of-practice/
- Jean Lave, Etienne Wenger and communities of practice – Infed.org, accessed June 13, 2025, https://infed.org/mobi/jean-lave-etienne-wenger-and-communities-of-practice/
- The 10 Core Principles of a Lean Startup, accessed June 13, 2025, https://www.waveon.io/en/blog/Core-principles-of-Lean-Startup
- What is the Lean Startup Methodology? – Viima, accessed June 13, 2025, https://www.viima.com/blog/lean-startup
- Lean startup – Wikipedia, accessed June 13, 2025, https://en.wikipedia.org/wiki/Lean_startup
- The Lean Startup: How Today’s Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses – Amazon.com, accessed June 13, 2025, https://www.amazon.com/Lean-Startup-Entrepreneurs-Continuous-Innovation/dp/0307887898
- Pursuing Perfection: Case Studies Examining Lean Manufacturing Strategies, Pollution Prevention, and Environmental Regulatory Ma, accessed June 13, 2025, https://www.epa.gov/sites/default/files/2013-11/documents/perfection.pdf
- Embracing the Lean Startup Method – Doodle, accessed June 13, 2025, https://doodle.com/en/embracing-the-lean-startup-method/
- Guiding Early Stage Development with the Build-Measure-Learn Loop – Thinslices, accessed June 13, 2025, https://www.thinslices.com/insights/early-stage-development-build-measure-learn-loop
- How to Use the Build, Measure, Learn Loop In The Product Development Process, accessed June 13, 2025, https://userpilot.com/blog/build-measure-learn/
- Don’t Build When You Build-Measure-Learn – Strategyzer, accessed June 13, 2025, https://www.strategyzer.com/library/dont-build-when-you-build-measure-learn
- Build Measure Learn – Why The Lean Startup Loop is Backwards – Kromatic, accessed June 13, 2025, https://kromatic.com/blog/build-measure-learn-vs-learn-measure-build/
- Lean startup methodology: How to make it work for your startup – DigitalOcean, accessed June 13, 2025, https://www.digitalocean.com/resources/articles/lean-startup-methodology
- Lean Startup Model: Key Principles and Stages – Shopify, accessed June 13, 2025, https://www.shopify.com/blog/lean-startup-model
- Vanity Metrics: Discover What They Are and Why to Avoid Them – Reportei, accessed June 13, 2025, https://reportei.com/en/vanity-metrics-discover-what-they-are-and-why-to-avoid-them/
- The Problem With Vanity Metrics In Social Media – Brighter Messaging, accessed June 13, 2025, https://brightermessaging.com/blog/vanity-metrics-social-media/
- How to Make “Pivot or Persevere” Decisions in Your Innovation Accounting – Kromatic, accessed June 13, 2025, https://kromatic.com/blog/how-to-make-pivot-or-persevere-decisions-in-your-innovation-accounting/
- Pivot or Persevere? How to Make Startup Decisions That Drive Success – Thinslices, accessed June 13, 2025, https://www.thinslices.com/insights/pivot-or-persevere-make-startup-decisions-that-drive-success
- To pivot or persevere: interpreting the results of your experiments – Innoway, accessed June 13, 2025, https://innoway.me/pivot-or-persevere/
- Improving the impact of remote Playing Lean workshops through action inquiry and critical reflexivity | Emerald Insight, accessed June 13, 2025, https://www.emerald.com/insight/content/doi/10.1108/ijlss-04-2022-0088/full/html
- Using Lean Startup for Greater Social Impact – 6sigma, accessed June 13, 2025, https://6sigma.com/using-lean-startup-for-greater-social-impact/
- Lean Action Research Collaborative – CLEAR, accessed June 13, 2025, https://clear.berkeley.edu/our-work/lean-action-research-collaborative
- Case Study: Performance Management and Lean Process Improvement – Results Washington, An Operational Excellence in Government Success Story | Data-Smart City Solutions, accessed June 13, 2025, https://datasmart.hks.harvard.edu/opex/research/case-study-performance-management-and-lean-process-improvement-results-washington
- What Is Action Research? | Definition & Examples – Scribbr, accessed June 13, 2025, https://www.scribbr.com/methodology/action-research/
- 8 Key Principles of Action Research: Insights from Kurt Lewin – Teachers Institute, accessed June 13, 2025, https://teachers.institute/growth-and-development-of-educational-management/key-principles-action-research-kurt-lewin/
- ctl.oregonstate.edu, accessed June 13, 2025, https://ctl.oregonstate.edu/sites/ctl.oregonstate.edu/files/action-research.pdf
- Action Research: McNiff, Jean – Books – Amazon.com, accessed June 13, 2025, https://www.amazon.com/Action-Research-Principles-Jean-McNiff/dp/0415535263
- Beginners’ guide to action research, accessed June 13, 2025, http://www.aral.com.au/resources/guide.html
- 6. The Action Research Cycle – University of Nottingham, accessed June 13, 2025, https://www.nottingham.ac.uk/helmopen/rlos/research-evidence-based-practice/designing-research/types-of-study/introducing-action-research/section06.html
- About Action Resarch | Action Research Tututoials | CCAR, accessed June 13, 2025, https://www.actionresearchtutorials.org/1-about-action-research
- What Is Improvement Science?, accessed June 13, 2025, https://iims.uthscsa.edu/isrn/home/about-us/what-is-improvement-science/
- What is improvement science, and what makes it different? An outline of the field and its frontiers – PubMed Central, accessed June 13, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC11884242/
- Our Ideas | Carnegie Foundation for the Advancement of Teaching, accessed June 13, 2025, https://www.carnegiefoundation.org/our-ideas/
- Collaborative Teacher Action Research: Improvement Science as Professional Development? – ISU ReD, accessed June 13, 2025, https://ir.library.illinoisstate.edu/cgi/viewcontent.cgi?article=2397&context=etd
- Continuous Improvement in Education: A Toolkit for Schools and Districts, accessed June 13, 2025, https://ies.ed.gov/ies/2025/01/continuous-improvement-education-toolkit-schools-and-districts
- The school research lead and PDSA cycles – what’s the evidence, accessed June 13, 2025, https://www.garyrjones.com/blog/2019/1/26/the-school-research-lead-and-pdsa-cycles-whats-the-evidence-
- Challenges With The Lean Startup Methodology – Reforge, accessed June 13, 2025, https://www.reforge.com/blog/lean-startup-methodology-problems
- What’s wrong with the Lean Startup methodology? – Quora, accessed June 13, 2025, https://www.quora.com/What%E2%80%99s-wrong-with-the-Lean-Startup-methodology
- How Community Leads the Way with Participatory Action Research (PAR), accessed June 13, 2025, https://collectiveimpactforum.org/resource/how-community-leads-the-way-with-participatory-action-research-par/
- Community-based Participatory Research (CBPR): Towards Equitable Involvement of Community in Psychology Research – PMC – PubMed Central, accessed June 13, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC6054913/
- Community of practice – Wikipedia, accessed June 13, 2025, https://en.wikipedia.org/wiki/Community_of_practice
- Communities of Practice – 21CSLA, accessed June 13, 2025, https://21cslacenter.berkeley.edu/communities-practice
- Communities of practice, accessed June 13, 2025, https://mwcasc.umn.edu/sites/mwcasc.umn.edu/files/2024-12/communities_practice_intro_wenger.pdf
- What is a Community of Practice? – COD Library – College of DuPage, accessed June 13, 2025, https://library.cod.edu/c.php?g=1227428&p=8980997
- What is a community of practice? – Community of Practice, accessed June 13, 2025, https://www.communityofpractice.ca/background/what-is-a-community-of-practice/
- (PDF) Seven Principles for Cultivating Communities of Practice – ResearchGate, accessed June 13, 2025, https://www.researchgate.net/publication/265678077_Seven_Principles_for_Cultivating_Communities_of_Practice
- Seven Principles for Cultivating Communities of Practice – Clearwater, accessed June 13, 2025, https://www.clearwatervic.com.au/user-data/resource-files/7Principles_Community-of-Practice.pdf
- (PDF) Implementing Lean Manufacturing by means of action …, accessed June 13, 2025, https://www.researchgate.net/publication/259810632_Implementing_Lean_Manufacturing_by_means_of_action_research_methodology_A_case_study_in_the_aeronautics_industry
- Improving the impact of remote Playing Lean workshops through action inquiry and critical reflexivity – Emerald Insight, accessed June 13, 2025, https://www.emerald.com/insight/content/doi/10.1108/ijlss-04-2022-0088/full/pdf
- Lean Research Field Guide | MIT D-Lab, accessed June 13, 2025, https://d-lab.mit.edu/resources/publications/lean-research-field-guide
- (PDF) DESIGN THINKING VS. LEAN STARTUP: A COMPARISON …, accessed June 13, 2025, https://www.researchgate.net/publication/234066097_DESIGN_THINKING_VS_LEAN_STARTUP_A_COMPARISON_OF_TWO_USER-DRIVEN_INNOVATION_STRATEGIES
- Managers can use the Hybrid Model Matrix to decide when to use design thinking, Lean Startup, or Agile with Stage-Gate to boost new product development. – ResearchGate, accessed June 13, 2025, https://www.researchgate.net/publication/354029770_The_Hybrid_Model_MatrixEnhancing_Stage-Gate_with_Design_Thinking_Lean_Startup_and_Agile_Managers_can_use_the_Hybrid_Model_Matrix_to_decide_when_to_use_design_thinking_Lean_Startup_or_Agile_with_Stage-
- Lean Startup Concepts & How To Become A Lean Startup – BMC Software, accessed June 13, 2025, https://www.bmc.com/blogs/lean-startup-enterprise/
- What Is Agile Project Management? | APM Methodology & Definition, accessed June 13, 2025, https://www.apm.org.uk/resources/find-a-resource/agile-project-management/
- What is agile methodology? (16 examples with pros and cons), accessed June 13, 2025, https://www.aha.io/roadmapping/guide/agile/development-methodologies
- Social Validity in Behavioral Research: A Selective Review – PMC, accessed June 13, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC10050507/
- 10 tips for practicing lean impact in a time of rapid change, accessed June 13, 2025, https://mediaimpactfunders.org/10-tips-for-practicing-lean-impact-in-a-time-of-rapid-change/
- Full article: Evaluating education innovations rapidly with build-measure-learn: Applying lean startup to health professions education, accessed June 13, 2025, https://www.tandfonline.com/doi/full/10.1080/0142159X.2022.2118038
- Minimum viable action – how to advance things that are stuck – Eficode.com, accessed June 13, 2025, https://www.eficode.com/blog/minimum-viable-action-how-to-advance-things-that-are-stuck
- How To Pivot Or Persevere Based On Your Learnings – FasterCapital, accessed June 13, 2025, https://fastercapital.com/topics/how-to-pivot-or-persevere-based-on-your-learnings.html
- Minimum viable transformation | Deloitte Insights | Business Trends, accessed June 13, 2025, https://www2.deloitte.com/us/en/insights/focus/business-trends/2015/minimum-viable-business-model-transformation-business-trends.html
- Minimal intervention needed for change: definition, use, and value for improving health and health research – PMC, accessed June 13, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC3958586/
- Minimum Viable Research: A Practical Guide – User Interviews, accessed June 13, 2025, https://www.userinterviews.com/blog/minimum-viable-research-practical-guide
- What is minimum viable task? – Focuskeeper Glossary, accessed June 13, 2025, https://focuskeeper.co/glossary/what-is-minimum-viable-task
- Improving Care Delivery Through Lean: Implementation Case Studies, accessed June 13, 2025, https://www.ahrq.gov/practiceimprovement/systemdesign/leancasestudies/lean-case1.html
- Agile in Healthcare: Benefits and Applications in Practice – Invensis Learning, accessed June 13, 2025, https://www.invensislearning.com/blog/role-of-agile-healthcare/
- Lean Startup for Social Innovation – Collaborate Up, accessed June 13, 2025, https://collaborateup.com/lean-startup-for-social-innovation/
- Measuring Social Impact: Approaches, Challenges, and Best Practices – Resonance Global, accessed June 13, 2025, https://www.resonanceglobal.com/blog/measuring-social-impact-approaches-challenges-and-best-practices
- Revisiting: Measuring Societal Impact or, Meet the New Metric, Same as the Old Metric, accessed June 13, 2025, https://scholarlykitchen.sspnet.org/2025/02/04/revisiting-measuring-societal-impact-or-meet-the-new-metric-same-as-the-old-metric/
- Ten Reasons Not to Measure Impact—and What to Do Instead, accessed June 13, 2025, https://ssir.org/articles/entry/ten_reasons_not_to_measure_impact_and_what_to_do_instead
- Measuring social impact is complicated and may create dysfunctional incentives – LSE Blogs, accessed June 13, 2025, https://blogs.lse.ac.uk/businessreview/2016/10/19/measuring-social-impact-is-complicated-and-may-create-dysfunctional-incentives/
- Beyond Vanity Metrics: Measuring What Truly Counts in Social Impact Marketing, accessed June 13, 2025, https://communitysymbol.com/beyond-vanity-metrics-measuring-what-truly-counts-social-impact-marketing/
- Social Impact Metrics Guide – Sopact, accessed June 13, 2025, https://www.sopact.com/guides/social-impact-metrics
- 5 Steps to Creating an Agile Community of Practice to Make Your Teams Awesome!, accessed June 13, 2025, https://blog.iil.com/5-steps-to-creating-an-agile-community-of-practice-to-make-your-teams-awesome/
- IDEAS (Integrate, Design, Assess, and Share): A Framework and Toolkit of Strategies for the Development of More Effective Digital Interventions to Change Health Behavior – PMC – PubMed Central, accessed June 13, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC5203679/
- Agile Methodologies: A Beginner’s Guide – Planview, accessed June 13, 2025, https://www.planview.com/resources/guide/agile-methodologies-a-beginners-guide/
- Agile Methodology: Advantages and Disadvantages – College of Continuing and Professional Studies, accessed June 13, 2025, https://ccaps.umn.edu/story/agile-methodology-advantages-and-disadvantages
- Question: Pivot or Persevere? – Startups.com, accessed June 13, 2025, https://www.startups.com/questions/2485/pivot-or-persevere
- The Future of Innovation Is Collective – Stanford Social Innovation Review, accessed June 13, 2025, https://ssir.org/articles/entry/collective-action-approach-social-innovation
- Social innovation – the last and next decade | Nesta, accessed June 13, 2025, https://www.nesta.org.uk/blog/social-innovation-the-last-and-next-decade/