International development policymakers and practitioners are periodically confronted with relearning that technical fixes to problems rarely lead by themselves to reliable solutions. Standing in the way of input-driven activities producing predictable results are the potentially derailing effects of a host of situation-specific factors often referred to as the context. Converting the truism that context matters into something useful for policy or practice calls for, first, identifying what aspects of the context are relevant to achieving a development objective and, second, learning what to do in response.
Among the tools for scanning the context is political economy analysis (PEA), a valuable approach for assessing the local systems where international development actors seek to intervene. PEA is a powerful tool for improving the effectiveness of development programming that focuses on how political and economic power are distributed and contested and identifies the implications for development activities and outcomes (Booth et al., 2009). Yet a dizzying plethora of PEA approaches and tools exists: PEA, applied PEA, Context Analysis, Everyday PEA, Power Analysis, Drivers of Change, Problem-Driven Iterative Adaptation, and many others. This glut of tools and guidance, however, raises several questions:
What works effectively for PEA in practice?
In which situations does one approach work versus another?
How can PEA usefully inform learning and adaptation?
What is the path from PEA to thinking and working politically (TWP)?
This research report offers insights that provide some answers to these questions, based on our survey of recent PEAs that RTI International conducted. We reviewed nine PEAs implemented between 2017 and 2019 to explore the following:
what the analyses focused on
how they were carried out
what factors appear to have influenced their successful conduct
where challenges arose
what value was added to implementation teams in the field
Based on these nine PEAs, we trace some preliminary patterns across the cases and then conclude with lessons learned from RTI’s experience—successes, challenges, and adaptations taken in response—and propose several recommendations for future exploration, learning, and research related to PEA and TWP. We begin, first, with a short background on PEA and some key points from the literature.
The roots of most theories, concepts, methodologies, and tools associated with international development extend farther back into the past than is commonly recognized. Among the antecedents of today’s interest in context and PEA is the recognition of the sustainability gap: in the 1980s donor agencies and practitioners discovered that projects’ benefits in many cases failed to outlast their initial investment periods. Studies searching for answers identified the impacts of social, economic, and political factors in project environments as helping to explain the weak sustainability of donor-funded investments. These findings led designers to consider how systemic elements of contexts might be addressed to increase the chances of project success. Central to this search was the need for ways to identify and understand these context elements.
The first generation of PEAs consisted of macro-level country studies that aimed to paint broad-brush pictures of the political forces and trends in a country to inform donor agencies’ country strategies and programming and promote awareness of political dynamics. Among the pioneers were the UK Department for International Development’s (DFID’s) “drivers of change” studies (UK Department for International Development, 2003). Second- generation PEAs narrowed the country-level analytic focus, first to sectoral political assessments, and next to problem-specific analysis aimed at understanding the political and institutional dynamics associated with a particular project or policy issue (Booth et al., 2009; Fritz et al., 2009). The problem-specific orientation connected PEAs more closely with identifying individual project bottlenecks and constraints and with supporting project learning and adaptive management to cope with those constraints (Brinkerhoff et al., 2018).
Currently, donors and practitioners use both generations of PEA approaches and tools, though the labels they use vary and can contribute to the confusion we noted in the Introduction. The US Agency for International Development, for example, refers to Baseline PEAs as an analysis at the country, sector, subsector, or project level depending on the needs of the mission and team (United States Agency for International Development (USAID), 2018). Practitioners engaged in project implementation have also developed analytic tools tailored to ongoing analysis by field teams under the label “everyday political analysis”(Hudson et al., 2016) but also known in the field as everyday PEA. The expanding PEA toolkit has added new topical emphases, such as gender equity and social inclusion (GESI), and stronger connections to flexible and adaptive project implementation, for example, integrating with USAID’s collaboration, learning, and adaptation (CLA) model.
Recent discussions have focused not simply on the growing PEA toolkit but on how those tools inform development practice in the field. This focus is encapsulated in the attention being paid to connecting the two elements of TWP: thinking politically and working politically (United States Agency for International Development (USAID), 2018). Several overviews of PEA and TWP suggest that effective practice derives less from increasingly sophisticated analytic tools than from politically savvy practitioners with deep contextual knowledge who anticipate and adapt during implementation (Booth et al., 2016; Laws & Marquette, 2018). Thus, PEA’s primary value derives from how practitioners with the right skills and aptitudes can use the insights the PEA provides and incorporate this approach into ongoing implementation.
As PEA has become widely applied, both practitioners and academics have identified challenges with the tools and approaches. A primary challenge concerns the bureaucratic barriers within donor agencies that impede the effective use of PEA findings and recommendations and undermine incentives for TWP, neatly captured in the subtitle of Carothers’ and de Gramont’s book on donors and politics, Development Aid Confronts Politics: The Almost Revolution (Carothers & de Gramont, 2013). Donor emphasis on results-based management calls for demonstrable evidence proving how PEA and TWP contribute to better results and outcomes.
Other challenges include a gap between field staff’s perceptions of the utility of PEA and who should pay attention to study findings, leading to a disconnect between everyday management practices and analysis-heavy PEA. Finally, some consider that PEA as used in donor agency programming tends to underplay the complexity of politics and power, essentially turning PEA into another tool in the service of technical fixes (Fisher & Marquette, 2014).
We analyzed nine PEAs completed between 2017 and 2019, through a combination of document review and interviews. RTI International conducted these PEAs in seven countries: Haiti, Nigeria, Philippines, Senegal, Tanzania, Uganda, and Zambia and four sectors: governance, water and sanitation, wildlife conservation, and higher education. Table 1 summarizes the PEA studies we reviewed.
Our survey surfaced several results that were shared across most of the projects we examined:
PEAs can make positive contributions to technical interventions.
Engaging project staff in PEAs increases the likelihood that they will be open to thinking and working politically.
Inclusion of GESI in PEAs helps to uncover hidden power dynamics.
Each PEA process included process and logistical challenges that required the team to adapt.
Explicitly connecting PEA findings to project implementation facilitates adaptive management.
We discuss these below.
PEA Findings Can Provide Evidence Valuable for Technical Strategy
In all nine cases, field teams used PEA findings to inform technical strategy and design of current activities, to uncover hidden political dynamics, and to contribute to future project design. Sometimes PEAs revealed new information and insights. However, in other cases the findings were not news to the project, but the PEA enabled teams to gather relevant information to provide a clear, coherent picture of a political or institutional issue and, more importantly, to indicate a potential path forward. Documenting and presenting the findings in a coherent, evidence-based way was compelling to project decision-makers and often their country counterparts as well. The evidence from the PEAs enabled activities to be more politically informed and more likely to be feasible than those developed based on technical criteria alone.
Applied PEA can influence the direction of a project in minor and major ways. For example, in Zambia, input from the Accountable Governance for Improved Service Delivery (AGIS) PEA contributed to capacity development plans for local governments that were better tailored to needs and more responsive to local priorities, with better ideas for leveraging existing resources. The project team remarked that the GESI questions helped them to understand the perspectives of diverse populations and how policies and potential project activities could impact the populations differently. The findings of the PEAs often uncovered hidden challenges or risks, or questioned project assumptions, which required the team to develop a mitigation plan. These PEAs enabled teams to triangulate perspectives and solidify how to prioritize actor engagement and which actions to pursue.
In another example, the Effective Water, Sanitation, and Hygiene Services (E-WASH) PEA in Nigeria revealed that the private sector providing water and sanitation services, which was not closely engaged in the early stages of the project, played a more vital role in the water distribution system than initially understood by the project team and USAID. It was clear that the project team needed to more intentionally engage private sector stakeholders in developing solutions to key sector challenges. The PEA resulted in adaptations to project activities and updates to service improvement plans in each state. It led to significant changes to the approach in two states where the PEA uncovered management, structural, and contextual issues that were hindering progress. After the process, the E-WASH team and RTI home office technical leadership remarked that it would have been valuable to do the PEAs as part of project start-up, but they were under significant time constraints.
“The PEA process has affected our project trajectory (introduced an emergency action plan), [and] influenced our messaging, re-emphasizing the need for a paradigm shift from Roads to Water.”
State Team Leader, E-WASH
In Senegal, the Governance for Local Development (GoLD) PEA results regarding fiscal and financial decentralization revealed a programmatic gap: improved service delivery at the local level required a stronger link with national government actors, civil society, donors, and others. The box below provides a more detailed example from Haiti.
Engaging Project Staff in PEAs Increases the Likelihood of TWP
PEAs often pushed staff to think differently about how their projects should approach the implementation process and to integrate knowledge regarding local systems. We found that PEAs helped to develop some basic TWP skills, such as the following:
identifying key stakeholders and assessing their likelihood to support or block project activities
considering how project activities and results can affect local power distributions and stakeholder incentives to support to support or block change
questioning underlying assumptions in the project or within specific activities
examining heretofore hidden or overlooked GESI factors
As the box below illustrates, team members learned to take the time to consider the underlying factors at play in any situation; to dig deeper and understand why things weren’t working as designed; to consider the incentives created by formal and informal rules; and to look for the entry points to bring about change. The PEA team in the Uganda Governance, Accountability, Participation, and Performance (GAPP) project, co-funded by USAID and DFID, found that the PEA deepened their ability to see an institution not as one homogenous body but as a group of “varied actors at very different points on the motivation continuum,” where there were both champions for change as well as persons with little interest, and in some cases disincentives, in improving the institution.
“The PEA provided us with a much deeper understanding of the real challenges of the Ministry which could not be solved superficially.”
Chief of Party, Uganda GAPP
As team members continually apply these skills, they have extended a TWP mindset to other areas of their work. Ongoing support from PEA home office experts as well as supporting PEA champions within a project team and messaging from project leadership make a big difference. Many teams have learned to embed a TWP approach into their own internal processes. The USAID/Zambia AGIS project reports that their the PEA with a GESI lens “helped the team to analyze issues and come up with workable solutions. For instance, when dealing with sensitive issues … the team was able to navigate through the situation.” Building on the example in the box, following the E-WASH team’s PEA experience, their next Pause & Reflect session was full of enthusiastic debate on the latest developments in political economy issues in their technical areas and adjustments made to planned activities.
Including GESI Helps to Uncover Hidden Power Dynamics
RTI intentionally inserted a GESI lens into its PEA methodology and training. As a result, many of the PEAs in our survey added one or more GESI-related questions. Incorporating GESI ensured that the PEA teams were inclusive and stakeholder discussions included diverse voices. Five of the nine PEAs reviewed explicitly included GESI, while one was a separate GESI assessment carried out in collaboration with the rapid PEA. Another PEA, undertaken before RTI added a GESI lens, uncovered a major GESI finding, which likely could have been further explored if the PEA had used a GESI lens more explicitly to investigate the situation. The research suggests that when GESI is not explicitly included, consistently referenced, and robustly considered, PEAs can miss out on important findings. Because of this experience, RTI home office staff adapted the PEA training and methodology to include specific attention to GESI within the PEA process.
A GESI assessment can be conducted independently in parallel with PEA or included in the PEA as a lens. For example, during the GERÉ project PEA in Haiti, a rapid GESI assessment was conducted simultaneously. The GESI workshop and field work mirrored the process of the PEA and led to discrete findings on the GESI assessment topic “How do local governments take gender equity and social inclusion into consideration and consider the needs of marginalized citizens?” The Nigeria E-WASH team reported that the PEA with a GESI lens taught them to consistently discuss and consider GESI internally and with their partners. A GESI lens also calls for ensuring diverse PEA teams, identifying specific GESI stakeholders, considering the GESI implications in the choice of PEA topics and the design of interview questions, and soliciting a wider set of interviewees and more diverse perspectives. Interviewers discuss their own biases, how to ask questions openly, and how to probe on topics of inclusivity. During sessions to synthesize study findings, teams identify the GESI implications as an integral component of the synthesis.
“GESI has been firmly grounded in the project activities alongside [public financial management] activities, and the overall response from the decision makers has been very positive. The project was able to influence inclusion of gender activities in planning process for the two sectors [education and health] during the recent planning launch.”
Team Leader, AGIS project
We found that diverse stakeholders, including interviewees, community organizations, and others, appreciated incorporating GESI directly in the PEA process. Doing so made them feel consulted, enriching all discussions and contributing significant nuance to the findings. Overall, the teams found that including GESI fundamentally strengthens the results of the PEA, deepening the exploration of power dynamics, questioning the context or project assumptions, and uncovering implications for various societal groups and individuals. Project teams appreciated the exposure to GESI; the Zambia AGIS team used a PEA as an impetus for developing a gender strategy. The box below illustrates how the Zambia team integrated applied PEA and GESI into their capacity assessments.
Connecting PEA Findings to Project Implementation Facilitates Adaptive Management
While the applied PEA process can be an intense learning experience for teams, it won’t have a long-term impact unless client/donor counterparts continue to support PEA and TWP and unless project teams are vigilant about following up on the evidence-based recommended actions. Our survey found that many teams found it challenging to turn the PEA results into politically informed actions, implement those actions, and measure the results. Sometimes, findings suggest interventions that are beyond the scope or abilities of a project, which may push donors, government counterparts, and other stakeholders to advocate for larger-scale changes. For example, several findings from the applied PEA with USAID’s Governance for Local Development (GoLD) project in Senegal were beyond the manageable interests of the project team, such as the competition between key actors in decentralization, a lack of coordination between donors, and the political influence of the fiscal transfer system. These high-level challenges were therefore shared with USAID and government of Senegal counterparts for consideration of possible action. Although it took some time, the USAID GoLD project is being adapted to add a new national component and key activities that grew out of the PEA findings and recommendations.
Experience shows that it is important that the team dedicate enough time to synthesize findings, develop clear next steps, and construct a work plan, including both technical and operational actions, to implement the PEA recommendations. For all PEAs, it is helpful to leverage existing structures, meetings, and deliverables (i.e., technical staff meetings, Pause & Reflect sessions, and the annual work plan) to hold the team accountable for the planned actions. Following are examples of methods some of the studied projects used for ensuring follow-through on their applied PEA findings:
USAID/E-WASH in Nigeria developed a tracking system for the recommendations and named a PEA champion who takes responsibility for maintaining a focus on the recommendations during staff meetings and Pause & Reflect sessions.
USAID/GoLD in Senegal developed an outreach and communication strategy to present findings to different audiences, e.g., client vs. government counterparts vs. civil society. This strategy also enabled the team to be selective about the recommended actions to undertake, understanding that quality is more important than quantity.
USAID’s Science, Technology, Research, and Innovation for Development (STRIDE) project in the Philippines set aside time in future Pause & Reflect sessions and work planning to reflect on PEA issues and potential responses/actions.
USAID’s Leadership, Empowerment, Advocacy, and Development (LEAD) project, USAID’s E-WASH in Nigeria, and USAID’s Landscape Conservation in Western Tanzania (LCWT) project used the PEA findings to question underlying assumptions, agree on specific actions with the client and key stakeholders, and update their work plan accordingly.
Challenges Abound—But So Do Solutions
Every PEA process had procedural, logistical and implementation challenges given the sensitive topics and complex operating environments. Yet in each case, the PEA team pushed itself to think innovatively, whether by adapting the implementation plan, seeking additional sources of information, or adjusting the strategy.
In several instances among the PEAs surveyed, project leadership or the client grew impatient with what they perceived as lengthy processes or a significant investment of staff time. The PEA teams worked to explain the importance of the process and the value of the forthcoming results and to make adjustments according to leadership or client recommendations. In all the cases cited, project teams reported that the results were worth the investment of time because the findings and actions that grew out of the PEAs changed the direction of activities—in a couple of cases, fundamentally.
Project staff have many time-sensitive demands on their time, from client meetings to work planning to monitoring visits to data gathering to reporting. In several instances, these demands pulled key staff away from the PEA process. On two occasions this occurred during finalizing the report. In planning for subsequent PEAs, the home office staff supporting the PEAs worked to clarify the time commitment up front as well as to plan a schedule based on when staff were available. In addition, RTI greatly streamlined its process to make it more efficient while retaining rigor.
Our survey led us to propose some lessons learned and operational recommendations.
Consider the Timing Carefully
If conducted at project start-up, applied PEA can result in improvements in project work plans that can lead to more effective and possibly earlier results. Mid-course PEAs enable project teams to adapt to changing contexts. PEAs conducted toward the end of a project’s lifecycle have a limited ability to contribute to adaptive management but can influence future programming. For example, one field team struggled during the applied PEA information synthesis process to differentiate between a politically informed action and technical actions in the existing work plan. The timing forced them to try to reverse-engineer the rationale for their existing plan without fully incorporating the learning from the PEA findings.
In the case of Nigeria LEAD, the team undertook an applied PEA process in the last year of the project, as requested by USAID. This left little time for the team to act on most of the findings, yet they made some minor adaptations during implementation of final project activities. However, the PEA findings and recommendations appear to have informed USAID’s design of the follow-on project.
Be Cognizant of the Political Economy of PEA Team Members
Experts on the local context and political environment are often valuable additions to an applied PEA team. However, be sure to consider each team member carefully, and be aware of their political affiliations, existing relationships, and reputation to avoid potentially biased results. For example, one applied PEA team learned late in the process of a local expert’s political ambitions for the future, which meant he was reluctant to document politically sensitive results. This meant that the other team members had to review the PEA findings and rewrite the draft report to be sure all key information was accurately captured in the results.
Be aware of how the involvement of international staff or experts may change the interview process and results. Their presence could discourage participants from sharing sensitive information that paints a negative picture of country systems, practices, or other sensitive information. However, in other instances, foreigners may be able to ask background or probing questions that may be harder for a local national team member to pose.
Recognize the Workload Burden on Field Staff
The benefits of engaging project staff in planning and conducting PEAs are clearly illustrated in better work planning, more effective implementation, and greater ability to think and work politically than home office staff or consultants. However, project staff are already busy fulfilling their various management and implementation responsibilities. If participating in the PEA process pulls them away from existing project responsibilities for too long, they may become frustrated or lose interest. As one staff member commented after a lengthy baseline PEA process, “That was really interesting, but I never want to do it again. It was too much time!” Thus, finding ways to streamline the applied PEA process is important to keep project staff as active participants. The challenge is to develop rapid PEA processes that retain analytic integrity and result in meaningful findings and recommendations. It is also important to look for ways to embed PEA questions into ongoing project implementation activities.
Logistics Really Matter
Good logistics are critical to ensure effective discussions and enable daily reviews and final synthesis of the results. For example, one PEA team did not have the correct address for a critical meeting with a government entity, so they arrived an hour late to the discussion. This resulted in a truncated discussion with that counterpart and his team and reduced the credibility of the project team and process. In another instance, the local team organized too many interviews in one day, some of which had to be rescheduled. As with any study and data collection, adequate planning and coordination are essential to success.
Start on a Positive Note
Set the right tone in the beginning of a PEA discussion by asking what is working well and who are the champions for positive movement forward. This positive tone helps the team to understand the context as consisting of supportive as well as constraining factors and helps to establish a rapport and willingness to discuss difficult challenges. As a result of this approach, the PEA teams found that counterparts offered positive feedback and suggested solutions even for the most challenging topics.
Conclusions and Recommendations
We recognize that our survey of nine PEAs constitutes a limited sample of experience. However, we offer the following thoughts that highlight the promise of PEA as well as the need for continued learning and research to address the challenges we outlined at the start of this brief. We return to several of the challenges noted at the outset.
No Single PEA Approach Serves All Purposes
Our sample illustrates the use of various PEA methodologies: baseline and rapid studies, GESI, conflict analysis integration, and embedding political economy questioning into a Pause & Reflect session or into other technical assessments. These PEA varieties were conducted at different stages in the project cycle: during the technical design, during a project inception period, as part of start-up work planning, at mid-project review and adaptation, and at project conclusion to inform future programming. The variation in methods and project-cycle application reinforces the fact that there is no single “best” approach to PEAs. The PEA label is a broad one that subsumes numerous ways of exploring PE issues for a range of purposes. As with most tools that support adaptation, the successes and contributions speak to the strength and breadth of the PEA approach as well as the flexibility and value that different PEA methodologies can bring.
PEA Is a Critical Tool in the Adaptive Management Toolbox
Previous research has demonstrated the power of PEAs to inform learning and adapting. A successful applied PEA is problem-driven, specific, participatory and generates recommendations that are realistic and actionable. During implementation, almost every team struggled with challenges—such as skepticism of project leadership or technical staff or challenges with obtaining stakeholder buy-in or significant logistical issues—that required the team to adapt the PEA topic, approach, or timeline. Our survey also suggests that PE findings are only as valuable as the activities planned and implemented to respond to that information. Ensuring follow-up implementation of evidence-based actions is the true application of an applied PEA. As adaptive programming and management are becoming more widespread, PEA can be a go-to tool for inspiring learning-focused, data-driven flexible programming.
More Effective if in the Hands of Project Teams
RTI’s applied PEA examples were primarily driven by local project teams with technical support from the home office. As a result, we believe the findings went far beyond surface findings and the recommendations were realistic given the team members’ in-depth knowledge of the sector, actors, constraints, and opportunities. They also had a better chance for implementation given this lead role than would a process driven either by the home office or by consultants.
Balancing Analytical Rigor with Flexibility
As indicated, project teams in developing countries where RTI operates are busy implementing ambitious project work plans. Every RTI project team remarked on how interesting and insightful the applied PEA process was but added that the time demands were challenging. It is critical to find ways to ensure rigor while streamlining the process to continue to engage local project teams.
Continued Demand from Clients
Applied PEA is increasingly included in solicitations from donors, particularly USAID and DFID. This enables organizations like RTI that implement projects to push the envelope on applied PEA approaches, findings, and recommendations. In all nine examples, the client was very interested in the findings and recommended solutions and supported the project teams to adapt project activities as a result. In two cases, the results were most likely used to inform follow-on programming. The practice will continue to grow if this encouragement continues. Simultaneously, implementers need to include this activity in proposals to expand the number of sector teams on the donor side who are exposed to the programmatic insights and benefits PEA can provide.
More Documentation and Learning Are Needed
Our experience and research suggest that applied PEAs provide valuable evidence for strengthening evidence-based, adaptive international development programming. It will be important to continue experimentation and learning on various approaches. RTI found aligning PEA with GESI and conflict assessments useful. Further testing and research could include embedding PEA in a value chain assessment, market systems study, or public financial management activity to increase learning on its value in the economic growth arena. As PEA practice evolves, documenting and sharing learning will help the international development community to implement more effective, data-driven development programming.
The authors gratefully acknowledge financial support for this study from an RTI independent research and development grant. The nine political economy analyses discussed in this paper were project funded. We appreciate the helpful comments from two anonymous reviewers. The views expressed are solely those of the authors.
Cover photo credit: Patrick Adams for RTI International