top of page
Image by Ismail Salad Hajji dirir


Backed by a team of experienced professionals, our strategic services meet the needs of all types and sizes of clients - from small startups to large development actors - and deliver lasting changes with measurable growth. Please get in touch with us today to learn how PEA consultancy can partner and collaborate.

The Evaluation Contexts.

The development contexts continue to respond to transitory needs with various layers of integration. Acute disparities in gender and settlement statuses call for focused programming that is founded on accurate data and evidence.

Why Evaluate with PEA.

Evaluations aim at identifying and quantifying the value, merit and/or worth of the activities, projects and programmes. In defining the boundaries of Evaluation, PEA consultancy is invested in going beyond the usual rubrics of What is… by accommodating the So what and What now questions while gathering data on activities, projects and programmes designed and implemented by our partners and collaborators to create and sustain real change in the communities we work with.
Every evaluation exercise we onboard are grounded on the workings towards our goal of “making data and evidence play a critical role in changing lives”.  Ultimately, we work toward supporting stakeholder learning, innovation and growth strategies; presenting robust and rigorous methodologies bounded by defined questions that answer to the specifics of requirements as a client, partner and collaborator.

Our Approach to Evaluations.

Development programming requires actions that are responsive to the gaps and needs of the communities we work with. This is our overarching hypothesis.  

PEA has specialised in deploying theory-based processes and outcomes evaluation to best support our clients and partners.

Operationalisation of these evaluations entails a hybrid of realist evaluation (context, mechanisms and outcome patterns (CMO)) and ToC Evaluation (Contribution analysis).

The delivery frameworks are participatory, otherwise referenced in our proposal documents as the Collaborative Outcome monitoring/reporting (COM/R) approach.


PEA implements the  respective evaluation activities through the below six steps:

  1. The Scoping and planning workshop: At the onset of every evaluation, all the evaluation stakeholders have a workshop to go over the scope of the evaluations, define the boundaries of the evaluation, and identify specific evaluation questions, data requirements and data collection methods. Further, the responsible implementations map and logical framework get reviewed and clarified, followed by identifying secondary data sources and repositories and developing a workable evaluation plan. The evaluation creates a technical working group (TWG) through this workshop to agree on the scope of the evaluation. The process climax in the creation of a desk review shared folder, the development and refinement of guidelines to COR, summarisation of the evaluation methodology, development of the data extraction sheets, key stakeholder mapping tool, guides to Key interviews, the creation of the performance story reporting template and a final case/country reporting template.

  2. Desk Reviews (Data Trawl). After conducting the scoping and evaluation planning workshop, the deliverables allow PEA to conduct desk reviews. The overall objective of this data trawl will always be to identify key themes around context characteristics that require programme adaptations, review implementation mechanisms, and map out initiative outcomes (highlight challenges and facilitating factors that are cross-cutting and best practices that can be adopted and adapted to the implementation frames of the activity, project or programme under evaluation. 

  3. In-depth reviews (Key interviews) through the conduct of a social inquiry. Using the key themes and information gaps identified in the preceding evaluation stage (desk review), PEA will always take up the opportunity to refine the social inquiry's scope and content. Our consultants deploy key informant data collection instruments at this stage.

  4. Data analysis and integration. We create both quantitative and qualitative repositories. The associated analyses are conducted to produce descriptive and inferential statistics (point estimates) to qualify and quantity performance against the goals of the activity, project or programme under evaluation.

  5. Outcomes panel. PEA will always propose the creation of stakeholder panels to facilitate a deeper understanding of the scope of the evaluation, the data requirements and the tools. Ultimately, PEA uses the same panels to validate outcomes and define appropriate programme improvement recommendations.

  6. Summit workshop. Our evaluation team will always present key findings and recommendations at a large workshop involving the broad participation of activity, project or programme stakeholders.

We are deliberate on Evaluative thinking.

bottom of page