01

How we see research

Research begins where instruments fail. The gap between the question a researcher writes and the question a respondent actually answers is where most measurement error lives. Survey design assumes shared frames of reference. Field reality is messier — a question about “household income” lands differently in a household where income is irregular, shared with extended kin, or earned in kind.

We build instruments by iterating with frontline workers and community members before enumerator training, not after. The point is not to make data “better” in the abstract — it is to narrow the distance between the question we wrote and the question respondents heard.

The measurement gap WHERE MOST ERROR HIDES The question we wrote drafted in the office The question they answered heard in their own frame THE GAP where most measurement error lives narrow this gap and the data starts to mean what you thought it meant
The measurement gap — where most error hides.
02

The size of the practice

Pinpoint Ventures is small on purpose. We take a limited number of engagements each year because each one needs the kind of attention that scales poorly. A research design we revise once is a research design we have not lived with long enough.

Working at this size means each project shapes the others. The aquaculture welfare work informs how we think about indicator vocabularies in emergent sectors. The climate-health work shapes how we read administrative data. The discrimination work refines how we measure what is officially uncounted. The cumulative effect — not the volume — is what makes the practice useful.

The size of the practice DEPTH SCALES POORLY MANY ENGAGEMENTS attention spread thin A FEW, DEEPLY each one held in mind smaller doesn't mean lesser — it means we can hold the whole shape
Depth scales poorly — and that is the point.
03

From field to frame to policy

The shortest distance between a finding and a policy decision is rarely a straight line. The most useful research outputs are not findings but frames — ways of seeing a problem that survive after the report is filed.

Our work moves through five stages: field, method, evidence, framework, policy. Each stage shapes the next, and the framework usually loops back to refine what counts as field. The “learning layer” is the part most evaluation contracts ignore — and the part that determines whether the work actually changes anything.

Field → Frame → Policy HOW THE WORK MOVES learning loop Field Method Evidence Framework Policy community + frontline the question decides data in context frameworks that last programmes + decisions each stage shapes the next; the framework loops back
Field → method → evidence → framework → policy, with a learning loop back.
04

Frameworks that last

Most evaluation reports are read once. Most theories of change are pinned to office walls and forgotten. Both are normal — and both reflect how the development sector mistakes deliverables for outcomes.

A useful framework outlives the engagement. It survives staff turnover. It informs the next funding round. It changes the way an organisation argues about its own work. We design MEL systems, indicator architectures, and learning layers with this longevity in mind. If our deliverable is useful only while we are in the room, we have failed.

What lasts vs what gets filed REPORTS GET FILED · FRAMEWORKS GET USED REPORTS read once, filed vs FRAMEWORKS Framework used, reused, argued with if our deliverable is useful only while we are in the room, we have failed
What gets filed vs. what gets used.
05

Methodologically plural

The fight between qualitative and quantitative research is mostly an artifact of academic departments. In the field, the question decides the method.

We use RCTs and quasi-experimental designs where the counterfactual matters. We use photovoice and life-history interviews where lived experience cannot be aggregated into a number. We use Human-Centred Design for instrument iteration. We use historical and policy analysis to read structural conditions. The discipline is in choosing — not in defending one method against another.

Methodologically plural ONE QUESTION · MANY WAYS IN the question RCT Quasi-experimental HCD Photovoice Mixed methods Life histories FGDs Policy analysis the discipline is in choosing — not in defending one method against another
One question. Many ways in.

Want to see this in action?

Our work is where the philosophy meets the field.

Start a conversation