What we do
Producing the evidence that education systems need to make better decisions — and translating it into practical guidance that drives action.
Education systems make better decisions when they have better evidence. In English language teaching, the evidence base is growing — but it remains unevenly distributed, often disconnected from the policy questions that matter most, and rarely translated into formats that decision-makers can act on. Ministries and institutions frequently invest in programmes without baseline data, evaluate impact using instruments that measure activity rather than outcomes, and struggle to connect monitoring data to programme improvement in real time.
Closing this gap requires more than academic research. It requires applied research designed for decision-making: needs analyses that inform programme design, monitoring frameworks that track implementation quality, impact evaluations that distinguish what changed from what was always going to happen, and evidence syntheses that make existing knowledge usable. The challenge is producing research that is rigorous enough to be credible and practical enough to be used. TELT’s research practice sits at that intersection — generating evidence that strengthens policy, improves programme design, and supports education systems to learn from their own implementation.
Designing MEL frameworks that track implementation quality, measure outcomes, and feed learning back into programme management in real time.
Conducting rigorous impact evaluations that isolate programme effects from external factors — using mixed-methods designs appropriate to the context and evidence needs.
Producing research that directly informs education policy — from landscape analyses and benchmarking studies to targeted briefs for ministry decision-makers.
Conducting national and institutional needs analyses that map the current state of English language education and identify priorities for intervention.
Reviewing and synthesising existing research to make the current evidence base accessible and actionable for programme designers and policymakers.
Assessing the viability of proposed programmes, approaches, or technologies before full-scale investment — examining institutional capacity, infrastructure, cost, and risk.
Designing data collection and analysis systems that track learner progress, teacher development outcomes, and programme effectiveness across digital and face-to-face delivery.







.jpg)
We design and deliver mixed-methods research that generates actionable evidence for policy and practice. Our work ranges from large-scale national evaluations to focused scoping studies, feasibility assessments, and rapid needs analyses. In every case, the research question is shaped by the decision it needs to inform — not by methodological preference.
Our expertise includes monitoring, evaluation, and learning (MEL) framework design, baseline and endline studies, curriculum and programme evaluations, gender-focused research, impact assessments, and learning analytics. We combine quantitative data with qualitative insight to illuminate both measurable outcomes and the more nuanced changes in attitudes, behaviour, and institutional culture that determine whether reforms take hold.
Our research is not an academic exercise. Findings are translated into practical recommendations, strategic guidance, and capacity-building processes that strengthen decision-making at classroom, institutional, and system level. Where education systems need to build their own evaluation capacity, we design and deliver training in research methods, data analysis, and evidence-informed planning.
Case Studies