Data Analytics · Higher Education

Measuring What Matters: Analytics-Driven Programme Evaluation for a Leading Postgraduate Institution

A multi-engagement data analytics mandate spanning five years, three student cohorts, and two parallel evaluation tracks — delivering validated impact evidence and structured development roadmaps for a large-scale postgraduate education institution.

2Parallel Engagements
3Student Cohorts Analysed
5Competency Domains
4Structured Deliverables
The Engagements

TWO MANDATES. ONE ANALYTICAL FRAMEWORK.

The institution commissioned two concurrent but distinct evaluation engagements — one focused on supervisor competency development, the other on student programme experience. Hinc Group designed a unified analytical architecture that served both tracks while maintaining methodological rigour across each.

Engagement 1 · 2023 – 2025

Cross-Sectional Cohort Analysis of Supervisory Support Programme Impacts

A multi-phase, evidence-based evaluation of supervisory capacity-building programmes delivered to postgraduate research supervisors across three consecutive cohorts. The engagement combined systematic literature review, validated survey design, and cohort-level statistical analysis to produce an actionable competence-building roadmap.

Engagement 2 · 2022 – 2024

End-of-Programme Retrospective Student Cohort Evaluations — M&D Research Capacitation

A retrospective evaluation of Masters and Doctoral research capacitation programmes across three student cohorts. The engagement assessed programme effectiveness from the student perspective, identifying gaps and informing the development of an evidence-based M&D capacitation improvement plan.

The Challenge

COMPLEX DATA. DUAL STAKEHOLDERS. REAL DECISIONS.

Programme evaluation in higher education is rarely straightforward. The institution faced four compounding analytical challenges that demanded a purpose-built data strategy rather than a standard survey approach.

Multi-Cohort Data Complexity

Three consecutive cohorts of programme participants generated heterogeneous data sets spanning different programme iterations, delivery formats, and participant profiles — requiring a unified analytical framework.

Response-Shift Bias

Traditional pre-post evaluation designs fail to account for the evolution of participants' understanding of competency definitions through programme participation, systematically understating programme impact.

Dual Stakeholder Perspectives

The institution required simultaneous insight into both supervisor-side competency development and student-side programme experience — two distinct analytical tracks requiring coordinated methodology.

Actionable Output Requirement

Raw findings were insufficient. The institution required structured competence-building plans and programme improvement roadmaps that could be directly operationalised by programme administrators.

Methodology

A SIX-PHASE ANALYTICAL PROCESS

From literature review to actionable roadmap — every phase was designed to build on the last, ensuring that the final deliverables were grounded in validated data and directly usable by programme administrators.

01

Literature Review & Framework Development

Weeks 1–2

Conducted a systematic review of postgraduate supervision literature to identify the most critical supervisory competencies associated with successful supervisory practice. Five validated competency domains emerged as the analytical foundation.

Deliverable: Competency Domain Framework
02

Survey Instrument Design & Validation

Weeks 3–4

Designed a four-section survey instrument employing a retrospective post-then-pre methodology. Underwent two-stage validation: expert panel review for content validity, followed by pilot testing with a non-study sample to verify question clarity, flow, and technical functionality.

Deliverable: Validated Survey Instrument
03

Ethical Clearance & Deployment

Week 5

Completed ethical clearance in compliance with institutional data protection policies and POPIA. Deployed the survey via a secure, institution-approved platform with full participant anonymisation protocols in place.

Deliverable: Ethical Clearance Certificate
04

Data Collection & Quality Assurance

Weeks 6–9

Implemented a four-week data collection window with a structured two-reminder protocol targeting ≥60% response rate. Automated validation checks flagged straight-line responses and inconsistencies; systematic data cleaning addressed missing values and outliers.

Deliverable: Clean, Validated Dataset
05

Cohort Analysis & Statistical Modelling

Weeks 10–12

Conducted cross-sectional and longitudinal cohort analyses across the five competency domains. Quantified self-perceived competency gains using the post-then-pre design, disaggregated by cohort year, programme type, and demographic variables.

Deliverable: Cohort Analysis Report
06

Competence-Building Plan & Reporting

Weeks 13–16

Synthesised quantitative findings and qualitative open-ended feedback into structured competence-building plans for supervisors and programme improvement roadmaps for M&D capacitation initiatives. All deliverables formatted for direct operationalisation.

Deliverable: Competence-Building & Improvement Plans
Analytical Innovation

THE RETROSPECTIVE POST-THEN-PRE DESIGN

Standard pre-post evaluation designs introduce a systematic measurement error: participants' understanding of competency definitions changes through programme participation, making their pre-programme self-assessment retrospectively inaccurate. This phenomenon — response-shift bias — causes conventional designs to understate programme impact.

Hinc Group deployed a retrospective post-then-pre design: participants first rated their current (post-programme) competency level, then retrospectively rated their pre-programme level from the same conceptual vantage point. This eliminates response-shift bias and produces a more accurate, defensible measure of programme-induced competency gain.

Step 1
Rate Current Competency
Participant rates their competency level NOW — after programme participation — from a fully informed conceptual baseline.
Step 2
Retrospectively Rate Pre-Programme Level
Using the same conceptual understanding, participant recalls and rates their competency level BEFORE the programme.
Step 3
Calculate Gain Score
The difference between Step 1 and Step 2 represents the true perceived competency gain — free from response-shift bias.
Analytical Framework

FIVE SUPERVISORY COMPETENCY DOMAINS

Derived from a systematic review of postgraduate supervision literature and validated through expert consultation, these five domains formed the analytical backbone of both evaluation tracks.

Domain 01

Pedagogical Competencies

Teaching, learning, and assessment skills in the context of research supervision — including constructive feedback, adaptive instruction, and student learning facilitation.

Domain 02

Relational Competencies

Establishing and sustaining productive supervisor-student relationships through communication, emotional support, and motivational environment creation.

Domain 03

Functional Competencies

Administrative, procedural, and project management knowledge — including institutional policy, research ethics procedures, and supervision workflow management.

Domain 04

Domain & Methodological Knowledge

Subject matter expertise and research design proficiency enabling supervisors to guide students through advanced disciplinary research techniques.

Domain 05

Cultural & Social Competencies

Cultural sensitivity and inclusive supervision practices enabling effective engagement with diverse student populations across social and cultural contexts.

Validation Standard

Two-Stage Validation Process

Expert panel content validity review
Pilot testing with non-study sample
Automated response quality checks
Systematic data cleaning protocols
Quality Standards

RIGOROUS DATA QUALITY BENCHMARKS

Every engagement is governed by defined quality thresholds. These metrics are not aspirational targets — they are contractual commitments built into the project execution plan.

≥60%Target Response RateCensus approach — all programme participants invited
>95%Data CompletenessCore competency assessment items
<5%Data Cleaning RateInconsistent responses requiring intervention
5-WeekCollection WindowStructured two-reminder protocol
Deliverables

FOUR STRUCTURED DELIVERABLES

Each deliverable is designed for direct operational use — not as a report to be filed, but as a working instrument that drives programme decisions.

Cross-Sectional Impact Report

Comprehensive report identifying direct and indirect impacts of supervisory capacity training on self-perceived competence across all five domains, disaggregated by cohort year.

Postgraduate Supervisor Competence-Building Plan

Structured intervention roadmap guiding future supervisory development initiatives — identifying priority competency gaps and recommended programme focus areas.

M&D Student Cohort Evaluation Report

Retrospective evaluation report covering three student cohorts (2022–2024), assessing programme effectiveness from the student perspective with cohort-level trend analysis.

M&D Capacitation Improvement Plan

Evidence-based programme improvement plan guiding future development of Masters and Doctoral research capacitation support initiatives.

Data Governance

POPIA-COMPLIANT. INSTITUTION-APPROVED. FULLY ANONYMISED.

All data collection, storage, and processing procedures were designed in strict compliance with the Protection of Personal Information Act (POPIA) and the institution's own data governance policies. Participant confidentiality was protected through immediate anonymisation of all collected data before any analysis commenced.

The survey platform was institution-approved, password-protected, and restricted to authorised project team members. No personally identifiable information was included in any report, presentation, or publication arising from the engagement.

POPIA Compliance
All procedures aligned with the Protection of Personal Information Act
Ethical Clearance
Formal ethical clearance obtained before data collection commenced
Secure Storage
Password-protected servers with access restricted to authorised personnel
Immediate Anonymisation
All data anonymised before analysis — no PII in any deliverable
Commission an Engagement

DOES YOUR INSTITUTION NEED EVIDENCE-BASED PROGRAMME EVALUATION?

Hinc Group brings validated analytical methodology, sector-specific expertise, and a commitment to actionable outputs to every higher education data analytics engagement. Whether you need a single-track evaluation or a multi-cohort longitudinal study, we design the right framework for your institutional context.

View All Case Studies