A multi-engagement data analytics mandate spanning five years, three student cohorts, and two parallel evaluation tracks — delivering validated impact evidence and structured development roadmaps for a large-scale postgraduate education institution.
The institution commissioned two concurrent but distinct evaluation engagements — one focused on supervisor competency development, the other on student programme experience. Hinc Group designed a unified analytical architecture that served both tracks while maintaining methodological rigour across each.
A multi-phase, evidence-based evaluation of supervisory capacity-building programmes delivered to postgraduate research supervisors across three consecutive cohorts. The engagement combined systematic literature review, validated survey design, and cohort-level statistical analysis to produce an actionable competence-building roadmap.
A retrospective evaluation of Masters and Doctoral research capacitation programmes across three student cohorts. The engagement assessed programme effectiveness from the student perspective, identifying gaps and informing the development of an evidence-based M&D capacitation improvement plan.
Programme evaluation in higher education is rarely straightforward. The institution faced four compounding analytical challenges that demanded a purpose-built data strategy rather than a standard survey approach.
Three consecutive cohorts of programme participants generated heterogeneous data sets spanning different programme iterations, delivery formats, and participant profiles — requiring a unified analytical framework.
Traditional pre-post evaluation designs fail to account for the evolution of participants' understanding of competency definitions through programme participation, systematically understating programme impact.
The institution required simultaneous insight into both supervisor-side competency development and student-side programme experience — two distinct analytical tracks requiring coordinated methodology.
Raw findings were insufficient. The institution required structured competence-building plans and programme improvement roadmaps that could be directly operationalised by programme administrators.
From literature review to actionable roadmap — every phase was designed to build on the last, ensuring that the final deliverables were grounded in validated data and directly usable by programme administrators.
Conducted a systematic review of postgraduate supervision literature to identify the most critical supervisory competencies associated with successful supervisory practice. Five validated competency domains emerged as the analytical foundation.
Designed a four-section survey instrument employing a retrospective post-then-pre methodology. Underwent two-stage validation: expert panel review for content validity, followed by pilot testing with a non-study sample to verify question clarity, flow, and technical functionality.
Completed ethical clearance in compliance with institutional data protection policies and POPIA. Deployed the survey via a secure, institution-approved platform with full participant anonymisation protocols in place.
Implemented a four-week data collection window with a structured two-reminder protocol targeting ≥60% response rate. Automated validation checks flagged straight-line responses and inconsistencies; systematic data cleaning addressed missing values and outliers.
Conducted cross-sectional and longitudinal cohort analyses across the five competency domains. Quantified self-perceived competency gains using the post-then-pre design, disaggregated by cohort year, programme type, and demographic variables.
Synthesised quantitative findings and qualitative open-ended feedback into structured competence-building plans for supervisors and programme improvement roadmaps for M&D capacitation initiatives. All deliverables formatted for direct operationalisation.
Standard pre-post evaluation designs introduce a systematic measurement error: participants' understanding of competency definitions changes through programme participation, making their pre-programme self-assessment retrospectively inaccurate. This phenomenon — response-shift bias — causes conventional designs to understate programme impact.
Hinc Group deployed a retrospective post-then-pre design: participants first rated their current (post-programme) competency level, then retrospectively rated their pre-programme level from the same conceptual vantage point. This eliminates response-shift bias and produces a more accurate, defensible measure of programme-induced competency gain.
Derived from a systematic review of postgraduate supervision literature and validated through expert consultation, these five domains formed the analytical backbone of both evaluation tracks.
Teaching, learning, and assessment skills in the context of research supervision — including constructive feedback, adaptive instruction, and student learning facilitation.
Establishing and sustaining productive supervisor-student relationships through communication, emotional support, and motivational environment creation.
Administrative, procedural, and project management knowledge — including institutional policy, research ethics procedures, and supervision workflow management.
Subject matter expertise and research design proficiency enabling supervisors to guide students through advanced disciplinary research techniques.
Cultural sensitivity and inclusive supervision practices enabling effective engagement with diverse student populations across social and cultural contexts.
Every engagement is governed by defined quality thresholds. These metrics are not aspirational targets — they are contractual commitments built into the project execution plan.
Each deliverable is designed for direct operational use — not as a report to be filed, but as a working instrument that drives programme decisions.
Comprehensive report identifying direct and indirect impacts of supervisory capacity training on self-perceived competence across all five domains, disaggregated by cohort year.
Structured intervention roadmap guiding future supervisory development initiatives — identifying priority competency gaps and recommended programme focus areas.
Retrospective evaluation report covering three student cohorts (2022–2024), assessing programme effectiveness from the student perspective with cohort-level trend analysis.
Evidence-based programme improvement plan guiding future development of Masters and Doctoral research capacitation support initiatives.
All data collection, storage, and processing procedures were designed in strict compliance with the Protection of Personal Information Act (POPIA) and the institution's own data governance policies. Participant confidentiality was protected through immediate anonymisation of all collected data before any analysis commenced.
The survey platform was institution-approved, password-protected, and restricted to authorised project team members. No personally identifiable information was included in any report, presentation, or publication arising from the engagement.
Hinc Group brings validated analytical methodology, sector-specific expertise, and a commitment to actionable outputs to every higher education data analytics engagement. Whether you need a single-track evaluation or a multi-cohort longitudinal study, we design the right framework for your institutional context.