2026 RESEARCH WHITEPAPER  ·  HIGHER EDUCATION

Beyond the Response Rate

How Higher Education can finally close the loop between surveys and action. Qualitative interviews with 15 IR professionals at 15 U.S. institutions.

15INSTITUTIONS
15IR PROFESSIONALS
20+PAGES

Free  ·  No credit card  ·  20+ pages

Inside the Whitepaper

Across 15 U.S. institutions, IR teams of 2 to 5 people are managing anywhere from 5 to 50+ surveys a year while response rates have quietly dropped from 35%. The problem isn't effort. It's that the ecosystem is still optimised for collection, not for what comes after.

This whitepaper draws on in-depth interviews with 15 IR and assessment professionals to document what "after" actually looks like: the analysis that consumes weeks, the reports that don't drive decisions, and the AI tools that help with the wrong things.

What the whitepaper explores:
  • The hidden costs of manual data analysis workflows across different team sizes
  • Why response rates have declined structurally and why technical fixes don't reach the root cause
  • How survey fatigue becomes an institution's own coordination failure
  • Where analyst time actually goes and why the analysis phase is the real bottleneck
  • Why AI is being used at the wrong layer of the workflow

Who This Research Is For

Three audiences will find something different in this whitepaper — and something they've been waiting for someone to say out loud.

IR Professionals & Assessment Practitioners

Seeking validation, vocabulary, and frameworks for the challenges they navigate on a daily basis — survey overload, analysis bottlenecks, and reports that don't drive decisions.

Institutional Leadership

Vice provosts, chief data officers, and CIOs who commission surveys and want to understand the full picture of what makes survey programmes succeed — and why so many don't.

Survey Platform Designers & EdTech Developers

A ground-level view of where current tools fail, what practitioners actually need from the next generation of survey infrastructure, and where product investment will have the most impact.

By the Numbers

15%
Average response rate today
Down from 25–35% a decade ago at the same institutions running the same surveys
10×
Hours on post-collection work
For every hour spent designing a survey — the hidden cost nobody budgets for
2–5
Staff on IR teams
At most institutions, regardless of whether they manage 5 or 50+ surveys per year
0
AI natively embedded
Every AI interaction is a manual export and context switch — nobody has it built in

Six patterns across every institution

Every institution uses different tools, serves different populations, runs at a different scale. Their frustrations are identical. That uniformity is the finding.

Response Rates
Structural decline with no technical fix

Participation has collapsed across all institution types. Every strategy — incentives, personalised invitations, QR codes — has been tried. None produced lasting improvement. It's a trust problem, not a delivery problem.

Survey Fatigue
An institution's own coordination failure

Multiple departments send surveys in the same week. Students receive overlapping questions from different offices. No shared calendar, no deduplication layer, no cross-department authority. The problem is structural.

Analysis Burden
Where capacity actually disappears

Open-text cleaning, longitudinal data stacking, and report generation consume the majority of analyst time. Current platforms optimise for survey construction. The post-collection phase is largely unautomated.

Platform Gaps
Built for design, not workflow

Data tied to personal accounts is a governance choice. Longitudinal stacking being manual is a tool gap. Every time a respondent drops off mid-way, that data disappears. None of this is inevitable.

AI Adoption
Informal, widespread, and at the wrong layer

Every participant uses AI for question drafting and report writing. None have AI embedded in their survey tool. Every AI interaction requires exporting data and switching context. It adds friction rather than reducing it.

Data to Action
Reports produced. Decisions rarely follow.

Surveys are run, reports are filed, and institutional change rarely results. This erodes respondent trust and suppresses future participation. Closing this loop requires leadership commitment — not better software.

How the Research Was Conducted

Semi-structured, sixty-minute conversations with IR and assessment professionals across five institution types — liberal arts colleges, community colleges, mid-size regional universities, large research universities, and a higher education consultancy.

15
Institutions Interviewed
~60
Minutes per Interview
5
Institution Types
8
U.S. States Represented

Transcripts were reviewed and coded thematically. Findings are organised into universal patterns (present in all or nearly all interviews), emerging signals, and distinctive observations — highly specific findings with structural significance. All participants and institutions are fully anonymised.

Voices from the Field

"For every hour we spend designing a survey, we spend ten on cleaning, coding, and getting it into a format anyone can read."

— Research Analyst · Mid-Size University

"The open-text cleaning is the single biggest pain point. Thousands of employer name spellings. All manual. Every single cycle."

— Data Analyst · Research University

"Low response rates are structural now. Most of us have moved from trying to solve it to managing expectations around it."

— IR Director · Liberal Arts Institution

Five Recommendations

These are not speculative futures. They are informed by the people who live inside survey workflows every day.

Post-Collection Automation

Build analysis, cleaning, and reporting directly into the survey workflow. The design phase is not where time is lost — every hour in design costs ten in post-collection.

Native AI, Not a Bolt-On

Assist with question logic, phrasing, and open-text analysis in-platform. Every AI interaction that requires an export is friction, not a feature.

Institutional Survey Calendars

Coordinate timing across departments to cut fatigue and redundant outreach. Survey overload is an emergent property of decentralised authority — it requires a coordination layer.

Mobile-First, In-Context Delivery

Meet respondents inside LMS and SSO environments to lift response rates. Surveys built for desktop and distributed by email are structurally misaligned with how students live.

Departmental Accounts & Question Banks

Shared, validated question banks keep longitudinal institutional data intact. Data tied to personal accounts is a governance choice — one that costs institutions their history.

Research Interview — $50 Gift Card

Help us build the future of Institutional surveys

We are speaking with IR professionals to understand how data collection really works in higher ed today — the tools, the process, and the gaps. It's a 30-minute remote conversation, and you will receive a $50 gift card as a thank-you.

$50 Gift Card30 minutesRemoteNo sales pitch

Read What Your Peers Won't Say Out Loud

Get the complete research with actual data, team-size analysis, and actionable recommendations for IR teams.

Free downloadNo credit card20+ pages of research
Scripts are blocked. This site won’t work properly. If you’re using Brave, click the Shields icon and turn off Block scripts. Otherwise disable your ad blocker for this site.