Choosing the Right Online Tutoring Platform for Your School: A Practical Procurement Checklist
A procurement-ready checklist for choosing online tutoring platforms, with scoring rubrics for safeguarding, curriculum fit, ROI, and scale.
For MAT leaders, headteachers, and school business managers, choosing an online tutoring provider is no longer a simple “which website looks best?” decision. It is a procurement question that affects attainment, safeguarding, workload, and budget accountability. The strongest schools now evaluate tutoring partners the same way they evaluate any high-stakes service: with criteria, evidence, scoring, and a clear exit plan. That is especially important in a market where options range from specialist providers such as Third Space Learning to school-partnership models like MyTutor, and where “value for money” must be proven, not assumed.
This guide turns the familiar “best tutoring websites” review into a procurement-ready framework you can use for vendor selection. It includes a practical checklist, scoring rubrics for safeguarding, curriculum alignment, ROI, and scalability, plus a comparison table and questions to ask before you sign. If you are also reviewing wider school systems, you may find our guides on making linked pages more visible in AI search and vetting an organisation like an investor useful for building a disciplined decision process.
Key idea: the best platform is not the one with the lowest hourly rate. It is the one that delivers measurable progress, fits your curriculum, satisfies safeguarding expectations, and scales in a way your staff can actually manage. That is the same logic used in rigorous procurement across sectors, from compliant cloud architecture to guardrailed document workflows.
1. Start with the school problem, not the provider brochure
Define the intervention need precisely
Before you compare platforms, define what problem you are buying a solution for. Is the goal to close a specific attainment gap in Year 6 maths, support GCSE resits, or provide intervention for disadvantaged pupils who need consistent feedback? The right platform for a large MAT running third space learning across multiple schools may be very different from one chosen by a single secondary school needing small-batch exam support. As with any procurement, a crisp problem statement prevents you from paying for features you do not need.
Start by writing a short specification: subject, key stage, volume of learners, delivery model, safeguarding requirements, and desired outcome. A school buying 40 one-to-one sessions per week needs different controls from one seeking unlimited, curriculum-led practice. If you are exploring the operational side of school technology, our article on maintaining security systems offers a useful analogy: systems only work when the setup matches the risk.
Separate intervention from enrichment
Some platforms are designed for general academic support, while others are purpose-built for intervention. That distinction matters because intervention must show impact quickly and reliably. For example, a GCSE English support package should be judged on progress measures, exam-board alignment, and tutor expertise, not simply on whether the website has a broad subject menu. Broad generalists can be useful, but they often introduce more variability in lesson quality and reporting.
It helps to ask whether you are buying instruction, revision support, subject-specific remediation, or capacity. Capacity matters for MATs that need to deploy tutoring across many schools without creating admin overload. If your team already manages high-volume operations, the planning mindset described in deploying technology in the field is a helpful parallel: a scalable rollout depends on standardisation, not improvisation.
Write a measurable outcome statement
Every procurement should have an outcome statement. A strong one sounds like this: “By the end of the spring term, pupils targeted for KS2 maths intervention will improve mastery in fraction concepts, with at least 70% demonstrating secure performance on aligned diagnostic checks.” That is a far better objective than “improve confidence.” Confidence matters, but it should sit alongside observable progress. This simple framing helps you compare vendors on evidence rather than marketing language.
In your specification, include baseline diagnostics, frequency of sessions, reporting cadence, and what success will look like at 6, 12, and 24 weeks. This keeps the focus on measurable outcomes and avoids drifting into vague satisfaction metrics. For schools that want to understand how educational support can be bundled and priced efficiently, our piece on high-capacity buying decisions offers a surprisingly relevant lesson: capacity should be defined by use case, not by maximum size.
2. Build a procurement scorecard that schools can actually use
Use weighted criteria, not gut feel
A procurement scorecard turns subjective comparison into a structured decision. For online tutoring, we recommend four core categories: safeguarding, curriculum alignment, ROI/value for money, and scalability. You can add usability, reporting quality, and implementation support if they matter to your trust. The important thing is consistency: every bidder should be scored by the same people using the same criteria and evidence. That protects the school, improves transparency, and gives governors confidence that the decision is defensible.
Below is a practical scoring framework you can adapt. Use a 1–5 scale where 1 is unacceptable and 5 is excellent. Multiply each score by the weight, then compare total weighted scores. For high-stakes procurement, require each bidder to submit written evidence against every criterion, not just a sales presentation.
| Criterion | Weight | What 1 looks like | What 3 looks like | What 5 looks like |
|---|---|---|---|---|
| Safeguarding | 30% | Weak vetting, unclear reporting, no DSL process | Standard checks, some procedures, partial reporting | Robust vetting, DBS/KYC, escalation and school liaison |
| Curriculum alignment | 25% | Generic tutoring with little UK curriculum fit | Some subject alignment, limited mapping to schemes | Clear scheme-of-work alignment and assessment mapping |
| ROI / value for money | 25% | Unclear pricing and no impact evidence | Reasonable cost, limited outcome reporting | Transparent pricing, measurable impact, strong efficiency |
| Scalability | 20% | Manual onboarding, hard to expand | Can scale with support | Seamless rollout across multiple cohorts or schools |
This approach is useful because it forces schools to define what matters most. If safeguarding is your biggest concern, increase its weight. If you are a MAT buying at scale, scalability and reporting may deserve equal importance. The goal is not a perfect formula; it is a repeatable one. For a broader view of how transparent evaluation builds trust, see how organisations disclose AI responsibly.
Split the scorecard into “must-haves” and “nice-to-haves”
Not every feature should be treated as equal. A good procurement checklist separates non-negotiables from extras. For example, enhanced safeguarding checks, school-level reporting, and data privacy controls may be must-haves, while custom branding or advanced dashboards may be nice-to-haves. This prevents a flashy feature set from distracting the panel from the essentials. It also makes it easier to reject a vendor that scores well on presentation but poorly on compliance.
You can also add pass/fail gates before scoring. For example, if a provider cannot evidence DBS checks, cannot explain how school DSLs are involved, or cannot produce a clear data processing agreement, they should not move to final scoring. That is analogous to risk control in other industries, where some criteria are disqualifying rather than graded.
Keep evidence in one procurement pack
For each bidder, request a standard evidence pack: tutor vetting policy, safeguarding policy, data protection statement, curriculum mapping example, sample progress report, implementation timeline, pricing schedule, and references from comparable schools. Once collected, store everything in a single comparison file so governors and senior leaders can review it in one place. This is especially valuable in MATs, where decisions often span multiple schools and stakeholders. A neat evidence trail reduces debate and speeds up approval.
To support your evidence gathering, our article on veting organisations systematically can help leaders think beyond first impressions. Procurement is not about finding the most persuasive sales pitch; it is about reducing uncertainty.
3. Safeguarding is the first filter, not the final checkbox
Ask how tutors are recruited, vetted, and monitored
In school procurement, safeguarding should be treated as a foundational requirement. Ask every supplier to explain exactly how tutors are screened, what identity checks are completed, whether enhanced DBS checks are in place, and how ongoing monitoring works. A provider that cannot show a clear process for vetting, training, and escalation is not ready for school use, regardless of subject quality. This is particularly important in one-to-one environments where a tutor interacts directly with a child.
Look for a written safeguarding policy, named safeguarding leads, and a route for staff to report concerns quickly. If a platform supports school partnerships, ask how it liaises with the DSL or pastoral lead when a concern arises. Strong providers do not just say “we are safe”; they show the workflow. For schools that value structured risk controls, our guide on identity controls that actually work illustrates the same principle in another high-trust environment.
Check online lesson controls and communication boundaries
Safeguarding in online tutoring also includes platform design. Can sessions be recorded or monitored? Are messages between tutor and pupil moderated? Are lessons accessible to school staff if needed? Can a school insist that communication stays within the platform rather than moving to personal email or chat apps? These questions matter because poor communication boundaries often create the greatest safeguarding risk. A platform with clear controls and auditable interactions is far easier to manage.
Also ask about session supervision, emergency procedures, and the handling of disclosures. Good suppliers can explain what happens if a tutor notices a concerning pattern, such as emotional distress or a pupil repeatedly missing sessions. The best answers are practical, not generic. They should sound like a real incident-response plan, not a policy slogan.
Use a safeguarding score rubric
A useful rubric might score 1 for basic compliance documents, 3 for documented vetting and staff training, and 5 for full operational safeguarding: enhanced checks, staff training, school liaison, audit trails, and platform controls. Schools often underestimate the value of simple but strong operational safeguards. A provider that quietly builds these into the user journey can reduce school workload and increase confidence. That is why safeguarding should carry a high weight in the scorecard, especially for younger pupils.
Pro Tip: if a supplier’s safeguarding answer is mostly marketing language, keep digging. The safest providers are usually the ones that can explain their processes clearly, without hand-waving.
4. Curriculum alignment determines whether tutoring actually moves attainment
Map tutoring content to your schemes of work
One of the most common procurement mistakes is buying subject expertise without curriculum fit. A platform may have excellent tutors, but if they cannot align teaching to your school’s sequence, interventions become disconnected from classroom learning. That is especially problematic in maths and core subjects, where misconceptions build quickly. Schools should ask for examples of curriculum mapping against the exact key stage, topic sequence, and exam board they teach.
Ask vendors to show how their tutoring links to prior learning, current class content, and future assessment. For example, in primary maths, a session on fractions should connect to number sense, equivalence, and manipulatives if needed. In secondary English, support should reflect the text, assessment objective, and mark scheme. Curriculum coherence is what makes tutoring feel like an extension of teaching, not a separate activity. If you are building this kind of structured learning pathway, the discipline seen in best practices for AI-enabled content workflows can be a useful analogy: alignment matters more than volume.
Assess diagnostics and progress reporting
Curriculum alignment is not just about delivery; it is about diagnosis and feedback. A strong platform should identify gaps quickly, track progress over time, and show whether pupils are moving toward the intended standard. That might include pre-assessments, session notes, topic mastery reports, and recommendations for follow-up. Leaders should ask to see a sample report before purchase, and they should check whether reports are understandable to teachers, not just data specialists.
Reporting should tell you what was taught, how the pupil responded, and what to do next. This makes tutoring actionable and helps class teachers reinforce gains in lessons. It also improves the credibility of the programme with governors, because you can show not just activity, but response to intervention. For teams that need to manage performance data responsibly, our article on structured content visibility shows how clarity improves decision-making.
Prefer providers that support school context, not just subject knowledge
The strongest providers understand the realities of school timetables, exams, attendance, and pastoral pressures. They know that interventions must work around assemblies, mocks, trips, and staffing constraints. This is where platforms like Third Space Learning and MyTutor often enter the conversation, because they offer different models of school partnership and delivery. What matters is not brand familiarity, but whether the provider understands your calendar and reporting needs.
If the supplier can tailor interventions to your year groups and exam windows, that is a strong sign of curriculum sensitivity. If not, you may end up with sessions that are useful in theory but poorly timed in practice. Procurement should therefore reward providers that are flexible without being vague.
5. Value for money must be judged against impact, not just price
Compare cost per outcome, not cost per hour
Price comparisons are easy to make and often misleading. A platform charging a lower hourly rate may be expensive if outcomes are weak, admin is heavy, or sessions are too generic to move attainment. Conversely, a fixed-price model may look high at first but deliver better value if it provides unlimited support, efficient scheduling, and strong reporting. The better question is: what does it cost to move one pupil one step forward?
That means you should calculate cost per learner, cost per successful outcome, and cost per staff hour saved. For example, a provider with a transparent annual subscription may be attractive to MATs that need budget predictability. A per-hour model may suit a smaller school that wants tactical support for a short exam cycle. For more thinking on value and procurement trade-offs, our guide on avoiding add-on costs offers a useful consumer analogy: the headline price is rarely the full story.
Look for evidence of impact, not just testimonials
Testimonials are useful, but they are not enough. Ask for case studies with baseline data, intervention length, and measurable improvement. Ideally, the provider should be able to show outcomes from schools similar to yours, not just generic praise. Strong evidence includes attendance rates, topic mastery improvements, teacher feedback, and examples of how the platform changed intervention planning.
If a vendor claims strong impact, probe the methodology. Were the pupils preselected by need? Was the intervention short-term or sustained? Were gains measured against a control or baseline? A provider that can answer these questions clearly is usually more trustworthy. The lesson mirrors what strong analysts do in other fields, such as in statistical analysis with real sample data: claims should be backed by a method.
Model hidden costs before you sign
Some costs never appear on the quote. These include onboarding time, staff training, coordination with parents, data exports, and internal oversight. If a platform requires your school to do much of the administration manually, the real cost may be much higher than the quote suggests. This is why procurement teams should ask for a full implementation outline, including who does what in week one, week four, and week twelve.
Be especially alert to costs linked to scale. A cheap pilot can become costly if it cannot expand across year groups or schools. Equally, a premium provider can be good value if it reduces internal admin and allows leaders to standardise provision. The key is to compare total cost of ownership, not just the invoice line.
6. Scalability is essential for MATs and multi-site schools
Test whether the platform can grow with your trust
Scalability is more than adding more pupils. It includes onboarding multiple schools, maintaining quality at higher volumes, and preserving reporting consistency across sites. MATs should ask whether the platform can support different schools with shared oversight, common reporting templates, and trust-wide governance. A strong provider will be able to describe a rollout model that works from pilot to full deployment.
Scalability becomes especially important when tutoring is used across several subjects or year groups. A provider that is excellent in a single-school pilot may struggle when the trust expands to six campuses. That is why procurement should include the question: what happens when demand doubles? For a broader systems view, our article on why capacity plans fail when they are too rigid is a useful reminder that flexibility is a strategic asset.
Assess onboarding, support, and governance
Good scaling depends on operational support. Ask how new pupils are added, how timetables are set, how schools are trained, and who your named contact will be. You should also ask how exceptions are handled: missed sessions, pupil absences, changes in target groups, and urgent safeguarding issues. If the vendor cannot describe the support model in practical terms, expect friction once you move beyond the pilot.
For MATs, governance matters too. Can the provider give trust-level dashboards while preserving school-level detail? Can leaders see usage by cohort, subject, and site? Can the system export data in a format your trust already uses? These are the questions that determine whether the platform becomes a strategic tool or just another dashboard nobody opens.
Plan for seasonal peaks
School demand is not flat. It rises before mocks, SATs, and GCSEs, and it can shift suddenly if attendance or staffing changes. The best tutoring suppliers know how to handle these peaks without sacrificing quality. Ask for their capacity model, tutor availability assumptions, and what happens if you add cohorts late in the year. A provider with genuine scale should be able to tell you how they absorb demand spikes while keeping service consistent.
This is where the difference between a boutique supplier and a platform provider becomes visible. A niche specialist may deliver deep quality but limited volume. A larger provider may offer breadth but require tighter governance from your side. Your scorecard should capture that trade-off instead of pretending all scaling models are equal.
7. Shortlist comparison: how to interpret common provider types
Specialist providers, general marketplaces, and school partners
In the current market, providers often fall into three broad groups. Specialist school partners focus on structured intervention, curriculum-fit, and school reporting. General marketplaces offer breadth, flexibility, and quick access to many tutors. Hybrid models try to balance both. None is automatically better, but each has different strengths and risks. If you understand the category, you will ask better questions and avoid mismatched expectations.
For example, a school partnership model like MyTutor may be attractive for GCSE and A level support where human tutor quality and subject range matter. A more tightly structured model may suit schools wanting consistency, standardised data, and less admin. Marketplaces can be excellent for breadth but may require more school oversight to ensure quality. The procurement lens should focus on which model best supports your teaching and safeguarding standards.
Use a decision matrix, not a popularity contest
When comparing vendors, ask each supplier to answer the same set of questions in writing. Include evidence for tutor selection, safeguarding, curriculum alignment, reporting, onboarding, and pricing. Then score independently before discussing as a panel. This reduces anchoring bias, where the first provider seen sets the tone for the entire process. It also protects against being swayed by brand recognition or a persuasive demo.
Below is a practical comparison matrix you can adapt internally. The point is to compare the model, not just the website.
| Provider type | Best fit | Key strengths | Possible limitations | Procurement questions to ask |
|---|---|---|---|---|
| Specialist school intervention | Primary/secondary core subject support | Curriculum focus, reporting, consistency | Narrower subject range | How well does it map to our schemes? |
| School partnership platform | GCSE/A level, targeted attainment | School liaison, subject expertise, oversight | May cost more per hour | How strong are tutor vetting and DSL escalation? |
| General tutor marketplace | Flexible multi-subject needs | Broad coverage, fast availability | Variable quality and admin burden | What quality control and reporting do we get? |
| Fixed-price subscription | High-volume intervention | Predictable budgeting, scale | May be less bespoke | Can we adapt to changing cohorts? |
| Per-session model | Targeted short interventions | Simple to start, lower commitment | Costs can rise with scale | What is the total cost over a term? |
Watch for overpromising in platform demos
Demos are useful, but they are designed to sell a vision. Procurement teams should therefore test the platform against their actual workflow: pupil allocation, parent consent, lesson scheduling, reporting, and escalation. Ask to see the admin journey, not just the tutoring experience. That is where a lot of value is won or lost. A slick demo that hides operational complexity is a warning sign, not a selling point.
If the vendor can make your internal process simpler, that is a major advantage. If it adds new steps at every stage, the time cost may outweigh the academic benefit. The best schools choose the platform that makes implementation feel boring—in a good way—because boring is often what scalable systems look like when they are working.
8. A practical procurement checklist for MATs and headteachers
Before the demo
Write down your outcomes, budgets, cohort size, and safeguarding threshold. Decide what will count as a pass/fail condition and what will simply be scored. Identify the people who need to be involved in the decision: subject leaders, DSL, SENCO, finance lead, and a governor or trust representative. If you do this up front, your demo meetings will be far more focused.
Ask the supplier to provide evidence in advance: policies, sample reports, tutor vetting details, pricing structure, and references. A provider that resists this request may not be ready for school procurement. Also ask how they support communication with parents, because engagement and consent can become a hidden bottleneck. For teams that want to improve internal communication quality, the framing in high-risk email workflows is a reminder that process design matters.
During the evaluation
Score each criterion independently. Do not let one impressive feature override a weak safeguarding answer. Compare total scores, but also review the comments: sometimes a vendor scores similarly overall but wins on the one factor that matters most to your school. Make sure the panel includes a practitioner who understands the day-to-day realities of interventions, not just procurement or finance.
Ask for a pilot only if it is structured. A pilot without a baseline, success criteria, and reporting plan often tells you very little. A good pilot should be short, purposeful, and measurable. It should answer your key risk questions, not simply create a sense of momentum.
Before signature
Confirm contract length, cancellation terms, data ownership, reporting rights, and support response times. Check who owns the pupil data and whether it can be exported in a usable format. Ensure the data processing agreement is aligned to your school and trust obligations. Then verify that safeguarding commitments are in the contract rather than only in sales materials. This is where procurement becomes protection.
If possible, get a named implementation lead and service-level expectations in writing. The first 30 days are often where a tutoring programme succeeds or stalls. Clear ownership at that stage reduces friction and makes the programme easier to scale later.
9. What “good” looks like in practice
A mini case study for a MAT
Imagine a MAT with six primaries and two secondaries, each with a different need. The primary phase needs a structured maths intervention, while the secondary phase needs GCSE English and science support. The trust chooses to run a two-stage procurement process: first a safeguarding and compliance gate, then a weighted scorecard. After shortlisting, the trust selects one provider for maths at primary level and another for GCSE support, because no single vendor is strongest everywhere.
That is not a failure of procurement. It is a sign of maturity. In complex organisations, the best decision is often a portfolio, not a single winner. The trust now uses common reporting templates across both suppliers and reviews progress monthly. Because the process was evidence-led, leaders can explain exactly why each provider was chosen and what outcome it is expected to deliver.
How to judge whether the investment is working
Look for three layers of evidence. First, utilisation: are pupils attending and engaging? Second, attainment: are the targeted skills improving? Third, transfer: are classroom teachers seeing the gains reflected in lessons and assessments? If the answer to the first is yes but the others are weak, you may have a good attendance system but a weak tutoring fit.
It is also worth checking staff workload. If teachers report that the platform saves time on planning, diagnosis, or reporting, that is part of the ROI. Schools often overlook this because the savings are indirect. Yet in a high-pressure environment, time saved is real value. That is why procurement should measure both academic and operational outcomes.
When to walk away
Walk away if the vendor cannot clearly explain safeguarding, cannot show curriculum fit, refuses to provide meaningful reporting examples, or cannot justify its pricing in relation to outcomes. Also walk away if the platform is too rigid for your timetable or too complex for your staff to adopt. In school procurement, the cost of a bad fit grows over time because interventions affect teaching schedules, leadership attention, and pupil confidence. It is better to delay a purchase than to buy a mismatch.
That principle applies whether you are buying a tutoring platform, a digital system, or any service that carries trust and compliance risk. The more evidence you gather, the better your final decision will be.
10. Final checklist: the questions every school should ask
Safeguarding
Are tutors vetted with enhanced DBS and identity checks? Is there a clear safeguarding policy, escalation route, and school liaison process? Can staff supervise or audit communication? If the answer to any of these is unclear, keep the provider on hold.
Curriculum alignment
Can the provider map content to our curriculum, exam board, or scheme of work? Does it offer diagnostics and reports that teachers can use? Does it fit the age, subject, and intervention length we need? Alignment is what turns tutoring into teaching support.
ROI and scalability
What is the total cost, including setup and staff time? What evidence shows impact in schools like ours? Can the provider scale across more pupils, schools, or subjects without losing quality? If you cannot answer these confidently, the risk is probably too high.
Pro Tip: the right platform should reduce uncertainty, not create it. If procurement becomes clearer after the demo, that is a strong sign. If it becomes more confusing, keep comparing.
FAQ
How do we compare online tutoring providers fairly?
Use a weighted scorecard with fixed criteria, such as safeguarding, curriculum alignment, ROI, and scalability. Require written evidence from every provider, and score independently before group discussion. This creates a defensible process and reduces the chance of choosing based on brand familiarity or presentation style.
What safeguarding checks should we expect from an online tutoring platform?
At minimum, ask for enhanced DBS checks where relevant, identity verification, tutor vetting, safeguarding training, and a clear escalation process. You should also check how the provider handles communication, session monitoring, and concerns raised by tutors or school staff. The best providers make this operationally clear.
Is a fixed-price tutoring model better value than hourly pricing?
Not always. Fixed pricing can be excellent for large-scale or ongoing intervention because it gives budget certainty and can lower the cost per learner. Hourly pricing may suit short, targeted use cases. The real question is which model delivers the best cost per outcome for your specific cohort.
How important is curriculum alignment compared with tutor quality?
Both matter, but curriculum alignment is often what determines whether the tutoring transfers into classroom progress. A highly capable tutor who works off-sequence may not help pupils as much as a slightly less experienced tutor who follows your curriculum closely. For school use, alignment and quality should be evaluated together.
What should MATs look for in a scalable tutoring solution?
MATs should look for trust-level reporting, consistent onboarding, manageable admin, and the ability to expand across schools without losing quality. They should also ask how the provider handles peaks in demand, changes in cohorts, and consistent safeguarding oversight across sites. Scalability should be tested, not assumed.
Related Reading
- 7 Best Online Tutoring Websites For UK Schools: 2026 - A useful starting point for understanding the current market landscape.
- How to Make Your Linked Pages More Visible in AI Search - Helpful if you want to see how structured content improves discovery.
- How to Vet a Charity Like an Investor Vetting a Syndicator - A strong analogy for evidence-led supplier evaluation.
- Designing HIPAA-Ready Cloud Storage Architectures for Large Health Systems - Useful for understanding compliance-driven procurement thinking.
- Why Five-Year Capacity Plans Fail in AI-Driven Warehouses - A practical reminder to build flexibility into scaling decisions.
Related Topics
Daniel Mercer
Senior SEO Editor
Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.
Up Next
More stories handpicked for you
Understanding Educational Content Moderation: Age-Appropriate Learning
Navigating Economic Uncertainty: Test-Taking Strategies for Students
Ecommerce in Education: Building Better Learning Systems through Tools
The New Age of Assessment: Adapting to Changes in Logistics
Adapting Smart Technology to Enhance Classroom Learning
From Our Network
Trending stories across our publication group