Peer-Reviewed Evidence

The Research Behind Think10x.ai

Every design decision traces back to peer-reviewed studies. Here's the evidence that shaped how we build.

Evidence and studies

Study 01supports

Interactive Video Enhances Student Learning Outcomes

A systematic review of 105 randomized controlled trials with 7,776 students found that adding video to existing teaching methods produces large learning gains. Swapping video for existing methods showed smaller but still positive effects. Interactive video dramatically outperformed passive formats.

g = 0.80
Effect size (video added)
g = 0.28
Effect size (video swapped)
105
Randomized trials
Noetel et al. (2021)
Study 02warns

Unguarded AI tools harm learning

A large-scale field experiment with nearly 1,000 high school math students compared two AI tutors: one mimicking standard ChatGPT and one with pedagogical safeguards. Students with unrestricted AI access performed 17% worse on exams when the tool was removed. The safeguarded version eliminated this harm entirely.

-17%
Exam scores (unguarded AI)
0%
Harm with safeguards
~1,000
Students tested
Bastani et al. (2025)
Study 03supports

AI Tutor Outperformed Active Learning Classrooms

In a randomized controlled trial with 194 Harvard physics students, a carefully designed AI tutor helped students learn more than twice as much as an active learning classroom while spending less time. Students also reported feeling more engaged and motivated.

2x
More learning vs. active learning classroom
194
Harvard physics students
Harvard
Institution
Kestin et al. (2025)
Study 04supports

Step-Based AI Tutors Nearly Match Human Tutors

A review of more than 60 studies comparing human tutors, intelligent tutoring systems, and standard instruction found that step-based tutoring systems—those that check understanding throughout a problem, not just the final answer—were nearly as effective as expert human tutors.

d = 0.76
Intelligent tutoring system effect
d = 0.79
Human tutor effect
62+
Studies reviewed
VanLehn (2011)
Study 05warns

Text-Only Socratic AI Increased Engagement, Not Learning

A randomized experiment with 122 students aged 14-16 found that Socratic AI chatbots created more engagement and richer dialogue than direct-answer chatbots, but produced no measurable improvement in learning or retention.

0
Learning improvement vs. non-Socratic AI
122
Students ages 14-16
K-12
School setting
Blasco & Charisi (2025)
Why It Matters

What this means for Think10x.ai

We use video, not just text because interactive video produces the largest learning gains (Noetel, 2021).
We never give away answers because unguarded AI harms learning (Bastani, 2025).
We pair AI with video and diagrams because that combination outperforms the classroom (Kestin, 2025).
We scaffold step by step because step-level tutoring matches human tutors (VanLehn, 2011).
We don't rely on text-only chat because Socratic dialogue alone doesn't move the needle in K-12 (Blasco, 2024).
Every feature we ship is grounded in evidence , not hype.

Try Think10x.ai

See the research in action. Upload a question and watch it become a video.

Try Think10x.ai