Home Adaptive Learning Socratic Used Extensive User Testing to Build Powerful Free “Digital Tutor” App

Socratic Used Extensive User Testing to Build Powerful Free “Digital Tutor” App

1
SHARE
socratic1

Artificial Intelligence and User Testing Empower New App That Can Provide Students Help With Many Academic Subjects

Last month we at Socratic debuted our namesake free education app for high school students. It represents a new model of education apps: Our AI engine combines machine learning with educator expertise to give high school students instant learning help when they are stuck on homework questions. 

Before Socratic, when students needed independent help with their school work, they had two options: either search Google and end up on sites with low quality answers like Yahoo Answers, or pay for tutoring services. They had to either accept low quality or high cost.
 
We wanted to give high school students a third option: a way for them to get high quality content at no cost that helps them learn the lessons intended and finish their assignments efficiently.
 
With the app, a student takes a picture of a question, and our AI figures out what concepts are required to answer it. It then shows the student high-quality mobile-native content created by our community of educators that we believe makes learning easier than most textbooks and websites. We supplement that content with videos, definitions, and the best results from the web.
 
At the heart of the development of our app is the testing we did and continue to do with students, to understand why they get stuck, and how they learn best. This testing took on various forms during the development of the app.

Extensive User Testing 

Beginning in late 2015, we personally user-tested the app with hundreds of high-school students. We set up relationships with local schools and community centers, and through those we had a regular weekly flow of new students visiting our office after school. We showed them various versions of the app, asked them to solve homework problems using it, and asked them questions to reveal what they thought about the app, all while taking detailed notes and discussing findings with the team.
 
To test the app in more real-world conditions, we released a stealth beta app in the App Store. That app has been used by over 150,000 students, and we tracked various metrics about their usage. We also recorded and tagged various sessions so our product and design teams could observe students using certain features first hand, or could observe how the app failed for users.
 
We learned many important lessons through user-testing and watching user-sessions. Early on, we saw students taking photos of screens, because their homework was assigned to them online. Our technology to read text from images couldn’t handle screens, so we knew we had to invest more engineering effort in it. We then noticed that many students did not swipe past the first result because they did not realize that there were multiple results. We updated our design to make it more obvious that there are more results to explore. Finally, we’d see students quit the app when results would take too long. To solve this, we invested in speeding up results, prioritizing faster pages, and updated our code to prioritize loading the results the user had swiped to. 

Socratic is Empowered With Artificial Intelligence

The outcome of all this testing and research is that we're the first education company to use AI to take students from a specific question to the concepts required to answer that question, something a tutor would usually do. So for a question like: “A balloon has a volume of 2.9 L at 320 Kelvin. If the temperature is raised to 343 Kelvin, what will its volume be?”, a tutor would teach the student about “Charles’s Law”, the relevant concept.
 
Our AI functionality works just like that. Educators and experts poured through thousands of questions submitted by students and categorized them by their core concepts. Then, we fed those questions into our machine learning algorithms and spent months training and refining the system until it could look at a new question and accurately predict which concepts were required to solve the question. That means our users get content that allows for in-depth understanding, not just specific answers to a particular question.
 
We felt confident that the app would deliver a lot of value to the students that use it, and that belief has been supported by the amazing reactions we’ve gotten from students. In addition to our 5-star rating in the App Store, we’ve received many messages from students telling us that this is the most useful homework app they’ve used.

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here