Providing a solution for teachers in analogue classrooms

The use of technology in the classroom has positively impacted education. The inclusion of convertible laptops, interactive and whiteboards have unlocked new ways and opportunities to learn. However, it is clear that technology is not present equally in classrooms around the world and still needs to be supplemented by analogue materials. This is how EF Class answered this need.

presentation.gif

Background

At EF Class we have a data-centric approach when it comes to make decisions and updates to the product. Following a data workshop at EF Class, we understood that the most appropriate “North Star” metric for EF Class was the number of Weekly / Fortnightly / Monthly teaching teachers, as it is both a proxy for delivery of value to our teachers, and a leading indicator of revenue for the business (the two key criteria for any North Star).

Complication
We were only able to track this metric if teachers actually tapped / clicked a specific button (“Start lesson”) and selected a class. However, on both Web and iOS apps, teachers had the ability to run entire EF Class lessons without doing this, through a combination of activity previews and the ability to project or mirror their screens. For some teachers without access to the relevant hardware, this was, in fact, the only way they had to use EF Class at the time.

Resolution
Although we might never have known the full extent to which this was happening, we needed to find a way of incentivising teachers to indicate that they were using a projected mode, for the following reasons:

  • We needed to understand how widespread this use-case was, so that we could decide whether to “formally” recognise it and potentially tweak the product in recognition of it;

  • We needed to capture these lessons as part of our North Star reporting.

We sent out surveys and when we saw the results, we could observe that at least 56% of teachers projected at least some of their lessons or activities.

In addition to the previous points, we had been aware – for the previous three years – that technology is not present equally in classrooms around the world and still needs to be supplemented by analogue materials. It was clear to us that we had to provide a proper solution to this issue, as well as solve our internal problems.

 

Assembling a squad to work on this problem

At EF Class we followed the Shape up approach from Basecamp and worked in weekly cycles so that we could elaborate a solution with shared responsibilities across the team. This way, everyone felt involved throughout the process and not just updated when features went live.

How does it work?
In a nutshell, we assemble a small team with a representative of each area of expertise that will be responsible for elaborating, shaping, testing, and delivering the feature. We start by understanding how long it will take to deliver something meaningful from start to finish and short enough that can be delivered in between six to twelve weeks. The timeline always depends on the feature and team. For this feature we set it to take nine weeks total.

The first four weeks would be spent in the “Shaping phase” where we:

  • Gathered research, assumptions, and questions about the problem to be solved; 

  • Sketched solutions at a high level of abstraction;

  • Explored a wide range of options;

  • Defined actionable ideas;

  • Solved the problem within the appetite without finer details worked out.

At the end of these four weeks we bet on a solution to elaborate. The next five weeks are spent in the building phase where we:

  • Examined solutions for holes or unanswered questions;

  • Amended solutions, or tackle tricky spots upfront; 

  • Carried out a slo-mo walk through;

  • Fenced off use cases;

  • Reduced the risk by solving open problems upfront;

  • No rabbit holes or tangled interdependencies opened.

 

Opportunity statement

With the team assembled we started to think: “How might we redesign EF Class to ensure that projection teachers will actively inform our system that they are using projection mode?”

We had a few assumptions when elaborating solutions; for example, hiding teacher notes could be used as the “incentive” to select projection mode.

And a few requirements:

  1. Solutions should cater to the fact that teachers would teach lessons directly from the Lesson plan overview via activity previews;

  2. Solutions should aim to capture existing projection teachers only and minimize the number of connected teachers switching over to projection;

  3. Solutions should work from Discover, Library, and Taught lesson views;

  4. Solutions should be designed in a way that we could be sure that a real lesson was being taught and minimize the chances that it was simply switched on and left on.

 

Gathering research findings

We starting by revisiting some research we had done in the past months and putting some user information together.
Teachers interviewed:

Persona file examples created test by test

Persona file examples created test by test

“Students are masters of distraction. With books, they look out of the window. With laptops, they open other websites. I project my lessons to stop them getting distracted!”

“I only use the projector every now and then, but it can be very useful when I need to make sure students are looking at the same thing, at the same time”.

“As a teacher you need to work hard keep the lesson alive’, and the projector is one way of helping you do that.”

We quickly understood a few opportunities:

  • Teachers already used projectors for a number of reasons:

    • Students rarely all have access to headphones;

    • Video watching is much easier;

    • Teachers like the feeling of control and keeping a class together for the Intro / Warm up / Lead in.

  • A teacher often starts the lesson with a video, and always uses a projector to do this. The video usually has to be shown twice for students to really grasp it. Projecting gives an element of control the teacher wouldn’t otherwise have: ”I can fast-forward, go back, pause it – it’s at my own pace”; 

  • Even for student productions, a teacher will use the projector to show the model answers;

  • A teachers will often have students read out the exercises – those at the front from the smartboard/projector, those in the back from their own screens;

  • Throughout the entire lesson, the smartboard/projector will usually reflect exactly what the students see on their own screens.

 

User flows and UX definition

With all this information (and more), we felt ready to start exploring designs and user flows:

user-flow-presentation.jpg

For the first release the presentation view in itself was simple, because we already had it available for live lessons. We simply had to clean the distracting elements from the UI, such as the activity menu containing Teacher notes, lesson controls, etc., and allow teachers to navigate out of presentation mode.

lesson-presentation.png

The challenge for this feature centred on the discoverability and affordability of this new available option. We weren’t sure either if our terminology was clear to teachers when it came to picking a lesson to either teach or set as an assignment.

starting-point-presentation2.png

We knew we would have to test the feature heavily and do a lot of iterations to get this menu right. After all, this is the most important menu in the app when it comes to user conversions. We had to get it right!

Team members from all areas of expertise –  design, development, marketing, and academic – all gathered to do a mapping of the scope of the feature so we could understand how long it would take and how much effort we would have to make to see it going live.

non-connected-classroom.jpg

We rolled up our sleeves and started listing our options to test:

On the design side we had some Iconography challenges to solve. We needed to be sure which icon to use in 3 different contexts:

  • Start a connected lesson;

  • Set an assignment;

  • Start a presentation.

Iconography exploration

Iconography exploration

Final options to test per lesson type.

Final options to test per lesson type.

 

On the copywriting side, we had some options to think about and had to decide which ones to test with teachers:

Copywriting explorations

Copywriting explorations

Final options to test per lesson type

Final options to test per lesson type

First round of usability tests

In the initial tests we wanted to gather qualitative data from teachers. We wanted to understand what they felt and perceived from the different terminology and iconography versions. In order to achieve this we created two prototypes so we could test Version A and Version B with three teachers each, and gather the same amount of feedback for both..

usability-test.png

With this information, we were able to understand which copy we had to review, as well as which icons would work or be completely misunderstood by teachers. 

 

Second round of usability tests

In this second session of testing we needed a higher volume of data and preferences when it came to decide which icons to choose.

We also understood that we would never fully know how teachers perceived the button terminology unless we asked them, and then placed them in a scenario and asked them to pick which solution they would choose.

Using typeform to help us test  

These tests were created with typeform because it gave us the flexibility to mix and match different types of questions and data gathering. 

We kept the A/B testing structure for half of the questions asked – mainly for the copywriting challenge – and the same ones for the iconography and scenario placement. We recruited 23 teachers and sent them the following test:

Expectation based on terminology understanding

Expectation based on terminology understanding

Double checking if there was any problem or misunderstanding with the text

Double checking if there was any problem or misunderstanding with the text

 
Iconography preference

Iconography preference

Scenario placement

Scenario placement

The findings:
When looking into the survey answers we had clear winners when it came to iconography and text perception.

final-combination.png
 

“It’s easy to understand and see that the image contains a screen, teacher, and pupils.”

Feedback on the first button to “Start a connected lesson“.

“The text explains well that I can send assignments to my students.”

Feedback on the second button to “Set an assignment“.

We felt more confident and reassured with the test results, and moved forwards with the implementation phase. We released this feature to production within the limit of the nine weeks at the end of December. We had clear plans for improvements for the near future but firstly we wanted to gather data and users' feedback on the feature.

 

Outcome

At the time of release we couldn’t have imagined the impact that this feature would have. After a few months of usage we looked into the teaching conversions and we were surprised that this method was used even more than one-off connected lessons..

North star deconstructed monthly.png
  • Total lessons: 211

  • Regular lessons: 135 (64%)

  • Presentation lessons: 47 (22%)

  • Quick-start lessons: 29 (14%)

lessons-data.png

We also found out that it helped to increase our North Star compared to the previous year.

north-star-data.png
 

We can now start to plan improvements for it, so we can bring even more value to teachers and students using this feature.

Date: January, 2019
Product: EF Class