top of page

Usability Testing and Reflection

 

        Designing and organizing this World History course felt very similar to teaching a brand-new group of middle schoolers during the first week of school, you think everything is labeled clearly, the routines are obvious, and the expectations make perfect sense, until a student clicks the wrong assignment three times in a row and asks, “Is this for a grade?” That’s exactly why I approached this usability testing. What seems obvious to the teacher is not always obvious to the learner.

 

To make the testing authentic, I recruited the best group of stakeholders I could, my senior students who had me for World History last year. They already knew my teaching style, my Google Classroom patterns, and my old course layout, which meant they could tell me immediately whether my new Start Here section and redesigned modules were actually clearer. These students were honest, comfortable enough to tell me when something didn’t make sense, and experienced enough to compare the organization to what they remembered from last year.

 

I planned each session to take about 20 minutes, enough time for students to navigate the Start Here Checklist, World History/Module 1, review the Welcome Post, the Course Overview, the “Welcome to World History” video, and the “How to Succeed in World History” quiz, and then explore the first module. Their task was simple: move through the Start Here section as if they were brand-new learners, then examine Module 2 (First Humans) to see whether vocabulary, videos, notes, Edpuzzle, and review games followed a logical sequence. A short Google Form captured their responses.

 

Because Google Classroom is my LMS, the testing happened inside a familiar platform, but it still revealed important insights. Google Classroom scrolls vertically, so the order of assignments matters more than I realized. A few seniors mentioned that the videos in one module felt in the middle of everything instead of following instructions, which made me realize I needed to reorder items for clarity. On the other hand, many commented that the modules were “organized,” “easy to navigate,” or “on topic,” and several said the games (Gimkit/Blooket) were their favorite part of the whole experience, of course they did they all loved playing games and competing against each other. Those responses showed me that my interactive elements are doing exactly what they’re supposed to, increasing engagement and helping students reinforce vocabulary and content.

 

One big lesson was the value of consistency. Because each module follows a predictable pattern, vocabulary → notebook → instructional video → game → quiz → reflection, students said they felt more confident moving through the content. Another important takeaway was how essential instructions are. Even though I write directions carefully, students still asked for clearer sequencing in a few spots. Just like Krug (2010) reminds us, if learners have to stop and think about what to click next, the design isn’t done yet. Their feedback reinforced that I should make explicit what I had assumed was implied.

 

The Form responses also helped me reflect on alignment. Content wise, everything was already in the right place, the modules built from First Humans to Civilizations, then to Mesopotamia, Egypt, India, and China. But the usability test helped me ensure that the delivery of the content matched the intended learning outcomes. The more streamlined and predictable the layout is, the easier it is for students to understand the “why” behind each activity, especially in a blended environment where they may work independently.

 

Looking ahead, I plan to make a few important changes:

  • Reorder videos and assignments to match the exact flow I want students to follow

  • Add clearer instructions, especially where multiple tools (like Edpuzzle, notebooks, or games) appear

  • Create quick-start guides or short walkthrough videos for tools like Gimkit and Edpuzzle

  • Strengthen the Start Here section so it truly prepares students for the modules

 

I will also build more infrastructure guidance into the course. Students shouldn’t be stuck at “Where’s the link?” or “What do I click next?” when they’re trying to learn world history. 

 

Overall, this usability testing experience was humbling, encouraging, and incredibly helpful. My seniors gave me genuine, honest insights that helped me refine the learning path and strengthen the student experience. This process confirmed that while my course content is strong, the presentation can always become clearer, friendlier, and more intuitive. In the end, usability testing didn’t just improve my course, it reminded me that even well-designed lessons need real users before they become truly student-centered.

 

References

 

Krug, S. (2020, May 6). Usability Test Demo [Video]. YouTube. https://www.youtube.com/watch?v=1UCDUOB_aS8

bottom of page