

Usability Testing & Reflection
While creating my Google Classroom course on Banner Automation for Financial Aid professionals, I knew it was important to involve those who would be directly impacted by the training. I reached out to colleagues who work with Banner in financial aid every day to get their honest input (Martin & Bolliger, 2018). Their feedback helped me see what would actually be useful and the challenges that came with it. By conducting usability testing with these stakeholders, I was able to see how the course flows in real time and identify any areas of confusion or improvement (Molenda, 2015). This testing phase turned out to be just as important as creating the course content itself. It gave me confidence that when staff log in for the first time, they’ll find the course practical, clear, and ready to help them streamline their work.
Completing the instructional design and implementation overview assignments provided a clear roadmap for building my course, outlining its structure, objectives, and delivery strategies. These steps led to usability testing, allowing me to evaluate how effectively the designed components supported learner navigation, understanding, and engagement in real-world use.
Course Reflection
Conducting usability testing for my Banner Automation course for Financial Aid professionals was an eye-opening and affirming experience. At first, I felt confident about the course’s design because I built it from the lens of daily tasks and pain points we all encounter in Banner. However, observing participants navigate the modules highlighted nuances I had overlooked. For example, while my instructions for automated packaging setup seemed clear to me, testers requested more visuals and step-by-step screenshots to reinforce confidence before performing live actions in Banner.
​
The Start Here section was intuitive for most, but one tester noted that adding an overview video explaining how automation will transform their workload could enhance buy-in. I also noticed that testers appreciated embedded reflective checkpoints after major topics, which validated their learning and clarified misconceptions immediately (Clark & Mayer, 2016). This feedback reminded me of Krug’s principle that “you are not your user,” emphasizing the importance of authentic user perspectives (Krug, 2014).
​
Overall, the usability testing reaffirmed that my content is meaningful but must remain grounded in clarity, simplicity, and relevance to their real Banner processes. Moving forward, I will continue to embed testing opportunities to ensure the course remains a tool that empowers staff with confidence, efficiency, and a sense of progress.
References
Clark, R. C., & Mayer, R. E. (2016). E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning (4th ed.). Wiley.
​
Krug, S. (2014). Don’t make me think, revisited: A common sense approach to web usability (3rd ed.). New Riders.
​
Martin, F., & Bolliger, D. U. (2018). Engagement matters: Student perceptions on the importance of engagement strategies in the online learning environment. Online Learning, 22(1), 205-222. https://doi.org/10.24059/olj.v22i1.1092
​
Molenda, M. (2015). In search of the elusive ADDIE model. Performance Improvement, 54(2), 40–42. https://doi.org/10.1002/pfi.21461

