MOOC pretesting: the importance, the process, and the experience

Authors: Shokoofeh Motamedi,¹ Rashidat George¹,  Vesna Belogaska,² and Eila Burns¹|

Abstract

Massive Open Online Courses (MOOCs), as the modern version of educational broadcasting in the digital era, have seen overwhelming public approval in recent years. To acquire user satisfaction, during the creation of a new design of MOOC, it is important to consider the usability of the product by arranging a user pretesting procedure. In this article, we explore some backgrounds to the importance of a user testing and describe the assessment process followed as well as lessons learned as a pre-tester. This work belongs to JAMK’s Future Factory activities that is offered through the “Pedagogy Lab” -process at the School of Professional Teacher Education.

Introduction

These days, attending Massive Open Online Courses (MOOCs) and e-courses are getting more and more common all around the world. MOOCs are tools used in educational institutions for enhancing and adopting strategies for online teaching and methods to open education. It considers the context of online learning and openness. However, there are different MOOCs that reflect different philosophies and purposes.

When designing a new MOOC, it is important to consider aspects, such as, a platform software, a number and length of video lectures, computer-marked assignments, peer assessment possibilities, user applicability, autonomy, interactivity, and diversity (Bates, 2015). Before publishing a new MOOC, it is important to perform usability and especially pedagogical usability testing on the content to make sure the content is satisfactory for their target end-users.

Usability testing involves methods that ensure the user-product interaction and identify problems (Chrum, 2012). Usability evaluation is considered important to understand the product’s grounds and to improve the content before the final version of the product’s design. Effectiveness and efficiency of the MOOC would enable the accuracy of user-product, leading to user satisfaction. The usability in an educational learning environment requires a good visual and working user interface; with tools for each learning activity and user-system interaction process (Mifsud, 2015).

In the context of online courses, the usability of a MOOC involves learnability, efficiency, memorability, errors, and satisfaction (Jirgensons, 2015). Satisfaction is the main factor of pedagogical usability which focuses on the user’s motivation and their level of enjoyment from completing the course (Jirgensons, 2015). It is shown that poor usability significantly affects the learning experience of learners (Abuhlfaia & De Quincey, 2019).

It is important to notice that a MOOC assessment’s process can involve usability testing for both the MOOC platform and MOOC content. For assessing the MOOC platform, a variety of usability testing tools and methods is proposed in literature such as heuristic evaluation, laboratory tests, questionnaires, thinking aloud, and so on (Zardari et al., 2020). Also, a comprehensive checklist has proposed for using as an adaptive usability evaluation instrument for MOOCs’ platform (Johansson & Frolov, 2014).

The usability testing for MOOC content can also be considered as a quality assessment of the online course. For this purpose, many factors need to be assessed including target group and users, learning objectives, learning process, and pedagogical solutions, assignments, contents, and materials employed online tools, interaction means, guidance and feedback, evaluation, development, usability and visuals and support services (Varonen & Hohenthal, 2017). One recommended method to assess the quality of an online course is using professionally designed questionnaires or rubrics (Illinois Online Network, 2018; Löfström et al., 2006). The results will be more accurate if the questionnaire would be filled by both educators and students to reflect the views of both producer and consumer of the product.

MOOC platform and content evaluation process

In this project, two student teachers volunteered to be pre-testers of a MOOC and share their experiences. The MOOC used in this pedagogical project is called “Video-Supported Collaborative Learning (VSCL)”, developed by the ViSuAL Knowledge Alliance in UK and offered on IRIS Connect website.  The pre-testers were asked to evaluate if the course design aids and facilitates a passive or generative role for the learner.

Before starting the course, the pre-testers got themselves familiar with the concept of design-based research and some literature related to quality criteria set for online courses (Alturkistani et al., 2020; Barnes & Hanna, 2017; Löfström et al., 2006; Varonen & Hohenthal,  2017). They also were provided with a short explanation from the organizer of MOOC about the aspects that are more interested to get feedback including course registration, interacting on the platform, the flow of the content, the balance of information and learning activities, and the opportunities for collaboration (albeit asynchronously).

VSCL was offered as a three-module course which comprised different exercises, peer discussions, and reflective sections. The content of VSCL MOOC included an introductory and welcome session to the MOOC that enabled the user to get familiar with the research model and a brief introduction from the MOOC organizer about the platform and learning activities, module 1 changing roles of teachers, module 2 video technologies for VSCL, module 3 planning a learning scenario and closing remarks and as such, the user oversaw their own learning. The task was quite a vast learning approach with lots of moderate exercises that needed time, logic, and reasoning for executing each task. However, it was valuable for initiating discussions and self-reflection.

By getting access to the course, pre-testers went through the lessons one by one and took notes step by step. Since it was a project of pre-testing both the MOOC content and MOOC platform, it was a time-consuming task to check all the criteria for both course and the platform yet stay in line with the course material and follow the subject. After completing the course, the pre-testers completed a quality assessment questionnaire suggested by Löfström (Löfström et al., 2006). Besides, each pre-tester prepared an individual feedback report consisting of their notes, their experience and thoughts during the course, and suggestions to improve the content or the platform. The feedback report and filled questionnaires were sent to both MOOC designers and the supervisor.

After completion of the course and the pre-testing process, MOOC organizers reviewed the feedback reports and discussed the content with pre-testers via several email exchanges. Also, a certificate of participation was issued for each pre-tester as a means of reward.

MOOC platform provider’s experiences  

As a collaborative professional learning platform that was a pioneer in video-based teacher professional development since 2008, IRIS Connect had been considered as a potential next-generation MOOC platform in the past.

Despite not being designed for hosting a MOOC, the advantages of some of the platform’s features (e.g., interactivity, security, and always-on mode in an open-source context) and the international community of educators it already engages, outweighed the limitations of the current lack of some other features found in conventional MOOC platforms (e.g. course facilitation and progress tracking).

Being actively involved in the project, from designing and facilitating experimentations on technology-enhanced pedagogy to co-creating the resources for video-supported collaborative learning with the pedagogical experts from the HEI project partners, influenced further the IRIS Connect team’s decision to host the ViSuAL project’s MOOC. The learning design process of the MOOC, both from the content and user interface perspective, was a rich and fulfilling experience in combining the diverse MOOC creators’ strengths and competencies in pedagogy, instructional design, interactive educational technology and blended learning – a great example of sustainable HEI-business collaboration. A vital part of that process was the usability testing of the MOOC and the pre-testing feedback – both the quality assessment questionnaire and especially the individual feedback report.

The comprehensive, qualitative feedback by the student pre-testers was invaluable not only for enhancing the ViSuAL MOOC and therefore its participants’ experience, but it went beyond that: it provided a user-insight that would inform our approach in the further development of the IRIS Connect platform from both content and interface perspective. The collaborative process of co-creating and pre-testing the MOOC also inspired a commitment to incorporate such collaboration with education stakeholders in further developments of IRIS Connect to fulfill our vision for being a hub for education practitioners’ in the education eco-system and bridging the gap between theory and practice.

MOOC pre-testers’ experiences  

Both volunteers had the experience of attending online courses previously, but it was their first experience of pretesting a MOOC. They believed that the literature materials that were suggested to study before starting the test were quite insightful and informative. The literature review helped pre-testers to pay more attention to the details such as the employed pedagogical approach in the course, the collaboration opportunities, the importance of links and resources to be functional, the flow of the course, and so on.

The MOOC pre-testers attended randomly small groups for peer assessment, supporting materials, shared comments, discussions, and other user experiences to get a different perspective, learning analytics, and variations in MOOC designs for different user applicability.  From the point of pre-testers, sometimes it was a challenging task to decide if certain feedback is just a personal idea or a standard requirement of online courses. During the assessment process, it was needed to stop from time to time, and think about the content, the way that it is presented and the learning objective that the content was supposed to support.

The Interface for users must be intuitive and the navigation process easy and self-explanatory. In essence, the learning activity and educational process need to work smoothly. Having used the VSCL MOOC, the structure enables a stronger connection between work and learning, facilitates just-in-time learning, and empowers professional development for teachers and change agents in the education systems. As regards the quality of the MOOC a lot is considered for user satisfaction and applicability, it was worth going through the entire process to improve the usability interface.

During this project, the pre-testers acquired a provision about the complexity of designing a quality online course, full of implementation and pedagogical details. Moreover, knowing more about the production process of online courses and the impact of MOOC platforms on the course designing phase were among the other findings of this project.

All in all, in the point of view of pre-testers, it was an exciting, yet educational and informative project, and they were inclined to attend to any other similar project in the future. They found the pre-testing experience so valuable and encourage everybody and especially teachers to attend similar projects to learn about designing quality content and get familiar with related challenges and opportunities.

Final thoughts

The revolution of the digital age has enabled the awareness and openness of MOOCs. Like any other product, it is important to perform usability testing on a prepared MOOC before publishing it. In this article, a pretesting process was explored and the experiences of two pre-testers and the MOOC platform provider were presented. From the pre-tester’s perspectives, the fluency and intuitive design of the MOOC interface have the highest impact on acquiring user satisfaction. Also, the learning activities should be aligned with educational goals and should support the learning process. On the other hand, using innovative features in the MOOC platform to promote interactivity, security, and always-on mode can enhance user satisfaction significantly.

The results of the MOOC usability testing, gathered by the questionnaires and qualitative feedback reports, provided valuable user-insights to both the content and the interface. These insights can be used to improve the MOOC’s content before its final publication but also to take into consideration when developing new MOOCs. Based on the shared experiences in this project, it is highly recommended to all MOOC designers to conduct a structured user testing to find any possible pitfalls and hear feedback from users and address them before its final publication.

Authors

Shokoofeh Motamedi,¹ Rashidat George¹,  Vesna Belogaska,² and Eila Burns¹

¹ JAMK University of Applied Sciences, School of Professional Teacher Education

² Director of International Development, IRIS Connect, https://www.irisconnect.com/uk/

 

References

  • Abuhlfaia, K., & De Quincey, E. (2019). Evaluating the Usability of an E-Learning Platform within Higher Education from a Student Perspective (p. 7). https://doi.org/10.1145/3371647.3371661
  • Alturkistani, A., Lam, C., Foley, K., Stenfors, T., Blum, E. R., Van Velthoven, M. H., & Meinert, E. (2020). Massive Open Online Course Evaluation Methods: Systematic Review. Journal of Medical Internet Research, 22(4). https://doi.org/10.2196/13851
  • Barnes, C., & Hanna, M. (2017). An Analysis of Student Perceptions of the Quality and Course Satisfaction of Online Courses. Retrieved 12 Dec.2020 from https://www.researchgate.net/publication/317138073_an_analysis_of_student_perceptions_of_the_quality_and_course_satisfaction_of_online_courses
  • Bates, A. W. (Tony). (2015). Teaching in a Digital Age. Tony Bates Associates Ltd. Retrieved and accessed 12 Dec.2020 From https://opentextbc.ca/teachinginadigitalage/
  • Chrum, T. (2012). An Introduction To Website Usability Testing. Usability Geek. https://usabilitygeek.com/an-introduction-to-website-usability-testing/
  • Illinois Online Network. (2018). Quality Online Course Initiative Rubric. University of IllinoisSpringfield. https://uofi.app.box.com/s/afuyc0e34commxbfn9x6wsvvyk1fql8p
  • Johansson, S., & Frolov, I. (2014). An Adaptable Usability Checklist for MOOCs: A usability evaluation instrument for Massive Open Online Courses. Retrieved and accessed 12 Dec.2020 From http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-90330
  • Löfström, E., kanerva, kaisa, tuuttila, leena, Nevgi, Lehtinen, & Nevgi, Anni. (2006). QUALITY TEACHING IN WEB-BASED ENVIRONMENTS: Handbook for University Teachers. 110.
  • Mifsud, J. (2015). Usability Metrics—A Guide To Quantify The Usability Of Any System. Usability Geek. https://usabilitygeek.com/usability-metrics-a-guide-to-quantify-system-usability/
  • Varonen, Mari & Hohenthal, Tuula. (2017). QualityCriteriafor Online Implementations. eAMK. https://eamk.fi/globalassets/tutkimus-ja-kehitys–research-and-development/tki-projektien-lohkot-ja-tiedostot/eamk/teema-1/laatukriteerit/kuvat-en/eamk_quality_criteria_for_online_implementations.pdf
  • Zardari, B., Hussain, Z., Aijaz Ahmed Arain, D., Rizvi, W., & Vighio, M. (2020). QUEST e-learning portal: Applying heuristic evaluation, usability testing and eye tracking. Universal Access in the Information Society, 1–13. https://doi.org/10.1007/s10209-020-00774-z

URN: http://urn.fi/urn:nbn:fi:jamk-issn-2489-2386-13