Troubleshoot to learn

TYPE

 ROLEs

Duration

Bachelor thesis, IIT Guwahati (India)

Interaction designer 

9 months (August 2012 - April 2013)

Introduction

Traditional learning methods and curriculum design at schools seldom foster critical thinking. Interventions to bring back the 'WHY' age has never been so relevant as in this era of unfiltered information sources (internet).

During our Bachelor's graduation project, in a team of two, we endeavored to dazzle the world of e-learning. We designed an educational platform sudents study their curriculum by inquiring rather than consuming information!

Keywords:
e-Learning, Interaction design, Mid-fi prototyping, Wizard of Oz, Effectiveness testing, Critical thinking, Problem-based learning

Goals

Given easy access to them in our engineering school, we defined Mechanical Engineering undergraduates as our target group, and their machine mechanics course as the target curriculum. 

Primary goal: Design a pedagogical intervention that demands critical thinking to be able to make progress; based on our research, solving an ill-defined problem triggers many aspects of critical thinking.

Secondary goal: Overcome the lack of applied mechanical engineering skills; in our early research, we found that theoretical learning was found insufficient to be successful at getting a job at campus placements.

Design outcome

eLearning platform that challenges you to troubleshoot a machine, and learn about its working mechanisms by inquiring, and actively tinkering with it. 

The platform provides you with tools like: 

- Interactive model of the machine to inquire about specific information
- Hypotheses building and testing features
- Game-like rewarding feature for engagement
- Dynamic scaffolding to support learning while keeping it challenging

Iterative Design process

Utilizing several design research methods, we took an iterative approach to build this platform.


Steps towards the final outcome involved:


- Made problem definition crisp: Back and forth from literature search and problem search in institute environment, we analyzed several target groups and curriculum, ranging from well-defined math problems to ill-defined design problems.

- Understood the most pressing needs of the learners: Task analysis in a Wizard of Oz fashion to uncover information needs in the learning process. Interviews with students and domain expert (a professor) to understand perceived gaps in theoretical and applied machine knowledge.

- Sketched mental model: Through Wizard of Oz method, also understood the flow of solving a challenge. Took into account learner psychology; e.g., design to keep cognitive load low enough so that the learner can grasp new concepts easily, while at the same time, making the load high to challenge them, push their limits, and keep them engaged. Dynamic scaffolding was used to adapt the learner's load based on the task and their performance.

- Selected features and layout: Used card-sorting type method with 15 participants to sketch out how.

- Defined user-system actions and translated them to wireframes: Based on all prior understanding of which features are useful during the flow of troubleshooting a mechanical machine (here, the use-case was of a centrifugal pump).

- Prototyped the eLearning platform: Hi-fidelity prototyping was done (semi-functional), to test effectiveness of key features. A friend with programming expertise helped us with creating the front-end and back-end of the prototype after we gave him the system requirements. Look and feel was kept crude to emulate a gaming-like environment for heavy machinery.

We created content guidelines based on the knowledge provided by a Mechanical Engineering professor (domain expert) on centrifugal pump machine, challenge, questions, and learning material related to it.


- Effectiveness and usability testing: With 5 students, we qualitatively tested the effectiveness of the platform, we had two key metrics: performance on domain knowledge test after using the system for an hour, and engagement level during the use. And for testing usability, we measured perceived for ease of use, learnability of the system, comprehensibility, and no. of and types of errors.

For usability, the system performed fairly well. Key improvement points were: To improve visibility of error messages/notification (which originally were shown from a top-bar entering the screen), better error recovery, and better accessibility to information related to a term.

The test results were also positive on effectiveness. 4 out of 5 participants performed well on the domain knowledge test (questions prepared by professor). They were engaged enough to continue finding all other correct answers, even when technically they passed the challenge after finding one (of several) correct answers.

- Implementing learnt insights into a re-design: For final design outcome, we incorporated some key improvements from user feedback in the screen designs, including information architecture, highlighting key features the most, and making the look and feel cleaner for learning while retaining the gaming visual quality.

What I learnt:

This project was important in my design career because:

- It gave me experience in following a complete interaction design cycle for solving an important and complex problem
- It allowed me to experiment with and use different research methods
- I could truly utilize the opportunities provided by my institute and incorporate inter-disciplinary knowledge in my design
- The project illustrated to me the importance of psychology in human computer interaction, and drove me to pursue higher education that focused a lot on the intersection of psychology, design, and ICT. 

I got passionate about 'Designing for education', and about 'UX for intelligent systems'.

Motivated to dive deeper in this field, I did a side-project during my winter holidays to author a Springer conference paper titled 'Intelligent Interactive Tutor for Rural Indian Education System', authors: Omna Toshniwal, Pradeep Yammiyavar.