This is the virtual presentation used at EduLearn21.
BLENDED & MOBILE LEARNING
Event: EDULEARN21
Track: Digital & Distance Learning
Session type: VIRTUAL
Abstract: https://iated.org/concrete3/view_abstract.php?paper_id=88226
Proceedings of EDULEARN21 Conference
5th-6th July 2021
ISBN: 978-84-09-31267-2
pages 1056-1066
User Experience Design and Usability Testing for Mobile Technology Support in Outdoor Education
1. USER EXPERIENCE DESIGN AND USABILITY TESTING FOR
MOBILE TECHNOLOGY SUPPORT IN OUTDOOR EDUCATION
Dr. Renée Schulz
Osaka University// University of Agder
13. Dynamic Learning Task Design
13
• From both insights (learning tasks and game tasks)
• The structure for tasks in the DynQ core concept was outlined as:
• Trigger(s)/Dependencies
• Sensor Logging
• Feedback
• The structure supports the needed processes for task creation,
distribution, and feedback
16. UX and User Testing Results
System Testing and Evaluation
• Most problems and limitations lie in the areas of communication between devices and
• cloud storage when students are in a zone without cellular connection/Wi-Fi for a long
time.
• Saving battery/ weather/ temperature is not good for batteries and touch screens
• Calculation of results, CPU/RAM power, cloud connection
• Size of GUI/ gloves
• Intervals of logging
• Touch gestures
16
17. UX and User Testing Results
UX of Task and Sensor Data
• Recorded data must be very specific/specified
• Important for logging/feedback as well as task triggers
• “Full freedom” of choice, is too confusing
• Useful examples could be:
• Average speed during a task, maximum speed during a task, highest acceleration (for some tasks) over a
certain amount of time, number of accelerations, and time spent doing a task/ completion time.
17
18. UX and User Testing Results
User Interaction Flow
• Interactions need to be short/fast (cold, snow, weather, clothes)
• Main functions in plain sight
• Only own courses shown (both for teachers and students)
• Interaction affordances clear
• Sensor usage must take users smartphones battery life into account
• Personalized tracking/ settings changeable
18
19. UX and User Testing Results
Multimedia Content Requirements
• Multimedia content additions, e.g., pictures and videos for training
• Reason: observing and understanding techniques
• Recording options to send multimedia content back to the teacher and for self-evaluation
19
20. UX and User Testing Results
Learning Task Location
• Start/end points of tasks are important – selection must be easy
• Both locations shown together, plus a path??
• Clear creation options for triggers, visible and hidden tasks and what feedback should be collected etc
• Clear instructions for creation/teachers and students
20
21. UX and User Testing Results
Students‘ Perspective on Learning Task Design
• the positive impact of feeling the progression of learning new skills based on the guidance and tasks given by
the teacher
• positive experiences were learning a new technique, understanding, and mastering techniques based on the
teacher’s explanations and guidance
• challenge of fast progression, connected to be motivated and pushed further by the teacher
• individual feedback was appreciated by most students
• Beginner students experienced long waiting times, because they had to go one-by-one to get feedback from the
teacher
• mismatch between task progression and the progression of individual students in a group can be critical
21
“I should have realized my own limitations. He [the teacher] wanted
us to ollie [a snowboard technique for jumping]. That was the worst (scary).”
22. • New technologies/sensors emerge and become more feasible
• More options for task trigger and feedback
• Personalized tasks and training
• Broadcasting information
Discussion and Conclusion
22
• Generalizable?
• Sports specific aspects,
• but also, interesting new learning
task takes
23. References and further links
This research was part of my PhD research project, and the full dissertation can be found here: http://hdl.handle.net/11250/2499694
[1] R. Schulz, G. M. N. Isabwe, & A. Prinz, “Development of a Task-driven Mobile Teaching Tool for Enhancing Teachers' Motivation,” Proceedings of the 8th International Conference
on Computer Supported Education (CSEDU 2016), vol. 1, pp. 251-258, April 2016.
[2] R. Schulz, G. M. Isabwe, & F. Reichert, „Investigating teachers’ motivation to use ICT tools in higher education,” 2015 Internet Technologies and Applications (ITA), pp. 62-67, IEEE,
September 2015.
[3] A. C. Borthwick, C. L. Anderson, E. S. Finsness, & T. S. Foulger, “Special article personal wearable technologies in education: Value or villain?,” Journal of Digital Learning in Teacher
Education, vol. 31, no. 3, pp. 85-92, 2015.
[4] M. Bower, & D. Sturman, “What are the educational affordances of wearable technologies?,” Computers & Education, vol. 88, pp. 343-353, 2015.
[5] S. E. Stahl, H. S. An, D. M. Dinkel, J. M. Noble, & J. M. Lee, “How accurate are the wrist-based heart rate monitors during walking and running activities? Are they accurate
enough?,” BMJ open sport & exercise medicine, vol. 2, no. 1, 2016.
[6] M. Swan, “Sensor mania! the internet of things, wearable computing, objective metrics, and the quantified self 2.0,” Journal of Sensor and Actuator networks, vol. 1, no. 3, pp.
217-253, 2012.
[7] G. Bieber, M. Haescher, & M. Vahl, “Sensor requirements for activity recognition on smart watches,” Proceedings of the 6th International Conference on PErvasive Technologies
Related to Assistive Environments, pp. 1-6, May 2013.
[8] A. Kalyanaraman, J. Ranjan, & K. Whitehouse, “Automatic rock climbing route inference using wearables,” Adjunct Proceedings of the 2015 ACM International Joint Conference on
Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, pp. 41-44, September 2015.
[9] S. Nylander, M. Jacobsson, & J. Tholander, “Runright: real-time visual and audio feedback on running,” CHI'14 Extended Abstracts on Human Factors in Computing Systems, pp.
583-586, 2014.
[10] M. Alhonsuo, J. Hapuli, L.Virtanen, A. Colley, and J. Häkkilä. “Concepting wearables for ice-hockey youth,” Proceedings of the 17th International Conference on Human-Computer
Interaction with Mobile Devices and Services Adjunct, pp. 944–946, 2015.
[11] S. De Freitas, & M. Levene, “Evaluating the development of wearable devices, personal data assistants and the use of other mobile devices in further and higher education
institutions,” JISC Technology and Standards Watch Report, (TSW030), pp. 1-21, 2003.
[12] T. Coffman, & M. B. Klinger, “Google glass: using wearable technologies to enhance teaching and learning,” Society for information technology & teacher education international
conference, pp. 1777-1780, Association for the Advancement of Computing in Education (AACE), March 2015.
[13] T. Wu, C. Dameff, & J. Tully, “Integrating Google Glass into simulation-based training: experiences and future directions,” Journal of Biomedical Graphics and Computing, vol. 4,
no. 2, pp. 49-54, 2014.
[14] ISO DIS. 2009. 9241-210: 2010. Ergonomics of human system interaction-Part 210: Human-centred design for interactive systems. International Standardization Organization
(ISO). Switzerland (2009).
[15]F. Wang, & M. J. Hannafin, “Design-based research and technology-enhanced learning environments,” Educational technology research and development, vol. 53, no. 4, pp. 5-23,
2005.
[16] R. Schulz, A. Prinz, & G. M. N. Isabwe, “The Use of Game World Tasks Concepts in Higher Education,” Serious Games. JCSG 2016. Lecture Notes in Computer Science, vol. 9894,
pp. 67-72, Springer, Cham, September 2016.
23