4. 4
Greller, W., & Drachsler, H. (2012). Turning Learning into Numbers. Toward a Generic
Framework for Learning Analytics. Journal of Educational Technology & Society.
http://ifets.info/journals/15_3/4.pdf
5. @HDrachsler, #LASI_NL, Zeist, Netherlands
Slide 5 / 29 June 2014
1. Why LA data
standard?
2. What data
standards are
out there?
3. Indepth
exampe xAPI
4. Different
LRS designs
Lecture structure
5. Outlook
6. Sophistican model
Siemens, G., Dawson, S., & Lynch, G. (2014). Improving the Quality and Productivity of the Higher Education
Sector – Policy and Strategy for Systems-Level Deployment of Learning Analytics. Canberra, Australia: Office of
Learning and Teaching, Australian Government. Retrieved from
http://solaresearch.org/Policy_Strategy_Analytics.pdf
7. Heterogeneous TEL systems not made for
Learning Analytics
Onderwerp via >Beeld >Koptekst en voettekst
Pagina 7
• Various heterogonous data
sources
• No metadata standards
• No proper description of
data fields
• No unique user ID in the
different systems
• Not intended for evaluation
and educational
interventions
• No comparison of effective
methods
8. • RQ1: How to generate more accurate and thus,
more relevant recommendations by using the
social data originating from social activities of
users within an online environment?
• RQ2: Can the use of the inter-user trust
relationships that originate from the social activities
of users within an online environment further
evolve the network of users?
Example RecSys study
@SoudeFazeli
10. 10
Educational Data
Drachsler, H., et al. (2010). Issues and Considerations regarding Sharable Data Sets for
Recommender Systems in Technology Enhanced Learning. 1st Workshop Recommnder
Systems in Technology Enhanced Learning (RecSysTEL@EC-TEL 2010) September, 28,
2010, Barcelona, Spain.
Verbert, K., Manouselis, N., Drachsler, H., and Duval, E. (2012). Dataset-driven Research
to Support Learning and Knowledge Analytics. Journal of Educational Technology & Society.
Important to report effects from algorithm Y to a reference dataset, to
gain common knowledge, and have reproducible results.
ACM Recommender Systems, and KDD cup work like this since years.
11. 1. Goal
To find out which recommender algorithms best
performs and thus, is suitable for social online
platforms like ODS platform
Data-driven study
Fazeli, S., Loni, B., Drachsler, H., & Sloep, P. (2014, 16-19
September). Which recommender system can best fit social
learning platforms? Presentation given at the 9th European
Conference on Technology Enhanced Learning (EC-TEL2014),
Graz, Austria. http://dspace.ou.nl/handle/1820/5800
12. 2. Method
• Testing several recommender algorithms
– Several similarity measures and nearest neighbors method
– T-index approach
• If explicit trust is available (Epinion)
• If trust is not available: similarity measures + walking algorithm
(BFS)
• Datasets
– MovieLens – standard dataset
– MACE, OpenScout, Travel well -- similar to the future ODS dataset
• Using Mahout
Data-driven study
13. 3. Setting
• v = 0.1 (Condition 1), L = 2 (Condition 2)
• Training set 80% and test set 20%
• Sizes of neighborhoods n= (3,5,7,10)
• Size of TopTrustee list m=5
Data-driven study
14. 4. Result (F1 score)
F1 of the extended T-index and Tanimoto algorithms for
different datasets, based on the size of neighborhood
Data-driven study
15. 4.2. Created trust network
Without T-index With T-index
Data-driven study
17. Aggregated Paradata
Drachsler, H., Bogers, T., Vuorikari, R., Verbert, K., Duval, E., Manouselis, N., Beham, G., Lindstaedt, S., Stern, H., Friedrich, M., &
Wolpers, M. (2010). Issues and considerations regarding sharable data sets for recommender systems in technology enhanced learning.
In N. Manouselis, H. Drachsler, K. Verbert, & O. Santos (Eds.), Elsevier Procedia Computer Science: Volume 1, Issue 2. Proceedings of
the 1st Workshop on Recommender Systems for Technology Enhanced Learning (RecSysTEL 2010) (pp. 2849-2858). doi: 10.1016/
j.procs.2010.08.010.
20. 1. More useful analysis through the
combination of data from different sources
2. A critical mass of data for learning science
research
3. Sufficient scale of data to determine
relevance and quality of educational
resources
4. Reproducibility and transparency in
learning analytics research
5. Cross-institutional strategy comparison
6. Research on the effect of education policy
7. Social learning in informal settings
8. Learner data as a teaching and learning
resource
Aims for Data Standards
http://www.laceproject.eu/deliverables/d7-2-data-sharing-roadmap/
21. MOLAC Innovation Cycle
Drachsler, H. & Kalz, M. (2015). The MOLAC Innovation cyle. Journal of
Computer Assisted Learning. (in press).
22. @HDrachsler, #LASI_NL, Zeist, Netherlands
Slide 22 / 29 June 2014
1. Why LA data
standard?
2. What data
standards are
out there?
3. Indepth
exampe xAPI
4. Different
LRS designs
Lecture structure
5. Outlook
23. Onderwerp via >Beeld >Koptekst en voettekst
Pagina 23
• Content metadata (e.g., IEEE LOM).
• Personal Data (e.g., IMS ePortfolio, IMS LIP,
or HR-XML)
• Social metadata (ratings, tags or comments
that were intentionally contributed by the
users)
• Paradata (automatically tracked by the
system)
• Linked Data (interlinked datasets on the web
using the RDF standard)
Types of Data
24. Onderwerp via >Beeld >Koptekst en voettekst
Metadata standards for Usage
Activity Stream
Learning Registry
NSDL Paradata
27. Context Attention Metadata
Scheffel, M., Niemann, K., Leony, D., Pardo, A., Schmitz, H. C.,
Wolpers, M., & Kloos, C. D. (2012). Key action extraction for
learning analytics. In 21st Century Learning for 21st Century Skills
(pp. 320-333). Springer Berlin Heidelberg.
Nikolas, A., Sotiriou, S., Zervas, P., & Sampson, D. G. (2014). The
open discovery space portal: A socially-powered and open
federated infrastructure. In Digital Systems for Open Access to
Formal and Informal Learning (pp. 11-23). Springer International
Publishing.
28. Context Attention Metadata
Wolpers M., Najjar, J., Verbert, K., Duval, E. (2007). Tracking Actual Usage: the
Attention Metadata Approach, Journal of Educational Technology and Society,
10 (3), 106-121.
29. How Tin Can API works
Tin Can enabled activities send simple statements to a Learning
Record Store.
LRS
Elearning Game Simulator Blog YouTube
30. Most strong candidates, right now
Released since 2012 First release October 2015
• Tracks experiences, scores, progress, teams, virtual media, real-world
experiences (not just completions)
• Allows data storage AND retrieval (ex. 3rd party reporting and
analytics tools)
• Enables tracking mobile, games, and virtual worlds experiences
• Developed by open source community
31. Activity driven data model
John added a photo to Open U Community Environment
Jim commented on John’s photo on Community Environment
John watched How to save energy video on ARLearn at 22.05.2014 3pm
John subscribed to Sustainable Energy on Open U at 24.05.2014 1pm
John posted My first blog post in Open U Community Environment
38. @HDrachsler, #LASI_NL, Zeist, Netherlands
Slide 38 / 29 June 2014
1. Why LA data
standard?
2. What data
standards are
out there?
3. Indepth
exampe xAPI
4. Different
LRS designs
Lecture structure
5. Outlook
43. Repository of xAPI statements
Onderwerp via >Beeld >Koptekst en voettekst
Pagina 43
44. @HDrachsler, #LASI_NL, Zeist, Netherlands
Slide 44 / 29 June 2014
1. Why LA data
standard?
2. What data
standards are
out there?
3. Indepth
exampe xAPI
4. Different
LRS designs
Lecture structure
5. Outlook
48. Lessons Learned
• xAPI
• xAPI has to much freedom of choice
(Authoritative for xAPI recipes is needed )
ECO as blue print?
• xAPI language issues
• LRS
• Extract-Transform-Load layer for interoperability
• Meta-Accounts for multiple data streams
• Data
• Are activities all we need? (Text-based analytics)
49. @HDrachsler, #LASI_NL, Zeist, Netherlands
Slide 49 / 29 June 2014
1. Why LA data
standard?
2. What data
standards are
out there?
3. Indepth
exampe xAPI
4. Different
LRS designs
Lecture structure
5. Outlook
53. Ice, P., Díaz, S., Swan, K., Burgess, M., Sharkey, M., Sherrill, J., & Okimoto, H. (2012). The
PAR Framework Proof of Concept: Initial Findings from a Multi-Institutional Analysis of
Federated Postsecondary Data. Journal of Asynchronous Learning Networks, 16(3), 63-86.
http://anitacrawley.net/Reports/PAR%20Framework.pdf
54. This silde is available at:
http://www.slideshare.com/Drachsler
Email: hendrik.drachsler@ou.nl
Skype: celstec-hendrik.drachsler
Blogging at: http://www.drachsler.de
Twittering at: http://twitter.com/HDrachsler
Many thanks for your attention!