6. Goals of workshop facilitation
Shared understanding of ROER4D-IS
Harmonization of impact studies
Sharing the OERRH experience
Refinement of ROER4D-IS proposals
7. Methods for workshop facilitation
Critical discussion of existing research
Peer review
Facilitating reflection on methods and claims
Exploration of key concepts
Making explicit what is assumed or implied
Identifying problematic areas
Effective planning
8. Things to avoid
✖ Dictating what methodology should be
✖ Being disrespectful or patronising
✖ Pleasing me
9. Icebreaker
Where in the world?
Name, institution, country
One key question
Swapping places to present partner institutions
10. Overview of ROER4D-IS (CHW)
Overview of objectives, activity, progress
Expectations of impact studies
[see other slide deck]
11. ROER4D Objectives
Build empirical knowledge base
Developing research capacity
Building scholarship networks
Open curation of research
Communicate research to influence
policy
12. ROER4D Strategies
Knowledge building (degrees of openness, OA)
Building research capacity (harmonization)
Build network through conference, workshops, etc.
Open curation (repositories, social media)
Collaborative, supportive approach to leadership
Seeking out creative synergies
Effective (agile?) methods for collaboration
Iterative evaluation
13. Expectations of ROER4D-IS
Case studies provide detail relative to broad understanding
of the Global South developed through survey work and
ROER4D as a whole
Balancing needs of network with individual needs
Open by default: CC-BY, open data, OA publishing
15. AVU / Teacher Education in Sub-Saharan Africa
Need for trained teachers and updated curriculum
OER offer promise of addressing issues of access, quality, cost
AfDB / UNDP resources in core subjects (Teacher Education)
Fullan (2006) theory of change underpins change knowledge
Examination of the conditions that sustain OER use
Comparative analysis across 12 institutions
Participatory approach to the research; qualitative data;
phenomenology
16. Darakht-e Danesh / Afghanistan
Conflict has destroyed educational infrastructure
OER gives educators independent access to content
OER supports much needed adaptation and localization
DD Library accessed via web, e-learning lab and mobile
“Effective measurement” of impact on teaching quality
Assumption that access to CPD resources will improve learning
outcomes (via improved literacies/competences)
Survey based approach (which questions?) supported by analytics
from learning lab and website access; student records
Theory of change: how is openness playing a role?
17. OER Impact in Asian non-formal ed. / Mongolia, India
Plurality of ‘impacts’ (knowledge, skills, aspirations, attitudes) on
learners and trainers from various OER types
Focused on strategies for collaboration and sharing between formal
and non-formal learning providers
Identify policies that improve quality and affordability of learning
Using Bennett’s (1979) hierarchy of outcomes to evaluate impact
Performance indicators = quantitative, qualitative, financial
Open = openly licensed? (If not, what?)
18. OER in teacher education / OU Sri Lanka
Action Research methodology (communities of practice)
Fullan (1993) as a framework for understanding change
4 hypotheses: changing pedagogical beliefs & practices; reduce cost
of learning; improve the quality of learning
Running workshops to raise awareness
Stakeholders: learners, teachers from six provinces & various levels of
study, subjects, etc. (nb. teachers as learners)
Interpretative Phenomenological Analysis (IPA) as organising
framework for qualitative data collected – emerging themes / meanings
19. OU UK / Teacher Education in E. Africa
Some research suggests that ‘quality’ teachers improve learning
National policies advocate ‘learner-centred’ education but this is vague
Focus on co-construction of knowledge as feature of openness
TESSA is a consortium of OER producing universities & other
organizations who developed a repository of OER for teacher learning
Practitioner responses to OER – attitudinal? Wider changes?
5 institutions: qualitative data; interpretation; phenomenology
Ontological & epistemological ‘shifts’ – is this clear?
How precise a conception of openness is appropriate here?
20. Practices and Openness in African HE / UCT
Global South tends to be seen as a recipient rather than provider
UCT has several MOOC available or in production (FutureLearn)
Various dimensions of openness: access, licensing, instruction
Impact of MOOC on educator and student practice & view of open
Impact of MOOC on valuing and repurposing of OER
How MOOC initiate OER use and creation
Methods: surveys, interviews, learning analytics, case studies
Attempt to map research questions to MOOC development cycle
21. Cost-Effectiveness Analysis of OER / U Philippines OU
Comparison of open vs non-open course development costs
Quasi-experimental research design
Participants chosen randomly from three disciplines (education,
health, management)
Strict separation of OER vs standard groups
Measuring: teacher competence; learner performance; quality of
materials – but how? Key indicators around savings per unit, efficacy
22. Virtual University Pakistan / Impact of OER in Pakistan
Study split between two institutions
Target of 88% ‘literacy’ by 2015 – only 60% at the moment
Internet access and use is rising (nb. laptop scheme)
Focus on lecture delivery; student performance; policy
Large scale survey augmented by interviews
Using Fullan’s theory of change
COUP framework to assess cost difference and impact on student
outcomes (http://openedgroup.org/coup /
/http://jime.open.ac.uk/article/view/252)
25. Enhancing Research Value
Between OER Practitioners
across the Global
North/South Divide Through
Open Collaboration
Dr. Rob Farrow
26. • Research project at The Open University (UK)
• Funded by William & Flora Hewlett Foundation for two years
• Tasked with building the most comprehensive picture of OER impact
• Organised by eleven research hypotheses
• Collaboration model works across different educational sectors
• Global reach but with a USA focus
• Openness in practice: methods, data, dissemination
OER Research Hub
oerresearchhub.org
#oerrhub
28. Keyword Research Hypothesis
Performance OER improve student performance/satisfaction
Openness People use OER differently from other online materials
Access OER widen participation in education
Retention OER can help at-risk learners to finish their studies
Reflection OER use leads educators to reflect on their practice
Finance OER adoption brings financial benefits for students/institutions
Indicators Informal learners use a variety of indicators when selecting OER
Support Informal learners develop their own forms of study support
Transition OER support informal learners in moving to formal study
Policy OER use encourages institutions to change their policies
Assessment Informal assessments motivate learners using OER
‘Evidence’ is only evidence in relation to a claim or hypothesis:
the project hypotheses form the core of the metadata model.
32. • Research instruments applied
consistently across collaborations:
surveys, interview questions,
focus groups, etc.
• Supplemented by integration of
secondary research
• ‘Agile’ research, sprinting
• Thematic and methodological
cohesion provided by research
hypotheses
Research Process
33. • Synthesis and aggregation of other
case studies
• Sharing networks, resources and
experiences
• Comparisons with Global North
• Initial agile enquiry through OLnet,
SCORE and OERRH fellows
networks
• Capacity for further, responsive
research
Essence of the proposal
34.
35. Synthesis
Synthesis Methods
• Isolating data by hypothesis, sector, country, or any combination
• Collaborative curation of research data
• Data visualization, reporting
• Editorial quality control exercised centrally
Validation
• Iteration through current and future patterns of evidence
• Open citation trails allow public auditing of evidence
• Community voting
44. Exercise: Activity Theory
Scandinavian school of AT seeks to synthesize several approaches, including
constructivism; pragmatism & actor-network theory
Context is important
45. Exercise: Activity Theory
Scandinavian school of AT seeks to synthesize several approaches, including
constructivism; pragmatism & actor-network theory
We are not interested here in AT as a tool of analysis or explanation, but as a
way of describing the different elements of the socio-technical systems
around OER implementation that will be studied in ROER4D-IS
Emphasis on tacit knowledge: you know your own context
47. subject actor(s) involved in a process
object purpose of the system
community social context
instruments tools & technologies
division of labour among actors / power
rules that regulate the system
[outcome what actually happens]
49. Exercise: Activity Theory
One approach that can work for this exercise is to complete the grid before
and after the intervention to broadly identify ‘impact’
How does it compare to the stated research question? Can we begin to
refine?
Similarities and differences between contexts
Share your description - http://tinyurl.com/roer4disat
50. Exercise: Activity Theory
Goals of the exercise:
Improved description of the research context
Identification of similarities / differences across case studies
Identifying possible partner for peer review exercise in day 3
Steps towards a general understanding of Global South context?
52. For your research hypothesis, what
would be the “perfect” evidence or
‘proof’?
53. What would be the next best thing?
… and if everything else failed?
54. Examples from OERRH
Hypothesis: OER improve student
performance/satisfaction
Gold: Longitudinal study pre/post OER
intervention grades; control of all
variables
55. Examples from OERRH
Hypothesis: OER improve student
performance/satisfaction
Silver: proxy data from surveys
(confidence, interest, motivation, etc.)
56. Examples from OERRH
Hypothesis: People use OER differently
from other online materials
Gold: Covert tracking of openly
licensed vs non open materials
57. Examples from OERRH
Silver: Triangulation of survey
questions around behaviours
Bronze: Anecdotal evidence;
interviews; focus groups
58. Examples from OERRH
Hypothesis: Open education acts
as a bridge to formal education
Gold: repository analytics with click
through to formal registration
(OpenLearn)
59. Examples from OERRH
Silver: Triangulation of survey
questions around attitudes
Bronze: Anecdotal evidence;
interviews; focus groups
60. For your case study / hypotheses…
Gold:
Pre/post intervention – but what are the metrics?
Attitudinal data where this is appropriate for hypothesis
Concept mapping to illustrate changing pedagogical beliefs
Rich qualitative description of change
A theory of change that can explain patterns in findings
Establishes causal relationship between intervention and effect
Silver:
Establishing relationships of correlation
Proxies from survey data
Bronze:
61. We will pick up on this again when we
look at risk assessment (Day 4)…
63. Impact as…
a change over time (in what?)
influence (on what?)
negative / positive / neutral
immediate vs medium-long term
intended vs unintended consequences
direct vs indirect
64. Open as…
openly licensed
free
online
sharing
participatory
accessible
“unfettered” / empowering
openness as general scholarly ethos, open-mindedness
decentralization of knowledge? / democratization (of what?)
a set of practices (OEP)
directed towards social justice / public good?
65. OER as…
Context of production vs. context of use
Openly licensed resources
Amenable to 4 Rs
Public domain
Free? (zero cost or freely available?)
Educational!
Designed to support learning?
66. Two issues in OER impact research:
1. No agreed definition/metrics for
‘impact’
2. Isolating particular influence of
openness on educational outcomes
67. OERRH strategies for amelioration:
1. Holistic, agile approach to data
collection
2. Embrace multiplicity of
interpretations
68. ROER4D strategies for amelioration:
Theories of change
1. Sharing increased access better lessons /
student performance
2. Viral openness / enacted practice leads to
participation
3. OER production encourages an important kind
of collaboration
69. ROER4D strategies for amelioration:
Theories of change
4. Participation in 4 Rs changes / challenges
epistemological assumptions
5. Adaptation influences quality
6. Local adaptation makes resources more locally
relevant
7. Integrating OER into teaching leads to changes
in practice
70. ROER4D strategies for amelioration:
Shared understanding of OER as free & openly licensed
Making explicit the interpretation of openness used in
context
Precise indicators of OER impact (direct/indirect)
Clarity with regards to the rationale, conceptual framework
and methodologies used
71. ROER4D Impact Studies Workshop
End of day 2
Homework = prepare for peer review of
proposals (ideally pair up based on shared
elements, e.g. sector, geography, hypotheses)
72. Self-critique of proposals
Any questionable assumptions?
Suggestions for improvement?
Can it be made clearer?
74. ROER4D Impact Studies Workshop
Day 3
It gets easier from here as we move
from difficult conceptual issues to
refining existing proposals
75. ROER4D Impact Studies Workshop
Day 3
Critique of proposals
Any questionable assumptions?
Suggestions for improvement?
Can it be made clearer?
76. AVU / Teacher Education in Sub-Saharan Africa
Lack of trained teachers / limited teaching resources for teacher education
Curriculum for maths and science is localized, but to what extent is this
integrated into policy and practice?
10 institutions with potential to further examine impact on teachers in
training – how is localization affecting them?
Main challenges around finding data that can illustrate relationship between
OER use and outcomes around teacher training
What kind of evidence? Curriculum adaptation (changes in learning
design?) plus descriptions of adaptation
77. Darakht-e Danesh / Afghanistan
Making it clearer how openness plays a role through collaboration
Making sure that this focuses on openness rather than just being a general
evaluation of the DD platform
Indicators – site analytics to measure uptake but how is improved
knowledge, learning and practice going to be measured?
Differences in patterns of access / use according to gender should be
especially interesting here
78. OER Impact in Asian non-formal ed. / Mongolia, India
Do these materials count as OER if they are not openly licensed? Shall we
just adopt the 4 Rs model? Hewlett definition?
Prioritising impact as a theme
Relating policy to practice through key PI
For each hypothesis: identify indicators and relate to theory of change
Difficulty of accessing farming materials in Mongolian, Tamil language
Are the OER used ‘native’ or coming from outside the community?
Learning analytics? How will this happen? Contingency plans?
Need to give adequate time for institutional approval
79. OER in teacher education / OU Sri Lanka
Methods = teaching observations; interviews; activity logs
Looking for evidence of pedagogical change: learning design
Evidence expected from analysis of teaching materials used
80. OU UK / Teacher Education in E. Africa
Variable uptake of OER, some already in place – how is this influencing
practices?
How are teacher trainees changing their understanding of ‘knowledge’ and
their own practices as a result of TESSA
Teacher educators from 5 institutions will provide data through survey – data
will be concept mapped and used as basis of interviews, etc.
81. Practices and Openness in African HE / UCT
Lecturers express difficulty in making MOOC materials open
What is the influence of OER on the pedagogies used?
Do MOOC structures require openness? If so, what does this mean?
What is the impact of use/creation of OER on other aspects of pedagogical
practice?
Evidence expected from baseline survey of lecturer behaviour/attitudes as
well as from analysing the various artefacts that are created and shared
82. Cost-Effectiveness Analysis of OER / U Philippines OU
Interest in quality of OER, impact on cost/access
Will focus on how faculty choose and adapt OER
Mandatory openness – faculty may not use copyrighted materials and are
even obliged to become OER creators
Distinguishing direct/indirect costs?
Cost-benefit analysis may be most difficult in year one as the start-up costs will
be applied in this year
Purposive rather than randomized sampling because only some courses are
using OER
83. Virtual University Pakistan / Impact of OER in Pakistan
There is a baseline study around use of IT resources already in place to
provide comparison with OER
High drop out rate, low quality textbooks – can these be ameliorated by OER?
Compare attrition rates of OER and non OER using students
but need to be aware of length of the study relative to academic year
Surveys to learn about impact of OER on teaching practice
Potential difficulty of evaluating / comparing textbook quality?
Ability to separate OER and non-OER student cohorts could produce useful
comparative data but need to be clear about the metrics
84. Plenary discussion: review of proposals
What is the problem to which OER is a potential solution? How?
Importance (and difficulty) of separating the impact of the general
intervention and the impact of the ‘open’ elements of the intervention
Different dimensions of openness
Impact of OER on course design and pedagogical methods (how
should this be captured?)
What is the process for getting the revised proposals approved?
Can we identify synergies between the impact studies? Can these
inform the creation of working groups within the IS?
85. Plenary discussion: review of proposals
Hypotheses should be as clearly stated as possible
Hypotheses should make clear how the ‘open’ element is under
examination
The evidence that is collected should be connected clearly with both
the hypothesis and the open element of the hypothesis
All proposals should include a section on the objectives of the study
– there is a general objective for all studies (around impact of OER in
Global South) and some specific objective related to the research
questions
A further objective is concerned with the effective communication of
results to influence future practice/policy
86. Plenary discussion: review of proposals
Seek out opportunities for harmonization between the impact
studies
Share methods and research instruments where possible
90. ROER4D Impact Studies Workshop
Day 4
Time for reflection
30 mins
– rephrase hypothesis
– explicit theory of change
– describing all relevant aspects of context
– methodology
share by email with ROER4D staff
91. Harmonization workshop
Harmonization facilitates comparison across ROER4D sub-projects
and the wider research literature
Developing a model for best practice in harmonization
Aggregation and categorization of existing OER research surveys
Clarification of concepts with original research teams over 9 months
92. Henry Trotter
ROER4D Impact Studies Workshop, Penang, Malaysia
4 December 2014
Research Question Harmonisation in ROER4D
93. Knowledge
building
Research
capacity Networking
Curation
&
Communication
1. Build an
empirical
knowledge base
on the use and
impact of OER in
education
2. Develop
the capacity
of OER
researchers
3. Build a
network of
OER scholars
4. Curate research
documents and
Communicate
research to inform
education policy and
practice
ROER4D Objectives
94. 4 goals:
• Harmonise our research questions, where possible, with
that of other OER studies such as OER Research Hub,
OER Asia, JISCOER, etc.
• Harmonise our research questions, where possible, across
our 12 projects
• Use this QH process to build the research capacity of our
sub-project researchers and research associates
• Provide a model of best practices for other research for
Research capacitation through Question Harmonisation
95. 1. Consulted 9 major OER surveys to develop a bank of potential questions…
98. 3. Shared Qs with researchers, showing how they would appear in survey form
99. 3. Shared Qs with researchers, showing how they would appear in survey form
100. 4. Engaged with researchers online via Adobe Connect to harmonise questions
15 synchronous sessions over 9 month period
101. …but to do so, we had to work out everyone’s time zones & best meeting time
http://roer4d.org/wp-content/uploads/2014/03/ROER4D-Participants-Time-Zones-for-2014.pdf
102. …but to do so, we had to work out everyone’s time zones & best meeting time
103. …but to do so, we had to work out everyone’s time zones & best meeting time
107. 6. Harmonised concepts as part of process (via Adobe Connect & Google Docs)
https://docs.google.com/document/d/1Iz1kVC4CYLFJBtZNm2o5ziFJKW96SjtNjhWHfTKKkbI/edit
108. 7. Piloted survey based on harmonised questions with ROER4D members and
other OER colleagues (version 1)
112. 10. Enjoined researchers to share their adaptations of the harmonised survey
for their own sub-projects via webinar sessions…
113. …and recruited some of them to share their research knowledge experience
with us next year during the bi-weekly Adobe Connect sessions
Evaluation Question:
What research skills could YOU contribute to the research capacity building?
Formulating research instrument questions (5)
• Cheryl Hodgkinson-Williams (research questionnaire development)
• Meenu Sharma (developing research instruments)
• Sanjaya Mishra (Scale development)
• Mohan Menon (development of research tools)
• Jose Dutra (instrument development)
Analysing qualitative data (2)
• Cheryl Hodgkinson-Williams
• Tess Cartmill (using NVivo)
Developing a conceptual framework (2)
• Cheryl Hodgkinson-Williams
• Meenu Sharma
Report writing (2)
• Sukaina Walji
• Meenu Sharma
Writing a research question (1)
• Cheryl Hodgkinson-Williams
Presenting research work (1)
• Sukaina Walji
Analysing quantitative data (1)
• George Sciadas
114. Outcomes (positive)
1. Through extensive collaboration, deliberation and testing, we developed a set of
questions that were:
• well-harmonised with other large OER surveys
• sensitive to and adapted for the Southern context
• successful at obtaining useful data on academics’ creation and use of OER
2. The process allowed us to sharpen and harmonise our concepts, creating a
better understanding of the terms that we use across the entire project.
3. It created a strong sense of community amongst the researchers that
participated, a valuable outcome given that many feel alone as OER researchers in
their contexts. (This also helped fulfill ROER4D’s third objective, which is to build a
network of OER scholars.)
4. Increased the research capacity of many of the scholars that participated, which
was the broader objective of this question harmonisation effort.
115. Outcomes (negative)
1. Research capacitation was uneven for a variety of reasons. Some researchers:
• were unable to attend due to time conflicts
• were disinterested in the process
• missed the point of the exercise (despite attending sessions)
• did not avail themselves of support structures outside the webinars (mentors, etc.)
to shore up the knowledge or concepts to which they were exposed.
2. The technology (especially Adobe Connect and our institutional broadband
connections) often let us down, turning vibrant conversations into clunky, painful
interactions.
3. The process took longer than anticipated.
4. The sub-project which could have benefited from this process the most and utilise
the harmonised questions in a powerful and extensive manner essentially decided
not to use them, thereby reducing the impact that the process could have had on
116. Lessons learned
What worked?
1. Having regular sessions: the consistency of the process was crucial for creating the opportunities
necessary to build research capacity and to develop a sense of community amongst participants.
2. Inviting researchers to share their own work: this allowed members to get valuable feedback and
to feel “heard” by their peers.
3. Working collaboratively and “openly” (within the project): the transparency of the process –
especially the network team’s creation of “public” Google docs which researchers could engage –
created greater credibility and accountability, enhancing members’ buy-in.
What didn’t work?
1. The “voluntary” model: for practical and pedagogical reasons, we chose to make this a voluntary
process, but this resulted in uneven attendance and interest.
2. Initiating the process after other key issues had already been decided: the process would have
likely run more smoothly if it had been built into the programme from the beginning, with clear
117. So the question is…
Would some sort of question or concept harmonisation process be
useful for the ROER4D Impact Studies group?
And if so, how would it work?
118. Harmonization of Impact Studies
Working from a common vocabulary (n.b. translation issues; getting
caught in semantics)
Shared methods for shared hypotheses?
Use existing ROER4D survey questions where possible
Problem of differing research paradigms / assumptions / contexts
Thematic classification of results
Harmonization of research processes?
119. Harmonization of Impact Studies
OER can expand access to education (4) (n.b. formal / informal)
Local adaptation of OER leads to improvement in learning (6)
Exposure to OER leads to open practice (6)
Reuse/re-purposing leads to changed pedagogy (7)
Integration of OER improves quality of teaching resources (3)
OER can provide alternative perspectives that are useful for teaching and
learning (2)
OER use reduces student attrition (in public schools)
Key concepts: openness; impact; quality; access; reuse; repurposing; adoption;
cost; adaptation; practice
Adoption team to share work already done in the area of concept mapping
121. Examples of exemplary OER research
There are none!
Wiley (2009) ‘Decade of Development’ – history of OER movement
McAndrew et al (2012) Assessing OER impact… (Bridge to Success)
CHW (2014) ‘Degrees of Ease…’
Schaffhauser (2014) 5 ideas for spreading OER / 5 myths of OER
122. Examples of exemplary OER research
Link to ROER4D bibliography
http://tinyurl.com/ROER4D-Bibliography
Any references provided by Raj?
123. What are the features of effective OER research?
Clear research questions Builds on existing relevant disciplinary
knowledge
Context sensitive Original
Ethical Robust, clearly articulated design
Clarity around assumptions Awareness of roles/interests
Clear terminology Explicit conceptual framework
Clear methodologies Good analysis
Relevance Advances thinking in the field
Replicability Cost-effectiveness
Communication Awareness of limitations
Reliable
124. IMPACT research is necessarily empirical (based on experience)
… but there is still going to be INTERPRETATION of the data that is collected
125. ‘Eyes that Survey the World’:
the latest data snapshot from
OER Research Hub
B. de los Arcos, R. Farrow,
L.A. Perryman, B. Pitt
The Open University, UK
oerresearchhub.org
@OER_Hub
127. Photo CC BY-NC 2.0 https://flic.kr/p/dSHr87
Data
• 20+ surveys;
• 60+ interviews with educators
and OER experts;
• 6 focus groups;
• Impact statements
129. 6,390 responses from 180 countries:
50.3% informal learners,
24.7% formal learners,
21.6% educators,
3.4% librarians;
50.1% female; 48.7% male;
64% speakers of English as first language;
9.9% declare a disability;
33.3% hold a postgraduate degree;
34.8% use OER in Science.
130. Photo CC BY-SA 2.0 marfis75 https://flic.kr/p/o4Hice
PhotoCCBY-NC2.0AlexProimoshttps://flic.kr/p/dgqpwt
134. “Over the course of an entire semester all the kids turned in on
average 82% of their homework, which is significant for me as an
instructor because that made me feel that what I was asking
them to do at home, (…) whatever it happened to be, that they
saw the meaning in doing that.”
“The greatest impact comes when I share the MERLOT website with
students. They instantly connect with others who share their best practices.
Then they develop their own best practices to share with their students and
colleagues. There is such a great ripple effect when people are willing to
share; especially when the information is easy to locate.”
135. Photo CC BY-SA 2.0 https://flic.kr/p/5BZgEa
Photo CC BY 2.0 https://flic.kr/p/6EuSQZ
136. Photo CC BY-SA 2.0 https://flic.kr/p/5BZgEa
86.3% of educators adapt OER to suit their needs
137. “The problem where I teach now is that
we have no money; my textbooks, my
Science textbooks are 20 years old,
they’re so outdated, they don’t relate to
kids (...) so I pick and pull from a lot of
different places to base my units.”
“I will maybe look and find an
instructional video that’s maybe 2
or 3 minutes long that gets to the
point better than I could, and I
would use it, or I will look for
lessons and if they are for Grade
5 or Grade 3 I don’t use all of it, I
just adapt it, I take out what I
don’t want and rearrange it.”
“What I do is I look at a lot of free resources but I don’t usually
give them directly to my students because I usually don’t like
them as much as something I would create, so what I do is I get
a lot of ideas.”
138. • I’ve created resources 95%
• I’ve created resources and published them online 44%
• I’ve created resources and published them online under
a CC license 5%
(Flipped Learning)
140. ‘I use a broader range of teaching and learning tools’ 40.6%
‘I reflect more on the way that I teach’ 37%
‘I have broadened my coverage of the curriculum’ 36.7%
‘I more frequently compare my teaching with others’ 32.1%
141. “It used to be that when I thought about preparing for a lesson I would
look at a book and see what they did and I then would
kind of teach a lesson similar to it but now I can go online
watch a video or look at somebody else’s material that they put out there,
see what they’re doing and either modify what they’re doing and bring it
into my classroom or just get a totally different perspective on it and
allow my students to get multiple perspectives on a
topic.”
145. “Down the road they may. Students talk to other potential
students. When they find out that teachers care about cost
and readability, they are more likely to choose your college”
“Since we are all using online version, the school saves a
lot of paper and money”
“Without any doubt my students are saving money! Only
one has purchased a copy of the textbook - everyone else
uses their laptop, tablet, or prints out what they want.”
147. 57% of informal learners already have a degree
31% of formal learners used OER to try university-
content level before signing up for a paid-for course
88.4% of all learners choose OER for the opportunity to
study at no cost
149. ‘COUP’ Framework
The COUP is the Open Education Group’s framework for studying the impact
of open textbooks, open educational resources, and open pedagogy in
secondary and post-secondary education. COUP stands for:
- Cost
- Outcomes
- Use
- Perceptions
150. ‘COUP’ Framework
Presentation by David Wiley (2012 OER Asia)
http://openedgroup.org/coup
Example of use
http://jime.open.ac.uk/article/download/2013-04/478
151. John Hilton III’s slides from
Open Education 2014
(thanks for sharing, John!)
152. A Review of Research on the
Perceptions and Efficacy of OER
(and a call for more!)
John Hilton III
http://johnhiltoniii.org
Open Education Group
http://openedgroup.org
153. Problem
A recent nationally representative survey
of 2,144 faculty members in the United
States found that “most faculty remain
unaware of OER.”
Source: Babson 2014 Survey, “Opening the Curriculum.”
154. Possible Solutions
Increasing efforts to “market” OER.
Increasing number of outstanding OER material.
Increasing the number of academic, peer-
reviewed studies regarding the efficacy and
teacher and student perceptions of OER
materials.
155. Increasing the number of academic, peer-reviewed
studies regarding the efficacy and teacher and
student perceptions of OER materials.
The Babson 2014 survey found that college
professors rate “proven efficacy” and “trusted quality”
as the number 1 and number 2 most important criteria
for selecting teaching resources.
156. Published Efficacy and Perception
Studies
1. Article focused on efficacy or perception in actual practice
(not simply theory).
2. The resource(s) examined in the study needed to be OER
that were the primary learning resource(s) used in the class.
3. In order to be selected for inclusion in this study, the
research needed to have been published by a peer-
reviewed journal, or be an institutional research report. Blog
posts and conference proceedings were excluded from this
data set.
157. References
Allen, I., Seaman, J. (2014). Opening the Curriculum: Open Educational Resources in U.S. Higher Education, 2014. Report available at:
http://www.onlinelearningsurvey.com/oer.html.
Bliss, T., Robinson, T. J., Hilton, J., & Wiley, D. (2013). An OER COUP: College teacher and student perceptions of Open Educational Resources. Journal of Interactive
Media in Education, 1–25. Retrieved from http://www-jime.open.ac.uk/article/2013-04/pdf.
Bowen, W. G., Chingos, M. M., Lack, K. A., & Nygren, T. I. (2012). Interactive Learning Online at Public Universities: Evidence from Randomized Trials. Ithaka S+R.
Retrieved from http://mitcet.mit.edu/wp-content/uploads/2012/05/BowenReport-2012.pdf
Feldstein, A., Martin, M., Hudson, A., Warren, K., Hilton, J., & Wiley, D. (2012). Open textbooks and increased student access and outcomes. European Journal of Open,
Distance and E-Learning. Retrieved from http://www.eurodl.org/index.php?article=533
Hilton, J., Gaudet, D., Clark, P., Robinson, J., & Wiley, D. (2013). The adoption of open educational resources by one community college math department. The
International Review of Research in Open and Distance Learning, 14(4), 37–50.
Hilton, J., & Laman, C. (2012). One college’s use of an open psychology textbook. Open Learning: The Journal of Open and Distance Learning, 27(3), 201–217.
Retrieved from http://www.tandfonline.com/doi/abs/10.1080/02680513.2012.716657
Lindshield, B., & Adhikari, K. (2013). Online and campus college students like using an open educational resource instead of a traditional textbook. Journal of Online
Learning & Teaching, 9(1), 1–7. Retrieved from http://jolt.merlot.org/vol9no1/lindshield_0313.htm
Lovett, M., Meyer, O., & Thille, C. (2008). JIME-The open learning initiative: Measuring the effectiveness of the OLI statistics course in accelerating student
learning. Journal of Interactive Media in Education, 2008(1).
Pawlyshyn, Braddlee, Casper and Miller (2013). Adopting OER: A Case Study of Cross-Institutional Collaboration and Innovation. Educause Review,
http://www.educause.edu/ero/article/adopting-oer-case-study-cross-institutional-collaboration-and-innovation.
Petrides, L., Jimes, C., Middleton‐Detzner, C., Walling, J., & Weiss, S. (2011). Open textbook adoption and use: Implications for teachers and learners. Open
learning, 26(1), 39-49.
Robinson T. J., Fischer, L., Wiley, D. A., & Hilton, J. (2014). The impact of open textbooks on secondary science learning outcomes. Educational Researcher, 43(7): 341-
351.
Wiley, D., Hilton, J. Ellington, S., and Hall, T. (2012). “A preliminary examination of the cost savings and learning impacts of using open textbooks in middle and high
158. Efficacy and Perception Studies
1. Lovett et al. (2008) measured the result of an
implementation an online, OER component of Carnegie
Mellon University’s Open Learning Initiative (OLI). Over
two semesters, forty-four students utilized the OER as part
of this study. Researchers examined test scores (three
midterm and one final exam) of those students who took
the traditional course versus those who utilized the OER
materials. They found that no significant difference
between the two groups.
159. Efficacy and Perception Studies
2. Petrides et al. (2011), utilized surveys of instructors and
students who utilized an open statistics textbook called
Collaborative Statistics. In total, 31instructors and 45 students
participated in oral interviews or focus groups that explored
their perceptions of the OER which they had utilized. They
found that “Cost reduction for students was the most
significant factor influencing faculty adoption of open
textbooks” (p. 43), partly because it increased student access.
65% of students on the survey reported a preference for using
open textbooks in the future because they are generally easier
to use.
160. Efficacy and Perception Studies
3. Bowen et al. (2012) compared the use of a traditional statistics
textbook with Carnegie Mellon’s OLI at six different institutions.
Participating students were randomly assigned to either the
face-to-face class with a traditional textbook, or a “hybrid” class
that used the OER resource. Both groups took the same
standardized test at the beginning and end of the semester, as
well as a final examination. 605 students took the OER version
of the course, while 2,439 took the traditional version. Students
who utilized OER performed slightly better on the standardized
exam than those who did not. However the difference in
outcomes was not statistically significant.
161. Efficacy and Perception Studies
4. Hilton and Laman (2012), focus on introductory Psychology
courses taught at Houston Community College (HCC). In the
fall of 2011, twenty-three sections composed of 690 students
used an open psychology textbook. The textbook was
available for free online, and digital supplements produced by
faculty were also freely to HCC students. The introduction of
an open textbook was correlated with the increase in class
grade point average, an increase of the average score on the
departmental final examination and a lower course withdrawal
rate. No causation was claimed.
162. Efficacy and Perception Studies
4. (Cont.) One hundred and fifty-seven students completed
surveys regarding their perceptions of the OER. 84% of
students surveyed agreed with the statement that “Having
a free online book helps me go to college.”
163. Efficacy and Perception Studies
5. Wiley et al. (2012) examined the standardized test scores of
students using the open textbooks in secondary science
classes in three different school districts. Approximately
1,200 students used open textbooks during this study.
Researchers examined their end-of-year standardized test
results and found no apparent differences between the
results of students who used traditional and open textbooks.
164. Efficacy and Perception Studies
6. Research by Feldstein et al. (2012) took place at Virginia State
University. OER were implemented across nine different
courses in the business department. 1,393 students took
courses utilizing OER. Researchers found that students in
courses that used OER more frequently had better grades and
lower failure and withdrawal rates than their counterparts in
courses that did not use OER. While their results had statistical
significance, because of a new core curriculum employed at
Virginia State University’s Business school, the two sets of
courses were not identical. Thus while these data provide
interesting correlations, they cannot establish causality.
165. Efficacy and Perception Studies
6. (Cont.) Three hundred and fifteen students completed a survey
regarding their perspective on the shift to the OER, and Almost
95% of responding students strongly agreed or agreed that the
OER were “easy to use” and 78% of respondents felt that the
OER “provided access to more up-to-date material that is
available in my print textbooks.” Approximately two-thirds of
students strongly agreed or agreed that the digital OER were
more useful than traditional textbooks and that they preferred
the OER digital content to traditional textbooks.
166. Efficacy and Perception Studies
7. Bliss et al. (2013), studied OER adoption at eight different
institutions of higher education. Fifty-eight teachers and 490
students across the eight colleges completed surveys regarding
their experiences in utilizing OER. Approximately 50% of students
said that the OER materials had the same quality as traditional
textbooks and nearly 40% said that they were better. Students
focused on several benefits of the open textbooks. Many cited
technical advantages of the digital texts. In addition, the free cost of
their open texts seemed critical to many students. 55% of teachers
reported that the open material were of the same quality as the
materials that had previously been used, and 35% felt that they
were better.
167. Efficacy and Perception Studies
8. Lindshield and Adhikari (2013) studied the perceptions of
students who utilized a digital OER textbook in a Human
Nutrition class. One hundred and ninety-eight students
completed a survey in which they shared their perceptions of the
OER text. “Students favorably rated their level of satisfaction,
liking the idea of the [digital OER], ease of [digital OER] use, not
having to buy a textbook, and preferring the [digital OER] versus
buying a textbook for the course.” Moreover they found that
students disagreed or somewhat disagreed with statements to
the effect that they would like to have a traditional textbook in
addition to the OER.
168. Efficacy and Perception Studies
9. Pawlyshyn et al. report on the adoption of OER at Mercy College. In
the fall of 2012, 695 students utilized OER in Mercy’s basic math
course, and their pass rates were compared with those of the fall of
2011, in which no OER were utilized. Researchers found that the
pass rates increased from 63.6% in fall 2011 (when traditional
learning materials were employed) to 68.9% in fall 2012 when all
courses were taught with OER. Similarly, students who were enrolled
in OER versions of a reading course performed better than their
peers who enrolled in the same course using non-OER materials.
169. Efficacy and Perception Studies
10. Hilton et al. (2013) chronicles a study that took place at
Scottsdale Community College (SCC). In the fall of 2012,
OER were employed throughout five different math courses
at SCC, affecting 1,400 students. Issues with the initial
placement tests made it so only four of the courses could be
compared; nevertheless, the results of Fall 2012 (when OER
was used) compared to Fall 2011 and 2010 showed that
student results on department exams were approximately
the same before and after the OER implementation.
170. Efficacy and Perception Studies
10. (Cont.) Surveys were completed by 910 students and
eighteen faculty members at SCC who reported on their view
of the OER. The majority of students (78%) said they would
recommend the OER to their classmates. Similarly, 83% of
students agreed with the statement that “Overall, the
materials adequately supported the work I did outside of
class” (only 5% of students disagreed with this statement).
Faculty members were likewise positive about the open
materials. 50% said that it was of the same quality as
traditional textbooks, 33% said it was better, and 17% said it
was worse.
171. Efficacy and Perception Studies
11. Robinson et al. (2014) examines the use of open science
textbooks in three secondary science subjects across
several schools in a suburban school district. This rigorous
study used propensity score matched groups in order to
control for teacher effect, socioeconomic status, and eight
other potentially confounding variables. There were 1,274
students in each condition, treatment and control. In
examining the results of the end-of-year state standardized
test there were small, but statistically significant difference
between the two groups, favoring those who utilized OER.
172. Efficacy and Perception Studies
12. Allen and Seaman in their Babson Survey (2014) surveyed
2,144 college professors regarding their opinions on OER.
Of the 34% (729) who expressed awareness of OER, 61.5%
of respondents said that OER materials had about the same
“trusted quality” as traditional resources, 26.3% said that
traditional resources were superior, 12.1% said that OER
were superior. 68.2% said that the “proven efficacy” were
about the same 16.5% said that OER had superior efficacy
and 15.3% said that traditional resources had superior
efficacy.
173. Synthesizing
In terms of student and teacher perspective of OER, there were
2,115 students and 836 faculty members whose perceptions were
surveyed across the seven studies pertaining to perceptions of
OER. In no instance did a majority of students or teachers report
that the OER were of inferior quality. Across multiple studies in
various settings, students consistently reported that they faced
financial difficulties and that OER provided a financial benefit to
them. A general finding seems to be that roughly half of teachers
and students find OER to be comparable to traditional resources,
a sizeable minority believe they are superior, and a smaller
minority find them to be inferior.
174. Synthesizing
7,301 students were reported to have utilized OER materials
across the eight studies that attempted to measure results
pertaining to student efficacy. While causality was not
claimed by any researcher, the use of OER was sometimes
correlated with higher test scores, lower failure and/or
withdrawal rates. None of the eight studies that measured
efficacy had results in which students who utilized OER
performed worse than their peers who used traditional
textbooks.
175. Synthesizing
While some may be disappointed that OER materials have
not been found to significantly increase student learning
outcomes, this “non-finding” is nevertheless very important.
Given that (1) students and teachers generally find OER to
be as good or better as traditional textbooks, (2) students do
not perform worse when utilizing OER, students, parents and
taxpayers stand to save literally billions of dollars without any
negative impact on learning through the adoption of OER.
176. Two Requests
1. If you are aware of a peer-reviewed efficacy or
perceptions study that I have not mentioned, will you
please let me know?
2. Will you initiate research studies focused on perceptions
and efficacy of OER? Scholarly articles in this arena will
increase awareness and adoption of OER. If you would
like help in designing or implementing such studies, my
colleagues at the Open Education Group are happy to
assist.
177. Publishing is not that hard!
1. International Review of
Research on Open and
Distance Learning
2. Journal of Interactive Media
in Education
3. Open Praxis
4. Subject-specific disciples
(e.g., Science Education,
Math Education, etc.)
178. A Review of Research on the
Perceptions and Efficacy of OER
John Hilton III
http://johnhiltoniii.org
Open Education Group
http://openedgroup.org
188. Open Research: Process
“Open research is research conducted in the spirit of free and open source
software. Much like open source schemes that are built around a source
code that is made public, the central theme of open research is to make
clear accounts of the methodology freely available via the internet, along
with any data or results extracted or derived from them. This permits a
massively distributed collaboration, and one in which anyone may
participate at any level of the project.”
http://en.wikipedia.org/wiki/Open_research
189. Open Research: Process
Five principles:
1. Radical, realtime transparency
2. Make work discoverable
3. Minimise barriers to participation
4. Update in regular rhythm
5. Use social media to publicly engage
http://opensource.com/education/12/3/how-do-open-research-5-basic-principles
192. The field of ethics (or moral philosophy) involves
systematizing, defending, and recommending
concepts of right and wrong behavior.
Internet Encyclopedia of Philosophy
http://www.iep.utm.edu/ethics/
193. https://www.youtube.com/watch?v=jD-YCDE_5yw
Post World War II, war crimes
trials produces Nuremberg
Code (1947) for research
involving human subjects
Belmont Report (1979) sets
out the principles of ethical
research & still acts as basis
for experimental research
Criticised by Shore (2006) for
failure to recognize difference
(gender, ethnicity, culture,
geography, etc)
194. Principles of Ethical Research
• Exercise control over research process
• Ethical research design, sampling, data collection
• Respect for the autonomy and self-determination of research participants
• Informed (and freely given) consent
• Privacy & confidentiality (including data management)
• Fairness, impartiality & transparency
• Non-maleficence (do no harm)
• Beneficence (maximise benefits of research)
195. Open Research
When you make
research open,
novel and
interesting things
happen to the
research process
196. Ethics in OER Research Hub (1/2)
Considerations in line with ‘traditional’ research:
• Compliance with UK Data Protection Act (1998) and the USA’s Protection of
Human Subjects (45 CFR 46)
• Risk assessment
• Free recruitment of research participants
• Institutional approvals (IRB) as needed
• Informed consent
• Data collection / storage in compliance with policy of The Open University (UK)
197. Ethics in OER Research Hub (2/2)
New dimensions resulting from greater openness:
• collaborative research design; agile working in partnership needs to maintain
epistemological integrity
• third-party data; respecting the consent provided at the time
• open release of research data; issues around privacy and security of data;
obligations to participants; wording of consent form
• open licensing of research instruments; responsibility to set standards for
research excellence
• open dissemination: blogging, open access publication, School of Open course,
duty to share findings widely
198. Openness in education
The digital nature of OER and the particular methods of producing and using them
represent a considerable challenge to existing practice in education:
• Implications for proprietary methods of publication, dissemination
• Evolving pedagogical roles & responsibilities
• Relation to academic career development
• Correct use (and attribution) of intellectual property
• Blurring boundaries between private and ‘connected’ life
• Building consensus and influencing policymakers
200. Morality and open education
“When educational materials can be electronically copied and transferred around
the world at almost no cost, we have a greater ethical obligation than ever before
to increase the reach of opportunity. When people can connect with others nearby
or in distant lands at almost no cost to ask questions, give answers, and exchange
ideas, the moral imperative to meaningfully enable these opportunities weighs
profoundly. We cannot in good conscience allow this poverty of educational
opportunity to continue when educational provisions are so plentiful, and when
their duplication and distribution costs so little.”
http://www.irrodl.org/index.php/irrodl/article/view/469/1001
Caswell, Henson, Jensen & Wiley (2008)
201. Morality and open education
Paris Declaration on OER (2012) builds on the previous ten years of OER
advocacy as well as article 26 of the Universal declaration on human rights
(UDHR, 1948) and article 13.1 of The International Covenant on Economic, Social
and Cultural Rights (UN, 1966) in recognition of “the right of everyone to
education”
http://www.irrodl.org/index.php/irrodl/article/view/469/1001
203. Morality and open education
• Are we morally obliged to release OER? For its own sake? For the sake of
improving access to education as a moral good?
• Are we morally obliged to release data openly? Can there be adequate
safeguards? Is the risk too great?
• Do we need more evidence around OER efficacy?
• Education as common good supported indirectly by OER, open data, etc.
• The moral significance of inaction
204. Risks that might affect the research…
Changing currency exchange rates Failure to secure IRB ethical approval(s)
Security of the research sites / equipment Risk to human participants (instability)
Robbery / criminal activity in research sites Lack of professionalism / skills
Scheduling issues – academic year, etc. Collaborator dependencies
Subcontracting; recontracting Insufficient data is gathered in time
Key stakeholders become unavailable Translation issues
Reliability of data collected online
206. OERRH Ethics Manual: Guidance
It’s not possible to anticipate every possible effect of openness in unmonitored spaces:
• Understanding the potential for collected information to be personally, professionally
or commercially sensitive
• Policies should make it clear when data can be shared with others and under what
conditions, licence, etc.
• Though open, dissemination strategies should respect existing agreements with
those who have been recorded or provided data
• Openly available third party materials should be used fairly.
• Data mined from social networks may need to be treated with caution
http://oerresearchhub.org/about-2/reports/oerrh-ethics-manual/
207. Summary of Guidance
• Just because it’s legal doesn’t mean that it is ethical
• Check terms & conditions thoroughly if you’re at all unsure on legal side
• Think about the control you exercise over the process and how to use
influence.
• CC-BY-NC/ND license options may give more control over data, but are
arguably less open – is there a balance to be struck?
Open versions of familiar principles:
• Minimize harm
• Ensure that consent is as informed as it reasonably can be
• Respect for privacy and personhood
219. Next Steps
15 December 2014 – 1 January 2015
Proponents to submit revised proposals, abstracts and budgets to WOU (gdhan@wou.edu.my cc
vivienchiam@gmail.com)
from 15 January 2015
Proponents can expect to receive feedback on their revised proposals
from 15 January 2015
WOU to send out Memorandum of Grant Conditions
15 January 2015 – 15 February 2015
Proponents to return signed Memorandum of Grant Conditions
February 2015
WOU to send out 1st tranche grant funds to Sub-projects (85% of project expenses)
220. Schedule of Financial and Technical Reports
01 March 2015
Official commencement date for all Sub-projects
31 August 2016
Official completion date for all Sub-projects
221. Schedule of Financial and Technical Reports
01 March 2015
Official commencement date for all Sub-projects
15 June 2015
1st Technical Reports due from Sub-project
(covering 3-month period from 1 March – 31 May 2015)
15 September 2015
2nd Technical Reports due from Sub-projects
(covering 3-month period from 1 June – 31 August 2015)
222. Schedule of Financial and Technical Reports
15 March 2016
3nd Technical Reports due from Sub-projects
(covering 6-month period from 1 September 2015 –
28 February 2016)
1st Financial Reports due from Sub-projects
(covering 12-month period from 1 March 2015 –
28 February 2016)
31 August 2016
Official completion date for all Sub-projects
223. Schedule of Financial and Technical Reports
30 September 2016
4rd (Final) Technical Reports covering entire grant period,
from 1 March 2015 – 31 August 2016
2nd (Final) Financial Reports covering entire grant period,
from 1 March 2015 – 31 August 2016
from 30 October 2016
Final fund disbursements to Sub-projects (up to balance
15% of project expenses)
225. Budget Categories
Research Personnel:
Include remuneration, honoraria, allowances, and benefits
paid to the principal investigator, co-investigators and other
project staff
Project advisors may be included if they are being paid on a
regular basis and are hired for a longer period (more than a
year).
International travel costs for research personnel are covered
in a separate budget item – International Travel
226. Budget Categories
Consultants:
Include all expenses related to acquiring the services of
a consultant for a specific activity within the project
Include fees, travel, accommodation, living expenses,
and support services hired directly by the consultant for
the project
Indicate the total cost for each consultant as a single
lump sum, and use a note to give a breakdown of the
costs
227. Budget Categories
International Travel:
Include costs for international travel by project staff listed
under “Research Personnel”
Include costs for ground transportation, accommodation,
meals, airfare, departure taxes, travel insurance and other
expenses related to international travel
Adhere to travel management processes of own institution
but must follow terms stipulated in grant agreement, i.e.
economy class travel and most direct route
228. Budget Categories
Research Expenses:
Include all costs related to carrying out the
research and disseminating the research findings
Include items such as payments to people who
gather data or provide casual labour, consumable
goods, computer services, in-country travel,
reference materials, translation, printing, etc.
229. Budget Categories
Indirect costs:
Include administrative costs not directly related to the research
Include clerical, accounting, or secretarial help, communications costs,
photocopying
in total, do not exceed 10% of total project cost
If grant-seeking institution is absorbing the indirect costs partially or in total,
indicate accordingly and deduct the amount from the total project cost
230. Currencies and Bank Transaction Costs
Budgets must adhere to the upper limits stipulated in the Call –
MYR150,000 – MYR225,000
All budgets must be submitted in MYR, based on local currency
calculations
Exchange rate and date of conversion to MYR must be shown in the
budget
Grant payments will be made in MYR. Note that there will be no
reimbursements for additional costs for bank charges and currency
fluctuations. The PI of the sub-project must deal with any shortfall in
the budget due to exchange rate loss and/or bank charges, by
adjusting project expenses
231. In General
Budget line items must accompanied by clear budget notes
Ensure budgets are apportioned appropriately across the 18
months’ project timeline
Two payment tranches will be made (initial payment for 12 months’
expenses and final payment for six months’ expenses upon
submission of the final project reports)
232. Some Anticipated FAQs
What happens if project costs are incurred before the grant agreement is signed?
Such costs cannot be covered by the grant.
Can I revise my budget during the grant period?
This can be done only with the agreement and approval of the Project Coordinators, and with proper
justification. The revised grant total budget must be within the limit of the original approved budget.
Can I change the working currency of the project during the period of the grant?
Normally, no, unless there are exceptional circumstances.
233. Some Anticipated FAQs
Do the Technical and Financial Reports have to be submitted in a certain
format?
The formats for each type of report will be provided to grantees.
Do I need to maintain a separate bank account for the grant monies?
No, unless the grantee institution prefers to do so.
Do I need to submit all receipts for project expenses?
You must retain a proper accounting record of all project expenses, together with
supporting invoices/receipts, and they should be available for submission with your
financial reports, if required.
236. Key questions for evaluation (1/2)
1. What aspect(s) of the project should be evaluated?
2. Who is the evaluation for?
3. What is it they want to find out?
4. What evaluation methods will be used?
5. What changes will be made when the results are
gathered?
237. Key questions for evaluation (2/2)
6. What are the evaluation criteria and what is their source?
7. When will the evaluation take place?
8. Who will be involved in the evaluation?
9. What constraints will be placed upon the evaluation?
10. How and when will the evaluation results be
disseminated?
239. Final thoughts…
Value of individual discussion – less ‘confrontational’
Support going forward
Communication between IS grantees
Communication with ROER4D ‘mothOERship’
Learning more than expected! (working harder than expected!)
Google Group for contact: docs, discussion, hangouts
Apr 2015 meeting? (tbc)
Possible attendance at OE Consortium Conference in Banff 2015
Thanks everyone
241. Join us in building understanding of open education
School of Open
course on
#openresearch
OERRH Evidence Report
OERRH Ethics Manual
Contribute to OER
Impact Map
Editor's Notes
Introduce project
Data we have
Characteristics of survey respondents
Do we need to add a bit more?
Introduce Hypothesis A – Performance Use of OER leads to improvement in student performance and satisfaction (OER improve student performance/satisfaction)
Performance understood in non-grade related aspects.
Quotes. And summary: Learners believe that OER use improves the grade performance, educators to a lesser extent. There is stronger evidence for OER improving related factors for learners, for example enthusiasm, confidence and overall interest.
Introduce Hypo B – Openness The Open aspect of OER create different adoption and usage patterns than other online resources (People use OER differently from other online materials)
Hypothesis B is intended to guide exploration of whether the openness of open educational resources is a contributory factor to their being used differently from non-open online resources. To what extent does openness (ie openly licensed resources) make a difference over being online and free? Disentangling the influence of these elements is problematic, as the contribution of all factors will influence the use of a resource.
One indicator of the influence of openness is the degree to which resources are adapted. We find a comparatively high level of adaptation amongst all types of users (79.4%, n=1765), regardless of being educators (86.3%, n=556), formal learners (77.2%, n=336) or informal learners (84.7%, n=788).
What do they mean by adaptation.
What this suggests is that one impact of openness is that it allows a continuum of adaptation to develop, ranging from adapting ideas for their own material to full reversioning of content.
Introduce Hypo E Reflection: Use of OER leads to critical reflection by educators, with evidence of improvement in their practice (OER use leads educators to reflect on their practice)
There is strong evidence that OER use and exposure leads to reflection on practice by educators. It causes them to incorporate a wider range of content, to consider different teaching approaches and to reflect upon their role as educator. This is arguably the most significant impact of OER and one that is not widely promoted.
Introduce Hypo F Finance OER adoption at an institutional level leads to financial benefits for students and/or institutions (OER adoption brings financial benefits for students/institutions)
Quotes.
There is strong evidence for savings with Open Textbooks that are used to replace compulsory set texts. The evidence for cost savings of other forms of OER is less clear. Often it is difficult for educators to know whether their institution saves money, and what happens to any such savings. The obvious cost benefits of free resources are a clear, and easy benefit to articulate, but greater accountability is required to make these evident to all stakeholders.
Introduce Hypo C Access Open Education models lead to more equitable access to education, serving a broader base of learners than traditional education (OER widen participation in education)
Are open education models leading to more equitable access to education? The emergent picture is mixed, based on evidence from our research with collaborations. There is some negative evidence in the demographics of the informal learners, 57% of whom already have an undergraduate or postgraduate degree.
However, one use of OER that was evident was either to support formal students studying already or for trialling out a subject before committing to formal study (2nd fact) course. The Open University report a 10% conversion rate of learners using OpenLearn OER materials, to going to the offcial sign up page of a relevant course.
And (fact 3) some learners are using OER as a replacement for formal education which they might not otherwise have access to.
These aren’t necessarily all to do with openness, but form a set of co-ordinates that may help us to understand where openness makes a difference
These aren’t necessarily all to do with openness, but form a set of co-ordinates that may help us to understand where openness makes a difference
Shore, Nancy (2006). "Re-conceptualizing the Belmont Report: A community-based participatory research perspective". Journal of Community Practice 14 (4): 5–26. doi:10.1300/J125v14n04_02
Maybe add another quote here
‘[T]he moral principles guiding research from its inception through to completion and publication of results’ (British Psychological Society)
In practice, usually addressed at an institutional level through guidance issued by advisory bodies (e.g. National Institutes of Health, British Educational Research Association) or through institutional review board (IRB) / ethics committee
The ethical significance of human subjects and valuing human life (n.b. exceptions like animal ethics, environmental ethics)
Guidance similar (if not uniform) because all based on established common principles, but unspecific – detailed and specific regulations for every possibility do not exist
These aren’t necessarily all to do with openness, but form a set of co-ordinates that may help us to understand where openness makes a difference
(e.g. IRB, impact on human subjects, informed consent, objectivity)
Mention Ethics Manual
These aren’t necessarily all to do with openness, but form a set of co-ordinates that may help us to understand where openness makes a difference
PHRONESIS = THINK LIKE AN IRB
OBJECTVES
Insert one WHOLE paragraph of Gen Objective and Specific Objectives. Good examples are ICU and UPOU proposals.
Need this for MGC – should not put words in your mouth.
Balance payment in October 2016 will be based on Final Financial Report covering the entire grant period.
Balance payment in October 2016 will be based on Final Financial Report covering the entire grant period.
Balance payment in October 2016 will be based on Final Financial Report covering the entire grant period.
Balance payment in October 2016 will be based on Final Financial Report covering the entire grant period.
PI’s remuneration/honoraria – standardize at CAD$500 (MYR1,485)/day.