Open Education Mirrors the Open Science Reform Movement

Published by Rima Maria Rahal on

Originally published at ijoerandbeyond.org/open-education-mirrors-the-open-science-reform-movement/ | 

Open Educational Resources (OERs) are a game-changer for education, for a plethora of reasons spanning aspects such as accessibility and dissemination. Here, I want to focus on the promise of OERs to facilitate updating educational materials.

Changeability in OERs

Open resources can be more adaptable to the specific situation in which they will be used. Because creators of OERs can choose copyright licenses that allow other educators to update and modify these materials, this makes it possible to legally adapt existing resources to specific needs, courses, and circumstances. This opens OERs up to change: instead of relying only on updates from the original creators, OERs can also be changed by other educators. 

Moreover, OERs are often conceived as virtual resources from the start: textbooks, syllabi and worksheets are often available online and do not necessarily rely on print materials. Hour-long video lectures can be split into short knowledge clips, which makes it easy to replace sections of the lectures when an update is necessary. The nature of OERs as creatures of the internet facilitates remixing and adapting these materials. 

Keeping Education Up To Date

But why is the capacity of educational materials to be updated such a valuable feature? The world of scientific insight and research discoveries never stands still. Every day, the corpus of scientific literature increases and new frontiers of knowledge emerge. After such new ideas have been rigorously debated and tested in the academic community, they eventually enter into the educational canon and find their way into textbooks, syllabi, and lesson plans. Keeping up with such developments becomes easier when educational materials can be updated more easily — as can be the case with OERs. 

However, it may also be the case that the educational canon should undergo a revision because what was previously thought to be teachable content turns out to be questionable after all. This may be the case because of societal developments, for instance in how historically oppressed groups are represented in educational content. However, revisions may also be necessary because the scientific work on which these educational materials are based is no longer tenable. 

On Being Wrong

Being wrong is very much part of the history of science; sometimes, research goes down a road that appears to be the right one and then turns out to be quite wrong later. One such example is phrenology, a science-at-the-time that aimed to find relationships between bumps on people’s heads with their personality and other mental states (e.g., Gall, 1835): While influential and widely discussed, the central hypotheses of phrenology do not hold up to empirical tests (e.g., Parker et al., 2018). Today, phrenology is regarded as pseudoscience. 

There are many ways by which one can end up being wrong in science. Part of being wrong simply comes from innocent statistical errors (Healy, 2009). However, one can also end up being wrong when using problematic research methodology and statistical analyses (Simmons et al, 2011), which a surprisingly high number of researchers reported engaging in (e.g., John et al., 2012). Spotting these cases is an important part of scientific work. But even though there are mechanisms in place to check research before it is published, false findings can enter the academic record (Altmann, 2006). And then, not only the scientific record needs to be corrected (which is quite hard as it stands, compare Brandolini’s law (also referred to as the “bullshit asymmetry principle”, e.g., Williamson, 2016). The corrections also need to trickle down into the educational canon. 

An example of such a pending update of the educational canon because science was wrong is currently budding in social psychology. The field has seen much upheaval in the past 10 years (Fidler & Wilcox, 2018), often referred to as the Replication Crisis. Some of the published research findings could not be replicated, meaning that a renewed attempt to conduct the research that led to the original findings did not show the same results (Open Science Collaboration, 2015). This leads to the suspicion that parts of what was considered the state of the field in social psychological research may be false, and consequently, that parts of the social-psychological insights that are being taught at universities and high schools may be false. An update of the educational materials seems necessary. Such an update could take different forms, from flagging specific contents as questionable, to explaining the larger context of the non-reproducible findings and what this means for what we know in the field of social psychology or to even striking questionable materials from the canon. But to which degree this updating process indeed takes place remains unclear. Educators may be informally adapting their syllabi, but it takes a while for textbooks to change. Some educators have begun to create OERs that teach the controversy in the field (e.g., Schimmack’s (2018) Introduction to Anti-Social Psychology), and others have created OERs that aim to equip future researchers with the skills to avoid (and spot) methodological fallacies that can lead to non-reproducible findings (e.g., Laken’s (2018) online course on statistical inference, or the Open for Insight online course on experimental methodology (Rahal, 2020)). 

Changing the Game by Changing the Canon

In their capacity to facilitate change, OERs are part of the larger Open Science movement towards more transparency and openness in science generally and translate the scientific reform movement into educational reform. Both movements mark a turn towards improving our ability to update what insights we believe are robust and ultimately teachable. 

References

Altman, L. K. (2006, May 2). For science gatekeepers, a credibility gap. The New York Times. Retrieved from https://www.nytimes.com/2006/05/02/health/02docs.html

Fidler, F., & Wilcox, J. (2018). Reproducibility of Scientific Results. In Edward N. Zalta (Ed.) The Stanford Encyclopedia of Philosophy. 

Gall, F. J. (1835). On the functions of the brain and each of its parts: With observations on the possibility of determining the instincts, propensities, and talents, or the moral and intellectual dispositions of men and animals, by the configuration of the brain and head. Marsh, Capen and Lyon.

Healy, J. F. (2009). The Essentials of Statistics: A Tool for Social Research (2nd ed. Pp. 177-205).Cengage Learning. 

John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953

Lakens, D. (2018). Improving your Statistical Inferences. Coursera. Retrieved from https://www.coursera.org/learn/statistical-inferences  

Open Science Collaboration (2015). Estimating the Reproducibility of Psychological Science. Science, 349(6251): 943–951. https://doi.org/10.1126/science.aac4716

Parker, J. O., Alfaro-Almagro, F., & Jbabdi, S. (2018). An empirical, 21st century evaluation of phrenology. Cortex, 106, 26–35. doi: doi:10.1016/j.cortex.2018.04.011

Rahal, R.-M. (2020). Open for Insight: An online course in experimentation. PsychArchives. http://dx.doi.org/10.23668/psycharchives.4319

Schimmack, U. (2018). An Introduction to Anti-Social Psychology. Retrieved from https://replicationindex.com/2018/12/28/an-introduction-to-anti-social-psychology/

Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science, 22(11), 1359–1366. https://doi.org/10.1177/0956797611417632

Williamson, P. (2016). Take the time and effort to correct misinformation. Nature. 540(7632), 171. https://doi.org/10.1038/540171a