Supporting student learning in research methods and statistics in psychology through authentic assessment

Stephanie McDonald1, Andrew Reid2 and Christopher R. Madan1

1 School of Psychology, University of Nottingham, Nottingham, UK
2 Tilburg School of Social and Behavioral Studies, Tilburg University, Tilburg, The Netherlands

Corresponding Author:
Stephanie McDonald, School of Psychology, University Park, University of Nottingham, Nottingham, NG7 2RD, UK
Email: Stephanie.McDonald@nottingham.ac.uk

Abstract

Research methods is a challenging yet central topic within an undergraduate and postgraduate psychology degree. All other modules rely on an understanding of research methods, though statistics anxiety is prevalent amongst learners in non-mathematical disciplines—including psychology. Here we conducted a multi-stage study to explore student experiences with statistics learning and assessment and used these insights to develop a framework of authentic assessment. Findings from focus groups in Study 1 highlight two aspects of the curriculum which pose challenges for students, that is, the nature of the learning environment, highlighting the need to embed opportunities where students can engage actively with content, and how students are being assessed. Adopting an evidence-based approach, we re-designed the nature of the assessment to incorporate elements of peer-assessment and self-assessment tasks, providing students with opportunities to engage with and apply the assessment criteria through the use of exemplars and their own work both inside and outside the classroom. Subsequent focus groups conducted as part of Study 2 suggest that designing the learning environment to encourage active learning and developing teaching approaches underpinned by principles of authentic assessment and assessment for learning can have profound benefits for students in terms of engagement, self-efficacy, and reduced levels of statistics-related anxiety.

Keywords

statistics anxiety, assessment for learning, active learning, self-efficacy

 

Introduction

In the undergraduate psychology curriculum, research methods and statistics is a critical central pillar—regardless of a student’s interest in subfields of psychology, such as developmental or cognitive psychology, a strong understanding of research methods is essential. However, students often struggle with the mathematical and statistical principles that are fundamental to a module on research methods, and to their degree programme as a whole. This has become a well-established challenge within the undergraduate psychology curriculum over the last decades and across different countries’ approaches to teaching psychology, now known as ‘statistics anxiety’ (Bourne, 2018; Cruise et al., 1985; McDonald & Barnard, 2023; Onwuegbuzie & Wilson, 2003). In this study we evaluated current issues with our research methods module through a series of focus groups involving both undergraduate students and students on the MSc Psychology conversion course. Based on this feedback, we developed a new framework of assessment that is used across the full-year module to support student learning. This new assessment for learning includes active learning components, in the form of student engagement with assessment criteria and exemplars, peer assessment, and self-assessment tasks, to facilitate engagement and improve student outcomes.

Substantial research has been conducted on statistics anxiety, including distinguishing it from mathematics anxiety and identifying distinct components within it (Baloğlu, 1999, 2002; Cruise et al., 1985; Onwuegbuzie & Wilson, 2003; Paechter et al., 2017). Questionnaires have been developed to measure statistics anxiety, in particular the Statistics Anxiety Rating Scale (STARS; Cruise et al., 1985; Hanna et al., 2008), which includes six subscales (e.g., interpretational anxiety, worth of statistics). The importance of addressing this barrier to learning in research methods cannot be overstated. Statistics anxiety can lead to a lack of confidence in students' abilities to understand and apply statistical concepts, which can in turn affect their overall academic performance and future career prospects (Macher et al., 2013; Messer et al., 1999; Miller & Pyper, 2023; Pownall et al., 2023). Key areas that students often find challenging include understanding statistical concepts, interpreting data, and applying statistical methods to real-world problems (Cruise et al., 1985; Hanna et al., 2008; Onwuegbuzie & Wilson, 2003; Slootmaeckers et al., 2014). Ultimately, attenuating statistics anxiety at this stage in a student’s academic journey will allow us to help students develop a deeper understanding of statistics and equip them with the skills they need to succeed in their learning and future careers.

One approach to attenuating statistics anxiety is to embed active learning practices within the curriculum, supporting students in developing their statistical reasoning abilities. Active learning involves students in the learning process more directly than traditional didactic methods such as passive lectures, requiring students to engage in activities and to think about what they are doing (Bonwell & Eison, 1991). A key aspect of active learning is that students are actively processing information and making it their own, which can lead to better understanding and retention of the material. A number of studies have shown the benefits of adopting active learning approaches in research methods and statistics education on student learning, confidence in applying knowledge gained, attitudes towards the subject, and in reducing statistics anxiety (Allen & Baughman, 2016; Carlson & Winquist, 2011; Chiou et al., 2014; LaCosse et al., 2017). Through active learning and good feedback principles, students can become better at evaluating their work and assessing their own abilities. This is often achieved through engaging students with assessment criteria and applying these to the evaluation of exemplars of work as well as to self-assess their own work (Carless & Boud, 2018; Tai et al., 2018). In the context of research methods, exemplars can reflect example answers of written discussions of statistical results interpretations, e.g., describing how a statistical output from an Analysis of Variance (ANOVA) relates to a study design and experimental factors or levels.

Here we focused on two specific activities, peer assessment and self-assessment. Peer assessment allows students to evaluate and provide feedback on each other’s work. This not only encourages students to improve their understanding of the marking criteria and develop their evaluative judgments, but also fosters a collaborative learning environment where students can learn from each other (Brignell et al., 2019; Carless & Boud, 2018; Topping, 1998). Self-assessment and reflection involve students thinking about their own learning process, identifying areas where they have struggled, and developing strategies to overcome these challenges (Gibbs & Simpson, 2004). This can be facilitated through the use of guided exemplars, where students actively engage in thinking about their understanding of how those examples relate to the assessment criteria from a provided marking rubric, ultimately improving their ability to critically evaluate their own draft answer as they develop it and reducing students’ assessment anxiety (Boud & Molloy, 2013; Carless & Boud, 2018; Tai et al., 2018; Yucel et al., 2014). Exemplars can improve student feedback literacy by showing what quality work looks like and allowing them to improve their abilities to discriminate between work at different levels—improving evaluative judgments. Supporting students with good feedback practices has many benefits including facilitating the development of self-assessment, encouraging positive motivational beliefs, and helping students achieve desired levels of performance (Nicol & Kushwah, 2023; Nicol & Macfarlane-Dick, 2006; Winstone & Boud, 2022). In particular, being given clear indications of what characterises good performance (e.g., marking rubric and exemplars) can enhance transparency of assessment criteria for students (Johnsson, 2014) and further, less explicit and tailored feedback should be needed from instructors, helping maintain manageable workloads (Boud & Molloy, 2013).

Two practices for creating more effective learning experiences for students are authentic assessment and assessment for learning. Authentic assessment refers to the evaluation of students' abilities in real-world contexts, emphasising complex problem-solving and collaboration (Gulikers et al., 2004; Svinicki, 2004; Swaffield, 2011). Tasks are designed to simulate real-world challenges, requiring students to apply their knowledge and skills in practical ways. The focus is not only on the final product but also on the process of learning. Assessment for learning use assessment as a tool to enhance learning (Gibbs & Simpson, 2004; Wiliam, 2011). It involves a cycle of feedback and improvement, and the emphasis is on improving student learning. Together, these practices enhance student learning through formative assessments that are analogous to evaluation circumstances. Both practices align with the goal of making assessment a more engaging and supportive aspect of the educational experience.

To re-design the topic as an assessment for learning and to better support students, we used an evidence-based approach (Black & Wiliam, 1998) to embed active learning in our teaching practice. We also used a blended learning approach, utilising digital tools to make this process more efficient and effective (Brown et al., 2016). These digital tools included online submission via the virtual learning environment (VLE), peer-assessment evaluations and large group discussion facilitation using audience response systems, and a pre-recorded feedback video using a screen recorder software and the VLE. By designing assessment for learning, we sought to make the process more formative, enabling students with opportunities for peer assessment and reflection. Breaking up the assessment into smaller, more manageable components can also help reduce statistics anxiety (Onwuegbuzie, 2000). To begin this process, we conducted focus groups with students. Adopting a qualitative approach in the present study enables a richer understanding of students' experiences and the nuances of statistics anxiety, the challenges that students face, and students' specific worries and experiences around their course (McDonald & Barnard, 2023).

The aim of the present investigation was to evaluate the impact of a new framework of assessment, incorporating peer assessment and self-assessment principles, on students’ learning experience, self-efficacy and learning gains associated with their research methods module. Two sets of focus groups were conducted with undergraduate and MSc Psychology conversion students enrolled in a Research Methods and Statistics module, which is a requisite component of our psychology program. Study 1 included initial focus groups to assess students’ perceptions and experiences in learning statistics as part of their course to inform the design and implementation of changes in the curriculum. Study 2 involved the second set of focus groups seeking to evaluate changes in the learning environment and assessment practice on student engagement, self-efficacy, and learning.

Research methodology

An essentialist/realist epistemological framework was adopted, where the focus of the analysis is on participants’ experiences and the meanings of those experiences. This approach was applied as we wanted to investigate students’ experiences of learning and being assessed on a core component of their degree programme, without trying to fit the data into pre-conceived ideas based on relevant literature. Here we conducted focus groups to examine student challenges in the research methods module and views after an active learning approach was embedded within the curriculum design. A strength of this method is that it captures commonalities and nuances in students’ shared experiences. Focus group discussions are a useful approach for collecting data as interactions between participants can often lead to more elaborated reflections of student views and experiences on a particular topic (Wilkinson, 1998). In both studies, focus groups were facilitated by a researcher who was not involved in the module teaching. This was considered as students may not feel as free to express their thoughts on the module to someone who had taught the content, but rather is more independent. Study 1 was facilitated by an independent researcher; Study 2 was facilitated by the first author, who did not teach in this module in that academic year.

Study 1 – Initial focus groups

Methods

Participants

Four focus groups were conducted as part of Study 1. Two of the focus groups were comprised of students registered on the undergraduate and postgraduate modules at the time of the study, and the remaining two focus groups involved students in their final year of degree who had completed the module in the previous year. This recruitment strategy was chosen to provide both current and retrospective accounts of student experience and perceptions of learning and assessment. Further, students in their final year of studies are likely better suited to provide insights into how students feel they have used knowledge and skills acquired through their research methods and statistics module in the next level of their studies.

Participants were recruited via a mass email invitation, a Moodle forum invitation, and oral invitations by lecturers during class. Students were provided with an inconvenience allowance of £10 for participation in the study. A total of twenty-three students took part in the focus groups, with focus groups comprising between four and eight participants each. Participants’ ages ranged from 19 to 23 years (M = 21, SD = 2.07) and all identified as female. Nine students were currently registered on the course (eight on the undergraduate course and one student on the MSc Conversion course), and fourteen students were in their third year of studies, having completed the course in the previous year.

Procedure

The study received ethical approval from the School of Psychology Ethics Committee at the University of Nottingham (Ref. F1062). Focus groups lasted between 90 and 120 minutes. Informed consent was obtained from all participants prior to their participation in the focus groups. Focus group discussions were audio recorded and transcribed verbatim.

Focus group schedule

The focus group questions were designed to capture students’ views and experiences with module content, delivery, online support through the virtual learning environment, and the assessment. Students were asked to discuss aspects of module design they found beneficial or challenging, with particular emphasis on teaching methods, resources, and the assessment, as well as perceived relevance of the module with respect to their psychology course as a whole. The focus group schedule can be found in Appendix 1.

Analysis

Focus group data were analysed using inductive thematic analysis, guided by the methodological procedure outlined by Braun and Clarke (2006, 2012), in order to develop a thematic map of students’ experiences with the research methods component of their course. The aim of thematic analysis is to identify, analyse, and report patterns of ideas which are prevalent across the dataset and address the study’s research question(s). Inductive thematic analysis involves analysis that is driven by the data, rather than existing theory, with themes grounded in participants’ responses. Below we outline the six stages of conducting a thematic analysis on our dataset; we have previously summarised the stages in other work (McDonald & Blackie, 2023).

1.                Data familiarisation: Following transcription of focus group data, the analysis began by repeatedly reading through the focus group transcripts in order to gain familiarity with the data and to identify any initial ideas in the dataset which were relevant to participants’ perceptions and experiences of aspects of their research methods module.

2.                Data coding: This step involved identifying extracts in the dataset which were relevant to the research question and generating initial codes for those extracts. A semantic approach was adopted, whereby codes reflected a summary of surface level meanings in participants’ responses. Once all focus group transcripts were coded, initial codes were then collated. This process involved removing any duplicate codes and identifying instances where codes reflected the same idea but worded differently; in the latter case, codes were adapted to retain a final list of collated codes, with each code reflecting a unique idea in the dataset relevant to our research question.

3.                Initial search for themes: Conceptually similar codes were then clustered together into candidate themes.

4.                Reviewing themes: Candidate themes were then reviewed for content, to ensure that the codes within each developed theme cohere together meaningfully and that each theme is capturing distinct ideas from the other themes in relation to the research question. At this stage, some of the initial candidate themes were grouped together to form a more parsimonious theme, and some of the codes identified in the analysis were eventually discarded as they were deemed not to fit in any of the developed themes.

5.                Defining and naming themes: This step involved re-examining the final set of themes to identify core conceptual ideas captured by each theme and to develop names for each of the themes.

6.                Producing the report: In this final step, extracts from participants’ responses were selected, which illustrated how the conceptual ideas represented by each theme were featured in the data. These extracts are presented in the results section, in the form of quotations, as evidence of theme prevalence across the dataset.

The analysis was conducted by the first author. The resulting themes were then reviewed by the second and third authors, and any discrepancies were resolved through discussion.

Results

Two themes were developed in the analysis, which described students’ experiences and perceptions with regards learning research methods and statistics in their course: a) ‘Going beyond the surface level of statistics’ and b) ‘Developing effective approaches to learning’. These are described below, together with selected quotations from participants’ responses. See Appendix 2 for a table of themes developed in Study 1 and codes encapsulated within these themes.

Theme 1: Going beyond the surface level of statistics

This theme captures the worries and challenges faced by students studying statistics within their psychology degree and their perceptions around the value and relevance of the subject more broadly.

Students reported generally feeling anxious and overwhelmed with some of the content covered in the module, in particular the mathematical content and the theory behind statistical tests. For example, one participant commented “[…] as soon as it comes up on the screen like massive formulas […] I don’t want to look at that, that’s horrible […] scared of it […]” (Participant 6, FG 2). Some students felt a disconnect between the need to learn the theory and mathematical concepts associated with statistical techniques, as the rationale for this was not always clear to them. For example, one participant commented, “I didn’t understand the theory because I didn’t know why I was learning the theory or […] relevance” (Participant 8, FG 2). Feelings of confusion around challenging content and perceived lack of understanding often led to decreased engagement by students. This often resulted in students feeling left behind with substantial gaps in knowledge throughout the duration of the course, which for some, contributed towards feelings of anxiety around the module more generally. For example, one participant mentioned, “I’m not going to the stats lecture because I’m too far behind […] loads of catching up to do so I don’t see the point” (Participant 1, FG 1).

Focus group discussions revealed prevalence of low perceived self-efficacy in relation to applying the content covered in statistics modules. Perceived self-efficacy encompasses students’ perspective on whether they can confidently apply this knowledge to specific aspects of the module (e.g., assessment) and to other aspects of their course. For example, one participant commented, “[…] if I saw a dataset […] I don’t think I’d know what the best test to analyse data […] even if the lectures are straight forward […] it’s more like how you’re applying that and realising where you need to apply that” (Participant 1, FG 3). Some also commented that they would benefit from further exposure and practice with real-world, messy data rather than more straightforward examples of datasets. Participants also reported lower self-efficacy with regards their progress in the module, and also in relation to their peers (e.g., “[…] it just felt like everyone knew what they were doing apart from me […]” (Participant 4, FG 2).

Whilst recognising some of the challenges faced, participants also acknowledged the value and relevance of research methods and statistics in enhancing their understanding of the research process in psychology and in supporting their learning in other taught modules in their course, particularly in terms of engaging with and understanding published literature in different areas of psychology.

Theme 2: Developing effective approaches to learning

This theme encompasses students’ perceptions and experiences with the teaching methods used in this module, aspects of their learning environment, and the approach to and impact of the assessment on students. These provide fruitful suggestions on ways in which we can design learning environments which are effective in engaging students within the context of a challenging and anxiety-provoking compulsory subject.

Participants commented that the method of delivery of aspects of the module content aided their understanding. For example, a number of participants found that designing sessions to cover the theory and rationale behind statistical tests, followed by step-by-step guidance on how to, for example, interpret the findings of a statistical test was helpful towards their understanding. Example comments include “[…] they taught […] the theory and then they’ve done like a practice lecture afterwards which really helps to consolidate it” (Participant 5, FG 4). Focus group discussions also indicate that students benefit from using a variety of resources to support understanding of content. Examples include the use of short videos illustrating key concepts and processes that students found independently and step-by-step guides and worked examples provided by the teaching team through the VLE.

Participants, however, found the format of some of the teaching sessions challenging. Findings suggest a preference for interactive sessions, as opposed to the more traditional lecture-type classes, where students are given opportunities to apply knowledge acquired (e.g., “it doesn’t seem appropriate to be in a lecture […] you’re learning a skill”; Participant 5, FG 3). Participants, however, did acknowledge that the method of teaching needs to be adapted depending on the content covered, and thus, interactivity may not always work across the board. Opportunities to apply content in a teaching session, for example engaging in a formative test, and to receive feedback on individual or group performance was a suggestion frequently reported by students in our sample as a way of supporting learning. Interactive sessions would further facilitate a peer-learning environment, thus supporting each other with the learning the content (e.g., “[…] [in situations where] you’re with other people and you […] figure it out together and talk about it which helps a lot […]” (Participant 6, FG 2).

The greatest source of worry reported by students was related to the assessment of the module. At the time when the focus groups were conducted, students were assessed by means of a written three-hour long examination scheduled at the end of the academic year, upon completion of the teaching for the module. Focus group discussions revealed that the nature and timing of the assessment posed challenges for students and influenced their general approach and motivations around learning the subject. With regards timing of the exam, students commented “[…] it was just a case of memorising […] and recalling it” (Participant 6, FG 2), “[…] no deep understanding” (Participant 1, FG 1), with students suggesting that further opportunities within the module to demonstrate learning of content would help to reduce anxiety levels (e.g., “if you separate it out it is less stressful […] better understanding of what you actually don’t know”; Participant 2, FG 4).

The nature of the assessment was also commented negatively by focus group participants. Some students made comparisons between how they were asked to apply their statistical knowledge in the exam and how researchers or professionals in a work context would apply this knowledge (e.g., “[aspects of the assessment] aren’t applicable to stats in the real world”; Participant 5, FG 3). Although participants valued the importance of statistics for different aspects of their course (e.g., research project), discussions indicated that students tended to adopt a surface level approach to learning, with a primary focus on memory and recall of information, and an assessment-focused orientation when engaging with the module content. This reflected perceptions around ‘what do I need to know for the exam’ and, thus, taking a strategic approach with content engagement (e.g., “[…] those lectures were redundant […] because I knew I wasn’t gonna do that in the exam” (Participant 4, FG 2).

Summary and key messages

Findings from Study 1 highlight two key aspects of module design that needed addressing: a) the nature of the learning environment and student engagement, b) the nature of the assessment in supporting student learning.

Study 2 – Implementation and evaluation of curriculum changes

Based on Study 1 findings and relevant literature, the following changes were implemented in the curriculum progressively over a period of two academic years, which Study 2 sought to evaluate through focus groups with learners: A re-design of the assessment to develop a framework of authentic assessment incorporating two formative activities (practice assignment activity, followed by either: a, in-class activity; or b, reflection task) and two summative assessments over the academic year.

This framework consists of three key components:

1.      Practice assignment activity. Students work on a coursework assignment outside of class time and submit this through the VLE. The goal is to provide flexibility for the students and an opportunity for students to apply knowledge in practice. Students are awarded full marks for their best attempt. The purpose of the assignment is to engage with the assessment (here, statistical results interpretation) prior to the exam, which is much more heavily weighted towards the module grade.

2.      In-class peer-assessment activity. Discussion of the assessment criteria with students and engagement with the marking rubric. Students collaboratively assess peer work, with the use of classroom-based audience response tools, fostering a sense of community and shared learning. This activity is interleaved with classroom discussion, where the lecturer provides general feedback and uses exemplars to illustrate key points.

3.      Reflection task. Completed independently, this task is supported by a feedback video and serves as a form of self-assessment of their own work. The reflection task encourages students to critically evaluate their own work, identify areas of strength and weakness, and develop strategies for improvement.

Through this framework, we aim to create a more engaging and supportive learning environment, and a more effective and meaningful assessment process.

Teaching methods

Changes to module structure

Peer-assessment activity

Based on results from the focus groups (see below), further changes were made between the two academic years. Specifically, the activity was changed from solely formative assessment and in-class discussion to having a summative role, where each of the two practice assignment activities corresponded to a 5% weighting towards the final module grade and evaluated as a pass-fail mark based on a reasonable attempt of an answer. Prior to the day of the activity, the teaching team generated two representative examples based on student submissions, with varied degrees of quality. Actual responses of specific students were not used to avoid (1) the potential for a student to feel singled out, or (2) students to feel that someone received direct feedback, whereas others were not provided with that opportunity.

In the in-class activity session, the lecturer went through the assessment criteria from the associated marking rubric. Students worked individually or in small groups to evaluate these two examples in relation to the marking rubric. For each example, students provided their assessment on how well they met the criteria, as well as written feedback, using Microsoft Forms. The lecturer then went through these responses and provided guidance on where the examples matched the assessment criteria and where they were lacking, facilitated by the response summaries and graphs generated by Microsoft Forms.

Feedback video activity

This activity was implemented similar to the peer-assessment activity. Students were first given a research design and associated statistical output and asked to submit a reasonable attempt of a results section interpretation. After the submission deadline, students were given access to a feedback video that provided a step-by-step explanation of a complete answer and how it was put together (see Figure 1). This video was 7-minutes long and was recorded and edited using Camtasia (Techsmith, East Lansing, MI, USA). This approach was based on prior work by Brown et al. (2016).

Figure 1. Annotated screenshot from the feedback video activity. (Photo is of C. Madan, used with permission.)

Figure 1. Annotated screenshot from the feedback video activity. (Photo is of C. Madan, used with permission.)

Activity engagement

In the first academic year that included these added activities, where they were implemented in a purely formative nature, we measured engagement. For the peer-assessment activity, only 16% submitted an attempt. For the video-feedback activity, only 12% submitted an attempt. For the video itself, 43% of students watched the 7-minute video. For formative assessment to be effective, however, it requires active engagement from the students which is often a challenge (Gibbs & Simpson, 2004; Nicol & Macfarlane-Dick, 2006).

The feedback video activity was designed prior to the COVID-19 pandemic, but occurred in the same academic year as it began—concurrently with the harshest period of lockdown experienced (Ali, 2020). At this time, lectures were immediately forced to occur online and asynchronously, resulting in the video format becoming less of a distinct activity from the rest of the module’s lectures as a mode of content delivery. In this setting, we found that students responded better to the peer-assessment activity. As such, we moved to using this summative assignment structure for both assignments of the academic year. In the second year, where these activities were associated with a weighted contribution, engagement greatly increased. Engagement in the two peer-assessment activities was 75% and 73%, respectively.

A blended learning approach is inherent to this implementation, as digital tools are used at several stages of this framework. These include the VLE for accepting students’ submissions before the in-class activity, an online form (using Microsoft Forms) during the peer-assessment activity, allowing the lecturer to gauge students’ pace with the activity and facilitate large group discussions, and the development and delivery of the feedback video activity.

Focus groups

Participants

Study 2 was conducted with student cohorts over two academic years (2019 – 2020, and 2020 – 2021). Seven focus groups were conducted in total, online via Microsoft Teams. Four focus groups were comprised of participants from the first cohort of students (n = 14) and three with the second cohort of students (n = 8). Focus groups were comprised between two and four participants each. A total of 22 students (21 females, 1 male) registered on the research methods module took part. Participants’ ages ranged from 20 – 48 years (M = 24.2, SD = 6.4). Seventy-seven percent of participants were students from the UK and 23% were international students. Ten participants were registered on the undergraduate psychology course and 12 participants were registered students on the postgraduate psychology conversion course. All students taking part in the focus groups were registered on the research methods module at the time of participation. Students on both courses share the teaching and assessment of the research methods module. Recruiting both undergraduate and postgraduate students for evaluation of our new framework of assessment allowed us to gain further insights into student experiences across academic levels. The focus groups took place at the end of each academic year, following completion of the research methods module.

Ethical approval for this study was granted by the authors’ institution (Ref. F1062, F1288), adopting the same methods of participant recruitment and ethical procedures as Study 1. The objective of the focus group was to evaluate the impact of changes to module structure on the student learning experience. The same discussion topics were covered as in Study 1 (see Appendix 1).

Analysis

The same analytic approach was adopted as in Study 1 for the analysis of focus group transcripts.

Findings and general discussion

Focus group findings indicate a positive shift in learners' perceptions of both the assessment process and the learning environment. Our findings have several implications for practice, underscoring the significant impact that the design of our learning environment and our teaching approaches can have on student engagement, learning strategies, and attitudes towards challenging subjects. Here we highlight key findings relevant to the impact of changes in curriculum design, from the student perspective, and implications for practice.

Analysis of focus group data led to the development of an overarching theme of ‘It’s all in the design of the learning environment’. Embedded within this overarching theme were two themes: ‘Embedding active learning through a framework of authentic assessment’ and ‘Adopting effective approaches to content engagement’ (see Appendix 3 for themes and corresponding codes).

It’s all in the design of the learning environment

This overarching theme captures perspectives relating to the design of the physical and virtual learning environment and the impact that this can have on student learning, their learning experience, and self-efficacy associated with applying learning in different contexts.

Theme 1: Embedding active learning through a framework of authentic assessment

This theme captures students’ experience with and the impact of the design of the assessment.

Module content and assessment design support skills development

Participants spoke positively about the design of assessment in supporting their learning. The module content and assessment design were seen as supporting students in the development of important skills: in particular, interpreting and reporting research findings and thinking critically about research. Some students commented that they were able to apply this knowledge to other aspects of their course and felt confident in using this knowledge in their undergraduate research project. For example, one participant reported “it will help me a lot in my research project […] I understand how to write a results section and look at the data […] a lot more confident with it now because of stats [the module] […]” (Participant 1, FG 4).

Embedding a credit-bearing activity in the module enhances student engagement

Participants in the first student cohort commented on perceived usefulness of the peer assessment activity, however, the formative element of the activity led to reduced engagement. Participants in the second cohort, where engagement in the activity contributed towards module grade, felt the benefits of the approach. Introducing a mark weighting to this activity based on engagement rather that performance, was seen as positive by students, particularly with regards to motivating engagement. For example, one participant commented, “the idea of doing the practice ones and allocating 100% [mark] […] if you submit something is an excellent idea […] I'm not convinced that I wouldn't have given up if […] it was […] not worth any marks […] it's also quite confidence building […] I'm just gonna have a go and do the best that I can and that will be OK” (Participant 2, FG 5). These findings reveal that adding a weighted contribution to assessment activities can help to enhance student engagement, and subsequent learning gains. This aligns with strategies proposed by Winstone and Boud (2022).

Engagement with peer assessment activity facilitates understanding of content and confidence

Students commented that engaging with this activity supported their understanding of content. Having engaged with the different elements of the peer assessment activity supported students in developing confidence and feeling prepared in undertaking the end of semester assessment. For example, one participant commented “it was quite nice having the feedback […] we could apply that to our actual exams” (Participant 1, FG 6). More specifically, completing the first part of the activity, where students engaged in formulating their answer to the research study scenario and, thus, applying knowledge to practice, supported their learning and gave students the opportunity to self-assess their understanding of content and identify any challenges or gaps in knowledge (e.g., “[…] I think having to write it myself made me realize how much I don't understand about regression. And then I could like resolve that. So I think it was quite helpful […]” (Participant 1, FG 6).

Engaging actively with material and scenarios provides students with experience that supports their understanding of content, development of skills, and ability to apply knowledge to practice (Gulikers et al., 2004; Svinicki, 2004; Swaffield, 2011). Our findings suggest that this active engagement not only enhances learning but also increases self-efficacy and decreases aspects of anxiety. The benefits of such active engagement are numerous and can be realised even in a large classroom setting. To facilitate this, we propose the integration of activities both in and out of the classroom, leveraging the capabilities of a blended learning approach. Digital tools can be employed to foster engagement, facilitate collaboration and sharing of ideas, and provide feedback. Additionally, classroom discussions can serve as a form of feedback on individual student work, providing students with valuable insights into their performance and areas for improvement (Boud & Molloy, 2013; Carless & Chan, 2017). Through this approach, we aim to create a more interactive and supportive learning environment that promotes active learning and reduces statistics anxiety.

Engagement in peer assessment activity enhances student understanding of assessment criteria and expectations

With regards the in-class element of the activity, students felt that the use of resources, such as exemplars and annotated answer, and the lecturer explaining the assessment criteria and their application to the task, supported their understanding of assessment criteria and expectations around the assessment. For example, one participant mentioned “I feel like if I hadn't attended that session I wouldn’t have maybe even known how to access the marking scheme or what's actually required for me […] it really, really helps your understanding, once you know what it is that the markers are looking for” (Participant 2, FG 5).

Our approach emphasises a thorough understanding of assessment criteria. It is crucial for students to comprehend what is expected of them, how their work will be evaluated, and the implications of assessment weighting (Gibbs & Simpson, 2004; Wiliam, 2011). This understanding can influence the effort exerted and the level of engagement with the task. By making the assessment criteria clear, there should be less ambiguity that could lead to unnecessary stress or misdirected efforts. This discussion around the approach to assessment is an integral part of our framework, designed to empower students with the knowledge they need to succeed and to engage more effectively with their coursework.

Activity engagement can function as formative feedback for the heavier credit-bearing assessment

Students commented that the resources used in the sessions, the in-class discussion and overall feedback from the lecturer could be used as feedback for their own work. Some students commented specifically how they went on to use the annotated answers provided by the lecturer to compare their own work. For example, one participant reported “[…] the model answers and the other answers […] I do think those in themselves are a form of feedback […] I did actually go through all of them […] and compare and contrast to what I'd written […]” (Participant 1, FG 5). Some participants, however, mentioned that the lack of personalised feedback on their own work left them unsure if what they had written was correct.

For some students engaging in a marking activity of exemplars, guided by the marking criteria, was perceived as useful. Some students commented that the collaborative nature of the in-class task and the opportunity to apply knowledge gained in practice was helpful. For example, one participant mentioned “I found it really helpful […] understanding the actual data together […] peer assessed it together […] and then in the actual lecture […] what the lecturers were saying, like bit by bit and explaining why we needed to do that, why we needed to do this […]” (Participant 1, FG 4).

These findings suggest that assessments should be authentic and appropriate to the topic. They should be designed in ways that promote engagement and interaction, thereby making the assessment process more meaningful and less daunting for students (Gulikers et al., 2004; Svinicki, 2004; Swaffield, 2011). This could involve the use of real-world scenarios, collaborative projects, or problem-based assignments that allow students to apply what they have learned in a practical context. Our study, further, emphasises the importance of creating learning environments that facilitate the understanding of complex and often practical topics. Classrooms should be designed to support active and blended learning, with opportunities for peer interaction and feedback (Gibbs & Simpson, 2004; Wiliam, 2011). This could involve the use of digital tools to facilitate collaboration, the integration of active learning strategies such as group work or discussions, and the provision of timely and constructive feedback.

Some students use the annotated example as a template rather than considering the underlying assessment criteria

Our findings suggest that for some students it was the resources available that supported learning, such as use of an annotated answer to the task, rather than the in-class discussion associated with the peer assessment activity. Some expressed that they did not feel particularly confident on peer discussions and peers’ assessment of work, given that the whole cohort was at the same level of learning and understanding of content. Some participants also expressed a greater focus on the annotated answer and using that to guide their preparation for the assessment, and less emphasis and engagement with the marking criteria. For example, one participant mentioned “I have to be really honest. I didn't think I looked at the marking criteria before I wrote my mock answer […] I looked at lecture notes and like past examples and then just tried to basically recreate it […] … used the model answer […] almost as the marking criteria […] made tick boxes of everything that you need to include and this is what gets you marks kind of thing […]” (Participant 1, FG 7).

Although the annotated answer was seen as helpful to the majority of participants in supporting their learning, some participants commented that this resource was seen as a model or perfect answer, which may inhibit them from deviating from this template and thinking more in terms of applying assessment criteria and knowledge to a similar scenario in a new context. For example, one participant commented “I took [the example answer provided] as being incredibly formulaic, and for me that was personally what I needed from that. But […] I don't think it necessarily does give you that good an idea of how you might frame things differently, but still correctly” (Participant 1, FG 5).

Together, these findings suggest the importance of highlighting the objective of assessment criteria and how students can use these as a guide in understanding expectations around the assessment and to evaluate quality in their own work.

Theme 2: Adopting effective approaches to content engagement

This theme captures student perspectives around ways of motivating engagement with course content.

Some statistics anxiety remains and pre-existing mathematics knowledge can vary across students

Some anxiety around aspects of content was still evident in focus group discussions. For example, whereas some students spoke positively around the way the mathematical content was explained and integrated within the broader module content (e.g., “I don't really like math. But the way that it was […] explained and […] integrated with scenarios”; Participant 1, FG 3), others felt that lectures were quite theoretical in nature with fewer links with practical application of content. Students commented that being provided with opportunities to see how the content is applied in real world research would make the content less abstract and support their learning. Some examples of means by which to achieve this would include providing students with a research question which a particular research method would be suited to answer or bringing in published research in the field. For example, one participant commented “[…] when you're talking about a statistical method […] giving an example of a research project that could be used in because I think it's all very well saying this is an ANOVA, this is a regression. But […] we need more examples of when that's used […]” (Participant 1, FG 1).

The mathematical component was often seen as a challenge, where some students particularly on the postgraduate course feeling that teaching staff might be making assumptions around students’ level of knowledge. Some participants questioned the rationale behind the emphasis on maths where this aspect was perhaps seen as less relevant for the assessment and future career path, whilst others felt that that a greater breadth of research methods could be covered in less depth across the course. For example, one participant raised a point around “[…] too much effort put into the mathematics which is not needed. There are Statisticians who can do this for you and I would argue that as a clinical psychologist or as an experimental psychologist, you don't need to know the maths […] we have different programs that can do this for us. You could factor in so many other things into the module that are so much more useful […]” (Participant 2, FG 1).

Particularly with respect to approaching course content, participants expressed that a welcoming approach to statistics is seen as beneficial, with an emphasis on why this is covered as part of the course and the necessity of including mathematical content, for example, in the form of statistical formulas could be introduced thereafter. For example, one student commented “[…] it would definitely be very beneficial for people who don't enjoy statistics […] to have a more welcoming approach to statistics […] why we need statistics? why is it important? What does it give us? […] to kind of warm people to the subject […]” (Participant 2, FG 1). This highlights the need for teaching approaches that help students understand the concepts as well as why they are important. Students need to be supported in developing statistical literacy, understanding that statistics is part of psychology and how statistical analyses reported in research papers should be interpreted. This goes beyond a superficial understanding and deferring to others (e.g., a statistician), and being able to evaluate if research methods are adequate or verging on questionable research practices—a broader issue that underlies the current replication crisis in psychology, and science more broadly (Martinson et al., 2005; Nelson et al., 2018; Pownall et al., 2023; Simmons et al., 2011). Research methods need to be conveyed and taught as interconnected with different aspects of the curriculum.

Taking a holistic psychology curriculum-level approach to teaching research methods

Students also commented positively on how knowledge acquired in this module can be useful to other aspects of their course, and the benefits around alignment of research methods with other aspects of the curriculum. For example, students commented how helpful it was to learn the theoretical aspect of research methods and then apply this to some of the practical components of their course, such as laboratory classes for a different module, demonstrating the importance placed by students around aligning course content at curriculum level.

Focus group discussions also revealed students’ preferences and perspectives around different teaching and learning methods. For example, live demonstrations of steps in statistical output interpretation, rather than the mere use of static PowerPoint slides was seen as helpful in supporting learning. Similarly, students who engaged with the feedback video activity indicated that they saw clear benefits for the learning of content. Some students commented that being able to view the process in deriving the answer to the scenario presented in the activity live, rather than in the form of screenshots on a PowerPoint slide, supported their understanding (“[…] it was a lot easier to understand than screenshots of SPSS”; Participants 4, FG 2). The narrative accompanying this process, which highlighted where key information can be derived, was beneficial to students in terms of integrating information and seeing it all in practice. For example, one participant commented “[…] help to enhance my understanding of […] what all the numbers mean, what they relate to and what you're trying to find out. And […] walking through an answer […] just to really integrate it into your head is so important […]” (Participant 1, FG 3).

Our findings strongly advocate for a holistic, curriculum-level approach to the teaching of research methods. Given the foundational role that research methods play across various aspects of the course, we propose it is crucial that its teaching is aligned with other aspects of the course. This alignment not only enhances students' ability to perceive the relevance and value of research methods but also facilitates the application of knowledge across different areas of study, thereby enriching students' understanding of psychology research studies. However, our approach goes beyond mere alignment. It also emphasises active engagement with the content, which is instrumental in supporting student learning of the topic. While challenges with maths remain, our approach provides a rationale for the content and demonstrates how it links with other aspects of statistical content. This is achieved through a welcoming approach to the teaching of research methods, which includes the use of videos and live demonstrations to facilitate understanding.

Our approach also advocates for a gentler introduction to the module. This can ease students into the complexities of research methods, thereby reducing the potential for statistics anxiety. Our results underscore the importance of this holistic, curriculum-level approach to effectively teach research methods. In essence, our approach not only ensures that research methods are linked to provide students with opportunities to apply knowledge gained in research methods modules, but it also fosters an engaging and supportive learning environment. Furthermore, the rationale and importance of statistics should be made clear to students, to enhance their understanding of the subject's relevance and value (McDonald & Barnard, 2023; Pownall et al., 2023).

Study limitations

When conducting the focus groups, we included both current students and those that had completed the module a year previously within the same discussion. This was done to facilitate scheduling groups of sufficient size for the focus group sessions, but also resulted in students participating in the same discussion that were at different stages of their degree progression. In future work, it would be preferable to schedule focus groups with these two cohorts to be at separate times. A further limitation of this study is that additional pedagogical questions could have been examined, such as how the different student cohorts (undergraduate vs. postgraduate conversion) view the topics discussed in the focus groups. Here we sought for generality across these cohorts, though additional insights could be obtained from comparing the cohorts or using other qualitative approaches. Nonetheless, the insights discussed in this paper provide meaningful evidence and guidance on the use and implementation of authentic assessments, particularly within a research methods module.

Conclusions

In summary, our research demonstrates the development of a framework for facilitating students’ critical thinking and self-assessment using authentic assessments in teaching research methods, centred around assessment for learning. The framework aimed to enhance student engagement through activities like peer evaluation and reflection, while also promoting understanding of assessment criteria. Focus groups indicated this approach fostered a more positive learning environment and assessment process. Our findings underscore the need to simplify complex concepts, facilitate active learning and collaboration, and gently introduce challenging material to help students grasp relevance. Overall, we advocate for a holistic approach that links research methods content and actively engages students to increase self-efficacy and skills application. This demonstrates the substantial impact instructional design can have on student perceptions, learning strategies, and mastery of challenging subjects.

References

Ali, W. (2020). Online and remote learning in higher education institutes: A necessity in light of COVID-19 pandemic. Higher Education Studies, 10(3), 16-25. https://doi.org/10.5539/hes.v10n3p16

Allen, P. J., & Baughman, F. D. (2016). Active learning in research methods classes is associated with higher knowledge and confidence, though not evaluations or satisfaction. Frontiers in Psychology, 7, 279. https://doi.org/10.3389/fpsyg.2016.00279

Baloğlu, M. (1999). A comparison of math anxiety and statistics anxiety in relation to general anxiety. Texas A&M University – Commerce Department of Psychology. ERIC database. https://eric.ed.gov/?id=ED436703

Baloğlu, M. (2002). Psychometric properties of the statistics anxiety rating scale. Psychological Reports, 90(1), 315–325. https://doi.org/10.2466/pr0.2002.90.1.315

Barnard, M., Whitt, E., & McDonald, S. (2021). Learning objectives and their effects on learning and assessment preparation: insights from an undergraduate psychology course. Assessment & Evaluation in Higher Education, 46(5), 673-684. https://doi.org/10.1080/02602938.2020.1822281

Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7-74. https://doi.org/10.1080/0969595980050102

Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom. 1991 ASHE-ERIC higher education reports. ERIC. https://eric.ed.gov/?id=ED336049

Boud, D., & Molloy, E. (2013). Rethinking models of feedback for learning: The challenge of design. Assessment & Evaluation in Higher Education, 38(6), 698–712. https://doi.org/10.1080/02602938.2012.691462

Bourne, V. J. (2018). Exploring statistics anxiety: Contrasting mathematical, academic performance and trait psychological predictors. Psychology Teaching Review, 24(1), 35–43. https://doi.org/10.53841/bpsptr.2018.24.1.35

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa

Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology (Vol. 2): Research designs: Quantitative, qualitative, neuropsychological, and biological (pp. 57–71). American Psychological Association. https://doi.org/10.1037/13620-004

Brignell, C., Wicks, T., Tomas, C., & Halls, J. (2019). The impact of peer assessment on mathematics students’ understanding of marking criteria and their ability to self-regulate learning. MSOR Connections, 18(1), 46–55. https://doi.org/10.21100/msor.v18i1.1019

Brown, R. C. D., Hinks, J. D., & Read, D. (2016). A blended-learning approach to supporting students in organic chemistry: Methodology and outcomes. New Directions in the Teaching of Physical Sciences, 8, 33–37. https://doi.org/10.29311/ndtps.v0i8.492

Carless, D., & Boud, D. (2018). The development of student feedback literacy: Enabling uptake of feedback. Assessment & Evaluation in Higher Education, 43(8), 1315–1325. https://doi.org/10.1080/02602938.2018.1463354

Carless, D., & Chan, K. K. H. (2017). Managing dialogic use of exemplars. Assessment & Evaluation in Higher Education, 42(6), 930-941.  https://doi.org/10.1080/02602938.2016.1211246

Carlson, K. A., & Winquist, J. R. (2011). Evaluating an active learning approach to teaching introductory statistics: A classroom workbook approach. Journal of Statistics Education, 19(1). https://doi.org/10.1080/10691898.2011.11889596

Chiou, C. C., Wang, Y. M., & Lee, L. T. (2014). Reducing statistics anxiety and enhancing statistics learning achievement: Effectiveness of a one-minute strategy. Psychological Reports, 115(1), 297-310. https://doi.org/10.2466/11.04.PR0.115c12z3

Cruise, R. J., Cash, R. W.,& Bolton, D.L. (1985). Development and validation of an instrument to measure statistical anxiety [Paper presentation]. Annual meeting of the Statistical Education Section; Chicago, IL.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences USA, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111

Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports student learning. Learning and Teaching in Higher Education, 1, 3-31. https://eprints.glos.ac.uk/3609/

Gulikers, J. T. M., Bastiaens, T. J., & Kirschner, P. A. (2004). A five-dimensional framework for authentic assessment. Educational Technology Research and Development, 52(3), 67–86. https://doi.org/10.1007/BF02504676

Hanna, D., Shevlin, M., & Dempster, M. (2008). The structure of the statistics anxiety rating scale: A confirmatory factor analysis using UK psychology students. Personality and Individual Differences, 45(1), 68–74. https://doi.org/10.1016/j.paid.2008.02.021

Jonsson, A. (2014). Rubrics as a way of providing transparency in assessment. Assessment & Evaluation in Higher Education, 39(7), 840-852. https://doi.org/10.1080/02602938.2013.875117

LaCosse, J., Ainsworth, S. E., Shepherd, M. A., Ent, M., Klein, K. M., Holland-Carter, L. A., Moss, J. H., Licht, M., & Licht, B. (2017). An Active-learning approach to fostering understanding of research methods in large classes. Teaching of Psychology, 44(2), 117-123. https://doi.org/10.1177/0098628317692614

Macher, D., Paechter, M., Papousek, I., Ruggeri, K., Freudenthaler, H. H., & Arendasy, M. (2013). Statistics anxiety, state anxiety during an examination, and academic achievement. British Journal of Educational Psychology, 83(4), 535–549. https://doi.org/10.1111/j.2044-8279.2012.02081.x

Martinson, B. C., Anderson, M. S., & De Vries, R. (2005). Scientists behaving badly. Nature, 435(7043), 737–738. https://doi.org/10.1038/435737a

McDonald, S., & Barnard, M. P. (2023). The influence of prior experience with mathematics and A-Level science subjects on statistics anxiety in undergraduate psychology students. Psychology Teaching Review, 29(1), 37–50. https://doi.org/10.53841/bpsptr.2023.29.1.37

McDonald, S., & Blackie, L. E. (2023). A theoretical qualitative investigation exploring illness perceptions and decision-making about COVID-19 in an ethnically diverse UK-Based sample. Patient Preference and Adherence, 17, 473-489. https://doi.org/10.2147/PPA.S389660

Messer, W. S., Griggs, R. A., & Jackson, S. L. (1999). A national survey of undergraduate psychology degree options and major requirements. Teaching of Psychology, 26(3), 164–171. https://doi.org/10.1207/S15328023TOP260301

Miller, A., & Pyper, K. (2023). Anxiety around learning R in first year undergraduate students: Mathematics versus biomedical sciences students. Journal of Statistics and Data Science Education, 32(1), 47-53. https://doi.org/10.1080/26939169.2023.2190010

Nelson, L. D., Simmons, J., & Simonsohn, U. (2018). Psychology’s renaissance. Annual Review of Psychology, 69(1), 511–534. https://doi.org/10.1146/annurev-psych-122216-011836

Nicol, D., & Kushwah, L. (2023). Shifting feedback agency to students by having them write their own feedback comments. Assessment & Evaluation in Higher Education, 49(3), 419-439. https://doi.org/10.1080/02602938.2023.2265080

Nicol, D. J., & MacfarlaneDick, D. (2006). Formative assessment and selfregulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218. https://doi.org/10.1080/03075070600572090

Onwuegbuzie, A. J., & Wilson, V. A. (2003). Statistics anxiety: Nature, etiology, antecedents, effects, and treatments--a comprehensive review of the literature. Teaching in Higher Education, 8(2), 195–209. https://doi.org/10.1080/1356251032000052447

Onwuegbuzie, A.J. (2000). Attitudes toward statistics assessments. Assessment and Evaluation in Higher Education, 25, 325–343. https://doi.org/10.1080/713611437

Paechter, M., Macher, D., Martskvishvili, K., Wimmer, S., & Papousek, I. (2017). Mathematics anxiety and statistics anxiety. Shared but also unshared components and antagonistic contributions to performance in statistics. Frontiers in Psychology, 8, 1196. https://doi.org/10.3389/fpsyg.2017.01196

Pownall, M., Azevedo, F., König, L. M., Slack, H. R., Evans, T. R., Flack, Z., Grinschgl, S., Elsherif, M. M., Gilligan-Lee, K. A., De Oliveira, C. M. F., Gjoneska, B., Kalandadze, T., Button, K., Ashcroft-Jones, S., Terry, J., Albayrak-Aydemir, N., Děchtěrenko, F., Alzahawi, S., Baker, B. J., Pittelkow, M.-M., Riedl, L., Schmidt, K., Pennington, C. R., Shaw, J. J., Lüke, L., Makel, M. C., Hartmann, H. Zaneva, M., Walker, D., Verheyen, S., Cox, D., Mattschey, J., Gallagher-Mitchell, T., Branney, P., Weisberg, Y., Izydorczak, K., Al-Hoorie, A. H., Creaven, A.-M., Stewart, S. L. K., Krautter, K., Matvienko-Sikar, K., Westwood, S. J., Arriaga, P., Liu, M., Baum, M. A., Wingen, T., Ross, R. M., O'Mahony, A., Bochynska, A., Jamieson, M., Vel Tromp, M., Yeung, S. K., Vasilev, M. R., Gourdon-Kanhukamwe, A. Micheli, L., Konkol, M., Moreau, D., Bartlett, J. E., Clark, K., Brekelmans, G., Gkinopoulos, T., Tyler, S. L., Röer, J. P., Ilchovska, Z. G., Madan, C. R., Robertson, O., Iley, B. J., Guay, S., Sladekova, M., Sadhwani, S., & FORRT. (2023). Teaching open and reproducible scholarship: A critical review of the evidence base for current pedagogical methods and their outcomes. Royal Society Open Science, 10(5), 221255. https://doi.org/10.1098/rsos.221255

Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359–1366. https://doi.org/10.1177/0956797611417632

Slootmaeckers, K., Kerremans, B., & Adriaensen, J. (2014). Too afraid to learn: Attitudes towards statistics as a barrier to learning statistics and to acquiring quantitative skills. Politics, 34(2), 191–200. https://doi.org/10.1111/1467-9256.12042

Svinicki, M. D. (2004). Authentic assessment: Testing in reality. New Directions for Teaching and Learning, 100, 23–29. https://doi.org/10.1002/tl.167

Swaffield, S. (2011). Getting to the heart of authentic Assessment for Learning. Assessment in Education: Principles, Policy & Practice, 18(4), 433–449. https://doi.org/10.1080/0969594X.2011.582838

Tai, J., Ajjawi, R., Boud, D., Dawson, P., & Panadero, E. (2018). Developing evaluative judgement: enabling students to make decisions about the quality of work. Higher Education, 76, 467-481. https://doi.org/10.1007/s10734-017-0220-3

Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68(3), 249–276. https://doi.org/10.3102/00346543068003249

Wiliam, D. (2011). What is assessment for learning? Studies in Educational Evaluation, 37(1), 3–14. https://doi.org/10.1016/j.stueduc.2011.03.001

Wilkinson, S. (1998). Focus group methodology: A review. International Journal of Social Research Methodology, 1(3), 181-203. https://doi.org/10.1080/13645579.1998.10846874

Winstone, N. E., & Boud, D. (2022). The need to disentangle assessment and feedback in higher education. Studies in Higher Education, 47(3), 656-667. https://doi.org/10.1080/03075079.2020.1779687

Yucel, R., Bird, F. L., Young, J., & Blanksby, T. (2014). The road to self-assessment: Exemplar marking before peer review develops first-year students’ capacity to judge the quality of a scientific report. Assessment & Evaluation in Higher Education, 39(8), 971–986. https://doi.org/10.1080/02602938.2014.880400

 


 

Appendices

Appendix 1. Focus group schedule for Study 1 and Study 2

Below we detail our focus group schedule, organised by broad topic areas, each with sets of questions to ensure are discussed.

1.                General views on the module, including lecture content, content delivery, online support via the virtual learning environment.

·        Which aspects of the module have you found beneficial?

·        Can you tell us about aspects of the module which you found particularly challenging or aspects which you feel may not have supported you in your learning?

 

2.                I would like us to discuss how your view this module in relation to other aspects of your psychology course. In particular, we would like you to consider the relevance of this module with respect to the psychology programme as a whole.

·        How well do you feel you can apply knowledge or skills acquired through the module to your other modules?

 

3.                This module is primarily delivered through weekly lectures.

·        Study 1: What are your thoughts on the mode of delivery and the content covered?

·        Study 2: We would now like to focus on the delivery of content in this module. This module is delivered through weekly sessions, with some sessions designed to be more interactive in nature (e.g., peer-assessment in-class task). What are your thoughts on these aspects of the module?

 

4.                This next discussion topic focuses on the use of Moodle as an online support platform for this module.

·        Can you share your experiences with Moodle in terms of how you have used it in the context of this module, which resources you have found particularly useful, and whether you might like to suggest any resources that we could add to support students in their learning?

 

5.                We would now like to focus on the assessment for the module.

·        Could you tell us your thoughts and experiences on the nature and format of this assessment?

 


 

Appendix 2. Themes developed in Study 1

 

Going beyond the surface level of statistics

Developing effective approaches to learning

·      Feeling anxious about mathematical concepts

·      Feeling anxious towards theory behind statistical tests

·      Feeling overwhelmed with subject leads to decreased engagement

·      Need for more exposure and practice with real life data

·      Lower self-efficacy associated with applying module content

·      Worries around progress on module and in relation to peers

·      Positive value and relevant of research methods to other aspects of degree

·      Research methods knowledge facilitates understanding of research project and published literature

·      Curriculum structure facilitates learning of content

·      Videos and step-by-step guides support student learning

·      Interactive classroom environment facilitates confidence and understanding of challenging content

·      Opportunities for practice and feedback in the classroom can facilitate learning

·      Nature and timing of assessment can contribute towards statistics anxiety

·      Assessment design can influence student engagement with content and approach to learning

·      Learning is assessment-driven

·      Assessment should be authentic and relevant to the field of the subject

 


 

Appendix 3. Themes developed in Study 2

 

It’s all in the design of the learning environment

Embedding active learning through a framework of authentic assessment

Adopting effective approaches to content engagement

·        Module content and assessment design support skills development

·        Credit-bearing peer assessment activity enhances student engagement

·        Engagement with peer assessment activity overall facilitates understanding of content

·        Students felt confident and prepared going into the exam having engaged with the peer assessment activity

·        Writing results section as part of the peer assessment activity beneficial for self-assessing understanding and identifying challenges

·        Feedback from peer assessment task could be used for the exam

·        The in-class elements of the peer assessment activity support understanding of assessment criteria, expectations around the assessment and assessment preparation

·        Collaborative nature of the in-class task and the opportunity to apply knowledge gained in practice was helpful

·        In some instances, focus was directed at annotated example answers as a template in preparing for the exam, less so in assessment criteria

·        Some anxiety around aspects of content was still evident

·        Further links between theory and practical application of content needed

·        Mathematical component often perceived as challenging

·        Rationale on the emphasis on mathematics is not always evident to students

·        Mathematical content can be perceived as less relevant for the assessment and future career path

·        Preference for a more welcoming approach to research methods in the curriculum

·        Taking a curriculum-level approach to the study of research methods - benefits around alignment of research methods with other aspects of the curriculum

·        Mode of content delivery and student engagement can support student learning

·        Feedback video resource with live step-by-step demonstrations of process supports student learning

 

Acknowledgments

The authors would like to thank Rumandeep Hayre and Asiyya Jaffrani for their support in running the focus groups and transcription of data. We would also like to acknowledge the funding received through the Faculty of Science Education and Student Experience Grant Scheme at the University of Nottingham to support the project.