Navigating assessment challenges: Students’ reflection on preparation for exams, essays, presentations, and reports

 

Pauldy C. J. Otermans1, Chelsea C. Livingstone2, Lai A. Nasar1, David B. Tree1 and Stephanie A. Baines1

1 Brunel University London, UK
2 University of Otago, Dunedin, New Zealand

Corresponding Author:
Pauldy C. J. Otermans, Department of Psychology, Brunel University London, Kingstone Lane, London, UB8 3PH
Email: Pauldy.Otermans@brunel.ac.uk

Abstract

Assessments play a key role in students’ academic experience in higher education. This study investigates how students prepare for different types of assessments. Survey data collected from 104 BSc Psychology and BSc Psychology (Sport, Health and Exercise) and 90 BSc Biomedical Sciences students showed that there was a wide discrepancy regarding preparation time for assessments. Focus groups were conducted to gain deeper insights into students’ perception and preparation for assessments. Thematic analysis revealed three key themes: (1) the nature of the assessment influences the level of preparation, with multiple-choice exams often perceived as requiring minimal effort, while essays and lab reports demand more extensive critical thinking and preparation time; (2) procrastination is prevalent, especially for tasks that involve complex, in-depth work like report writing; (3) peer collaboration plays a significant role, particularly in assessments that require structured thinking, such as essays and oral presentations. These findings underscore the need for educators to consider the diversity in student preparation strategies when designing assessments and support systems.

Keywords

assessment and feedback, study revision techniques, preparing for assessments, higher education


 

Introduction

Preparing for assessments is an important aspect of a student’s academic experience. It not only impacts their grades but also their understanding of the content they are learning (Entwistle & Entwistle, 1992). Students prepare for assessments in different ways. This may include using different strategies that work best for their learning to adopting disciplined study habits (MacKenzie, 1994). The aim of this study was to conduct an in-depth multi-methods exploration of students’ perspectives on effective methods of preparation for assessments, using a quantitative survey and focus groups.

Several studies suggest that students use different methods of preparation for different types of assessment (Ahdad & Ighilkrim, 2018; Biggs, 1979; Karagiannopoulou, 2006; Marton & Saljo, 1976; Scouller, 1998; Tang et al., 1999). Tang et al. (1999) investigated the way in which students prepare for traditional and portfolio assessments in a Problem-Based Learning curriculum (PBL). PBL focuses on students as active learners, enabling active construction of knowledge by involving students in meaningful situations, while using high-level learning strategies. The study focused on one final-year nursing course, using a questionnaire to evaluate how students prepared for different assessments. During this course the students received two teaching methods, one more traditional method (lectures and seminars) and another using a PBL approach. The traditional method was academically assessed using a multiple-choice exam, while the PBL approach was assessed using a portfolio, where students reflect on their PBL experience in relation to their clinical practise. The study concluded that in multiple-choice tests students mainly focused on what they learned in class and memorised it. In contrast, when preparing for portfolios, students used wider reading. The preparation strategies used in this case are more cognitively challenging, such as application and reflection.

Similarly, Scouller (1998) focused on using multiple-choice examinations and assessment essays. The study focused on two types of learning strategies, deep learning and surface learning. A deep learning approach focuses on meaning and understanding, whereas a surface approach focuses on recalling and reproducing information (Biggs, 1979; Marton and Saljo, 1976). This study found that students were more likely to use surface learning approaches in multiple-choice exams as they perceived the exams as testing lower levels of cognitive processing. Yet, when working on essays, students were more likely to use deep learning approaches because they perceived the essays as assessing higher levels of cognitive processing. The use of deep learning approach was positively associated with higher marks in the essay, but this was not the case for multiple-choice exams. Similar results were identified by Stanger-Hall (2012) which indicated that students preparing for multiple-choice exams tend to focus on memorisation and recognition tasks, whereas essay-based exams encourage the development of critical thinking and the ability to articulate complex arguments. Students adopt different preparation techniques for open-book exams, often emphasising understanding and organisation of materials, while closed-book exams led to increased rote memorisation (Agarwal et al., 2008).

Therefore, the assessment type impacts how students study based on how they perceive different assessments. Students adopt distinct strategies when preparing for formative assessments, which are ongoing and provide feedback, compared to summative assessments, which evaluate cumulative knowledge. Nicol and Macfarlane-Dick (2006) found that students engage more deeply with material and seek constructive feedback during formative assessments, enhancing self-regulated learning. Winstone and Carless (2020) also found that formative assessment leads students to adopt more iterative and reflective study practices, enhancing their engagement and learning outcomes.

Students facing practical assessments, such as laboratory work, prioritise hands-on practice and application of concepts. Conversely, theoretical assessments drive students toward extensive reading and conceptual understanding, indicating a strategic alignment of preparation methods with assessment demands (Biggs et al., 2022). Students preparing for group assessments focus on collaborative learning and communication skills, whereas individual assessments lead to solitary study practices (Slavin, 2014).

Research also suggests that approaches to studying can impact academic performance. Minbashian et al. (2004) found that students who use more moderate levels of deep approach learning had greater reproduction of information during an exam compared to those who use a lower level of deep approach learning. However, increased use of deep approach learning was linked to a decrease in amount of information reproduced. This may be because these students prioritise understanding overall concepts, rather than remembering every detail. The study also found that using deep approach learning did not lead to better scores on questions that required transformation of the material, compared to those that required just recalling it. This demonstrates that the type of question did not impact the relationship between the use of deep approach learning and exam grades. However, academic performance is not only impacted by deep approach learning. Students who have higher levels of self-efficacy tend to procrastinate less and perform better academically (Hassanbeigi et al., 2011).

The integration of technology in education has led to a shift in assessment methods. During preparation for digital assessments rather than for traditional paper-based exams, students often engage more with interactive and multimedia resources, enhancing their engagement and understanding (Hillier, 2015). Students preparing for online assessments are more likely to engage with digital resources and collaborative tools, reflecting adaptability to the assessment environment (Williams & Wong, 2009). This also relates to the use of artificial intelligence (AI) and how educators can incorporate that in the learning journey of students (Thomson et al., 2024).

Authentic assessments aim to measure students’ application of content and knowledge in real-world contexts (Gulikers et al., 2004). They has been shown to improve knowledge, high-order skills (i.e., skills going beyond observation of facts and memorisation such as critical thinking) and engagement (Azim & Khan, 2012; Raymond et al., 2013). Segers et al. (2008) investigated how portfolio assessment in a competency-based programme in applied sciences impacted students’ learning approaches and perceptions. The study focused on both first- and second-year students, who had experienced one year of portfolio assessment practice. The study concluded that students use deep approach learning for this type of assessment, through reflection and the effective use of feedback. Huxham et al. (2012) carried out a study that compared student performance and attitudes towards oral and written assessments. Students were split into two groups. The first group were randomly assigned to either oral or written assessments, and their scores on the same biology test was compared. The second group were given both oral and written assessments on ‘scientific’ and ‘personal development’ topics. The study concluded that both groups performed better on oral assessments. This was because they were perceived as more authentic and professional, despite students being nervous. The design of assessments also influences students' study behaviours. (Adapa, 2015) suggest that assessments perceived as authentic and reflective of real-world tasks encourage deep approach learning, with students dedicating more time to understanding concepts rather than rote memorisation.

In summary, it is important to understand how students prepare for their assessments in order to potentially support them better. Therefore, the aim of this study was to explore how students prepare for their assessments. This was achieved through a quantitative survey and focus groups.

Methods

Participants

Participants were recruited from the BSc Psychology, BSc Psychology (Sport, Health and Exercise) and BSc Biomedical Sciences programmes to take part in a survey and focus group. Social media were used for recruitment. The Psychology SONA system was used to recruit psychology students who can earn SONA credits for taking part in research studies. This is a requirement of their degree programme. Apart from separately advertising the focus group, students who completed the survey were also asked at the end if they would like to take part in a focus group. If they agreed, they would add their email address and would be contacted by the researchers.

Survey participants

A total of 116 BSc Psychology and BSc Psychology (Sport, Health and Exercise) students started the survey. Twelve were removed due to failure to complete the survey. For BSc Biomedical Sciences, 102 students started the survey. Twelve were removed due to failure to complete the survey. No demographics were collected.

Focus group participants

A total of 42 participants took part in the focus groups; 11 identified as male (26.2%), 30 identified as female (71.4%) and one participant did not provide their gender (2.4%). In terms of ethnicity, 17 (35.4%) identified as Asian/British Asian, 11 (26.2%) identified as White, six (14.3%) as Black/Black British African, three (7.1%) as Arab, two (4.8%) as Black/Black British Caribbean, two (4.8%) did not specify ethnicity, and one (2.4%) as Mixed-White and Black Caribbean.

Data collection methods

Survey

Participants were asked to complete a short survey using JISC Surveys. For each assessment on their programme of study, participants were asked how long it took to complete and submit the assessment. Responses were provided as free-text responses. Participants were provided with the module code and name, the name of the assessment, and a short description of the assessment (e.g., About how many hours did you take to prepare for and complete the PY1801 Portfolio for Academic and Employability Skills, Oral presentations, ‘Mock Job interview’). As a response, participants only provided an answer as to how long it took to complete and submit the assessment.

Focus groups

Semi-structured focus groups were used, as this method of generating data encourages participants to express their views and opinions whilst providing the opportunity to probe and ask follow-up questions and, consequently, generate rich, in-depth, and detailed data (Braun & Clark, 2013). The focus groups were conducted in-person and audio recorded. The duration of each focus group is provided in Table 1.

Table 1. Duration of each focus group.

Focus group number 

Duration 

00:28:36 

00:22:16 

00:27:34 

00:23:34 

00:19:40 

00:31:42 

 

The focus group explored topics such as authentic assessments, assessment preferences and academic motivation, and also explored the different ways in which students prepare for assessments. The participants were asked demographic questions, followed by mores specific questions about assessments such as: “What activities do you include when preparing for assessments?” and “How long do you spend on preparing for assessments?”. Probing questions were also asked “What does this all include?” and “Please give examples.”. Questions about assessment guidance were not directly included in the focus groups. The emphasis was on the way students prepared for assessments themselves rather than ways academic staff supported them in their preparation. The participants were also asked about how they understood and interpreted the survey question and for any ways the question in the survey could be improved for more clarity for future studies.

Procedure

This study was approved by the institution’s Research Ethics Committee. Participants were invited to complete the survey through a link and were presented with a participant information sheet and consent form before proceeding to the survey questions. When signing up for the focus groups, participants were presented with a participant information sheet and a consent form. For both the survey and the focus groups, participants were informed that they could withdraw their participation at any point, should they wish, and that no penalty would be applied. For the survey, participants were informed that their responses would remain confidential. For the focus groups, participants were informed that their contributions would remain anonymous but not confidential. At the end of the study, participants were thanked for their participation, received a debrief form, and gained one SONA credit for the survey and four SONA credits for taking part in the focus group, if participating via SONA. Participating via SONA was only possible for Psychology students as this is part of their course. This is not the case for Bioscience students. Only providing an incentive to some of the participants can be a limitation of this study, as it may introduce bias in which participants choose to take part in one but not all cohorts; however, the research team chose to select different programmes to widen participation.

Data analysis method

Survey data were analysed using IBM SPSS Statistics. If participants gave a range of hours, an average was calculated (e.g., 15-20 became 17.5). If participants mentioned something like “3 hours a week the whole term”, this was set at 33 hours (given a term is 11 weeks). The raw data also showed some responses such as “Not yet started as exam date has not been released” or “Not yet completed” which have not been taken into account in the analyses.

Focus groups were transcribed verbatim and analysed using thematic analysis by applying Braun and Clarke’s (2006) six-step approach. Using this approach, familiarisation with the dataset took place, followed by open-ended coding, and theme generation. The data were analysed by one member of the team and open coding was used. These codes were discussed by the wider research team and themes were generated. Through the discussion by the wider team, it was ensured that the coder’s work was accurate and reflective of the participants’ responses. In this way, reliability was achieved. The team discussed the themes and finalised them together.

Results

Survey

As there were fewer than five responses for each assessment of Year 3 in both Psychology and Biomedical Sciences, these data were not included in the results. The descriptive statistics on how long students prepared for and completed each assessment can be seen in Table 2 for Psychology and Table 3 for Biomedical Sciences. Please note that there is only one formative assessment in the Psychology programme that students receive feedback on and this informs the summative assessment. The data show a very large spread as seen by large standard deviation values for each assessment. In the Biomedical Sciences course, there is a clear discrepancy between coursework and exams. Students indicated much more preparation for exams compared to coursework assessments. In the Psychology course, no clear pattern can be identified although data suggest that students take the most time to complete assessments such as lab reports and qualitative research reports.

 

 

 

 


 

Table 2. Descriptive statistics showing how long students prepared for and completed each Psychology assessment in Year 1 and 2. The mean, SD, median, min and max all reflect the number of hours.

Assessment 

N

Mean ± SD

Median

Min

Max

Year 1 

Exams

Clinical psychology MCQ exam

92

39.7 ± 56.4

20

0

336

Research methods & statistics MCQ exam 1 

103

35.1 ± 54.0

15

0

336

Brain and cognition MCQ exam 

83

40.8 ± 71.3

18

1

453

Research methods & statistics MCQ exam 2 

80

42.8 ± 88.0

16

1

600

Sport psychology MCQ exam 

6

15.2 ± 22.6

6

2

60

Lab reports

Research methods & statistics lab report 1 

104

45.2 ± 118.8

15

0

912

Research methods & statistics lab report 2 

89

32.3 ± 73.1

15

0

588

Essays

Learning and social psychology formative essay 

93

31.6 ± 83.6

12

0

730

Learning and social psychology summative essay 

99

36.0 ± 84.9

12

0

588

Written reflections

Written reflection on skills development 

81

9.8 ± 15.5

5

0

80

Written reflection on taking part in research studies 

87

8.6 ± 20.3

2

0

168

Presentations

Oral presentation (job interview) 

82

9.9 ± 16.4

5

0

100

Sport psychology oral group presentation

5

13.3 ± 10.3

10

3

27.5


 

Year 2 

Exams

Statistics exam 

56

38.2 ± 105.9

15

1

800

Developmental psychology take-home exam 

37

33.3 ± 80.9

15

0

500

Lab reports

Qualitative research report 

53

42.1 ± 135.5

18

2

1000

Quantitative lab report

54

28.3 ± 67.2

15

2

500

Mini research project report 

4

36.0 ± 26.0

32.5

9

70

Essays

Conceptual and historical issues in psychology essay 

47

29.5 ± 58.2

16

0

400

Cognitive neuroscience synoptic essays 

51

29.2 ± 70.3

10

1

500

Social psychology and individual differences synoptic essay 

31

38.4 ± 105.6

20

0

600

Written reflections

Conceptual and historical issues in psychology reflective account 

50

6.0 ± 10.0

2.5

0

48

Written reflection on taking part in research studies 

40

5.8 ± 8.9

2

0

48

Presentations

Poster

37

17.8 ± 49.0

6

0

300

Presentation on the final year project proposal 

3

24.3 ± 31.1

10

3

60

e-poster 

5

12.6 ± 11.9

6

3

30

 

 

 

 


 

Table 3. Descriptive statistics showing how long students prepared for and completed each Biomedical Sciences assessment in Year 1 and 2. The mean, SD, median, min and max all reflect the number of hours.

Assessment 

N

Mean ± SD

Median

Min

Max

Year 1 

Exams

Research and communication skills statistics and bioinformatics MCQs 

80

22.5 ± 35.6

10

0

250

Biomedical sciences examination 

60

106.4 ± 131.9

40

1

600

Synoptic examination 

67

61.5 ± 107.1

24

2

700

Essays

Research and communication skills portfolio 

90

28.2 ± 40.3

10

0

200

Lab reports

Practical skills biochemical analysis 

73

24.3 ± 30.0

10

0

150

Practical skills molecular analysis 

73

28.64 ± 44.1

15

0

280

Practical skills microscopy discussion and quiz 

77

19.4 ± 24.3

10

0

130

Presentations

Research and communication skills oral presentation 

80

28.1 ± 33.7

15

0

200

Year 2 

Exams

Data evaluation and reporting in-course MCQ 

75

33.6 ± 59.1

10

0

400

Data analysis, interpretation and presentation in-course MCQ 

74

46.6 ± 94.0

11

0

565

Biomedical sciences examinations 

65

145.7 ± 200.8

60

0

1256

Synoptic examination 

57

84.1 ± 104.5

40

0

400

Essays

Primary literature interrogation and synthesis 

77

50.8 ± 57.1

30

0

350

 

 

Lab reports

Data evaluation and reporting case study 

77

58.7 ± 83.6

25

1

400

Written reflections

Career skills portfolio (CV, Cover Letter, reflection on education, recording of an interview)

81

26.9 ± 27.9

16

0

120

Presentations

Data analysis, interpretation and presentation poster 

75

57.0 ± 98.8

25

0

720

 

As the survey showed such a wide discrepancy in results, it suggests students might be interpreting “preparing for an assessment” differently. Therefore, focus groups were conducted to explore in-depth how students prepare for assessments. Key points that were discussed in the focus group were the ways students prepare for assessments, what they understand by the time it takes to prepare for an assessment, and how they perceive study time. Below are the three themes that were generated from our thematic analysis.

Theme 1: Common studying techniques

When asking students about studying techniques they used when revising content and preparing for assessments, students provided a wide range of responses. These ranged from studying content with friends, doing practise quizzes, and using flashcards to using Google Scholar and books to gather further information not discussed in lectures. The most common technique students used was revising previous lecture slides, both to revise for coursework assessments and exams. Participant 19 (P19) describes their studying strategy:

I sit down for 20 minutes and look over the lecture slides or look over my notes, which is mainly what I'll be doing to revise, I just find little and often that leaving more time just helped me a lot this year (Participant 19, Focus Group 14, line 61 (P19, FG4, L61)).

P18 outlines a similar process “I try to go over the slides that are relevant to the assignment, and understand what's going about” (P18, FG4, L47). Other students agree that revising lecture content helps solidify the knowledge presented in class and helps them understand how to carry out their assessments.

Many of the students liked to combine common studying techniques as a means of better understanding the content. P4 discusses mixing studying techniques:

I would say, just like going, go to the library with my friends on your course, making notes and like flashcards and like breaking down the material that you've been given in the lectures to like, find the specific things that you would actually be asked (P4, FG1, L238).

Some students used different study methods based on the type of assessment, i.e., coursework assessments versus exams. P19 outlines the different methods of study they use for coursework and exams:

For like, if it's an actual test, I need to get all the information and I'll look over like, lecture slides and lecture recordings. But if it's just for like assignments and essays, and I just need like general information on specific things, then I'll just do like, lecture slides and Google Scholar and potentially even like AI software (P19, FG4, L106).

Through the focus groups, students reported using a combination of different study techniques, from reviewing lecture slides and creating flashcards to utilising digital resources such as Google Scholar and AI tools. The survey did not provide any details on the study techniques students used. These findings align with literature on effective study habits, highlighting the use of active learning and diverse study methods tailored to the nature of the assessment (Cottrell, 2013; Roediger & Butler, 2011). The most common study technique mentioned by students was revising lecture slides. This aligns with research from Roediger and Butler (2011) which emphasised the importance of reviewing and engaging with course materials multiple times to enhance retention and understanding of the content. Students mentioned using lecture slides not only for exam preparation but also for coursework, indicating that lecture content serves as the foundation for both types of assessments. This is in line with the findings of Cottrell (2013), who suggests that lecture slides often encapsulate the key points needed for both understanding and applying knowledge in assessments. Students also mentioned revising with other students. Studying with peers can be a form of collaborative learning which has been shown to promote higher-level thinking, critical thinking, and improve retention (Johnson et al., 2014). Therefore, we can infer that discussing content with peers can help clarify doubts and improve understanding.

The use of AI tools as mentioned by some students is a relatively new development in students’ revision practices, as became apparent in the focus groups. AI tools may help students with organising information, generating summaries, simulate quizzes, and providing support with structure (Thomson et al., 2024). Emergent research suggests that these tools can provide personalised learning support that cater to individual student needs (Luckin & Holmes, 2016), and this is an area educators could focus on.

Theme 2: Feelings associated with studying, assessments and exams

There were a variety of emotions associated with revising, doing assessments and sitting exams. One of the most common feelings associated with exams and assessments was stress. P5 describes their anxieties to do with exams: “stress, anxiety it’s a lot of like lot of pressure with […] sitting exams at university […] your heart races you’re always stressed about everything cause […] you don’t know what the questions are in advance” (P5, FG2, L350). P29 also describes their dislike for exam settings, “when it comes to exams, I hate exam halls [laughs] feels like I can’t breathe” (P29, FG5, L391). In general, although students found coursework stressful, they tended to believe exams were more stress inducing, particularly statistics exams. P42 highlights this discrepancy in stress levels:

Assignments we do quite often. So it's like, the more you do it, the more you get comfortable […]. But stats exams, we don't do it as often we don't like, for me personally, I've only had one stats exam this whole year. […] So obviously it’s gonna be more nerve wracking than if you're used to doing something every other month” (P42, FG6, L364).

Although students tend to associate a high level of stress with exams and coursework, they also acknowledge that they feel a great sense of freedom and relief after their exams are finished. P22 expresses this: “Yeah, it can be stressful and overwhelming, but the feeling at the end is probably, you know, quite freeing that you don't have to do it anymore” (P22, FG4, L284).

Another common response associated with coursework and exams is procrastination. Many students seemed to have trouble getting started early on assessments, instead leaving their studying to the last minute. P38 illustrates this process of procrastination:

With things like um reports, truthfully, I do leave them very last minute. And I feel like I'm like, generally a like very last-minute person, which is obviously bad. Um, so yeah […] for example, the last report I had, I literally started three hours before, which is really bad (P38, FG6, L204).

P24 also describes finding it difficult to get started on assessments early:

[…] if I have an assignment due, and I say to myself okay I’m going to start a week earlier, nothing, my brain is not focusing, but if I say right, the assignment was due in two days, all of a sudden I can sit on my desk and get work. It's like I work better under pressure” (P24, FG4, L445).

A common emotion amongst students that was associated with this procrastination response to assessments is feelings of guilt. This is highlighted by P6 when comparing their work ethic to their friends’:

umm I think uhh the anxiety of seeing like your friends preparing in advance like way in advance it kind of makes me think or makes you feel guilty for not doing the same but umm I do procrastinate a lot and push it like the others (P6, FG2, L319).

Our focus group results are consistent with prior research showing that students experience considerable anxiety when preparing for exams and assessments (Pachole et al., 2023). High levels of psychological distress are associated with impaired performance, inhibition of learning and can increase attrition rates (Lyndon et al., 2014; Turner & McCarthy, 2017). It is therefore important that universities consider methods to mitigate such anxiety, through interventions such as cognitive, behavioural or mindfulness approaches, which have been demonstrated to reduce student assessment stress (Regehr et al., 2013). The students in our focus groups tended to find exams more stressful than coursework. This might occur for several reasons. Firstly, exams are typically time-constrained, requiring students to demonstrate their knowledge and skills within a limited period. This can heighten anxiety in the preparation for exams, as students anticipate the need to recall and apply information quickly, without access to external resources, often under intense pressure, which was similar to findings of Kavanagh et al. (2016). This is exacerbated by the fact that exams typically require students to study and review large amounts of material in a relatively short time, especially if multiple exams are scheduled close together. This can lead to feelings of being overwhelmed and underprepared. In addition, unlike coursework, where students have time to research, plan, and revise, exams often involve elements of unpredictability. Students may feel uncertain about which topics will be covered or how questions will be phrased, leading to increased anxiety which was also identified by Duraku (2017). With coursework, students have more control over their pace, the resources they use, and the time they invest in the assessment. Exams, on the other hand, offer little flexibility, which can lead to feelings of helplessness and increased stress.

In line with our results, procrastination is a prevalent response among students when faced with coursework and exams, with some studies reporting up to 80% of students frequently delaying academic tasks (Fentaw et al., 2022; Hidayat & Hasim, 2023). This tendency to delay academic tasks is well-illustrated by student experiences, highlighting the challenges they face in managing time effectively, despite understanding the negative consequences. Common reasons for procrastination include poor time management, lack of motivation, fear of failure, and distractions (Dub, 2021; Hidayat & Hasim, 2023). This behaviour can negatively impact academic performance and emotional well-being, leading to guilt, anxiety, and discomfort (Dub, 2021). Difficulty in initiating assignments early, with the ability to focus only when the deadline is imminent, suggests that the urgency of an approaching deadline can sometimes act as a motivator, and may increase productivity and creativity as deadlines approach (Zhu, 2023) albeit at the cost of increased stress (Fentaw et al., 2022). To combat academic procrastination, interventions focusing on improving time management skills, increasing confidence, reducing distractions, and setting realistic goals are recommended (Fentaw et al., 2022; Hidayat & Hasim, 2023). Universities are encouraged to provide support through counselling and training programs to address this issue (Fentaw et al., 2022).

A common emotion among students who procrastinate is guilt, particularly when they compare themselves to peers who begin their work earlier (Dub, 2021; Sommer, 1990). This is consistent with our data, where students expressed the anxiety and guilt experienced when observing friends who prepare well in advance. The guilt stems from a recognition that procrastination is not an optimal strategy, yet the pattern persists, driven by a complex mix of anxiety, pressure, and sometimes a belief in the ability to perform well under tight deadlines (Kamran & Fatima, 2013). This pattern of procrastination and the accompanying feelings of guilt reveal the psychological toll that procrastination can take on students (Kamran & Fatima, 2013). While some may find that pressure enhances their focus (Zhu, 2023), the overall impact is often negative, leading to increased stress and a sense of inadequacy (Munda & Tiwari, 2024). Our results and the wider literature therefore suggest that procrastination is a common yet problematic response to coursework and exams among university students. While it may provide short-term motivation for some, it is often accompanied by feelings of guilt and anxiety, underscoring the need for better time management strategies and support systems to help students break the cycle of procrastination.

Theme 3: Time spent on exams and assignments

In terms of how long students spent revising for an exam, the most common answers were given in terms of hours. Most students mentioned it takes them between 12 and 24 hours to prepare for an exam. For coursework, on the other hand, students mentioned on average it takes them between one and two weeks to prepare a coursework. There were a few outlier students in regards to this theme, where some students started preparing very early - up to a month before the exam or coursework was due, highlighted by P15:

um, so I’ll start looking at an assignment probably like a month before and then like um as it gets closer I’m spending more hours to probably like…two, four hours a day so I don’t leave it to last minute and it builds up. [pause] (P15, FG3, L137).

Other students deviated from the norm in the opposite direction, mentioning they would often ‘cram’ their studying and/or assessment work all into one or two nights. This theme of ‘cramming’ study into one night is highlighted by P21 when asked how long they spent preparing for exams, “So un-unfortunately I do not plan for them, as well as I want them to be, so, usually it's the night before” (P21, FG4, L36).

Students also added that the time they spent on each assessment depended on the type of assessment it was and how much it contributed toward their overall grade. For instance, students tended to spend less time on MCQ exams than essay-based exams due to essays needing more thought and creativity. P35 outlines their opinion on the time they spend on exams:

Typically, from the MCQ answers, you'll be able to jog your own memory and select the right answer from process of elimination. But if it's an essay, I might spend a little bit more time writing out practice essays and going over my answers and like paragraph structures (P35, FG6, L116).

P35 also states “If it’s like multiple choice, then, I won’t lie, I probably won't spend long revising because I'll just go over my notes” (P35, FG6, L111). Students found it rather difficult to imagine how many hours they had spent on a module as a whole. P2 comments “So I would say that I can't really put a time on it, because it's a period of four months, too long to actually count every single one” (P2, FG1, L408). Those who were able to give a rough estimate of how long they spent on a module as a whole generally predicted that a module would take them around 25 hours to complete.

These results show different patterns in how students allocate time to revise and work on their assessments. A majority of students reported that they spent between 12 and 24 hours preparing for exams, often during the time closely leading up to when the exam takes place. Some students indicated that they spent minimal time revising for MCQs whilst they spent more time revising for essay-based exams as these require more cognitive engagement and a deeper understanding of the content. This is consistent with findings of Biggs et al. (2022) who discussed that surface-level strategies, such as rote memorisation, are often used for assessments that are perceived as less cognitively demanding (such as MCQs). Deep-level strategies, which require more time and effort, are used for assessments which require critical thinking and analysis (Biggs et al, 2022). The focus group data further suggested that the weight of an assessment also influences the time students spent preparing for it. This is in line with previous research where Gibbs and Simpson (2005) argued that students are often very pragmatic in their study approach and invest more effort (and time) in assessments that carry greater weight in their final grade. Educators should take the weight of the assessment into account when supporting students in their assessment preparations.

Conclusion

Through our analysis of the results, this study indicates that students prepare for assessments in a variety of ways. Table 4 provides a summary overview of how students prepare for assessments.

Table 4. Examples of students’ comments on how and how long they prepare for different types of assessments.

Type of assessment 

Student’s comments on how they prepare for it and how long they prepare for it 

MCQ exam 

Attending the teaching sessions. 

Preparing for the exam a few days before or actual cramming the night before. 

Using AI to create flashcards to study for the materials.

Other type of exam 

For practical exams, practice at home and going over revision notes and activities. 

Reading over lecture notes and/or reviewing lecture recordings. 

More time needed to prepare for exam as compared to preparing for an MCQ exam.

Using AI tools to generate information, generating summaries or simulate quizzes.

Lab report 

Data collection through the term. 

Structure and data analysis were the most difficult elements. 

Doing the final bits of the report last-minute.

Using AI tools may help to provide support with structure.

Essay 

Takes more time to prepare for as often students need to find relevant readings. 

Discussing with other students the structure, skeleton or just brainstorming. 

This type of assessment is more of an ongoing project rather than a sprint.

Using AI tools may help to provide support with structure.

Oral presentation 

 

Often practice a few times at home (sometimes with other people). 

Does not take much preparation and is often done close to the actual presentation timeslot. 

 

As a result of our analysis, we have identified various methods and timeframes that students use to prepare for different types of assessments besides attending timetables revisions sessions and drop-ins. The findings suggest three key take-aways. Firstly, it is important to consider the nature of the assessment. The level of preparation often links with the perceived complexity of the assessments. MCQ exams are seen as less demanding, leading to minimal preparation. On the other hand, essays and lab reports require more in-depth critical thinking and therefore more in-depth preparation and more time. Secondly, many students show procrastination especially for tasks that involve report writing or in-depth preparations. Thirdly, students often seek peer input for assessments like essay writing and preparing for oral presentations to discuss their essay structure and their peer’s views on how to tackle the assessment. Taken together, our results suggest that educators should be aware of these elements when supporting students in their assessment preparations and when designing assessments. Given that students have noted that working with their peers is helpful in preparing for assessments, educators could highlight this in teaching sessions as an approach their students might find helpful.

Given the findings of this study, further research could be undertaken on how students prepare for assessments. Future studies could delve deeper in the way AI tools can be used to support students with preparing for assessments. In addition, future research could explore to what extent assessment writing with the appropriate support of AI tools supports deep learning. Another aspect that future research could consider is the support provided by educators during the assessment preparation process. Studies could consider how different types of assessment or degree programmes support students whilst they are working on their assessments leading up to submission. This might include auditing in class support (e.g. activities during teaching sessions to work on an assessment) and support outside of class (e.g. office hours).

Using AI tools in assessment preparation

Results of this study showed that students use AI tools to support them in the preparation and completion of assessments. The role of AI tools in these preparations can be multifaceted. For example, AI systems which are integrated into writing assistants (e.g., Grammarly) can provide students with immediate feedback on errors and suggestions for improving their writing. In this way, students can refine their work before submitting. In addition, AI-enabled collaborative tools can support students’ group work by providing summaries of team meetings with key action points, and support with task distribution and scheduling. Finally, tools like ChatGPT or similar can support students with generating ideas, outlines for essays, and aiding them in starting their assessments. Of course, there are challenges here too. Students may use AI unethically, such as using the tools to generate entire essays, and students can be overdependent on such tools which discourages them from developing skills such as critical thinking and problem-solving. Teaching students how to use AI tools effectively whilst helping them avoid these pitfalls is an important focus in modern higher education.

References

Adapa, S. (2015). Authentic assessment tasks: Students take a deep approach to learning. eLearn2015. https://doi.org/10.1145/2767532.2749228

Agarwal, P. K., Karpicke, J. D., Kang, S. H., Roediger III, H. L., & McDermott, K. B. (2008). Examining the testing effect with openand closedbook tests. Applied Cognitive Psychology, 22(7), 861-876. https://doi.org/10.1002/acp.1391

Ahdad, W., & Ighilkrim, S. (2018). Investigating students' revision strategies for the preparation of their exams: The case of Master One students in the Department of English at MMUTO [Doctoral dissertation, Mouloud Mammeri University of Tizi-Ouzou].  https://dspace.ummto.dz/server/api/core/bitstreams/45cf13e4-db9a-47fa-a3e9-d5b286df3d32/content

Azim, S., & Khan, M. (2012). Authentic assessment: An instructional tool to enhance students learning. Academic Research International, 2(3), 314–320. http://ecommons.aku.edu/pakistan_ied_pdcc/11

Biggs, J. (1979). Individual differences in study processes and the Quality of Learning Outcomes. Higher Education, 8(4), 381–394. https://doi.org/10.1007/bf01680526

Biggs, J., Tang, C., & Kennedy, G. (2022). Teaching for quality learning at university (5th ed). McGraw-Hill education (UK).

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in Psychology, 3(2), 77-101. https://doi.org/10.1191/1478088706qp063oa

Braun, V., & Clarke, V. (2013). Successful qualitative research: A practical guide for beginners. SAGE.

Cottrell, S. (2013). The Study Skills Handbook. Palgrave Macmillan.

Dub, V. (2021). Features of students’ procrastination. Problems of the Humanitarian Sciences. Psychology Series, 22-41. https://doi.org/10.24919/2312-8437.47.229345

Duraku, Z. H. (2017). Factors influencing test anxiety among university students. The European Journal of Social & Behavioural Sciences 18(1), 69-78. https://doi.org/10.15405/ejsbs.206 

Entwistle, A., & Entwistle, N. (1992). Experiences of understanding in revising for degree examinations. Learning and Instruction, 2(1), 1–22. https://doi.org/10.1016/0959-4752(92)90002-4

Fentaw, Y., Moges, B.T., & Ismail, S.M. (2022). Academic procrastination behavior among public university students. Education Research International, 2022, 1277866. https://doi.org/10.1155/2022/1277866

Gibbs, G., & Simpson, C. (2005). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, (1), 3-31. https://eprints.glos.ac.uk/id/eprint/3609

Gulikers, J. T. M., Bastiaens, T. J., & Kirschner, P. A. (2004). A five-dimensional framework for authentic assessment. Educational Technology Research and Development, 52(3), 67–86. https://doi.org/10.1007/bf02504676

Hassanbeigi, A., Askari, J., Nakhjavani, M., Shirkhoda, S., Barzegar, K., Mozayyan, M. R., & Fallahzadeh, H. (2011). The relationship between study skills and academic performance of university students. Procedia - Social and Behavioral Sciences, 30(1), 1416–1424. https://doi.org/10.1016/j.sbspro.2011.10.276

Hidayat, M., & Hasim, W. (2023). Putting it off until later: A survey-based study on academic procrastination among undergraduate students. Journal of Educational, Cultural and Psychological Studies (ECPS Journal), 28, 27-38. https://doi.org/10.7358/ecps-2023-028-taha

Hillier, M. (2015). e-Exams with student owned devices: Student voices. Proceedings of the Australasian Society for Computers in Learning in Tertiary Education (ASCILITE) Conference, 2015, 465-470. 

Huxham, M., Campbell, F., & Westwood, J. (2012). Oral versus written assessments: A test of student performance and attitudes. Assessment & Evaluation in Higher Education, 37(1), 125–136. https://doi.org/10.1080/02602938.2010.515012

Johnson, D. W., Johnson, R. T., & Smith, K. A. (2014). Cooperative learning: Improving university instruction by basing practice on validated theory. Journal on Excellence in College Teaching, 25(4), 85-118. https://celt.miamioh.edu/index.php/JECT/article/view/454

Kamran, W., & Fatima, I. (2013). Emotional Intelligence, Anxiety and Procrastination in Intermediate Science Students.  Pakistan Journal of Social and Clinical Psychology, 11(2), 3-6.

Karagiannopoulou, E. (2006). The experience of revising for essay type examinations: differences between first and fourth year university students. Higher Education, 51(3), 329–350. https://doi.org/10.1007/s10734-004-6383-8

Kavanagh, B. E., Ziino, S. A., & Mesagno, C. (2016). A Comparative investigation of test anxiety, coping strategies and perfectionism between Australian and United States students. North American Journal of Psychology, 18(3).

Luckin, R., & Holmes, W. (2016). Intelligence unleashed: An argument for AI in education. Pearson. https://discovery.ucl.ac.uk/id/eprint/1475756/

Lyndon, M.P., Strom, J.M., Alyami, H.M., Yu, T.W., Wilson, N.C., Singh, P., Lemanu, D.P., Yielder, J., & Hill, A.G. (2014). The relationship between academic assessment and psychological distress among medical students: A systematic review. Perspectives on Medical Education, 3, 405 - 418. https://doi.org/10.1007/s40037-014-0148-6

MacKenzie, A. M. (1994). Examination preparation, anxiety and examination performance in a group of adult students. International Journal of Lifelong Education, 13(5), 373–388. https://doi.org/10.1080/0260137940130504

Marton, F., & Saaljo, R. (1976). On qualitative differences in learning outcome as a function of the learners’ conception of the task. British Journal of Educational Psychology, 46(2), 115–127. https://doi.org/10.1111/j.2044-8279.1976.tb02304.x

Minbashian, A., Huon, G. F., & Bird, K. D. (2004). Approaches to studying and academic performance in short-essay exams. Higher Education, 47(2), 161–176. https://doi.org/10.1023/b:high.0000016443.43594.d1

Munda, V, D.T., & Tiwari, V.K. (2024). The impact of academic procrastination on students' performance in Indian school education systems: A special research analysis-vision 2045. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.4832564

Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative assessment and selfregulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218. https://doi.org/10.1080/03075070600572090

Pachole, N., Thakur, A., Menon, M., & Peepre, K. (2023). A study to explore patterns and factors of depression, anxiety and stress among students preparing for competitive exams in central India. International Journal Of Community Medicine And Public Health. https://doi.org/10.18203/2394-6040.ijcmph20230918

Raymond, J. E., Homer, C. S. E., Smith, R., & Gray, J. E. (2013). Learning through authentic assessment: An evaluation of a new development in the undergraduate midwifery curriculum. Nurse Education in Practice, 13(5), 471–476. https://doi.org/10.1016/j.nepr.2012.10.006

Regehr, C., Glancy, D., & Pitts, A. (2013). Interventions to reduce stress in university students: a review and meta-analysis. Journal of Affective Disorders, 148(1), 1-11. https://doi.org/10.1016/j.jad.2012.11.026

Roediger, H. L., & Butler, A. C. (2011). The critical role of retrieval practice in long-term retention. Trends in Cognitive Sciences, 15(1), 20-27. https://doi.org/10.1016/j.tics.2010.09.003

Scouller, K. (1998). The influence of assessment method on students’ learning approaches: Multiple choice question examination versus assignment essay. Higher Education, 35(4), 453–472. https://doi.org/10.1023/a:1003196224280

Segers, M., Gijbels, D., & Thurlings, M. (2008). The relationship between students’ perceptions of portfolio assessment practice and their approaches to learning. Educational Studies, 34(1), 35-44. https://doi.org/10.1080/03055690701785269

Slavin, R. E. (2014). Cooperative learning and academic achievement: Why does groupwork work? Annals of Psychology, 30(3), 785-791.

Sommer, W.G. (1990). Procrastination and cramming: how adept students ace the system. Journal of American College Health, 39(1), 5-10. https://doi.org/10.1080/07448481.1990.9936207

Stanger-Hall, K. F. (2012). Multiple-choice exams: An obstacle for higher-level thinking in introductory science classes. CBE—Life Sciences Education, 11(3), 294-306. https://doi.org/10.1187/cbe.11-11-0100

Tang, C., Lai, P., Arthur, D., & Leung, S. F. (1999). How do students prepare for traditional and portfolio assessment in a problem-based learning curriculum? In Themes and Variations in PBL: Refereed Proceedings of the 1999 Bi-Ennial PBL Conference, 1, 206–217. Australia Problem-Based Learning Network.

Thomson, S., Pickard-Jones, B., Baines, S., & Otermans, P.C.J., (2024). The impact of AI on education and careers: What students think. Frontiers in Artificial Intelligence, 7, 1457299. https://doi.org/10.3389/frai.2024.1457299

Turner, K., & McCarthy, V.L. (2017). Stress and anxiety among nursing students: A review of intervention strategies in literature between 2009 and 2015. Nurse Education in Practice, 22, 21-29. https://doi.org/10.1016/j.nepr.2016.11.002

Williams, J. B., & Wong, A. (2009). The efficacy of final examinations: A comparative study of closed-book, invigilated exams and open-book, open-web exams. British Journal of Educational Technology, 40(2), 227-236. https://doi.org/10.1111/j.1467-8535.2008.00929.x

Winstone, N. E., & Carless, D. (2020). Designing effective feedback processes in higher education: A learning-focused approach. Routledge. https://doi.org/10.31046/wabashcenter.v2i2.2910 

Zhu, F. (2023). The positive and negative aspects of procrastination in college students. Journal of Education, Humanities and Social Sciences, 10, 203-208. https://doi.org/10.54097/ehss.v10i.6920