There is a growing need for assessments that evaluate higher-order thinking skills rather than mere memorization, particularly in STEM classrooms striving for inclusivity and equity. Authentic assessments, rooted in real-world contexts, address this need. This brief presents an example of integrating authentic assessment into an Environmental Education course. Student feedback indicates a preference for these assessments due to reduced stress, increased engagement, and skill development. Despite grading challenges, authentic assessments prepare students for professional demands, nurturing critical thinking and creativity.
Introduction
The current landscape of science, technology, engineering, and mathematics (STEM) education increasingly emphasizes the development of inclusive and equitable approaches, including how learning is assessed. Traditionally, higher education has relied on objective and standardized assessments focused on content memorization. However, previous research underscores students’ need to acquire transferable skills, warranting assessments that evaluate higher-order thinking beyond mere memorization.1,2,3 Additionally, the workforce faces a significant challenge as recent graduates often lack the necessary skills and adaptability to meet the demands of the professional environment. Employers express disappointment with graduates’ rigidity and inability to effectively problem-solve or communicate.2,4 This disconnect between academia and industry contributes to heightened stress among graduates, who feel unprepared and insecure as they transition into the workforce.2 Addressing these issues calls for assessments that authentically incorporate relevant contexts, ways of thinking, and feedback.5 These assessments, known as “authentic assessments,” offer a promising avenue for fostering more profound learning experiences.
Authentic assessments encompass three essential dimensions: realism, cognitive challenge, and evaluative judgment (see Figure 1).2 Hobbins et al. (p.1262) provide a definition rooted within these three dimensions: “Authentic assessment refers to a formally evaluated assessment activity which engages students with problems or important questions that are relevant to everyday life beyond the classroom; prompts students to use higher levels of thinking to extend knowledge and thinking, while also providing an opportunity to enhance self-regulated learning by engaging with grading criteria and providing and receiving feedback.”5 Realism describes the alignment of questions or tasks in the classroom with tasks one may face in one’s professional life.5 This dimension is especially relevant in performance-based tasks where students demonstrate their knowledge in a way that represents performance found in the workplace.2,6
Figure 1. Model to Build Authentic Assessment (modified from Villarroel et al., 2018)
Cognitive challenge refers to assessments requiring higher-order cognitive skills, as defined by Bloom’s taxonomy.7 The transfer of knowledge required for cognitive challenge goes hand in hand with the previously mentioned dimension of realism in that students need to be able to practice skills required of them beyond a traditional exam at a university. In other words, successfully memorizing information for a decontextualized exam does not indicate how well that same individual can utilize that knowledge when needed in the real world.2,8 Evaluative judgment refers to the need for students to judge the quality of their work and performance. There are two ways to accomplish this.2,5 First, students should utilize rubrics to evaluate their work before submission, ensuring alignment with assessment criteria. Second, students should engage in reflective practices, such as self-assessment and peer feedback, while instructors provide additional feedback post-grading.2,5
Description of Teaching Activity
The authentic assessment framework has been integrated into an Environmental Education (WFB 4200/6200) course that meets a communication or elective requirement for students pursuing degrees in Wildlife and Fisheries, Forestry, and/or Environment and Natural Resources. The lecture portion of these courses is offered online asynchronously, with in-person lab sessions for 4000-level students and online labs for 6000-level remote students.
In Environmental Education (EE), students engage in project-based learning assessments to demonstrate their understanding of course material. Specifically, students are tasked with creating a video presentation for both their midterm and final exams. Students are provided with an outline (see Table 1) that includes the criteria for assessment, including content, delivery instructions, and adherence to time constraints. They are given a week to prepare their video presentation, which must be 20 to 25 minutes long. A ten-point grade deduction is applied if the presentation exceeds 25 minutes, reinforcing the importance of concise communication. Students use PowerPoint to create presentations and upload them as video files into Canvas (LMS). Students’ faces must be visible on-screen during the exam to maintain academic integrity, and collaboration among students is prohibited. Students also sign an academic integrity pledge. Close captioning is optional, as only the TA and instructor watch the exam videos. The midterm accounts for 20% of their overall grade, while the final accounts for 22.5%. The TA grades the undergraduate exams, and the instructor grades the graduate section’s exams.
While traditional assessment methods, such as exams with true/false, fill-in-the-blank, multiple-choice, short answer, and essay questions, could achieve the student learning objectives (SLOs) outlined below, project-based learning assessments offer a more robust and comprehensive means of assessing student achievement.8 By engaging in real-world tasks and demonstrating their understanding through video presentations, students not only meet the SLOs but also develop valuable skills in communication, critical thinking, and creativity. Linking exam questions to SLOs is essential for quality assessment. Below, SLO 1 for this course is shown, along with Table 1, which includes prompts that link to the SLO.
The student will be able to:
(1) Identify all the types of EE and the history of EE.
(2) Apply learning and behavior change theory to EE activities.
(3) Identify bias in EE and summarize the professional responsibilities of Environmental Educators.
(4) Create EE lessons that link to state and national standards and can be used for all types of learners.
(5) Develop Assessment and Evaluation tools for EE lessons and programs.
(6) Reflect on their own culture and examine different perspectives so that they can foster learning and promote inclusivity.
(7) Create an online community for the sharing of relevant topics from the course and support of peers.
Table 1. Example Outline for Environmental Education (EE), Modified Instructions and Module 1.
Discussion of Outcomes
Since implementing these assessment methods, several benefits have become evident. Firstly, there have been no instances of students requesting points back, a common occurrence with traditional exams. Additionally, there have yet to be any reported situations where students claim to have studied but performed poorly. Also, by being very clear with detailed feedback, students are more understanding of points missed, leaving little room for confusion. Furthermore, accommodations are primarily addressed preemptively, with students given the flexibility to choose their environment and ample time to complete assessments (with extra time and quiet environments being the most common requests). This proactive approach has contributed to a smoother assessment process overall.
Within the Environmental Education (EE) course, we sought direct feedback from students in the in-person 4000-level lab section to gain insight into their experiences. This process (approved under IRB2023-0521-02) involved handing out half-sheets of paper while the instructor stepped out of the room, ensuring anonymity. Students responded to the following questions:
- Do you prefer the EE exam structure (create a presentation based on an outline) or more traditional exams (multiple choice/short answer/matching)? Please explain why.
- Which test structure makes you feel more stressed?
- What do you like about the EE test format and why?
- What do you dislike about the EE test format and why?
The results, drawn from a thematic analysis of responses from 14 students, shed light on preferences, stress levels, and perceptions regarding the EE exam structure compared to traditional exams. Results found that all students (14) expressed a clear preference for the EE exam structure, involving creating presentations based on an outline over traditional exam formats. The reasons cited by the respondents encompassed several key themes:
1. Pressure Relief and Better Retention: Students indicated that the EE exam structure alleviated pressure compared to traditional exams, leading to better retention of the material.
2. Active Engagement and Critical Thinking: Students appreciate that the EE structure makes them think more deeply about the class content.
3. Learning through Oral Communication: Presentations provide an opportunity to speak and explain the learned information, which can help solidify the understanding of the material.
4. Preparation for Real-World Skills: Creating presentations was viewed as a valuable skill-building exercise with real-world applicability.
5. Reflective Learning: The EE exam structure allowed for reflection on learned material, facilitating a deeper understanding of the content.
6. Variety in Assessment: Students appreciated the change of pace offered by presentation-based exams, particularly in majors where traditional exams are prevalent.
7. Incremental Presentations: While most respondents favored the presentation style, some suggested breaking it into separate presentations for each module.
The majority of students (13 out of 14) indicated that traditional exams, characterized by multiple-choice questions and standardized formats, induced more stress compared to the EE exam structure. Some students mentioned that the EE structure felt less like an “exam” or “test,” suggesting a different perception of assessment.
Students highlighted several positive aspects of the EE exam format, including flexibility, focus on core concepts, learning through presentation, real-world relevance, learning enhancement, effective demonstration of knowledge, and reduced reliance on memorization and cramming.
However, some students also identified challenges or areas of improvement, such as preferences for having the outline earlier in the course, desire for more freedom in presentation creation, dissatisfaction with the midterm structure covering multiple modules, the need for more time, and anxiety stemming from the unfamiliar format. Some students had trouble saving their presentation as a video file and uploading it. This process takes time and requires a stable internet connection. Despite prior warnings, many students experienced stress by waiting until the last evening to submit their exams.
In summary, the student’s feedback indicates a strong preference for the EE exam structure due to its perceived benefits in understanding, engagement, and skill development. However, there are also valuable insights for refining and improving the format to better meet student needs and preferences.
Reflection of Outcomes
Existing research indicates that students often experience testing anxiety in courses that rely solely on high-stakes, summative assessments.9 However, incorporating authentic assessments allows instructors to deviate from traditional exams and essays, instead focusing on real-world tasks.10,11 For instance, presentation-based assessments, while still eliciting some anxiety, help students acclimate to the discomfort they may encounter when presenting in professional settings.12
Moreover, this study supports the notion by Lynam and Cachia13 that surface learning, aimed at passing traditional assessments, is becoming increasingly obsolete in a world where information is readily accessible. Therefore, educators must prepare students to excel as employees who possess problem-solving abilities, decision-making skills, information-sourcing capabilities, effective communication, and strong teamwork skills. It is no longer sufficient for students to acquire knowledge within their field. Educators can create learning environments that support this need by shifting away from traditional assessment practices and embracing authenticity in practice.
Discussion of Potential for Adoption in Other Courses
While many courses may already incorporate assignments with authentic aspects, our overarching goal is to shift away from reliance on traditional exams and prioritize using authentic assessment as the primary assessment method in courses. By emphasizing more authentic assessments and moving away from traditional exams, we can foster critical thinking and practical skill development among STEM students.
If study guides are already provided to students, they can be adapted to an outline of prompts like the example in Figure 1. This approach may even be more straightforward than creating multiple-choice questions. Well-defined learning objectives for each module or lecture can also become the outline of topics that should be covered in project-based assessment. The learning objectives for each lecture were used to create the presentation-based exam outline for Environmental Education. This approach has also been used in a Restoration Ecology course, which demonstrates its transferability: Ask students what they know related to the learning objectives of the lecture, module, or course. Preparing students for this type of assessment should include guidance on approaching project-based tasks, including research strategies, presentation design, and communication skills. Instructors may facilitate peer feedback sessions to help students practice giving and receiving constructive criticism, developing their evaluative judgment capabilities.
One common concern with authentic assessments, especially when students create videos, is that grading may require additional time. Fortunately, there are strategies to expedite the process, such as watching videos at faster playback speeds once you become familiar with the outline. Additionally, implementing more frequent formative assessments, such as after each module, rather than relying solely on two summative assessments, can result in shorter videos for grading. For larger courses, teaching assistants can also help speed up the grading process. Another option is to incorporate peer grading and feedback. This process could further solidify the content and build evaluative skills among students. This route requires transparency in grading criteria and monitoring processes to ensure fairness and accuracy in grading.
Having students create a video presentation is a simple way to adapt existing SLOs into an outline. However, there are endless possibilities for implementing these ideas in other classroom contexts. For example, in a Restoration Ecology course, students could be tasked with designing a restoration plan for a degraded ecosystem. They would need to consider and apply ecological principles, stakeholder perspectives, and communication strategies in their designs and could not simply memorize decontextualized facts. In a technical forestry course, instructors may opt for task performances, such as conducting a tree inventory, executing forest management techniques, or implementing sustainable logging practices. These practical approaches align with the needs of modern STEM education, which prioritizes not just knowledge acquisition but also the application of that knowledge in solving complex, real-world problems.
References Cited
- Shepard L. The role of assessment in a learning culture. Educational Researcher. 2000; 29 (7):4–14.
- Villarroel V, Bloxham S, Bruna D, Bruna C, and Herrera-Seda C. Authentic Assessment: creating a blueprint for course design. Assessment and Evaluation in Higher Education. 2018 Aug; 43 (5):840–854.
- Schultz M, Young K, Gunning T, Harvey ML. Defining and measuring authentic assessment: a case study in the context of tertiary science. Assessment & Evaluation in Higher Education. 2022 Feb; 47(1):77–94.
- Singh P, Thambusamy R, Ramly M. Fit or unfit? Perspectives of employers and university instructors of graduates’ generic skills. Social and Behavioral Sciences. 2014 May; 123:315–324.
- Hobbins J, Kerrigan B, Farjam N, Fisher A, Houston E, Ritchie K. Does a classroom-based curriculum offer authentic assessments? A strategy to uncover their prevalence and incorporate opportunities for authenticity. Assessment & Evaluation in Higher Education. 2022 Dec; 47(8):1259–1273.
- Palmer S. Authenticity in assessment: reflecting undergraduate study and professional practice. European Journal of Engineering Education. 2007 Jan; 29(2):193–202.
- Anderson L, Krathwohl D, Bloom B. A taxonomy for learning, teaching, and assessing: A revision of Bloom’s Taxonomy of educational objectives. 2001.
- Bloxham S, Boyd P. Developing effective assessment in higher education: a practical guide. Maidenhead: Open University Press. 2007.
- Kenyon C. Assessing what we value: interactions between student perceptions of assessments in the calculus classroom and their future-oriented motivation. Clemson University Digital Repository. 2023 May.
- Keppell M, Carless D. Learning-oriented assessment: A technology-based case study. Assessment in Education: Principles, Policy & Practice. 2007 Jan;13(2):179–191.
- Durning S, Dong T, Ratcliffe T, Schuwirth L, Artino A, Boulet J, Eva K. Comparing open-book and closed-book examinations: a systematic review. Academic Medicine. 2016 Apr; 91(4):583-599
- Jones E, Priestley M, Brewster L, Wilbraham S, Hughes G, Spanner L. Student wellbeing and assessment in higher education: the balancing act. Assessment and Evaluation in Higher Education. 2020 Jun; 46(3):438–50.
- Lynam S, Cachia M. Students’ perceptions of the role of assessments at higher education. Assessment and Evaluation in Higher Education. 2018 Feb; 43(2):223–234.