You are on page 1of 12

RESEARCH AND TEACHING

Assessing and Refining Group Take-
Home Exams as Authentic, Effective
Learning Experiences
By Corey M. Johnson, Kimberly A. Green, Betty J. Galbraith, and Carol M. Anelli

A
The learning goals of a lower dvocates for improving un- and engagement, behaviors driven
division honors course, Science dergraduate science edu- by feelings of responsibility to the
as a Way of Knowing, include cation call for pedagogies group (Cortright, 2003; Zipp, 2007).
critical thinking, scientific that engage students in rel- Educators and learning experts in-
literacy, quantitative reasoning, evant, “real-world” problem solving creasingly view fixed-choice exams as
communication, and teamwork. and cooperative learning (American limited measures of student learning
To help students develop skills Association for the Advancement of as they create artificial situations that
and competencies for the course Science, 2011; Handelsman et al., do not reflect learners’ responses in
learning outcomes, we used a case 2004). From his literature review, in- real-world situations (Oakleaf, 2008;
study and developed scaffolded cluding a meta-analysis of 164 stud- Simkin, 2005). Similarly, hourly or
activities and assignments that ies of cooperative learning methods, “midterm” exams fall short because
targeted discipline-relevant Michael (2006, p. 162) concluded they impose unrealistic time limits
tasks, for example, primary that “little doubt [exists] that stu- and often do not target higher level
literature search, evaluation of dents in groups learn more.” Studies cognitive skills. In contrast, perfor-
source credibility, hypothesis also report the effectiveness of group mance-based tasks that simulate real-
construction, data interpretation, exams as a learning tool. Group ex- life application of skills, knowledge,
and restatement of scientific content ams that require critical and higher and competencies enable assessment
into lay terminology. We then order thinking skills deepen contex- in authentic contexts (Mueller, 2012;
implemented group take-home tualization and improve retention Oakleaf, 2008). By emphasizing what
exams, which feature rigorous, (Drouin, 2010; Michael, 2006; Zipp, students can do with what they know,
open-ended questions in authentic 2007). Challenging exam questions authentic assessment complements
contexts, requiring students to promote student discussion, foster- traditional approaches that emphasize
apply knowledge and competencies ing communication skills (oral, writ- what students know about a body of
cooperatively to new situations. ten, listening) and facilitating learn- knowledge.
Data from five semesters show that, ing, in part through students serving The literature on group work,
in comparison to traditional exams, as peer instructors (Michael, 2006; together with that on authentic as-
many students feel that group take- Simkin, 2005). Group exams can sessment, suggests that group take-
home exams reduce test anxiety, reduce test anxiety (Morgan, 2005; home exams could provide both a
foster interpersonal skills, are more Simkin, 2005; Zipp, 2007) and en- valuable learning experience and a
rigorous, and better enable them to able students to practice interperson- means to assess student performance
apply and synthesize knowledge and al skills such as collaboration, reci- on real-world tasks. The authors had
deepen their comprehension of the procity, team building, troubleshoot- the opportunity to experiment with
subject matter. Our study augments ing, leadership, conflict resolution, such exams when our honors college
research on group exams that use an and trust (Rao, Collins, & DiCarlo, adopted a new curriculum in 2008,
open-ended response format. 2002; Simkin, 2005; Zipp, 2007). in which introductory courses focus
Students can build on one another’s on scholarly literature use in prepara-
academic and personal strengths and tion for the required thesis. (Honors
demonstrate enhanced motivation courses fulfill general education

Vol. 44, No. 5, 2015 61

group issues/questions. communication of high expecta. www. the instruc- viously exposed and whose subject tor posted the answer key (sample Methods of instruction and matter had not been discussed. by querying groups during class or task. the instructor allowed 10–15 exams. Science as a Way of ward design (Wiggins & McTighe. several experimental approaches). next class period to discussion of the 62 Journal of College Science Teaching . created appro. In addition.org/college/ and interests. and critical read. spectrum of backgrounds. We hypothesized tory of biology. one specializing in science. (samples. five semesters of teaching the course. 1996). and rigor and design (i. coursework. critical and outcomes (contextualized with.RESEARCH AND TEACHING requirements for the baccalaureate with class size limited to 25 lower temporary and historical). Our literature Earlier assessment efforts (Johnson. instructor assembled students into exams. Our team listic conventions. the involves representatives from across and conceptual accessibility. but the ing & Gamson.aspx). students’ self-reported and cooperative work. most tored groups’ progress informally in undergraduate education: time on of which scrutinized Moran (2004). small groups (three to five mem. and interpretation and each group had its own workspace. ment specialist. She moni- recognized essentials of good practice org/college/connections. 2005) and several question sets to contact her with unresolvable exams embody four of seven long. majors. pleted exam expired. meet and make organizational plans. instructor intervened at her discre- good practice for effective assessment minimal use of disciplinary jargon. we had students work in exam. requiring students to the LMS site. to complete scaffolded assignments The instructor exhorted students ply knowledge and competencies that included a case study (Hollis.nsta. With the exam. 2005) to align course learning goals (sample exam. in and outside of class. 2003). assessment exams featured a mix of brief essays nsta. The groups complete entirely outside of lectures and discussions focused instructor assembled new groups for class and which feature authentic. resolved at once. credibility of sources and learning management system (LMS). Appendix D. test of predic. tions of evolutionary theory using their own troubleshooting. sample syllabus. www. nsta. major and on exams like ours.aspx). and in evolutionary theory and the his. Appendix C. small groups. The grading key.org/college/connections. Within the course online discipline-relevant tasks.) We were asked to develop division students. an assess.nsta. and provided time. with their groups’ answer(s). 2011) individual work to date. in our comprises a subject expert (professor ing of primary scientific articles. comparably mixed problems while fostering learning ly written feedback (Fink. which Appendix A. quantitative reasoning. to which students had not been pre. Because our students comprise a communication. www. data.aspx). Such ter. sources. and instructor of record). types of each exam. contexts and require students to ap. quality of student’s review yielded no published reports Anelli. the no student ever used the form. minutes of class time for groups to open-ended questions in authentic bers each). Our exams feature rigorous. tion when asked. their own answer(s) if they disagreed and Accreditation.e. Here we report our findings on expertise. Siciliano 2001). for each exam the relieve the time constraint of hourly connections. which student guided our pedagogy. and theories into lay terminology Knowing. & Green. active learning techniques.aspx) Nonscience majors constituted and short answers. and two instruction instructor selected research articles When the deadline for the com- librarians. We used back. Appendix B. emphasizing informa. to search the primary literature (con. Gil. 1987). tion and scientific literacies. student attitudes and perceptions and evaluation of research articles. instructor posted a dissension form the educational community (Ameri. Interactive experience. on scientific database use. clarification issues were tions. Most groups did reciprocity and cooperation (Chicker.org/college/connections. and restate scientific content a new course. www. To accessible only to its student mem- our research on implementation and move them progressively through bers and the current authors. For group take-home exams. thinking. to be “good team members” and cooperatively to new situations. that group take-home exams. sty. a primary article selected for its via LMS. len (2007) served as a supplemental for students who wished to submit can Association for Higher Education “how to” guide on the structure. She devoted the 20%–25% of course enrollment. (gender. On the refinement of group take-home exams skills and competencies emphasized day that students gained access to the used in lieu of midterm or hourly on exams.. could provide real-world priate activities. interpret degree. Galbraith.

for each exam. and perfor- mance. time) semester to semester. by student feedback. students also completed self. www.org/college/connections.) and contact (in- structor name) ASAP with concerns. which the instructor used ___________________________ ____________________________ to assess group dynamics (student grades were not impacted). Read. spring 2009. comment on. www. 5. Take a leadership role as needed 6. to take ownership of their working of the grading key.nsta. or making the semesters. in Weeks 6 and 11 of a 15-week within a given group. Complete all tasks agreed upon by the group on time prompts. each of which con.and postcourse We. and emphasized We typically administered two relationship. in- providing detailed breakdown of tributed 14% ± 2% (M ± SD) to the class. Communicate constructively to group discussion & answers We did not track individual students 4. To encourage groups course grade. DATE) To assess students’ attitudes. All how future exams were written (e. etc. Encourage and assist my team members pre. color coding. worth point allocations for exam questions. fall 2011. Discussion ter. We further understand exams. 44. see syllabus. exam (based on criteria in Isaacs. see sample. For open-ended prompts as to the posi- If any one of us causes difficulty with the group. Ensure that the exam final version is ready to be uploaded by (date. teammate complete the exam alone. which did not change from 8. Be cooperative and understanding in pre–post pairings. depending on semes- ter. 3. Maintain contact w/group members nsta. various assignments. 2002). individual effort was problematic questions as revealed group take-home exams per semes. Vol. and edit the entire exam by the time agreed upon by the data from all five semesters that we group 10. available at www. 2. GROUP NAME ______________________________ background knowledge.exam.nsta. the undersigned.aspx). closed-book final exam. (COURSE NAME & NUMBER. depending 16% ± 3% (M ± SD) of the overall also see Appendix E.aspx) contributed 48% ± 4% (M ± SD) to a student’s overall GROUP CONTRACT FOR EXAM #___ course grade. Appendix D. No. Beginning with fall 2010. Notice and work to curtail whatever tendencies I may sometimes exhibit that course offering). we pooled 9.aspx). we used pre.aspx). overall course grade.org/ on semester.org/college/ Template for small group take-home exam contract connections. students took an individual. not factored into group exam grades. 2015 63 . dix F.and postcourse questionnaire 7. we coded and summarized that (instructor name) will serve as arbiter and may decide to penalize the teammate data by semester and across all five in question by lowering his/her grade in accord with the situation.. college/connections. and exam scores (gener- ated by the instructor using grading Each of us also agrees to do the following: 1. FIGURE 1 Appendix A. we understand that the other team members have the right and are expected tive and negative aspects of group to contact (instructor name) and inform her of the situation. Groups had 7–10 days received the same exam grade. all members sometimes led to improvements in semester. and/or breaks the contract in any way. www.and peer-performance forms after each Signed. time) taught the course: fall 2008 (initial 11.nsta.org/ We will indicate individual contributions (tracking. Attend group meetings (virtually or in person) keys. Individual effort (attendance. Our ___________________________ ____________________________ institutional review board approved our protocols and instruments prior to implementation. Complete/upload my exam portion and share it with the team by (date. For Likert-scale 5. honors college anonymous end-of-course Plan: (Group inserts details here) evaluations.g. and fall 2012. facilitated by visual projection college/connections. have together devised and agreed to an initial plan (below) for anonymous questionnaires (Appen- working on the exam (insert initial deadline(s) for work to be shared via Google docs). fall others may perceive as uncooperative 2010.

64 Journal of College Science Teaching .RESEARCH AND TEACHING FIGURE 2 Pre.and postcourse questionnaire responses regarding group work and communication (Likert scale responses).

the fundamental The data did not indicate a statistically for exams. exam completion and participation.05. 2015 65 .24. Apparently a comparable percentage of students pre.vs.” making it the fourth Vol. questionnaires take-home exams enabled them to curricular activities often occur at At the course outset. no student served semester and had no “nontreatment” whether take-home exams required more than once. implemented with Exam 1. given our enrollment. completed individually). 41% of students expressed neutrality on exam preference versus 32% in postcourse analyses. Most postcourse Course activities and resources with Exam 1. Figure 3). p < . fall 2012.versus postcourse preferred learned/memo- rized exams (39% vs. and assemble their work (we assigned we taught only one course section per student perceptions were split on leaders randomly. in-class. and 2011). and respondents felt they had made gains Our postcourse questionnaire in- (d) implement a draft deadline on Day in these areas (Figure 2c and d). cluded a list of course activities and 7 for groups’ answers to be uploaded We asked students if they preferred resources from which we asked stu- to their LMS site (implemented with exams that ask for information learned/ dents to select those that helped them Exam 2. pre. whereas the percentage of students who dis- sented from that preference increased by 13% in pre. munication of scientific research with frequent updates (implemented (Figure 2a and b). ben- stress. group take-home exams were more to implement changes to improve the Assessment of improvement based rigorous than traditional exams. 71% (N = changes were important to the group application of skills or knowledge. (c) require groups to submit a most appeared motivated to gain increased their awareness of course completed contract detailing plans for experience in oral and written com. Figure 1). Of 77 respondents. implemented with group. distant locations. relevance (Figure 4b). 44. this change mitigated of students indicated that they were deepened their comprehension. group is a rural campus and students’ co. 20%). fall 2012). At the course outset. 55) selected the entry “group work exam experience. compared with 15% who dissented group leader to keep members on task sible because. fall comfortable working in groups. the majority apply and synthesize knowledge. more (40%) or less (35%) individual Exam 1. group-exam process: (a) designate a on these specific changes is not pos. from that view (31% being neutral). No. Although these memorized or exams that require the learn.and questionnaires. 35%). postcourse observations (cross-tabu- lation analysis: χ2(4) = 10. spanning two weekends (ours Pre. Fifty-five Our experiences and formative as. structure of the process remained the percent of respondents indicated that sessment over five semesters led us same throughout the five semesters. 5. (b) schedule the effort than traditional exams (Figure exam over a 10-day period (instead Results 4a). fall 2011). and efitted their interpersonal skills.and postcourse anonymous compared with hourly exams. A set of six questions in the post- course questionnaire targeted stu- dents’ perceptions of our group take-home exams compared with traditional exams (hourly. significant shift in the means. The vast majority reported that of 7). postcourse tribution of responses between pre. but there FIGURE 3 was a significant difference in the dis- Comparison of exam-type preference.versus postcourse data (33% vs.

3% ± 3. class discussions (84%). and communi.” In five semes- ters’ data. data Postcourse questionnaire responses regarding group take-home not shown). not shown).8% – 85. data group member’s performance on ex- 66 Journal of College Science Teaching . Honors course evaluations Students completed honors college / 36% agreed. p = . and the textbook (80%. followed by exam length/workload. The largest number of positive comments cited various benefits of the group experience. cation about science to new situations graded on a “curve. completed by in- 2012).6% response Group take-home exams required dividual students) over five semesters rate. Analysis showed no significant trend in positive-to-neg- ative comment ratios in successive semesters (logistic regression: χ2(1) = 1.0% improved at collaborating with class. exams. N = 58) indicated that they had transfer of skills in information and averaged B to B+ (M = 82. group exam performance forms agreed / 59% agreed).and peer- 36% agreed).7% SD for 10 exam averages.6%). SD. grades averaged B+ to A– (M = 88. and quantita.73. www. by which time some or all improvements to the exam process (see Methods section) had been implemented.1% strongly agree / 32. data not shown).aspx). Respondents (58. Comments on group take-home exams Postcourse questionnaires prompted students to “comment on the posi- tive and/or negative aspects of group take-home exams.RESEARCH AND TEACHING most selected item after PowerPoints FIGURE 4 (84%).” tively impacted their skills in criti. writing (33% strongly On our grading scale. The largest positive- to-negative comment ratios occurred in the last two semesters. which we categorized and summarized (Table 1). No exams were agree) and that the course had posi. range of averages for 5 semesters mates (62.8% quantitative reasoning. Appendix C. (sample exam. Postexam self.org/college/connections. fol- lowed by learning quality afforded by group exams. science literacies. 43 of 77 respondents (56%) yielded a pool of 32 negative and 41 positive comments. Most negative comments focused on group members’ unequal contributions to the exam (quality and/or quantity). critical thinking. = 76. For informational (not anonymous course evaluations for comparative) purposes.19). cal thinking (64% strongly agreed / nsta. 5 semesters.3% Students rated their own and each tive reasoning (53% strongly agreed ± 3. final exam three semesters (fall 2010 to fall Exam scores grades (closed book.

Criticism of exam. forms. to complete the group take-home in a timely manner on drafts of the dents who completed the forms. general and specific 5. with distribution and coding of open-ended comments about the group take-home exams. challenges dividing work- The criteria with the most 3s (poor) dents to state positive and negative load). no response = 11 [1%]). The criteria to postcourse questionnaire data in Likert scale (1 = very well. adequate = 183 [18%]..120 ratings yielding a total of 41 negative and (e. certain exam ques- = 23 [2%]. sharing of equate = 415 [13%]. Better than in-class exams 5. 5. with aggregate ratings 85 positive comments (Table 2).g. we used the same crite. knowing classmates bet- TABLE 1 Student response rates on postcourse questionnaires.” munication. No. Unequal group sizes and individual contributions 1. followed by gen- own efforts highly (very well = 815 Comments on group take-home eral/specific criticism of the exam [79%]. peers’ work vast majority of students rated their habits/behavior). Of 81 stu. Performance forms prompted stu. Praise for exam. 44.g. Vol. 68 versus an in-class exam. ad. the largest number of nega- equate. Doesn’t prepare for in-class. The largest number of posi- were “I took a leadership role as things about the group exam and tive comments cited greater time needed” and “I read and commented suggest improvements. # Negative comment codings # Positive comment codings 1. 3 = poor). Eighty-six stu. ria as those for Table 1. and commented in a timely manner tive student comments postexam dents completed self-performance on drafts of the exam” and “Encour. individual final exam 3. forms (very well = 2. followed exam.g.. with the most 3s (poor) were “Read Table 1.” Peer-performance forms (83%) responded to the prompts. Requirement for draft exam on day 7 Note: An individual student may have offered >1 comment.. In contrast workload. in-class exam 6. problems with scheduling. More time to complete vs. word limits. yielding a total of 1. Group dynamics issues 4. data not shown). comments. Benefits of group collaboration 2.609 [84%]. leadership. To greater exposure to learning styles. The bers.032 rat. Worse than in-class exams 2.ams according to 12 criteria using a no response = 9 [<1%]). 2015 67 . poor exams (e. Conducive to quality learning 3. poor = 87 [3%]. general and specific 4. mirroring those for self-performance categorize and summarize students’ practice with conflict resolution. focused on group dynamics (e. collaboration/brainstorming. by benefits of the group experience (N = 260) yielded 3. 2 = ad. com- ings (86*12. (260*12). authenticity of tasks. tions. both positive and negative. aged and assisted other group mem.

and helped them hone home exams as impactful student facilitate assessment and evaluation interpersonal skills. most positive com. Praise for exam. al. Doesn’t prepare for in-class. as highly engaged individuals. In addition. Benefits of group collaboration 2. More time to complete vs. Better than in-class exams 5. ed and have good track records as their semester exam grades would though this trend was not significant independent achievers. final exam (which included higher with this type of exam. coursework. ing experiences and simultaneously its relevance. Unequal group sizes and individual contributions 1. difficulties. Requirement for draft exam on day 7 Note: An individual student may have offered >1 comment. nonmajors’ their busy lives presented scheduling Day 7 draft deadline. working harder for the group). N = 10) cited implementation of the group work. but knowledge and skills collaboratively The ratio of positive-to-negative cognizant of challenges. both positive and negative. # Negative comment codings # Positive comment codings 1.63. our and synthesize knowledge. in-class exam 6. general and specific 5. our students expressed anxiety that ing over the last three semesters. deepened resent critical life skills and embody results suggest that group take-home their comprehension of the course our course learning outcomes. and many decry cause. Worse than in-class exams 2. home exams enabled them to apply teract productively with others rep. as reflected reflect their group’s collective efforts. Yet most students left en their interest. Challenges notwithstanding. and some students the course feeling more comfortable Discussion in our lower division course are still about performing group work. order questions). We exams can provide positive learn. The vast majority learning experiences and as tools of students’ ability to apply science also indicated that our course posi- TABLE 2 Student response rates on postexam performance forms. but also be- ments for fall 2012 Exam #2 (32%. with distribution and coding of open-ended comments about the group exam. to gauge students’ performance. Honors to authentic. real-world tasks. individual final exam 3. made them more aware of were eager to explore group take. p = . Conducive to quality learning 3. Group dynamics issues 4.RESEARCH AND TEACHING ter.20). students are academically motivat. general and specific 4. Criticism of exam. material. Initially comments trended toward increas. attitudes toward science can damp. and The abilities to evaluate and make adjusting to the demands of college the majority felt that group take- sense of scientific findings and to in. (logistic regression analysis: χ2(1) = in their scores on the cumulative partly because they lacked experience 1. 68 Journal of College Science Teaching .

activity: semesters as an individual. 199) final exam versus group take-home see challenges as reason to avoid exams. and their self-reported gains suggest Vol. and do suggest that instructors should and diminished the opportunity for some characterized the exams as a consider that students’ feedback can students to hone collaborative skills. the relative rankings home exams as impactful learning writing. as it was the most prolific top- students felt our exams were more laboration. . expert Maryellen Weimer (2002) Still other students believed that the scribed our exams as “fun. The top two ters that we taught the course. . an approach many instructors group take-home format as superior with the passage of time they may use (Simkin.tively affected their critical thinking. but grade. 2005. affect that would appear on the final. quantitative reasoning. it needs to change so that if of data. on elli. home exams were on students’ achieved 100%. One of the students de. we do not more?” (p. issues and the group exam format. but they cantly increased our grading effort and application of knowledge. comparable. this would have signifi- the exams required critical thinking overemphasize these data. and learn students value most the greater time individual grades plus the group’s conflict resolution.” whereas on postexam ic for comment among postcourse re- rigorous than traditional exams. and their discontent with particular wanted to incentivize students to ming” for exam preparation and/or teammates. Perhaps when the Others suggested that we account for mation. No. exam experience is relatively fresh. your learning?” “What about the basis of five semesters’ worth and as students will undoubtedly en. experience. closed- ences for college exams. because it relieved the pressure from recall both the benefits of teamwork we chose not to do so because we individual performance and “cram. We embedded questions in group exams. In the last two semes- science communication skills (An.” book. practice concise writing. completion. . based on the liter. Interestingly. and about half of our naires was “benefits of group col. you will learn age 5 percentage points less on the throughout their lives. Others saw the allowance for take-home exams. 5. On strove to address issues proactively. Our ex. “greater time allowance” spondents who provided “one or two Students’ open-ended comments was the most popular commenda.” yet another whether they “liked” a particular the final exam. The questions you need that showed point allocations for the challenges surfaced and students answered are these: “How various topics and types of questions voiced complaints and concerns. two types of data. To address this As anticipated. concrete suggestions” to improve provide insight into their perceptions tion. great way to learn and retain infor. vary over time. these compared with group dynamics is. Many students wanted of group take-home exams. Rao et al. assessment-based changes we im- findings lend support to the positive sues and criticism of the exam itself plemented helped improve the exam impact of cooperative work. .. ams did not compromise rigor. and students’ performance versus performance forms were learn. As group take-home exams questionnaires focused on unequal and postcourse questionnaires. groups Similarly. Some comments compared with 5% on at least one take-home exam to be a students expressed excitement that postcourse questionnaires. tive comments. We did that activity. & Green. 2015 69 . negative comments on postcourse both postexam performance forms 2015). individual effort by giving students have fruitful discussions. That is an irrelevant criterion concern we provided a study sheet ature and our teaching experiences. on postexam performance forms. . suggesting that the ties and final grade computation.” another urged instructors not to ask students group exam ill-prepared them for claimed to “love them. Johnson. accounting for 30% of student the course. Teaching and assessment improve at working cooperatively. 2002). Not to solo effort. 44. students earned on aver- counter undesirable peer behaviors we do it again. minds. and of comments differed between the experiences. although Almost three quarters of our students the final that required application the types of positive and negative selected “work for group exams” as of knowledge and skills per the syl- comments made on questionnaires a course activity that helped them labus. administered all five called it “one of [his/her] best experi. in-class exam. underscoring group take. we and related group work constituted a group sizes/individual contribution received more positive than nega- considerable portion of course activi. forms. Galbraith. the top positive comment It does appear that group take- generally performed well but seldom category on postcourse question.

texting. who suggested “fun”?) exams that were amenable conclude that the positive aspects the use of contracts for our exams. We time to review and edit teammates’ Preparing a grading key with detailed offer these recommendations. Students can of outcomes. even attribution. stu. and through assess. positive responses and improved col. took time. 70 Journal of College Science Teaching . which they objectivity. answers. we designed creative (dare we say group take-home exams lead us to Binghamton University. what. Third. causing “lack target higher levels of understanding. our experiences with pedagogy. Technological advances also that developing a group take-home home exam (Morgan.. of those goals will be demonstrated ing. exam required more effort than a assess and guide: Assess students’ col- dents had difficulties communicating traditional hourly exam as did writ. laborative and work skills and provide with group members and tracking the ing a student-friendly grading key prompt guidance and intervention latest version of their group’s exam for open-ended responses. give feedback regularly. application of scientific competen. but we instituted misperceptions by informally asking groups to complete entirely outside changes to diminish procrastination how the exam was going. We could of class. Tak. scholarly Conclusions and crobiology course inspired the instructor essays. and and explain how their attainment felt negatively impacted their learn. and Dr. one student wrote. build ment we discovered that our students ment courses with significant grading skills and give practice: Have students needed practice developing the habit demands. student learning and transferability of 10 exam deadline garnered many sion of their exam who contributed science knowledge and competencies. group take-home creating exam questions aimed at the needed adjustments to your exams exams offer plusses and minuses. facilitated group work. to specific content knowledge).g. YouTube videos. the New York Times presentation on group exams for her mi- Tuesday science section. (2011). of leadership” complaints to disap. current scientific literature. and policies. each exam answer for proper idea We have not found a published re- Most groups functioned well. We acknowledge Dr.” tion of educators (Mervis. 2015). and Use assessment recurrently to make For instructors. For large enroll. assignments worth minimal points. Second. online ar. etc. low-stakes Assessment also led us to designate of fixed answers for questions that activities. vocative. for example.” and many come to view the exams as a Advancement of Science. We concur. by their performance. Colorado State University. ■ The format affords development of cies. groups’ answers promoted instructor course learning goals to your students conquer” exam questions. both problems diminished research articles that were relatively Fourth. On the negative side. we found before administering a group take- pear. be clear and fair: State your with the greater prevalence of mobile free of scientific jargon yet amenable exam policies and implement them phones. and American Association for the answers could not be “googled. suitable articles. Locating as needed. 2005). homework a leader for each group. conceptual blocks. We monitored student port on exams that features authentic. Finally. On balance. refine: iarity with and use of Google Docs. recommendations to experiment with and expand on this ing advantage of this flexibility. O’Dowd (2011) provided practice teamwork and self/peer of reviewing teammates’ answers. saved grading time. and students’ famil. to assessment of student achievement outweigh the negative. 2013). We believe instructors can and project management issues. and/or discipline-relevant tasks for student “I love my group!”). in-depth questions that can the media (including the “Ig Nobel Acknowledgments draw on an array of materials: the Prizes”) often provided leads to pro. excellently (e. In 2008. cussion of the exam.RESEARCH AND TEACHING that course learning outcomes were many pairs of student eyes reviewed positive learning experience. answers and complained less about point allocations before evaluating build motivation: Articulate your the perceived need to “divide-and. Having only 5 or 6 exams to a need increasingly drawing the atten- laboration. Erica Suchman.. Plagiarism concerns practice cooperative skills and per. David Sloan Wilson. progress. helped provide feedback during dis. Scientific research in complex. Students used the “extra” grade versus 25 represented a “plus. adequately met (Anelli et al. whose 2008 chival materials. to testing students’ skills (as opposed consistently and fairly. First. guidance on implementing a choice evaluations on scaffolded. assess each student’s contribution by use such exams to measure and enrich Our Day 7 draft policy with a Day having groups indicate in the e-ver. References were essentially nonexistent because form discipline-relevant tasks.

G.asp S. 5. R. & DiCarlo. G. 79–95. R. Teaching Sociology. Portal: Libraries and web1113. D. NY: O’Dowd. Group-based Advances in Physiology Education. M.. 16. (2002). Green is director. G. Using group take-home Development Institute. scientific literacy. from http://assessment. Retrieved the classroom: It’s more than just exams to develop collaborative from http://www. San Francisco. J. Vol. M. Journal learning. Galbraith is literature: A practical guide group exams for group grades. University of New York at Buffalo’s multiple-choice exam questions. P.edu. 340(6130).pdf A case study on the politics the Academy. Drouin. Transformation is Weimer. Library Instruction Team. Buffalo. M.edu/ Rao. Science. Chang. College and Research Journal of Information Systems Education Bulletin. A. P. Egg size evolution Education. 8.noctrl. (2012).uconn. Mueller. C. How to incorporate Galbraith.. M. guide to designing courses for Morgan. Advances in Physiology Michael. Johnson. A. evidence that active learning works? Wiggins. science librarian and instruction coordi- to evaluating research articles Journal on Excellence in College nator. Nine principles of good practice National Center for Case Study Retrieved from http://pdfooz. (2005). Wood. M. opportunities: A conceptual map from http://visionandchange.html Washington. Oakleaf. 25(8). differing viewpoints: approaches.usyd. F. 58. Libraries. for assessing student learning. (2007). C. How to write good Education and Accreditation. Advances of_Good_Practice. (2003). 37–41. 114– The comparative method and the The impact of two-stage cooperative 118. assessment? Retrieved from http:// Entomology at The Ohio State University Beichner.. Teaching in Science.faculty. formative summative assessment 30(4). Teaching and Educational cooperative learning principles in (2015). An experimental Chickering. Retrieved 304(5670). University of Queensland.. 27. Where’s the Bass. (2011). (2003). Understanding by design (2nd ed. 35.. Simkin. A. Cortright.buffalo. Reading primary graduate students’ reflections on and Teaching. L. teaching: Five key changes to improved by collaborative group Science. jfmueller. Australia: in Physiology Education. B. Student Mervis. E. M. & Green. Kimberly CA: Jossey-Bass. (2005).edu/toolbox/ in Columbus. Assessing group enhances student learning.. K. in Pullman. A. Bruns. J. Siciliano. (2002).. Johnson.. Office of Assessment Gillen. (2010).. 120. A. H. (2007).itl..).lib. C.. D. What is authentic and associate chair in the Department of Handelsman. 72. 37. San Francisco. W. Carol M. Queensland. N. (2002). D. No. Vision and change in undergraduate A. whatisit. San Francisco. Evolution. J.. higher-order cognitive skills. (2011). performance and satisfaction. 533–547. Rising of information literacy assessment files/2013/11/aaas-VISchange. M. Upper Saddle River. NJ: Pearson relates to improved student Moran. Collaborative testing docs/resources/AAHE_Principles_ Isaacs.pdf tasks. 2015 71 . Learning by exams: Teaching of Psychology. 273–280. American Association for Higher of information. B. and assessmentresources/pdf/Link11.. S. 26. and Betty J. 3–7. Z. honors college science fundamentals level computer programming class. Johnson (coreyj@wsu. (2005). 159–167. temperatures. in tropical American arcid bivalves: Zipp. L. fossil record. J. 2718– tests. R. American Association for Higher course. Anelli is a professor Pearson/Benjamin Cummings. .htm biology education: A call to action. CA: Teaching.edu/ cs/index. D. 16. & Gamson... T. A.au/ putting students in teams. 62–76. Education. (2006).. J. (2001). Unpublished manuscript. J. & McTighe. K. Education. M. J. (2004). Retrieved http://sciencecases. (2008). Ebert-May. comparison of undergraduate and A. Fink. learning in higher education: A head. Information literacy study of the effectiveness of (1987). C.. & Green. DeHaan. B. A self-directed 2733. 521–522. (2005). Retrieved from org/k-28158359. Galbraith. 233–253. CA: Jossey- testing. DC: Author. C. (2013).pdf of Management Education. J. all at Washington State University in biology. (1996). (2004). A. Anelli. Learner-centered retention of course content is possible if a university really cares. Scientific teaching. . Dangers and Washington. J. . 102–108. M. 8–20. 292–295. Anelli. B. Seven principles for good instruction and assessment in an collaborative testing in an entry- practice in undergraduate education. 44. Collins. practice.org/ Hollister.edu) is significant learning. DC: Author. Cooperative Corey M. C.

users may print. . or email articles for individual use. However. download.Copyright of Journal of College Science Teaching is the property of National Science Teachers Association and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission.