office undereduc banner

Best Practices

Examples of Best Practice in SAS Program Assessment

Departments and programs across the School of Arts and Sciences have implemented and reported on a broad array of assessments of their majors and minors: there are many best practices we could cite as examples of evidence-based decision-making. In the reports for academic year 2015-16, many departments implemented changes to their programs based on that assessment evidence. A few notable examples are highlighted here - click on the titles for more information on each example.

Closing the assessment loop:  some AY 2015-16 examples

Criminal Justice: applying assessments to a program from start to finish

Criminal Justice has set standards for student achievement of its program goals at both introductory and culminating levels of its required curriculum, and looks for evidence of improvement as students progress through the major. At both ends of the sequence, assessment results have suggested modifications to course design and delivery to improve future student performance.

In the Criminal Justice introductory course, students are presented with case scenarios and prompted to interpret them. This requires them to demonstrate achievement of the program goals for (1) critical analysis of ethical issues, and (2) the ability to apply concepts and theories in specific criminal justice situations. The results of this assessment have revealed variation in student performance across different sections of the course.   In AY 2014-15, the poorest student performance was in the section of an instructor teaching the course for the first time. Sharing the assessment results with this instructor and providing the instructor support helped to narrow the student performance gap. This year, the assessment results showed that students enrolled online achieved less satisfactory results overall than did those in face-to-face sections. The department is now exploring the reasons for this.

The Criminal Justice goals were assessed at the other end of the program sequence as well. A capstone-level rubric defines desired outcomes for the goals at or near completion: this was used to score final papers in multiple 400-level courses for research design and critical thinking skills, critical command of criminal justice theory, and knowledge of the institutions and policies of the criminal justice system. Overall, results were very strong, though weaknesses were found in student achievement of research methods goals. This confirms past observations about the disadvantage of having to rely on research methods courses offered in other departments, which led to Criminal Justice to develop its own methods course. Instructors involved in its design will develop an appropriate assessment for the course, and follow-up assessments in the 400-level courses will explore how effectively this close-the-loop action helps to close the gap in student learning outcomes over time.

Geography: identifying steps for improvement through curriculum mapping

The Geography Department has a 3-year assessment cycle of successive review of each of its three major tracks.  In AY 2015-16, they developed a curriculum map for the track in Global Cultures, Economics and Society, aligning all required and elective courses of the program curriculum with specific program goals, and developing rubrics to assess student learning outcomes for those goals.  The three learning goals for this track were directly assessed in a 400-level capstone-level course required of all Geography majors.  In a dedicated class session, students completed an essay writing assignment designed to prompt for their ability to synthesize and examine critically a range of contemporary issues, and identify and apply appropriate analytic tools and models.  

All the essays were scored by a single reviewer, using the program rubrics.  Most students demonstrated satisfactory or better achievement of the goals, but the results for one program goal suggested room for improvement, with few students performing above a satisfactory level.  In response, the department scheduled a meeting in the Fall 2016 semester for the instructors teaching those courses identified in the curriculum map as specifically addressing this goal:  they will discuss how to improve the student learning outcomes in this area, closing the assessment loop by incorporating the results into decisions about course content and delivery.

In AY 2016-17, the Geography department will move on to a similar review of the  Environment track of the major, with development of a curriculum map and program-specific assessment rubrics, and the implementation of direct assessments in a capstone-level course of that program.

Marine & Coastal Sciences: using assessment results to modify teaching strategies

Marine & Coastal Sciences (SAS & SEBS) assesses student achievement of program learning goals in its capstone courses for its four major options.  The results for AY 2015-16 revealed that that many students in the Marine Biology option were not meeting the program benchmarks.  These results were confirmed in comments from students on evaluations that they felt overwhelmed by the amount of material in the course.  The department plans to revise the capstone-equivalent course and the content, delivery, and required sequence of this program track.  The modifications will update and re-apportion course content and the associated student workloads to improve student mastery of the Marine Sciences learning goals, and to sustain their progress to successful completion of the program.

Close-the-Loop Examples: PowerPoint | PDF

Other SAS Best Practices

Below are just a few other notable examples of how SAS programs are being strengthened through student learning outcomes assessment.  Click on the titles for more information on each example.

Best Practice Tips

Developing a Capstone Course through ‘Backward Design’ From Program Learning Goals (Africana Studies)

The program learning goals of Africana Studies emphasize readiness for postgraduate success both professionally and academically, and as part of their effort to ensure such student outcomes, they have integrated direct assessment across their curriculum, including such program elements as internships, service learning, and study abroad. Most notably, they created a new culminating experience for their majors: a capstone seminar explicitly designed around the benchmarks they have established for student achievement of the program learning goals at completion of the Africana Studies major. The course design features a comprehensive series of formative assessments in structured tasks and assignments that directly address improvement of student outcomes on these goals. In addition, they have linked each student learning outcome with at least one specific course or other element of the required curriculum leading up to this capstone, and developed benchmarks for achievement at each level, to be measured using common rubrics. Africana Studies has surveyed its graduates on how the major’s required and elective courses, internships, and service-learning experiences contribute to their readiness for professional success and/or graduate study, and the responses are used to inform their analysis of the direct assessment results as they consider possible modifications to their courses and curriculum.

Augmenting the Culminating (Capstone) Experience to Improve Outcomes and Assessment (Comparative Literature)

Comparative Literature has generated multi-year assessment results from a culminating experience for all its majors. A senior capstone workshop was created to support student performance near program completion, and serve as a site for assessment of the content knowledge learning goals for the major.  Mandatory for all graduating seniors not completing an Honors Thesis, this one-credit course requires the completion of a research paper, and a reflective essay on the student’s experience of the major  Results of these direct and indirect assessments have led to changes in the department’s advising structure; revisions to the requirements for the senior research paper; and changes to the content and delivery of the workshop itself, with the aim of further improving student outcomes, clarifying expectations for the work submitted, and prompting more useful feedback from students.
CompLit-assessment

Involving Majors in the Program Assessment Process (Classics)

The Classics department has taken the approach of involving their graduating seniors actively in their program assessment process.  They developed a set of benchmarks for student performance on their program learning goals at or near completion, and have used the resulting rubrics in direct assessments of a sample of student work from their upper-level courses. Graduating seniors are asked to submit a set of essays representative of their work in the major, and a panel of faculty members score student attainment of the desired outcomes in this work. The department uses the results of these assessments both to inform their curriculum revision processes and to better understand the match between their own expectations and those of their students.  In response to the results, Classics has implemented a new strategy to increase student participation in study-abroad and other co-curricular options that are closely aligned with student achievement of their program learning goals.

Using Year-to-Year Comparisons to Track Improvements in Student Outcomes (Art History)

Art History has multi-year assessment results for its program learning goals in all the courses of its culminating sequence, including its capstone-equivalent junior-senior seminars, and has used these assessment results to guide program and course revisions intended to improve student learning outcomes on the program’s competency goals.  A standard assessment rubric (below) is used by multiple assessors to score student work from all these courses, and to generate year-to-year comparative data tracking changes in aggregate student performance on the learning goals at or near program completion.  Modifications based upon the analysis of these results have included new pedagogical approaches in the culminating experience courses; curriculum revisions and the addition of co-curricular support to address identified areas of weak student performance; and mechanisms for better communicating program and course learning goals to students, disseminating the assessment rubrics and providing guidelines for how to achieve improved outcomes in all department courses.

  Yes. The Student achieved a high degree of competence in this area Somewhat. The student needs work in this area. No.The student did not demonstrate competence in this area. 
Did the student demonstrate critical thinking?      
Did the student use appropriate sources?      
Did the student construct a historical and theoretical argument?      
Did the student demonstrate skill in visual analysis?      

Evidence-Based Decision-Making About Courses & Curriculum - Closing the Loop (Cell Biology & Neuroscience)

Cell Biology & Neuroscience continued its strong, comprehensive  program assessment progress, which has steadily advanced to incorporate direct assessment in all the required and elective courses of their program. Close-the-loop actions over time have included revisions to program advising processes to match students more effectively to research faculty; review and modification of required courses identified as duplicative in topical content; development of new courses for non-majors and increased research opportunities for majors; and course scheduling revisions to improve student progress to degree completion, especially addressing the scheduling issues of double majors (a significant percentage of CBN students).

CBN now has multi-year direct assessment data on SLOs at program completion, scoring student performance on research and communication competencies in all its capstone courses with a set of 5 customized rubrics based on its program learning goals. These direct assessment results have been analyzed in conjunction with registration and graduation data that have revealed significant improvements in timely progress to graduation, as well as improved student performance measures in advanced courses, and augmented with indirect assessments of the revised courses and curriculum derived from an exit survey of graduating majors, and linked explicitly to program goals for post-graduate success.

Measuring Student Preparation for Professional Success:  Linking a Capstone-Level Co-Curricular Experience to Achievement of Program Learning Goals (Kinesiology and Health)

A program learning goal for Kinesiology and Health is that at graduation, all students will be prepared to immediately enter relevant careers, and be qualified for further graduate studies.  To this end, seniors complete a capstone-level professional internship in an approved professional agency relevant to their specific major option (one of three health sciences tracks, or Sport Management).  Upon the completion of this work-study experience, internship supervisors complete an online survey that includes specific questions on student achievement of the learning goals for the major.  The survey also gathers information on the number of years of experience with internships; the number of interns from Rutgers and other universities that were supervised; how Rutgers majors perform relative to all interns, and to professional standards of practice; and other general feedback on the interns and their readiness for employment in the field.  This calibration of ESS student outcomes with those of comparable programs, in addition to direct evaluation of student performance in a professional setting, is valuable information in the department’s analysis of the degree to which their students are fulfilling the expectations for achievement in each program track.

Revising Program Curriculum, Courses & Entrance Standards – and Faculty Ownership of Assessment (Italian)

The Italian Department has developed and implemented direct measures of student oral and written communication across its advanced courses, scoring specific exam questions with standard rubrics. The results have informed decisions to change the curriculum and the structure and content of required courses; standardization of course evaluations at each level of the curriculum; and creation of a new capstone seminar, with direct assessments built into its design. Assessment results led the department to revise its language placement process, developing a new placement test that was piloted this year; a follow-up assessment of student learning outcomes will measure the results of the modified assessment process and curriculum in the next academic year. Italian has been remarkably successful in involving all its faculty in assessment activities, using such tools as a Sakai site for sharing departmental assessment results and reviews, and regular communication about both program and Core learning goals assessment.

Augmenting Direct Assessment Results with Other Student Progress Measures (Mathematics)

Mathematics has implemented direct assessments in each capstone or “capstone analogue” course for its majors, and in introductory courses that are prerequisites for multiple majors, including its own. Common test items and workshop problems have been used in pre- /post-tests, with quantitative results scored by multiple assessors using uniform rubrics. While developing this framework for longitudinal direct assessment of student learning outcomes over the program curriculum, Mathematics has used the assessment results to guide revisions to its courses, advising and scheduling processes. It has modified the assessment process itself, changing the selection of assessment prompts in order to generate information on how effectively different course elements foster the transfer of improved student skills from one task to another. Direct assessment results have been used in conjunction with an analysis of course grades broken out by student class year, and data from their enrollment/special permission requests system, to assess the impact of delays in access to required courses on student acquisition of learning skills and retention of content relevant to the program goals. Based on this analysis, the department has revised its process for allocation of course seats, with follow-up assessments to measure improvements in student learning outcomes on the program goals.

Direct and Indirect Assessment Across the Curriculum, with a Developmental Focus (Molecular Biology & Biochemistry)

Based on the results of direct assessments of student performance in their research experience course sequence, Molecular Biology & Biochemistry has revised its curriculum to better prepare students for advanced study and career exploration; modified the capstone to focus on student performance of oral and written presentations of research; and created new required courses. The department uses a common rubric to collect assessments of student learning outcomes in its courses. The rubric identifies tools suitable for assessment of each learning goal, and student outcomes are ranked on a formative (developmental) scale from “developing” to “exemplary.” At all levels of the curriculum, required research projects are assessed on the program learning goals for research content, critical analysis, and effective communication, and scored on uniform criteria by multiple assessors using this formative ranking scale. The department employs indirect assessments to augment its analysis of these direct assessment results: an exit survey of graduating majors includes questions on the effectiveness of the revised curriculum sequence and requirements (courses, research projects), in promoting achievement of the learning goals and timely progress to graduation. In addition, student responses on SIRS are used to assess the impact of course content and delivery, and to identify areas for possible further action. The MBB department has used assessment results to revise the department’s advising guidelines and required advising meetings in each semester, incorporating career preparation activities in the junior and senior years of the major.

Piloting, Assessing and Analyzing the Use of an External Objective Metric for Program Goals Assessment (Psychology)

The Psychology department implemented a pilot direct assessment of its program learning goals using the ETS Major Field Test in Psychology, taking advantage of the availability of this external objective metric to benchmark their student learning outcomes against a nationally-normed sample. An analysis of the test results and the test instrument itself revealed some significant mismatches between the MFT content and the program learning goals, as well as with other measures of student achievement in the major. The reliability of the test results was analyzed in light of factors affecting the participation rate and student motivation, including the lack of a link to requirements for degree completion. Based on these results, the department concluded that the MFT is not an efficient, sustainable assessment of student achievement of its program learning goals, and is developing alternative customized in-house assessment tools, to be implemented at or near program completion as of the next academic year.

Best Practices Examples: PowerPoint | PDF

Assessment Resources
Assessment Tools

 

Contact Us

SAS OUE Staff Directory

Directory of Undergraduate
Chairs and Directors


35 College Avenue, Room 204
New Brunswick, NJ 08901


P 848-932-8433
F 732-932-2957
E This email address is being protected from spambots. You need JavaScript enabled to view it.