This guide to rethinking and revising your curriculum was built to offer ideas and best practices. ATLE is available to meet with individuals or entire departments interested in curriculum redesign.
As with any educational planning, it's always best to start "at the end"--we call this Backward Design, where you clarify first what students know or can do by the END of the experience, and then look backward toward the beginning with the knowledge of what students have to be told, how they will need to practice, and how you will assess them. In the case of a major/degree program, the faculty must first agree on the goals of the program or what students will know and be able to do when they have completed the program. This approach creates intentionality rather than leaving learning to chance.
The next step is to translate these goals into measureable Student Learning Outcomes (SLOs) -- statements that describe significant and essential learning that students have achieved and can reliably demonstrate when they have completed the degree program.
Note: The Academic Learning Compact required by the Florida Board of Governors (described below), requires SLOs in the areas of content/discipline knowledge and skills, communication skills, and critical thinking skills for baccalaureate degree programs.
Writing Good Student Learning Outcomes
Not every Student Learning Outcome is necessarily written well. In fact, SLOs that do not offer a clear path to measurement are particularly problematic. This often happens when the verb chosen is "know", "understand", or some other passive verb that would be difficult to determine. Even "demonstrate" is questionable. It is greatly preferred to choose strong action verbs instead--click here for sample verbs mapped to Bloom's Taxonomy: http://www.usf.edu/atle/teaching/blooms-taxonomy.aspx
A good SLO makes it clear how it should be assessed. "Students will be able to produce
a festival-quality documentary," for instance, is quite clear what it would take to
adequately measure that they can do this skill. As you (re)write SLOs, strive for
transparency on how they would be measured effectively, appropriately, and authentically
(where "authentic" means true to the skill--it's rarely "authentic" to have a multiple-choice
exam stand in for skills, for instance).
Once your department is satisfied that the core SLOs accurately reflect the knowledge, skills, and abilities your students should display by the time they graduate, the next step is to verify students are being given the requisite information and practice opportunities as they progress through the required courses of the degree program. It's not enough to leave this to chance; a matrix is required to map out exactly where a skill is Introduced, where it is Reinforced, and where it is Mastered. Here is an example from the University of West Florida:
In the above example, mastery only occurs at the capstone course, but this is perhaps not the most realistic assignment of mastery. Ideally, various skills will be mastered early in the degree program, with others coming later. It's even possible to have skills reinforced after they have been mastered.
As you can see in the example, the way to build this is to list all the established SLOs down one axis, and list all the required courses for the degree program on the other, and then insert "introduced," "reinforced," or "mastered" in the appropriate places.
Clearly, the creation of a curriculum map is meant to be an interactive project. Your department may discover in creating the curriculum map that some skills are missing--this is a good reason to revisit the SLOs yet again. Or you may decide that the list of skills are correct, but the courses chosen do not offer students a chance to truly master the skills--in that event, ideally this should spark a discussion about the list of required courses, and the content (and skills!) contained within those courses. In some cases, courses need to be removed (or added) to the degree program to live up to the curriculum map fully.
In addition to not leaving student learning to chance, mapping skill mastery onto the required courses serves an important function within the department: now, perhaps for the very first time, every instructor will know what a given course is meant to accomplish when viewed through the lens of students on the road to graduation. A degree program is much more than a collection of facts and information, and knowing what a given course is required to help students practice or master will absolutely influence how it is taught, particularly for new instructors.
Institutional Effectiveness in the Office of Decision Support (ODS) is charged with assuring that academic assessments at USF satisfy two mandates from outside agencies.
Florida Board of Governors (BOG) regulation 8.016 addresses Student Learning Outcomes. This regulation requires each State University System institution to develop a process that insures that program faculty:
- Develop and publish an Academic Learning Compact (ALC) for each baccalaureate program. At a minimum, the ALC must contain a list of core SLOs in the areas of content/discipline knowledge and skills, communication skills, and critical thinking skills - and examples of assessments students might encounter;
- Develop methods for assessing student achievement of the expected core student learning outcomes within the context of the program;
- Use program evaluation systems (which may include sampling) to evaluate the program and related assessment practices to analyze their efficacy in determining whether program graduates have achieved the expected core student learning outcomes; and
- Use the evaluation results to improve student learning and program effectiveness.
Every degree program at USF already has an established ALC. You can view them here: http://usfweb.usf.edu/DSS/SAM/Public/ALCs.aspx. The challenge, of course, is to revisit and refresh the ALCs every so often, certainly when looking at the overall curriculum and its stated goals. You may wish to view the well-regarded ALCs at the University of Florida.
Section 8 of the Principles of Accreditation of the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) also addresses Student Learning Outcomes.
The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of seeking improvement based on analysis of the results in the areas below:
a. student learning outcomes for each of its educational programs
b. student learning outcomes for collegiate-level general education competencies of its undergraduate degree programs
The focus of academic program assessment is on measuring and improving student learning. Each department or program should design an assessment process that provides information that can be used to:
- Determine whether students are learning what the program says they are;
- Identify strengths and weaknesses of the program;
- Make changes based on assessment results to improve the program.
There are a couple of very important concepts built into the required assessment structure. First, it is expected that departments and degree programs look beyond mere "mastery" and instead toward "continuous improvement," a concept at the base of SACSCOC's rationale for assessment. The idea is not to "prove yourself" each year, but to study something (maybe not even the central things) that will help you improve student learning, relevance, operations, efficiencies, or other elements of your program. It's common for an assessment plan to contain a mixture of mastery goals ("85% of students in this major will graduate within four years") and continuous improvement goals ("student portfolios in the capstone course will display all of the NACE competencies toward career readiness"). The former example is a target you either hit or you don't, and if you don't, it's not very helpful in terms of what to do with those results. The latter example is a more holistic target, and when it is sensed that more could be done, it's somewhat more clear what to actually alter so as to improve results going forward.
Accordingly, ODS asks each degree program to create a three-year plan in this fashion:
- Establish program specific SLOs based on the goals of the program.
- Develop a curriculum map showing the courses in which learning outcomes are introduced, reinforced, mastered, and assessed.
- Identify methods that will be used to assess SLOs and a plan for assessing them.
- Collect assessment data, use the collected data to pilot test the feasibility of the assessment measures and processes described in the plan, and (if needed) revise the assessment plan.
- Collect additional data (or improved data if measures were changed based on the pilot test results from Year One)
- Analyze the data, and produce a report that summarizes the assessment findings. The assessment report must include an action plan for using the assessment results to improve curriculum or instruction.
- Implement the action plan to improve curriculum or instruction that was developed during Year Two.
- Assess the impact of the curricular or pedagogical changes that were made.
None of the above will be easy--or more to the point, useful--if there is not wide-scale agreement and adoption at the Department level. Assigning a single faculty member, or even a small team, to do this work in isolation violates the spirit of curriculum design, which should be owned by all faculty in the department. To make the activity useful, there should be broad-based participation and understanding of the process, especially the curriculum map and its logic.
ATLE is available to help facilitate department-wide discussions toward this end: email firstname.lastname@example.org to get started.