Skip to content

V4I4: Tillamook Bay Community College’s Journey: Faculty Engagement with Assessment to Improve Student Success

By: Teresa Rivenes, Ed.D., Tillamook Bay Community College
Print Version

 

“…we have serious doubts that Tillamook Bay Community College will be successful in the seven-year mission fulfillment visit.” Those were just a few of the closing remarks from Tillamook Bay Community College’s mid-cycle visit. Luckily, Tillamook Bay Community College (TBCC) possesses grit, determination, and simply refuses to give up. Kezar (2014) says that when the pressure to change builds momentum, institutions (may) make decisions that result in new schemas and norms. This article looks at our journey through assessment and improvement and the overall impact assessment has had on student success and faculty engagement.

After the (painful) mid-cycle review, the TBCC faculty reviewed, refined, and wrote new Course Content and Outline Guides (CCOGs). They ensured all courses had Course Learning Outcomes (CLO), Program Learning Outcomes (PLO), and that these tied to Institutional Learning Outcomes (ILO) where applicable. Over the next few years, full-time faculty measured these outcomes on paper forms and engaged in mapping all outcomes to ensure that all students could achieve them by the time they finished their educational journey at TBCC. This was largely a compliance-driven initiative.

In the 2018-2019 academic year, TBCC continued to work on assessment by clarifying and expanding the process. A Curriculum Review and Assessment Handbook was written, as was an Instructional Program Review Handbook and forms for consistent tracking and comparisons. TBCC developed a customized database for measuring student learning outcomes. After each course, all faculty measured all student learning outcomes (CLOs, PLOs, and ILOs) regardless of modality or location. This was the first major step towards consistent measurement. These steps were not just busy work as they created a common vocabulary, created a shared baseline, and built faculty voice in the process. The arrival of a new Chief Academic Officer in 2018 opened the door for more in-depth conversations. Conversations started with, “How can I help you show the amazing things you are doing with students in your classes?” Showing off the important work that faculty were doing increased buy-in and engagement. TBCC established a foundation centered on the idea that “Change moves at the speed of trust (Kuh et al., 2015).”

By the conclusion of the 2018-2019 academic year, 77% of all student learning outcomes (CLOs, PLOs, and ILOs) were measured directly by faculty. This was up from 22% in the 2016-2017 and 2017-2018 academic years, a gain of 55%. The increase in measurement was good news; TBCC was beginning to engage in regular student learning assessments. However, it was not good enough. We were still tracking our data by classes and not individual students. Further, we were looking only at students who successfully met the learning outcomes. Arguably as much, if not more, could be learned from students who did not have a successful experience. For this reason, TBCC started “Phase Two” of the assessment journey.

Building on the success we were having, and on the trust developed through open and honest dialogue, TBCC continued to work on assessment. We were very clear that assessment was not a “teaching critique” but a way to demonstrate the important differences that faculty made in student learning. The data was simply a tool that could and should be used to help achieve even better results. TBCC used an assessment for improvement paradigm that focused on an ethos of engagement (Ewell, 2009). We attended to culture through discussion, celebrating small wins, sharing best practices, and creating a safe space to share frustrations when gains were not realized. We were very transparent about the fact that we would make mistakes and that we would fail, but that we were also people who dusted ourselves off and tried again because student success matters most.

A more simplified outcome tracking process was developed in our Learning Management System (LMS), Moodle, which was a tool faculty already used. It was easier than a separate database and had the added benefit of tying each measurement to an individual student. After each course, all faculty, regardless of modality or location, measured all student learning outcomes in every course. Faculty, for the first time, did this for student regardless of level of outcome achievement. We worked with our local school districts to secure stipends for our dual credit faculty as they engaged in this work as well. We paid everyone who attended training because we believe budget drives priorities.

TBCC continued making major improvements. First, faculty measured 96% of all student learning outcomes directly. This was up from 22% (16-17), 22% (17-18), 77% (in 18-19), and 92% (19-20); a gain of 74% overall. Second, we integrated our assessment process with our Enrollment Management System (Jenzabar). We exported the Moodle results and tied them to data tables in Jenzabar, which allowed us to analyze all SLO achievements by student demographics, including gender, race, ethnicity, first-generation status, age, and degree program (virtually any demographic factor that is stored in Jenzabar). While the process is still a bit manual, it is a significant improvement, and we are getting far more detailed and relevant information. Our computer science faculty (Dr. Chris Carlson), IT Director (Sheryl Neu), and Online Learning Coordinator (Sarah Miller) did the work, which again built trust with faculty as they trusted the development team. Each step of the process was vetted, discussed, and improved through faculty engagement.

Next, we worked on the quality and consistency of assessment. CLOs, PLOs, and ILOs were measured both directly and indirectly. Rubrics to measure ILO and PLO achievement were developed, refined, and all faculty, each term, were trained on using these tools. TBCC also strengthened PLO/ILO measurements by asking students to complete both a course survey after every course and a graduation survey at the end of their TBCC experience. CTE employers were asked to measure student PLO/ILO achievement in each student’s capstone work experience course. All this work allowed us to triangulate data for increased validity (Kuh et al., 2014). Faculty began to tie course learning outcomes to selected assignments in each course for increased consistency. TBCC felt it was important that faculty had latitude in the assessment measures, that faculty focus first on the areas that interested them, and that data should be reviewed after each course establishing a habit of reflection. Data, reflection, and improvement became an ingrained part of the TBCC culture.

It was vital for us to value the voice of TBCC faculty and their perceptions of student learning in this process. To do this, a quiz was developed and administered to faculty at the end of every course. The quiz asks questions such as “What went well?” “What did not go well?” “What will you do differently next time this class is taught?” Faculty use this tool to set goals for improvement in each course, each time it is taught, and make notes of needed curriculum change. This data is then exported and used by Department Chairs to improve the CCOG in response to a collection of faculty feedback. Faculty also review the quiz the next time they teach the course to ensure that the assessment loop is closed and planned interventions move forward. They then take the quiz again at the end of the course and restart the ongoing cycle of improvement. CLOs and PLOs are reviewed in detail every three years in the Program Review and Course Review process, and they are adjusted through that process as data indicates. Again, the process was inquiry-driven, built on success, free from judgment, and showed action even when that action was not successful (Ewell, 2009). Over time, TBCC has become more proficient at taking successful action.

Overview of Total Achievement of Student Learning Outcomes (including transfer programs)

Note: Achievement is defined as competent and above (score = 3+). The charts below indicate the percentage of students who achieved this rating.

 

2017-2018

2018-2019

Note: Rubrics and training were introduced this year.

2019-2020

2020-2021

Note: Covid pandemic started.

Course Learning Outcomes

77%

63%

80%

74%

Program Learning Outcomes

75%

60%

80%

75%

Institutional Learning Outcomes

76%

58%

82%

71%

The processes we developed are significant, but perhaps more important are the changes that have occurred because of the processes. The first of the changes was realized in General Education. In the 2019-2020 academic year, we did an in-depth program review of our General Education program. The first of its kind at TBCC, we looked at all courses within the general education disciplines, and faculty used data to set discipline-specific goals.

One of the first improvement goals faculty set was in writing as students, faculty, and employers noted the need for improvement in this area. We implemented several changes in writing, including sharing rubrics so that writing mechanics were looked at in multiple classes, asking that most 200 level courses include writing assignments as a key assignment on the CCOG, and asking all 100 level courses to consider an informational literacy assignment as a key assignment. Faculty, through the curriculum committee process, monitored progress on CCOGs. The data support these efforts have been effective as students’ writing achievement has increased as much as 30 percentage points in some cases.

There have been other successes as well. We found through data analysis that our 200 level majors Biology sequence and our Anatomy and Physiology sequence were barrier courses (high D/F/W rates) for students. Many students were not successful in completing them, nor did they move on to subsequent courses. TBCC purposefully created a 100 level Biology series (for non-science majors) to address this. Faculty also increased rigor in BI 112, a prerequisite requirement for Anatomy and Physiology (A & P). We have also added weekend open labs, faculty-led student study sessions, and increased tutoring support in the Learning Lounge (our tutoring center). We have added free recitation sections for students to increase lab time. As a result, we have seen improvement in student learning across science learning outcomes. We continue to work on A & P success.

Additionally, we revamped our highest enrolled social science classes (history and economics) and, as a result, saw impressive student learning improvement. One of TBCC’s economic faculty (adjunct) recently said with genuine excitement in a meeting, “I almost have 100% student success in my class this term. One student wanted to drop, but I have talked him into an incomplete, and I am working with him one on one to complete. I am going to get him there!” General education faculty have taken ownership of student success and are interested in improving student learning and demonstrating that improvement through data. As we see improvement, faculty get more excited about the process and more engaged. All of this happens on the foundation we built of trust and transparency. The process has high intrinsic value and meaning.

General Education Specific

Note: Achievement is defined as competent and above (score = 3+). The charts below indicate the percentage of students who achieved this rating.

 

2017-2018

2018-2019

2019-2020

2020-2021

Course Learning Outcomes

77%

63%

80%

72%

Program Learning Outcomes

75%

60%

80%

72%

Institutional Learning Outcomes

76%

58%

82%

68%

ARTS & LETTERS

 

 

 

 

CLO

 

 

74.62%

73%

PLO

 

 

70.34%

69%

ILO

 

 

74.55%

70%

READING

 

 

 

 

CLO

 

 

62.22%

63%

PLO

 

 

60%

70%

ILO

 

 

61.11%

41%

WRITING

 

 

 

 

CLO

 

 

49.31%

71%

PLO

 

 

53.64%

74%

ILO

 

 

54.81%

71%

MATH

 

 

 

 

CLO

 

 

79.74%

65%

PLO

 

 

76.99%

71%

ILO

 

 

73.35%

61%

SCIENCE

 

 

 

 

CLO

 

 

61.35%

62%

PLO

 

 

52.87%

65%

ILO

 

 

55.03%

62%

SOCIAL SCIENCE

 

 

 

 

CLO

 

 

75.99%

80%

PLO

 

 

67.23%

70%

ILO

 

 

64.81%

73%

 

Career Technical Education

Note: Achievement is defined as competent and above (score = 3+). The charts below indicate the percentage of students who achieved this rating.

 

2019-2020

2020-2021

MANUFACTURING & INDUSTRIAL TECHNOLOGY

 

 

CLO

81%

87%

PLO

78%

90%

ILO

72%

88%

CRIMINAL JUSTICE

 

 

CLO

73%

84%

PLO

76%

83%

ILO

74%

81%

WELDING

 

 

CLO

81%

78%

PLO

78%

84%

ILO

72%

80%

BUSINESS

 

 

CLO

83%

82%

PLO

77%

78%

ILO

77%

78%

HEALTHCARE

 

 

CLO

80%

85%

PLO

98%

79%

ILO

84%

81%

Increased student success has also been demonstrated in developmental math. The data showed that students, who started in developmental education, were not completing college-level math. As a result, a new math class was developed, encompassing all developmental math classes (in one course). In 2018-2019 (six terms), 60% of TBCC students passed MTH 20; 76% passed MTH 70; and 72% passed MTH 95. Many of these students took these courses multiple times. In the new MTH 99 class, 78% of students pass the first time. Even more impressive is that within MTH 99, 18% of those students completed MTH 20 and MTH 70 in a single term, and 3% completed MTH 20, MTH 70, and MTH 95 in a single term (21% completed more than one course). Getting through the math sequence is not the only goal. It is also essential to see how students perform in a subsequent course. The MTH 105 prerequisite was changed to MTH 20 (or one level of MTH 99). Thus, 78% of the students who took MTH 99 were eligible to take the subsequent college-level math course. To date, 82% of the students in MTH 105 passed, all of which were MTH 99 prepared students. Faculty have subsequently decided that every degree and certificate (45 credits or greater) would require college-level math and have developed three clear math pathways (STEM, statistics, and applied math). Our students are demonstrating success with the increased standard, and that has faculty excited.

This work has not ended. Still of concern is the significant percentage of students who did not take the college-level follow-up course to MTH 99. This is problematic because if students wait several terms, they may lose skills that would help them to be more successful. Next, faculty and advisors encouraged students to complete their entire math sequence in sequential order, but this encouragement did not demonstrate measurable success. Now, math faculty are implementing co-requisite support classes that allow students to take a college-level math class during their first term. Before Covid, students were required to spend three hours in the Learning Lounge for MTH 99, which was successful. In spring, when the physical campus was closed to students due to Covid, grades in MTH 99 plummeted. The Learning Lounge requirement was gradually brought back to two hours per week, but this was still not as successful.

For this reason, the math co-requisite was developed as a three-hour lab. Data and assessment have informed curricular design, curricular decision-making, and innovative solutions for student success. It is exciting to see faculty speak confidently about curricular decisions made using data informed by their own students’. TBCC faculty recently conducted a prerequisite review and used D/F/W and pass rate data to determine if pre-requisites could be lowered across the college catalog. TBCC has virtually eliminated both developmental education and placement testing successfully.

The pandemic has had interesting impacts on Career-Technical education (CTE). While we have been seeing increases in general education, courses like welding have had decreasing student learning outcome achievement. We saw this same trend the term that our learning lounge (tutoring) was closed. Though disheartening, this suggests that our CTE classes are more successful when held face to face. It also suggests that our student supports are effective in helping students achieve course learning outcomes.

In academic year 2020-2021, TBCC started collecting and examining disaggregated SLO data. TBCC looks at SLOs across student, course, and by individual faculty. This has been very impactful on personal teaching practices. Faculty have asked, and TBCC has responded, with in-service professional development on topics such as how to engage online learners, micro-aggressions, implementing a lens of diversity, equity, and inclusion, and how to support second language learners in writing assignments (coming next). We are currently exploring what courses we need to offer in Spanish and how we can offer a Spanish support course in gateway courses for more effective student learning. These topics have spurred great discussion. We remain committed to “failing forward” and understanding that we will make mistakes. All our interventions will not result in success. Still, some will, and we remain committed to building on those. Faculty saw the benefit of disaggregated outcome assessment as it related to student success, were empowered by moving the dial on student success forward, and are interested in furthering this work.

Student Demographics (new 2021)
Note: Achievement is defined as competent and above (score = 3+).
Note: This is just a sample of disaggregated data.

Average Achievement of outcomes for students

CLO

PLO

ILO

Over 30 years of age

3.06

3.13

3.09

Under 30 years of age

2.84

2.84

2.83

Who identify as White

2.92

2.97

2.96

Who identify as Hispanic

2.82

2.82

2.78

Who identify as male

2.92

2.89

2.89

Who identify as female

2.85

2.92

2.88

Who identify as non-binary

3.56

3.47

3.89

Grand Average (includes suppressed categories)

2.88

2.91

2.89

(Note: TBCC is a small rural college, so some data is excluded here because of group size. All data at TBCC is carefully scrutinized because of the small numbers, and we draw conclusions carefully. That said, we act using the available data and triangulate where possible to increase validity).

What lessons have we learned from all of this?

  1. It is 100% crucial to take a “failing forward” approach. People must be willing to innovate, try, fail, learn, and try again. The college culture must support this as a learning process for students, staff, faculty, and administration. TBCC built a cycle of improvement by acting and building on both positive and negative results. TBCC is not focused on perfection as we recognize that we are always improving and innovating.
  2. Assessment must not be punitive or used as a reflection of teaching evaluation. It is a recognition of student learning and a way to demonstrate the amazing work faculty do in their classes. Any other attitude will stifle the assessment process or the willingness of faculty to engage in the work.
  3. Assessment outcomes are frequently discussed, and all faculty use the CCOG. It stays top of mind through faculty meetings, in-service, emails, and curriculum committee.
  4. Training is key. We have a mandatory in-service meeting for everyone teaching that term, and we pay adjuncts to attend. We also offer it multiple times for convenience. We have used these opportunities to talk about issues, train on androgyny, and share our improvement using the data faculty have provided. We also use these meetings to build shared purpose, community, and invest in the work together.
  5. Over time, we have developed smaller groups that dive into different aspects according to group interests. For example, our Department Chairs and Deans are digging into the disaggregated data by subject. CTE advisory groups look at program-level data.
  6. TBCC is working to celebrate, communicate and talk about the wins.

How does our accreditation journey end? TBCC was recognized at the year seven Institutional Effectiveness visit, with multiple commendations and no recommendations. Notable quotes from the report include the following:

  • “Through the various conversations with college groups, there existed a can-do and ‘not afraid to fail forward’ attitude and culture to keep moving forward to serve students even when it is ‘messy’ and (a recognition) that ‘we can make it better the next time.'”
  • “Meetings with faculty and staff confirm that assessment occurs at every level of the organization through the efforts of teams, departments, and academic programs. All are aligned to evaluate institutional effectiveness.”
  • “Processes and structures are in place for closing the loop in systematic ways. In conversation with members of the TBCC community, several themes were stated over and over. Themes of ‘working on constantly gathering data,’ asking ‘how students are progressing,’ ‘where might there be issues and challenges,’ and ‘how do we support students’ were heard.”

And, we keep the assessment cycle rolling. Seymour (2016) notes that momentum builds a virtuous cycle and leads to bold imaginings and creative problem-solving. We hit bumps and steer off course, but TBCC is small, mighty, and determined. We remain data-focused in our mission of serving students to the very best of our abilities. And, we keep getting better at doing it!

References:

Ewell, P.T. (2009, November). Assessment, accountability, and improvement: Revisiting the tension. (Occasional Paper No. 1). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).

Kezar, A. (2014). How colleges change: Understanding, leading and enacting change. New York, NY. Routledge, Taylor and Francis Group.

Kuh, G.D., Ikenberry, S.O., Janlowski, N.A., Cain, T.R., Ewell, P.T., Hutchings, P., and Kinzie, J. (2015). Using evidence of student learning to improve higher education. San Francisco, Jossey-Bass.

Seymour, D. (2016). Momentum: the responsibility paradigm and virtuous cycles of change in colleges and universities. New York. Rowman and Littlefield Publishing.

 

Sign up for our Newsletter

Explore the latest issue of NWCCU’s quarterly newsletter, The Beacon, and discover educational resources, innovations, and events happening in the Northwest.