By: Dr. Jess Stahl, Vice-President Data Science & Analytics (NWCCU)
In May 2020, NWCCU member institutions responded to a survey from the NWCCU entitled “Institutional Data Capacity Survey” (IDCS), which was designed to provide insight into the capacity (e.g. data, technology, and resources) of institutions throughout our region to improve equitable outcomes through evidence-based approaches to meet the NWCCU 2020 Standards for Accreditation. Specifically, the standards (Standard One- Student Success, and Institutional Mission and Effectiveness: 1.D.2, I.D.3, and 1.D.4) require institutions to report disaggregated indicators of student achievement and engage in regional and national benchmarking using transparent methods for data collection and analysis to mitigate gaps in achievement and equity.
Thus, the IDCS covered a wide range of domains including technical resources, data staffing, data governance, and data reporting to inform our strategic planning about how best to support our member institutions. This article is the first in a three-part series discussing the key highlights of the survey results and our relevant initiatives. This article will present a summary profile of our member institutions and the technical resources used to manage institutional data.
The total number of institutions responding to the IDCS was 159, which was 100% participation.
Nearly 60% of our member institutions are located in Washington or Oregon. Two institutions reported geographic location in multiple states. (Note: “BC” = British Columbia, Canada)
Based on self-report using size designations from the College Scorecard, “Large” institutions represent 14% of our membership whereas 86% of our institutions are “Medium” or “Small”.
Four-year institutions comprise 60% of our membership and 20% of those institutions are designated as “Large” in size compared with only 5% of two-year institutions that are “Large”. The majority (59%) of our two-year institutions are “Medium” and 37% are “Small” in size.
A clear majority (70%) of our member institutions are “Public”, but there are also a significant number of “Private Non-profit” institutions. The size distribution for our Private Non-profit institutions is nearly identical to the overall distribution: 48% Medium, 39% Small, 13% Large.
Our largest institutions are 73% Public, 27% Private Non-profit, and 86% are four-year institutions. At the other end of the spectrum, our smallest institutions are 71% Public, 29% Private Non-profit, and 63% are four-year institutions.
The survey provided an opportunity for institutions to self-identify as any of the following (“unofficial”) designations: Hispanic Serving Institution, Tribal College or University, Native American Non-Tribal Institution, or Faith Based Institution. Our intention for these categories was to identify and understand any challenges and evidence-based approaches undertaken by institutions serving specific student populations. In total, 48 institutions identified as follows:
(One institution identified as both Native American, Non-Tribal and Hispanic serving.)
Faith-based institutions were asked:
“As a Faith Based institution, how do you evaluate (‘evidence-based’ approach) fulfillment of the faith-based aspects of your mission (e.g. faith, service, leadership, ethics, personal growth, reflection)?“
Institutions identified the following key dimensions as faith-based aspects of their mission that are reflected in their programs and courses: pastoral charity (e.g. concern for the poor, marginalized, and suffering), active engagement with persons of other religions, openness to persons of other cultures, integration of theology and pastoral practice, faith and values based decision-making, student behavior, service, leadership, ethical reflection, faith commitment, spiritual formation, and critical thinking regarding moral and ethical practices. Many indicated that their mission relates to lifelong demonstration, growth, and service across these dimensions.
Institutions use a variety of methods to measure the key dimensions that support mission fulfillment as a faith-based institution, including: evaluation (direct assessment) by formators and supervisors, opportunity to practice and develop in pastoral ministry placements, surveys, learning outcomes in core curriculum courses specifically tied to theological and ethical education, national benchmarking with similar faith-based institutions, participation in staff and faculty development related to specific faith-based pedagogical traditions, service requirements, course based rubrics, student feedback, course evaluations, student reflections and testimonials, each employee has a mission fulfillment discussion with their supervisor as a part of their annual performance review, capstone courses, engagement in chapel services, faith integration rubric for program assessment, faculty portfolios, specific surveys mentioned: National Survey of Student Engagement (NSSE), Christian Life Survey, faculty growth plans, and peer reviewed portfolios.
Hispanic Serving institutions were asked:
“As a Hispanic Serving Institution, what challenges have you had (or anticipate having) with regard to the Interim or Annual Report?“
Many institutions reported that they did not anticipate any challenges or were uncertain (as newly designated Hispanic Serving Institutions) about whether they would face any specific challenges. However, institutions reported that the most challenging issues are: interpreting federal data definitions (e.g. various FTEs), lack of institutional data capacity, difficulty measuring some of the performance measures identified in the initial grant proposal, technical problems with the Annual Performance Report (APR) website, lack of regular communication from the program officer, belief that number of Hispanic students is under-reported due to students identifying as ‘unknown’ race/ethnicity, difficulty obtaining information about “Deferred Action for Childhood Arrivals” (DACA) and undocumented students because identification is voluntary, and a need to engage in more workforce partnerships and establish continuity in transfer to other universities or colleges.
Tribal Colleges and Universities (TCU) were asked:
“As a Tribal College or University, how do you evaluate (‘evidence-based’ approach) fulfillment of your mission (e.g. foster tribal values, culture, languages, self-determination, knowledge, and traditions)?“
Institutions identified the following key dimensions as important aspects of their mission as a TCU that are reflected in their programs and courses: knowledge of culture and language, traditional knowledge and traditions, history, cultural dignity, and Native lifeways.
Institutions use a variety of methods to measure the key dimensions that support mission fulfillment as a TCU, including: integrating Indigenous Assessment Metaphor into the overall Institutional Assessment plan (grounding assessment efforts in place-based values and beliefs); participation in cultural programming; culture/language requirements within each degree; AIMS Key Indicator System (AKIS); course level assignments with evidence-based rubrics detailing how culture and language are integrated and graded within each assignment; departments must articulate culture and language within student learning outcomes; “support cultural perpetuation, including language, culture and history” as a core theme; outcome data from language classes; evidence of Native language learning among faculty and staff and its contribution to their work, faculty and staff participation in campus cultural events; evidence of cultural exchange among students, faculty, staff, and community; historical and cultural library holdings and their usage.
Native American, Non-Tribal institutions were asked:
“As a Native American Non-Tribal Institution, what challenges have you had (or anticipate having) with regard to the Interim or Annual report?“
These institutions reported that they did not anticipate any challenges in reporting.
The IDCS survey utilized categories aligned with the purpose of the survey to examine data capacity. The (U.S.) federally designated categories for minority serving institutions are[i]:
Acronym | Description | NWCCU members |
AANAPISI | Asian American, Native American, Pacific Islander Serving Institution | 15* |
ANNH | Alaskan Native, Native Hawaiian Serving Institution | 2 |
HSI | Hispanic Serving Institution | 12* |
NASNTI | Native American Serving, Non-Tribal Institution | 2 |
TCU | Tribal Colleges and Universities | 9 |
AAPI | Asian and Pacific Islander Serving Institution | — |
HBCU | Historically Black Colleges and Universities | — |
PBI | Predominantly Black Institutions | — |
MSEIP | Minority Science & Engineering Improvement Programs | — |
There are currently 39 NWCCU institutions federally designated as minority serving institutions and 44% of those are in Washington. There are no federally designated minority serving institutions in Idaho or Utah. One institution holds designation as both AANAPISI and HSI.
The IDCS also provided an opportunity for institutions to identify whether their institution has implemented or has key initiatives underway for the evidence-based practices. The most widely implemented initiative is “guided pathways” (36%) followed by “co-remediation” / corequisite (21%)[ii]. “Competency-based” (9%) programs are the least commonly implemented initiative.
Institutions implementing “guided pathways” were asked:
“How does your institution evaluate the success of its Guided Pathways (evidence-based approach)?”
Methods for evaluating the success of guided pathways initiatives include: statewide dashboard with common metrics (notably, the Washington State Board for Community & Technical Colleges dashboard), Voluntary Framework of Accountability (VFA), benchmarking with institutions with similar demographics, collaboration between institutional research and student affairs offices to identify and track key indicators such as advising appointments and registration in recommended courses, tracking Community College Research Center’s (CCRC) Early Momentum Metrics, analysis of declaration of pathway upon entry and program by second quarter, institutional scorecards, rubrics, signature assignments from selected classes, Association of American Colleges & Universities (AAC&U) multi-state collaborative, student perception feedback (e.g. quarterly College 110 student survey, Community College Survey of Student Engagement- CCSSE, Survey of Entering Student Engagement- SENSE, student focus groups), College Spark Guided Pathways Initiative, program maps evaluated annually during a faculty workday focused on assessment of learning, student self-reflection at each benchmark (developed by academic counselors), e-portfolio system.
Specific metrics that institutions use to evaluate guided pathways initiatives include: credits to degree, first year retention, graduation, completion, course success, credit accumulation (both making consistent progress and also fewer total credits accumulated to complete), post-graduate employment, transfer, continuous enrollment in math and English until general education requirements have been completed, shorter time to degree, completion of college level English and math within the first year of enrollment, admissions conversion rates, increase in number and percentage of students who have chosen a program pathway within two quarters or 30 credits.
Institutions implementing “co-remediation” were asked:
“How does your institution evaluate the success of Co-Remediation (evidence-based approach)?”
Methods for evaluating the success of guided pathways initiatives include evaluating transcript data, Integrated Basic Education and Skills Training (I-BEST), and enrolling students in a co-requisite support class.
Specific metrics that institutions use to evaluate guided pathways initiatives include: pass rates in key gateway courses, increased percentage of first-year students who successfully complete math requirements, time to graduation, student performance in subsequent courses, DFW rates, enrollment trends, grades earned by those enrolled in co-remediation courses compared with those in traditional math and English courses, grades earned by those in co-remediation courses compared with pre-university course grades, and year to year success rates.
Institutions implementing “competency-based” programs were asked:
“How does your institution evaluate the success of its Competency Based approach? (evidence)”
Methods for evaluating the success of competency-based programs include: student feedback within the class and course evaluations, institutional accountability plan, surveys, in-person and phone interviews, and evaluating portfolios.
Specific metrics that institutions use to evaluate competency-based programs include: graduation, licensure/certification exams, placements, retention, credentials awarded, student satisfaction measures, and enrollment and final grades over 3 semesters.
All institutions were asked:
“What are the most challenging aspects of data (collection, analysis, reporting) for your institution?”
We received strikingly similar descriptions of common data challenges from most institutions: data efforts are excessively time-consuming; data is siloed, decentralized, and has multiple ‘owners’; fragmented data efforts; limited data is available; lack of best practices for analyzing both qualitative and quantitative data; questionable data quality or accuracy; outdated data systems and tools; insufficient data staffing; unclear data collection (e.g. what, how, sample size, definitions, process, tools); lack of resources for data preparation; data definitions (e.g. different definitions of ‘success’); outdated IT infrastructure; integrating different data systems and multiple platforms; lack of data literacy; unclear data governance; balancing internal data requests with external reporting requirements; missing data; inconsistent (or non-existent) data workflows and processes; keeping up with industry technology, infrastructure, skills, and talent; lack of resources to pay for data skillset; difficulty retaining data staff; and lack of formal roles for data analysts. These concerns were consistently cited across institutions of all size and type.
We appreciated the candid and extremely informative responses to this question that reflected the real-world challenges in working with data faced by nearly all institutions. You will likely find your own experience(s) reflected in the Bonus Section[iii] (at the end of this article) that describes the challenges that are commonly experienced by most institutions, which also informs our work.
The “Technical Resources” section of the IDCS focused on the specific technical resources (data tools) used by our member institutions to manage and work with institutional data. The purpose of this section was to understand the range of technical resources and identify areas for support.
Nearly all (99%) institutions are using a Student Information System (SIS):
The most widely used student information systems are: Banner (31%) and PeopleSoft (28%). Almost an equal number of institutions use Jenzabar (11%) and Colleague (10%). It is interesting to note that one institution reported maintaining a 45-year-old mainframe computer while another reported not using a student information system at all.
Nearly all (98%) institutions are using a Learning Management System (LMS):
Canvas (53%) is the most widely used learning management system followed by Moodle (25%), Blackboard Learn (10%), and Brightspace D2L (6%). One institution reported using a “home grown LMS” and three institutions reported not using any learning management system at all.
Among those maintaining an on-premises database, the majority are using Microsoft SQL Server (63%) followed by Oracle Database (31%) and MySQL (15%).
There is an almost even split between the number of institutions that reported using a data warehouse (48%) and a slight majority (52%) that are not. The most widely used data warehouse tool is MS SQL Parallel Data Warehouse (8%) closely followed by Microsoft SQL Server (7%) and Oracle Autonomous Data Warehouse (6%).
Most (67%) institutions reported not maintaining a cloud database. But, among the institutions that are using a cloud database, there is a relatively even distribution among Amazon Relational Database (9%), Microsoft Azure SQL Database (8%), and Oracle Cloud Database (8%).
Survey results regarding the technical resources used by our member institutions is helpful for planning future programming, annual conference topics/speakers, and other initiatives. They also help us understand the technical capacity of our member institutions to implement high-quality data analytics to support evidence-based approaches to achieving equitable outcomes.
For example, a “modern data stack” in most industries consists of an application database or other data sources, customer relationship management system, and advertising platform from which data flows into a data warehouse/lake. Then, “extract-load-transform” (ELT) is performed to prepare data for use with business intelligence (BI) tools for data analytics/visualization, ad-hoc analyses, and (increasingly, within higher education) artificial intelligence/machine learning.
This data also provides insight into potential institutional data capacity gaps among our member institutions for which we may be able to provide resources through grant-funded initiatives such as the PDP Accelerator, which is currently serving a cohort of about 50 member institutions that are working toward onboarding to the Postsecondary Data Partnership.[iv]
We encouraged participation from relevant stakeholders across the institution in completing the IDCS, including staff from institutional research, information technology, human resources, assessment/effectiveness, accreditation, data governance, data ethics, and data analytics teams.
In our next issue, we will discuss IDCS results regarding data analytics and data staffing (HR).
“We are a data centered institution. Our challenges are similar to many institutions our size. Data quality is our largest challenge. We have a central information system that has been in place many years and we are generations from the original implementation so we have evolved in ways that the system was not designed and therefore data quality is challenged. Many users are narrowly trained and have little understanding of the interfunctionality of the system. This delays reporting and analysis because data needs to be cleaned and contextualized prior to higher level review.”
“We are currently transitioning from a legacy ERP to a more modern data system. At the same time we are also doing rapid work to develop a culture around utilizing data to inform our decision-making process. This is both an opportunity and a challenge in that staff are being asked to configure a new system in ways that impact what data will be collected and how it will be available while simultaneously learning how data can be utilized.”
“One challenge area has been the collection of student stories. In an effort to strengthen our assessment efforts, the collection of student stories has been added to our strategic plan. We are actively working on best practices of how to collect these stories.”
“Agreeing on the right questions to ask and setting the priority to persist in asking those questions; having the right data for complex questions like ‘what did your students learn’; integration of data from multiple systems and data sets across the institution, some siloed, some very old, some very new (enterprise-level, department-level, office-level); data management (getting clean and clear data suitable for analytical use); reconciling data definitions (every external requestor has their own ‘spin’; every internal requestor has their own ‘spin’; reconciling changes in definitions over time; maintaining continuity in the face of institutional change; integration and liability around data that from outside sources (like state government unemployment records); security infrastructure to ensure that data can be share only as is appropriate”
“Data definitions, consistency, analysis, collection, organization, and use are the most challenging aspects. This is so for multiple reasons, but in part because the IR team lacks resources such as the software tools, training, sufficient staffing, and time to provide routine, clear access to standard data points in a stable location. We need dashboards to review standard data points and to communicate effectively about how to identify and close information gaps. Using data to close equity gaps, improve the institution and for decision making is another challenge and growth opportunity.
Generally, we have sufficient data, but it is often incomplete, inconsistent, or unavailable and is usually under the direct control of units across the college, who may not follow consistent procedures or document their methods. Some data is not collected for the purposes of research but instead to support ongoing business practices. Data collection is also challenging because it relies on self-reporting.
For internal reporting we need access, and clear direction from leadership. External reporting requires staff time, which is at capacity. Broad-based data literacy is an issue in order to organize, analyze and use data meaningfully. We need to align expectations and assumptions with the reality of institutional research.”
“We struggle the most with collection, due to the many different avenues of data collection (e.g., different systems, individuals’ personal spreadsheets). We do not have a single comprehensive system or a set of systems that easily talk to each other.”
“We have a good collection, analysis, and reporting structure, but education is our next challenge: ‘What does the data mean?’ ‘Why does this data not match that data?’ “
“1. The student information system is different than the human resources and finance system. This makes it difficult to look at the business side of instructors to classes with our disparate information systems.
2. The non-credit side of our institution is housed in a different platform than the credit side, making it hard to track our entire student body.
3. With respect to HSI data, the current issues surrounding DACA students’ fears around participating in the Census are all challenging aspects of collecting data, especially concerning our undocumented and Hispanic student populations.”
“Being a community college, we have students entering our system at pre-college, GED/ESL, community education, and collegiate level. Data needs vary at each entry point and become difficult to manage when a student transitions from one entry point to another. This creates incomplete data sets as a student for instance transitions from community education to collegiate.”
“Some data collection is difficult because the college’s legacy data systems do not have fields for all information that the college would like collect. Some of that data is needed for better decision making and predictive analytics.
Data quality is also an issue because the college’s legacy system does not have field-level controls; definitions have changed over the years without documentation; and processes have not been consistently followed across the organization. As a result, some longitudinal data is almost impossible to read or use because codes have changed; data is inconsistent and messy; and some data elements have been used differently, over time, for different purposes.
Data reporting is also fairly comprehensive, but our college has been challenged with data literacy. Another challenge is forming evaluation and assessment plans before implementing student success initiatives. As expected, it is difficult to assess the impact of something if proper evaluation steps were not first considered and implemented.”
“We don’t have an Institutional Research Office or a designated full-time IR person (one mid-level administrator currently has 1/6 of his load allocated for institutional research). While the university realizes IR is an important need, intentions to create a staff level position to assist with this have been put on hold the past couple of years due to budget constraints. For the most part, data collection is decentralized on our campus, although in the last several years we have made efforts to make data more widely available and easily accessible to decision-makers on campus. In addition to personnel, perhaps the most challenging aspect of data collection, analysis, and reporting is the limitations of our ERP. We are currently using … (a system no longer being actively developed by our vendor) and we do not have a data warehouse, so our ability to extract live data and create data reports and visualizations is limited. We also have many different data repositories and they don’t all talk to each other and frequently contain conflicting and different information, which at times leads to a lack of trust in our data.”
“Limited human resource capacity; lack of real-time data availability; lack of data warehouse and working without data governance and standards. Multiple reporting options and constantly changing approaches to data views.”
“Limited collection of data on student demographics/profile (e.g. race, ethnicity, socio-economic indicators), which is an issue in all [similar] institutions.
– Having several independent data sources to meet the needs of individual business units creates a challenge of consolidating information and ensuring consistency across the institution.
– Data privacy creates a challenge around how much and what types of information can be published.
– Inconsistent business practices across different units is also a big challenge e.g. department use course waitlists in different ways so it is impossible to gauge student demand.
– Understanding the business practice of how data enters the information system. Without a complete understanding of a department’s business practice, it is easy to misinterpret data.
– Inconsistent data definitions across business units e.g. a part-time student is defined differently for different purposes or by different departments. Some of it may be reduced with implementing a standard data dictionary.”
“Disaggregation of data is challenging because of our small size. Many of our demographic and other groupings yield small numbers of students, and as a result, several yearsâ worth of data is required in order to have meaningful information.”
“The growing volume of data; matching up data resources with the institutional growth. Keeping up with high demand for data analysis, reporting and visualization”
“1) Lack of institutionalized Data Governance and Data Management to maintain consistent data entry and information system definitions. Currently working on implementing and addressing college-wide, establishing a Data Governance Committee, and creating a Data Catalog.
2) A history of bottom-up and silo-ed approaches to data usage that do not align with strategic planning and strategic institutional priorities and initiatives.
3) Prior to 2019, limited resources to provide data reports and insights. Now working on increasing data literacy across different levels of the organization and improving data management and usage for decision support.”
“Niche data storage by departments in the institution and data quality concerns when new systems are implemented with insufficient planning.”
“Most of our institution’s challenges relating to data are connected to our student information system (SIS). Our SIS is predominately designed for for-profit proprietary schools, and we are one of the few traditional undergraduate institutions utilizing the product. In conjunction with that, there has been a lack of training with the SIS for new users, which leads to a long and steep learning curve, and, at times, an over-reliance on more adept users of the system. Reports are exported to Excel and often must be combined with other database exports (few reports are customizable), and as such, users with little experience or technical knowledge of Excel are often frustrated and then rely heavily upon a few experienced users.
In addition, at times there appears to be a lack of vision/communication between departments regarding data requests and usage. We often receive multiple requests for information that are very similar but just different enough that the same data cannot be used for both request- hence, a doubling of work often occurs.”
“We have limited [human resources] (i.e., just one research analyst) to analyze and report the data for our college. Also, it will be more challenging and time-consuming, and less meaningful (e.g., colleges will have inconsistencies in coding and underlying institutional processes), to compare and benchmark our indicators with regional and national peer institutions.”
“The most challenging has been our institutional technology’s infrastructure and ability to provide the data in a timely manner with limited resources. It would be a better system for the institutional research department to have direct access to the data sources on campus but all sources are currently managed by the IT department.”
Navigate the articles below, or go to the current Beacon directory.