Volume 8: Common Mistakes: Data Triangulation and Analysis

scott's thoughts Dec 12, 2023

Welcome back! As we continue our discussion of the common mistakes that can lead to observations and citations in your program’s SSR, let’s have a look at some observation language. This is what we receive from the ARC-PA committee when a program’s conclusions, action plans, strengths, and areas needing improvement are not considered to be adequately data-driven:

“The program identified six modifications, three of which were not the result of documented data analysis (the pre-matriculation program, diverse student background assessment, and student success workshops). The program did not document consistent data for the cohorts, providing one set of data for one cohort but a different set of data for another, making it impossible to document critical analysis of the data.”

“The program did not consistently incorporate or reference relevant data from other areas of the SSR to support relationships or correlational analysis.”

“Areas in need of improvement/modifications/strengths do not have triangulation of data to support the action plan.”

“The program did not provide evidence that its conclusions, strengths, and areas in need of improvement provided as examples of ongoing program self-assessment were the result of performing critical analysis of the data. Missing data did not allow the program to fully assess any aspect of the program…there was no onsite documentation to verify the implementation of data analysis as part of the program’s self-assessment process.”

You may have noticed, however, that in some cases, the problem is not that the program does not have the data or use it, but that the SSR fails to properly describe the methodology. You cannot assume that the committee or anyone else can follow your logical processes if they are not clearly set out in the SSR.

Definition of critical analysis

Critical analysis is a process of evaluating, interpreting, and examining a subject, text, or idea in a thorough and objective manner, and involves assessing the strengths, weaknesses, implications, and underlying assumptions to form a well-informed and reasoned judgment or critique. 

Critical analysis often explores the context, evidence, and logic behind the subject matter to gain a deeper understanding and provide valuable insights or constructive feedback. 

Critical analysis is the process: it is what you uncover. The trick is to articulate that on your SSR and site visit. My recommendation is that if you’re not collecting all this data now, you should start, even if you are several years out from a site visit. You’ll be glad to have those action plans recorded in real-time.

Establish criteria for triangulation of data throughout the SSR. Consider including this as an additional section within your narrative.

Tackling triangulation: a template

Implement a program process involving triangulation of data within each element of the SSR and include this as part of each section within the analysis section of the  SSR. You can also create a triangulation strategy protocol to guide you. I have included my own template here.

This is an example of a triangulation strategy. So, for example, if a course evaluation was below benchmark, where would you go next to triangulate? You might go to the course director evaluation. You might then go to the sufficiency and effectiveness. Perhaps you are short-staffed, people are stretched thin, and things are done ineffectively. You may look with PANCE or PACKRAT scores to see if there is a correlation there.

Triangulation of data: an example

Whenever there is an actionable below-benchmark indicator, there needs to be additional data used to verify and validate the legitimacy of the data before making a significant modification or area in need of improvement. 

For example, let us say a program annually generates a multi-variable exhibit comparing students’ performance in the PACKRAT against the national average longitudinally over time when comparing multiple instruments. This aligns with the competencies by evaluating the program's preparation of the student for specific task areas as defined by PAEA/NCCPA examinations, which is conducted by comparing the results of the PACKRAT/EOC/PANCE. When considering the comparison to the national average over a three-year period of time, organ systems or task areas that were below the national average on all instruments were considered an area that needed improvement. These results are then evaluated against program instructional objectives to ensure coverage of all content areas. Course evaluations and exit surveys are also reviewed for analogous data.

Coming next week:

There is another piece to an excellent SSR, and that is the involvement of your faculty. In my next blog, I’ll discuss how to get your faculty on the same page with the SSR’s language and purpose and be well-prepared for the questions that come with a site visit.

Close

50% Complete

Two Step

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.