Many instructors share that they want their students to learn critical thinking skills. But critical thinking can be challenging to define and even more challenging to measure. A recent journal article accomplished both. “Teaching Critical Thinking” was the paper discussed in the April Pedagogical Innovations Journal Club. The 2015 paper, by N. G. Holmes, Carl E. Wieman (Nobel Prize winner in Physics), and D.A. Bonn was published in the Proceedings of the National Academy of Sciences.
The take-home message of this paper is that if we want our students to become critical thinkers we must train them in critical thinking. This includes defining and modeling critical thinking and then allowing students to practice this process or skill with feedback. In this case critical thinking was defined in the context of data collection in a physics lab, though the authors claim that these approaches can apply to any field that deals with data and models.
Students were taught a framework to apply when analyzing the data that they collected in the lab. That framework was:
- Make a comparison
- Reflect on the comparison
- Act on the comparison
In this model, making a comparison refers to comparing student-collected lab data to either other data they have collected through another method to measure the same phenomenon, or to a model. What the authors wanted students to do after that comparison was to propose changes to their experimental methods, and then to act on the proposed changes. The authors stated that in their overall experience, if there was a discrepancy between measured data and a model, students would be more likely to conclude that their data is incorrect than to consider that the model may be incorrect. The instructors wanted students to regularly consider that a model may be incorrect. As a first step, the authors wanted students to consider repeating their measurements, which would help students identify sources of variability and systematic error in their measurements. To support students in doing this they taught various statistical analysis methods designed to provide information about the variability of their data. Students were additionally given heuristics to help them interpret their results. The heuristics took the form of a decision tree that included possible interpretations of their statistical analyses and what those might mean about their data and the next steps they could take based on each interpretation. Below is one example of a decision tree to apply to the output of one of the statistical analyses.
- Determine chi-square for fitting student-measured data to a model.
- A low chi-square may indicate that data are in agreement with the model or that uncertainties in measurement were overestimated.
- A high chi-square may indicate that there are problems with data measurement or the model may be incorrect.
Students in the experimental section were taught these methods in week 2 of the lab course and were graded on their compliance with the methods as evidenced by their lab reports. Specifically, authors were looking at student lab notebook reports to determine if students in experimental and control sections had either proposed a change to their methodology or had proposed a change and actually carried it out. Students in the experimental section were much more likely to propose or propose and carry out changes in their methodology compared to students in the control section. In the experimental section, the authors then deliberately stopped instructing students in this method over the course of the semester (fading of directions) to see if students continued using this critical thinking approach. They found once again, that students in the experimental section at weeks 16 and 17 were more likely to propose or propose and change their methodology compared to students in the control section.
Transfer of knowledge and skills to new situations is really what most of us are looking for in our students’ learning. The authors found that applying these critical thinking skills transferred to the following year’s Physics lab. A subset of students in the experimental class went on to attend the subsequent year lab course. They found that the students from the experimental group were more likely to propose or propose and carry out changes in their methodology than their classmates who were not in the experimental group.
What does this mean for our teaching?
Define critical thinking for your field: Begin by describing a process of critical thinking in your field. Articulating exactly what you mean by critical thinking may not be easy. But once you have it defined, it will be easier to determine methods to teach it. Below are some generic examples of what critical thinking may look like in some fields.
Measure any phenomena in different ways using different methods – compare the results and address the following questions:
- Why are the findings different?
- Which method produces the most reliable results?
- What are some ways to address these differences?
Make a conclusion based on incomplete information or propose a solution to a complex problem that doesn’t have an obvious correct solution and address the following questions:
- What are the strengths and weaknesses of your proposal?
- How would you defend your proposal to someone else?
- What new information would make you believe your model might be wrong?
Teach critical thinking to your students and measure their mastery of it. Below are some aspects of the teaching strategies used in the article that probably contributed to the success of their approach:
- Instructors modeled the desired behavior to students. You can do this by deliberately sharing with students how you approach certain problems in your field.
- Instructors provided students with a decision tree. By making the process concrete, it is easier for students to follow it correctly.
- Instructors assigned points to the activities and assignments that asked students to display their critical thinking skills. Assigning points is typically how students determine what is important to you.
- Instructors provided feedback to students on their critical thinking. In attaching points to the assignment and returning them to students, instructors provided them with concrete feedback on their critical thinking skills. This tells students how they are doing and helps them to improve if they are having difficulty.
One of the audience members summarized their feelings about the approach used in this article by saying “Basically students will do what we tell them to do.” Yes! Exactly! A question to ask ourselves is this: Are we telling students to do what we think is most important when it comes to developing their critical thinking?