Reasoning about causal mechanisms is central to scientific inquiry. In science education, it is important for teachers and researchers to detect students' mechanistic explanations as evidence of their learning, especially related to causal mechanisms. In this paper, we introduce a semi-automated method that combines association rule mining with human rater's insight to characterize students' mechanistic explanations from their written responses to science questions. We show an example of applying this method to students' written responses to a question about climate change and compare mechanistic reasoning between high-and low-scoring student groups. Such analysis provides important insight into students' current knowledge structure and informs teachers and researchers about future design of instructional interventions.