GUEST POST: What 130,000 Student Questions to AI Reveal About Critical Thinking
Muireann Hendriksen is a Principal Research Scientist on the R&D and Thought Leadership team at Pearson, where she leads cross-functional qualitative research to improve learner outcomes. With a background spanning academia and the public health sector, Muireann specializes in impact evaluation and behaviour change, bringing deep expertise in qualitative methodologies and data storytelling to drive better product and business decisions.
The conversation around generative AI in education is often dominated by a sense of apprehension. Will these powerful new tools encourage students to take shortcuts, offload their thinking, and sidestep the valuable productive struggle that leads to genuine learning? Recent studies have highlighted the risk of diminished critical thinking when students rely on mainstream AI chatbots (1). But what if the story is more nuanced? What if, when designed thoughtfully for learning, AI could act not as a shortcut, but as a partner in curiosity?
Our latest research suggests that this more optimistic scenario is not only possible, but already happening. In a large-scale analysis of student interactions with an AI-powered study tool, we found encouraging evidence that students are using AI to build, rather than bypass, their critical thinking skills (2).
As learning scientists, we know that the act of asking questions is a powerful catalyst for learning. Formulating a question forces us to engage with material, organize our thoughts, connect new information with prior knowledge, and identify gaps in our understanding. This process of inquiry is fundamental to cognitive growth and a cornerstone of active learning (3).
With the rise of generative AI, students now have a new and powerful outlet for their questions. To understand how they are using these tools, my colleague Dr. Emily Lai and I analyzed tens of thousands of student interactions with an AI-powered study tool. Our findings offer an enlightening and, frankly, hopeful glimpse into the modern learning process, showing indications that students can build their critical thinking skills with the help of AI.
A Window into Student Inquiry
Our research, detailed in the report Asking to Learn, examined nearly 130,000 anonymized queries from over 8,600 students using an AI study tool embedded within a digital biology textbook often used in introductory biology courses. We focused on the "Explain" feature of the tool, which invites students to ask questions in their own words, providing a direct window into their thought processes and authentic curiosity.
To analyze the cognitive depth of these questions, we used the revised Bloom's Taxonomy as our framework, categorizing each query according to its cognitive process (e.g., Remember, Understand, Analyze) and knowledge dimension (e.g., Factual, Conceptual) (4). This allowed us to move beyond what students were asking to understand how they were thinking.
Beyond Fact-Checking: Evidence of Higher-Order Thinking
Unsurprisingly, a majority of the queries—about 80%—focused on foundational knowledge. Students asked the AI to define terms ("what are the different types of light microscopy?") or explain core concepts ("can you explain cellular respiration to me like I’m a dummy"). This is entirely appropriate for an introductory course, where building a solid base of factual and conceptual knowledge is essential (5). It shows students are using the tool as intended: to reinforce their understanding of foundational concepts and ideas.
What truly excited us, however, was the proportion of questions that went deeper. Our analysis revealed that about one-third of all student inputs reflected more advanced levels of cognitive complexity. Furthermore, 20% of queries were classified at the "Analyze" level or higher—levels widely associated with critical thinking skills.
These queries were not simple requests for information. Students were asking hypothetical questions, critically assessing experimental methods or procedures, and evaluating information in complex ways. For instance:
· "What might happen if the lysosome wasn’t in a separate compartment, or if it didn’t work?"
· "How would I ‘build’ an organism to maximize its surface area to volume ratio?"
· "If you had access to a microscope, how would you differentiate endomycorrhizae and ectomycorrhizae"
These examples show students actively grappling with the material, working with AI not just to retrieve facts, but to explore concepts and test their understanding in a meaningful way. They are not passively receiving information; they are actively framing their inquiries in a way that demonstrates deep cognitive engagement (6).
Image by PublicDomainPictures from Pixabay
Using AI to Nurture Curiosity
Inspired by these findings, our team has helped to develop a new AI feature called "Go Deeper." When a student asks a question, the tool now provides an answer and then prompts them with a follow-up question designed to scaffold them one to two levels higher in cognitive complexity on Bloom's Taxonomy. For example, a student asking for a definition (Remember) might be prompted to describe the concept in a new context (Understand) or apply it to solve a problem (Apply). This transforms a simple query into a multi-step learning journey, scaffolding the student to think more critically without pushing them too far ahead, risking confusion.
While opinions on AI range from cautious to optimistic, our findings highlight the value of thoughtfully designed AI-powered experiences that act as a formative support. By understanding how students ask questions, we can build tools that meet them where they are and guide them toward a richer, more active, and more curious engagement with the world of knowledge.
References:
1. Hendriksen, M., & Lai, E. (2025). Asking to Learn: What student queries to Generative AI reveal about cognitive engagement. Pearson. https://plc.pearson.com/sites/pearson-corp/files/asking-to-learn.pdf
2. Kumar, H., Rothschild, D. M., Goldstein, D. G., & Hofman, J. M. (2023). Math education with large language models: Peril or promise? Available at SSRN 4641653. https://ssrn.com/abstract=4641653
3. Anderson, L.W., & Krathwohl, D.R. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives: complete edition. Addison Wesley Longman, Inc. https://doi.org/10.1187/cbe.10-01-0001
4. Momsen, J.L., Long, T.M., Wyse, S.A. and Ebert-May, D. (2010). Just the facts? Introductory undergraduate biology courses focus on low-level cognitive skills. CBE—Life Sciences Education, 9(4), pp.435-440.
5. Chin, C. and Osborne, J. (2008). Students’ questions: a potential resource for teaching and learning science. Studies in Science Education, 44(1), pp.1-39. https://doi.org/10.1080/03057260701828101
6. Maiti, P., & Goel, A. (2025, March). Can an AI Partner Empower Learners to Ask Critical Questions? In Proceedings of the 30th International Conference on Intelligent User Interfaces (pp. 314-324).
Interested in submitting a guest blog post? Click here!