Journal articles about how children think and reason about data

Written by: Josh Rosenberg

Primary Source:  Joshua M. Rosenberg – May 9, 2016

As the list of articles got longer, I became reticent to post it, but decided to just as a helpful exercise. I wrote posts previously about books on how children reason about data and open datasets for educational use. I also wanted to post a few (okay, a little more than a few) of the relevant journal articles (and a few book chapters) I’ve tried to track down. I focused mostly on science settings.

Organization

I tried to organize them around three categories: 1) Thinking of and With Data (articles about reasoning with data) 2) Tools to Support Reasoning with Data (Articles about tools to support reasoning with data) and 3) Tools to Enhance Reasoning with Data (articles about tools, mostly computational, to enhance reasoning with data).

To organize them, I’m using an idea borrowed from Salomon and Perkins (2005) to distinguish between the articles in the second and third categories. I’m trying to do so based on how tools are used: Those in the first category are focused on fundamental processes of reasoning about data and feature tools less than the others). My argument, in short, is that computation can transform (or at least enhance in novel ways) how children reason about data. I’m trying to make this argument because while we have a deep and rich understanding of how children reason about data, in science and other settings, we have only a few studies on how children do so with computational tools. This is important because data are increasingly ubiquitous, for better and potentially worse.

Here’s a bit more on that. The distinction I’m trying to use is those articles in the tools to support reasoning with data category seem to describe effects with and of the use of technological and other tools, whereas those in the tools to enhance reasoning with data category have the potential to transform people’s cognitive and social capabilities through their use. I’m not sure the distinction between these two is helpful or how well it works: There is a lot of overlap between the two and this is just an organizational tool. I welcome input. Here are the references:

1. Thinking of and With Data

  • Duranti, A. (2006). Transcripts, like shadows on a wall. Mind, Culture, and Activity, 13(4), 301–310. Hancock, C., Kaput, J. J., & Goldsmith, L. T. (1992). Authentic inquiry with data: Critical barriers to classroom implementation. Educational Psychologist, 27(3), 337-364.
  • Konold, C., & Pollatsek, A. (2002). Data analysis as the search for signals in noisy processes. Journal for Research in Mathematics Education, 33(4), 259–289.
  • Lee, H. S., & Hollebrands, K. F. (2008). Preparing to teach data analysis and probability with technology. Proceedings of the ICMI Study 18 and 2008 IASE Round Table Conference, 1–6. Retrieved from https://www.stat.auckland.ac.nz/~iase/publications/rt08/T3P4_Lee.pdf
  • Lehrer, R., & Romberg, T. (1996). Exploring children’s data modeling. Cognition and Instruction, 14(1), 37–41.
  • Lehrer, R., & Schauble, L. (2000). Inventing data structures for representational purposes: Elementary grade students’ classification models.Mathematical Thinking and Learning, 2(1-2), 51-74.
  • NGSS Lead States (2013). Appendix F – science and engineering practices in the NGSS. Retrieved from http://www.nextgenscience.org/sites/default/files/Appendix%20F%20%20Science%20and%20Engineering%20Practices%20in%20the%20NGSS%20-%20FINAL%20060513.pdf
  • Petrosino, A. (2003). Commentary: A framework for supporting learning and teaching about mathematical and scientific models. Contemporary Issues in Technology and Teacher Education, 3(3), 288-299.
  • Shaffer, D. W., & Kaput, J. (1998). Mathematics and virtual culture: An evolutionary perspective on technology and mathematics education. Educational Studies in Mathematics, 37(2), 97–119. http://doi.org/10.2307/3483311
  • Shaffer, D. W., & Resnick, M. (1999). ” Thick” authenticity: New media and authentic learning. Journal of Interactive Learning Research, 10(2), 195.
  • Sharples, M., Scanlon, E., Ainsworth, S., Anastopoulou, S., Collins, T., Crook, C., … & O’Malley, C. (2015). Personal inquiry: Orchestrating science investigations within and beyond the classroom. Journal of the Learning Sciences, 24(2), 308-341.
  • Welser, H. T., Smith, M., Fisher, D., & Gleave, E. (2008). Distilling digital traces: Computational social science approaches to studying the Internet. In N. Fielding, R. M. Lee, & G. Blank, The SAGE handbook of online research methods (pp. 116-141). Thousand Oaks, CA: SAGE Publications, Ltd.

2. Tools to Support Reasoning with Data

  • Chin, D. B., Blair, K. P., & Schwartz, D. L. (advance online publication). Got game? A choice-based learning assessment of data literacy and visualization Skills. Technology, Knowledge and Learning.
  • Hancock, C., Kaput, J. J., & Goldsmith, L. T. (1992). Authentic inquiry with data: Critical barriers to classroom implementation. Educational Psychologist, 27(3), 337–364.
  • Edelson, D. C., & Gordin, D. (1998). Visualization for learners: a framework for adapting scientists’ tools. Computers and Geosciences, 24(7), 607–616.
  • Edelson, D. C., Gordin, D., & Pea, R. (1999). Addressing the challenges of inquiry-Based learning through technology and curriculum design. Journal of The Learning Sciences, 8(3-4), 391–450.
  • Edelson, D. (2001). Learning-for-use: a framework for the design of technology-supported inquiry activities. Journal of Research in Science Teaching, 38(3), 1–31.
  • Goldstone, R. L., & Son, J. Y. (2005). The transfer of scientific principles using concrete and idealized simulations. Journal of Learning Sciences, 14(1), 69–110.
  • Horn, M. S., Brady, C., Hjorth, A., Wagh, A., & Wilensky, U. (2014). Frog pond: a codefirst learning environment on evolution and natural selection. Interaction Design and Children, 357–360. http://doi.org/10.1145/2593968.2610491 
  • Johnson, A., Moher, T., Cho, Y. J., Edelson, D., & Russell, E. (2004). Learning science inquiry skills in a virtual field. Computers and Graphics, 28(3), 409–416.
  • Lai, K., Cabrera, J., Vitale, J. M., Madhok, J., Tinker, R., & Linn, M. C. (advance online publication). Measuring graph comprehension, critique, and construction in science. Journal of Science Education and Technology.
  • Land, S. M., & Zimmerman, H. T. (2015). Socio-technical dimensions of an outdoor mobile learning environment: a three-phase design-based research investigation. Educational Technology Research and Development, 63(2), 229–255.
  • Mandinach, E. B., & Gummer, E. S. (2012). Navigating the landscape of data literacy: It IS complex. http://doi.org/10.1017/CBO9781107415324.004
  • Öllinger, M., Hammon, S., von Grundherr, M., & Funke, J. (2015). Does visualization enhance complex problem solving? The effect of causal mapping on performance in the computer-based microworld Tailorshop. Educational Technology Research and Development, 63(4), 621–637.
  • Quintana, C., Reiser, B., & Davis, E. (2004). A scaffolding design framework for software to support science inquiry. The Journal of Learning Sciences, 13(3), 337–386.
  • Shaffer, D. W., & Kaput, J. J. (1998). Mathematics and virtual culture: An evolutionary perspective on technology and mathematics education.Educational Studies in Mathematics, 37(2), 97-119.
  • Sonnleitner, P., Keller, U., Martin, R., & Brunner, M. (2013). Students’ complex problem-solving abilities: Their structure and relations to reasoning ability and educational success. Intelligence, 41(5), 289–305.
  • Wilkerson-Jerde, M. H., Gravel, B. E., Andrews, C., & Shaban, Y. (2002). What’s the technology for? Teacher attention and pedagogical goals in a modeling-focused professional development workshop. Journal of Science Teacher Education, 27(1), 1–11.

3. Tools to Enhance Reasoning with Data

  • Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12: what is Involved and what is the role of the computer science education community? ACM Inroads, 2(1), 48-54.
  • Dickes, A. C., Sengupta, P., Farris, A. V., & Basu, S. (advance online publication). Development of mechanistic reasoning and multilevel explanations of ecology in third grade using agent-based models. Science Education. http://doi.org/10.1002/sce.21217
  • Grover, S., & Pea, R. (2013). Computational thinking in K-12: A review of the state of the field. Educational Researcher, 42(1), 38–43. http://doi.org/10.3102/0013189X12463051
  • Hmelo-Silver, C. E., Liu, L., Gray, S., & Jordan, R. (2015). Using representational tools to learn about complex systems: A tale of two classrooms. Journal of Research in Science Teaching, 52(1), 6–35. http://doi.org/10.1002/tea.21187
  • Kaput, J. J., Noss, R., & Hoyles, C. (2008). Developing new notations for a learnable mathematics in the computational era. In L. D. English (Ed.), Handbook of International Research in Mathematics Education: Directions for the 21st Century, 51–75. New York, NY: Routledge.
  • Khairiree, K., & Kurusatian, P. (2009). Enhancing students’ understanding statistics with TinkerPlots: A problem-bsed learning approach. Electronic Proceedings of the Fourteenth Asian Technology Conference in Mathematics. Retrieved from http://atcm.mathandtech.org/EP2009/papers_full/2812009_17324.pdf
  • Pallant, A., & Lee, H. S. (2015). Constructing scientific arguments using evidence from dynamic computational climate models. Journal of Science Education and Technology, 24(2-3), 378–395. http://doi.org/10.1007/s10956-014-9499-3
  • Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Education and Information Technologies, 18(2), 351–380. http://doi.org/10.1007/s10639-012-9240-x
  • Stieff, M. (2011). Improving representational competence using molecular simulations embedded in inquiry activities. Journal of Research in Science Teaching, 48(10), 1137–1158. http://doi.org/10.1002/tea.20438
  • Stieff, M., & Wilensky, U. (2003). Connected chemistry: Incorporating interactive simulations into the chemistry classroom. Journal of Science Education and Technology, 12(3), 285. http://doi.org/10.1023/A:1025085023936
  • Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2015). Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25(1), 127-147. http://doi.org/10.1007/s10956-015-9581-5
  • Wilensky, U., & Reisman, K. (2006). Thinking like a wolf, a sheep, or a firefly: Learning biology through constructing and testing computational theories — An embodied modeling approach. Cognition and Instruction, 24(2), 171–209.
  • Wilensky, U., & Resnick, M. (1999). Thinking in levels: A dynamic systems approach to making sense of the world. Journal of Science Education and technology, 8(1), 3-19.
  • Wilkerson-Jerde, M. H., & Wilensky, U. J. (2015). Patterns, probabilities, and people: Making sense of quantitative change in complex systems. Journal of the Learning Sciences, 24(2), 204–251. http://doi.org/10.1080/10508406.2014.976647
The following two tabs change content below.
Joshua M. Rosenberg is a Ph.D. student in the Educational Psychology and Educational Technology program at Michigan State University. In his research, Joshua focuses on how social and cultural factors affect teaching and learning with technologies, in order to better understand and design learning environments that support learning for all students. Joshua currently serves as the associate chair for the Technological Pedagogical Content Knowledge (TPACK) Special Interest Group in the Society for Information Technology and Teacher Education. Joshua was previously a high school science teacher, and holds degrees in education (M.A.) and biology (B.S.).