Data Analytics in Citizen Cyberscience: Evaluating Participant Learning and Engagement with Analytics

Authors

  • Oula Abu Amsha HEIG-VD UNIGE
  • Daniel K. Schneider UNIGE
  • Jose Luis Fernandez-Marquez UNIGE
  • Julien Da Costa UNIGE
  • Brian Fuchs The Mobile Collective
  • Laure Kloetzer UNINE UNIGE

DOI:

https://doi.org/10.15346/hc.v3i1.5

Keywords:

Citizen cyberscience, analytics, engagement, learning, framework

Abstract

Citizen Cyberscience (CCS) projects are online projects that engage participants with no necessary prior scientific experience in online tasks of very varied types and that contribute to the scientific research in different domains. Many research studies confirm the usefulness of CCS projects to researchers while less has been done to explore their added-value for the participants. Specifically, we are interested to know to what extent CCS projects help participants learn while participating through typically small-sized and very specific tasks.We propose in this work to include another source of quantitative data to the research toolbox usually used to evaluate learning in informal learning contexts as the context of citizen science. This data source is learning analytics that makes use of the already very ubiquitous web analytics and that is heavily used in varied online learning environments. Based on our experience with two CCS pilot projects, we created a framework to help CCS project designers properly implement learning analytics in their project in order to make the full use of these analytics and integrate them with other sources of quantitative data related to the user experience. We apply the proposed framework to explore the interaction between learning and engagement in two pilot CCS projects of different types: volunteer thinking and gaming. We conclude with a number of recommendations to avoid pitfalls and proposals for best practice based on our experience. 

Author Biography

Daniel K. Schneider, UNIGE

Tecfa

References

Bonney, R., Phillips, T., Ballard, H., & Enck, J. (2015). Can citizen science enhance public understanding of science? Public Understanding of Science.

Brossard, D., Lewenstein, B., & Bonney, R. (2005). Scientific knowledge and attitude change: The impact of a citizen science project. International Journal of Science Education, 27(9).

Cooper, C. B. (2014). Is there a weekend bias in clutch-initiation dates from citizen science? Implications for studies of avian breeding phenology. International journal of biometeorology,, 58(7).

Crall, A., Jordan, R., Holfelder, K., & al., e. (2013, August ). The impacts of an invasive species citizen science training program on participant attitudes, behavior, and science literacy. Public Understandin g of Science, 22(6).

Cronje, R., Rohlinger, S., Crall, A., & Newman, G. (2011). Does Participation in Citizen Science Improve Scientific Literacy? A Study to Compare Assessment Methods. Applied Environmental Education & Communication, 10(3).

Elias, T. (2011). Learning Analytics: Definitions, Processes and Potential.

Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of technology Enhanced Learning.

Fernandez-Marquez, J. L., Charalampidis, I., Abu-Amsha, O., Grey, F., Fuchs, B., & Schneider, D. K. (2016). CCLtracker Framework: Monitoring user learning and activity in web-based citizen science projects. The Human Computation Journal, Forthcoming.

Hand, E. (2010). Citizen Science: People Power. Nature .

Himmelstein, J., Goujet, R., & Lindner, A. (2016). How Open Game Metrics Can Turn Any Game into Citizen Science. Human Computation, forthcoming.

Jennett, C., Eveleigh, A. M., K., A., & Cox, A. (2013). Creativity in citizen cyberscience: All for one and one for all. ACM WebSci13, ‘Creativity and Attention in the Age of the Web’ Workshop. Paris, France.

Jordan, R. C., Gray, S. A., Howe, D. V., Brooks, W. R., & Ehrenfeld, J. G. (2011). Knowledge Gain and Behavioral Change in Citizen-Science Programs. Conservation Biology, 25(6).

Jordan, R., Ballard, H., & Phillips, T. (2012). Key Issues and New Approaches for Evaluating Citizen-Science Learning Outcomes. Frontiers in Ecology and the Environment, 10(6).

Kahn, W. (1990). Psychological Conditions Of Personal Engagement And Disengagement At Work. Academy of Management Journal, 33(4).

Kiron, D., Prentice, P. K., & Ferguson, R. B. (2014). Raising the Bar With Analytics. MIT Sloan Management Review, 55(2).

Kloetzer, L., Schneider, D., & da Costa, J. (2016). Not So Passive: Engagement and Learning in Volunteer Computing. Human Comutation Journal.

Kloetzer, L., Schneider, D., da Costa, J., Abu-Amsha, O., & Jennett, C. (2015). D6.3 Learning in Citizen Cyberlab. Deliverable. Récupéré sur to be accessible after approval at http://citizencyberlab.eu/research/deliverables/

Kloetzer, L., Schneider, D., Jennett, C., Iacovides, I., Eveleigh, A., Cox, A. L., & Gold, M. (2013). Learning by volunteer computing, thinking and gaming: What and how are volunteers learning by participating in Virtual Citizen Science? ESREA 2013. Germany.

Lieberoth, A. (2014). Getting Humans to Do Quantum Optimization - User Acquisition, Engagement and Early Results from the Citizen Cyberscience Game Quantum Moves. Human Computation, 1(2).

Meece, J. L., Blumenfeld, P. C., & Hoyle, R. H. (1988). Students’ goal orientations and cognitive engagement in classroom activities. Journal of educational psychology, 80(4).

Michelucci, P. (2013). “Synthesis and Taxonomy of Human Computation,” . Dans Handbook of Human Computation (pp. pp. 83–86). New York: Springer .

Morris, L., Finnegan, C., & Wu, S.-S. (2005). Tracking Student Behavior, Persistence, and Achievement in Online Courses. The Internet and Higher Education, 8(3).

O'Brien, H., & Toms, E. (2008). What is user engagement? A conceptual framework for defining user engagement with technology. Journal of the American Society for Information Science and Technology, 59(6), 938–955. doi:http://dx.doi.org/10.1002/asi.20801

Phillips, T., Ferguson, M., Minarchek, M. P., & Bonney, R. (2014). User’s Guide for Evaluating Learning Outcomes in Citizen Science. Ithaca, NY.: Cornell Lab of Ornithology.

Piskurich, G. (2011). Rapid Instructional Design: Learning ID Fast and Right. John Wiley & Sons.

Ponciano, L., & Brazileiro, F. (2014). Finding Volunteers’ Engagement Profiles in Human Computation for Citizen Science Projects. Journal of Human Computation(6).

Romero, C., & Ventura, S. (2010). Educational Data Mining: A Review of the State-of-the-Art. IEEE Transaction on Systems, Man, and Cybernetics, 40(6).

Sauermann, H., & Franzoni, C. (2015). Crowd Science User Contribution Patterns and Their Implications. Proceedings of the National Academy of Sciences, 112(3).

Schneider, D. K., DaCosta, J., Abu-Amsha, O., Jennett, C., & Kloetzer, L. (2016). Participation and learning patters in Cyber Citizen Science. Human Computation Journal.

Siemens, G., & Long, P. (2011, July/August). Penetrating the Fog: Analytics in learning and education. , vol. 46, no. 4 . EDUCAUSE Review, 46(4).

Wiggins, A., & Crowston, K. (2011). From Conservation To Crowdsourcing: A Typology of Citizen Science. 44th Hawaii International Conference on System Sciences.

Downloads

Published

2016-12-31

How to Cite

Abu Amsha, O., Schneider, D. K., Fernandez-Marquez, J. L., Da Costa, J., Fuchs, B., & Kloetzer, L. (2016). Data Analytics in Citizen Cyberscience: Evaluating Participant Learning and Engagement with Analytics. Human Computation, 3(1), 69-97. https://doi.org/10.15346/hc.v3i1.5

Issue

Section

Research