Citizen science, computing, and conservation: How can “Crowd AI” change the way we tackle large-scale ecological challenges?
DOI:
https://doi.org/10.15346/hc.v8i2.123Keywords:
applications, interfaces, techniques, interdisciplinary collaborationAbstract
Camera traps - remote cameras that capture images of passing wildlife - have become a ubiquitous tool in ecology and conservation. Systematic camera trap surveys generate ‘Big Data’ across broad spatial and temporal scales, providing valuable information on environmental and anthropogenic factors affecting vulnerable wildlife populations. However, the sheer number of images amassed can quickly outpace researchers’ ability to manually extract data from these images (e.g., species identities, counts, and behaviors) in timeframes useful for making scientifically-guided conservation and management decisions. Here, we present ‘Snapshot Safari’ as a case study for merging citizen science and machine learning to rapidly generate highly accurate ecological Big Data from camera trap surveys. Snapshot Safari is a collaborative cross-continental research and conservation effort with 1500+ cameras deployed at over 40 eastern and southern Africa protected areas, generating millions of images per year. As one of the first and largest-scale camera trapping initiatives, Snapshot Safari spearheaded innovative developments in citizen science and machine learning. We highlight the advances made and discuss the issues that arose using each of these methods to annotate camera trap data. We end by describing how we combined human and machine classification methods (‘Crowd AI’) to create an efficient integrated data pipeline. Ultimately, by using a feedback loop in which humans validate machine learning predictions and machine learning algorithms are iteratively retrained on new human classifications, we can capitalize on the strengths of both methods of classification while mitigating the weaknesses. Using Crowd AI to quickly and accurately ‘unlock’ ecological Big Data for use in science and conservation is revolutionizing the way we take on critical environmental issues in the Anthropocene era.References
Ahumada, J. A., Silva, C. E., Gajapersad, K., Hallam, C., Hurtado, J., Martin, E., ... & Sheil, D. (2011). Community structure and diversity of tropical forest mammals: data from a global camera trap network. Philosophical Transactions of the Royal Society B: Biological Sciences, 366(1578), 2703-2711.
Allen, M. L., Peterson, B., & Krofel, M. (2018). No respect for apex carnivores: distribution and activity patterns of honey badgers in the Serengeti. Mammalian Biology, 89(1), 90-94.
Anderson, T. M., White, S., Davis, B., Erhardt, R., Palmer, M., Swanson, A., ... & Packer, C. (2016). The spatial distribution of African savannah herbivores: species associations and habitat occupancy in a landscape context. Philosophical Transactions of the Royal Society B: Biological Sciences, 371(1703), 20150314.
Barnosky, A. D., Matzke, N., Tomiya, S., Wogan, G. O., Swartz, B., Quental, T. B., ... & Mersey, B. (2011). Has the Earth’s sixth mass extinction already arrived? Nature, 471(7336), 51-57.
Beaudrot, L., Palmer, M. S., Anderson, T. Michael, Packer, C. (2020). Mixed-species groups of Serengeti grazers: a test of the stress gradient hypothesis. Ecology, (In Press).
Beery, S., Van Horn, G., & Perona, P. (2018). Recognition in terra incognita. In Proceedings of the European Conference on Computer Vision (ECCV) (pp. 456-473).
Bonney, R., Shirk, J. L., Phillips, T. B., Wiggins, A., Ballard, H. L., Miller-Rushing, A. J., & Parrish, J. K. (2014). Next steps for citizen science. Science, 343(6178), 1436-1437.
Bonney, R., Phillips, T. B., Ballard, H. L., & Enck, J. W. (2016). Can citizen science enhance public understanding of science? Public Understanding of Science, 25(1), 2-16.
Branson, S., Van Horn, G., & Perona, P. (2017). Lean crowdsourcing: Combining humans and machines in an online system. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 7474-7483).
Burton, A. C., Neilson, E., Moreira, D., Ladle, A., Steenweg, R., Fisher, J. T., ... & Boutin, S. (2015). Wildlife camera trapping: a review and recommendations for linking surveys to ecological processes. Journal of Applied Ecology, 52(3), 675-685.
Chen, G., Han, T. X., He, Z., Kays, R., & Forrester, T. (2014, October). Deep convolutional neural network based species recognition for wild animal monitoring. In 2014 IEEE International Conference on Image Processing (ICIP) (pp. 858-862). IEEE.
Cox, J., Young, E. O., Simmons, B., Lintott, C., Masters, K., Greenhill, A., Graham, G., & Holmes, K. (2015). Defining and measuring success in online citizen science: A case study of Zooniverse projects. Computing in Science & Engineering, 17(4) (pp. 28–41).
Dickinson, J. L., & Bonney, R. (2012). Introduction: Why citizen science. In Citizen science: public participation in environmental research (pp. 1-14).
Dredze, M., & Crammer, K. (2008, October). Online methods for multi-domain learning and adaptation. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (pp. 689-697). Association for Computational Linguistics.
Eickenberg, M., Gramfort, A., Varoquaux, G., & Thirion, B. (2017). Seeing it all: Convolutional network layers map the function of the human visual system. NeuroImage, 152, 184-194.
Fegraus, E. H., Lin, K., Ahumada, J. A., Baru, C., Chandra, S., & Youn, C. (2011). Data acquisition and management software for camera trap data: A case study from the TEAM Network. Ecological Informatics, 6(6), 345-353.
Foody, G. M., & Arora, M. K. (1997). An evaluation of some factors affecting the accuracy of classification by an artificial neural network. International Journal of Remote Sensing, 18(4), 799-810.
Fortson, L. F., Masters, K., Nichol, R., Borne, K. D., Edmondson, E. M., Lintott, C., Raddick, J., Schawinski, K., Wallin, J. (2012). Galaxy Zoo: Morphological Classification and Citizen Science. Book Chapter: Advances in Machine Learning and Data Mining for Astronomy, Chapman & Hall/CRC editors Way, M. J., Scargle, D. J., Ali, K. M., Ashok, N. S., arXiv:1104.5513
Fortson, L. F., Wright, D. E., Lintott, C. J., & Trouille, L. (2018). Optimizing the Human-Machine Partnership with Zooniverse. Keynote: 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). arXiv e-prints, arXiv:1809.09738.
Glover‐Kapfer, P., Soto‐Navarro, C. A., & Wearn, O. R. (2019). Camera‐trapping version 3.0: current constraints and future priorities for development. Remote Sensing in Ecology and Conservation, 5(3), 209-223.
Hampton, S. E., Strasser, C. A., Tewksbury, J. J., Gram, W. K., Budden, A. E., Batcheller, A. L., ... & Porter, J. H. (2013). Big data and the future of ecology. Frontiers in Ecology and the Environment, 11(3), 156-162.
Harris, G., Thompson, R., Childs, J. L., & Sanderson, J. G. (2010). Automatic storage and analysis of camera trap data. The Bulletin of the Ecological Society of America, 91(3), 352-360.
Hines, G., Swanson, A., Kosmala, M., & Lintott, C. (2015, March). Aggregating user input in ecology citizen science projects. In Twenty-Seventh IAAI Conference.
Jennett, C., Kloetzer, L., Schneider, D., Iacovides, I., Cox, A., Gold, M., ... & Talsi, Y. (2016). Motivations, learning and creativity in online citizen science. Journal of Science Communication, 15(3).
Jetz, W., McPherson, J. M., & Guralnick, R. P. (2012). Integrating biodiversity distribution knowledge: toward a global map of life. Trends in Ecology and Evolution, 27(3), 151-159.
Kays, R., McShea, W. J., & Wikelski, M. (2020). Born‐digital biodiversity data: Millions and billions. Diversity and Distributions.
Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems (pp. 1097-1105).
Kosmala, M., Wiggins, A., Swanson, A., & Simmons, B. (2016). Assessing data quality in citizen science. Frontiers in Ecology and the Environment, 14(10), 551-560.
Lepczyk, C. A., Boyle, O. D., Vargo, T. L., Gould, P., Jordan, R., Liebenberg, L., ... & Vaughan, H. (2009). Symposium 18: Citizen science in ecology: the intersection of research and education. The Bulletin of the Ecological Society of America, 90(3), 308-317.
McKinley, D. C., Miller-Rushing, A. J., Ballard, H. L., Bonney, R., Brown, H., Cook-Patton, S. C., ... & Ryan, S. F. (2017). Citizen science can improve conservation science, natural resource management, and environmental protection. Biological Conservation, 208, 15-28.
Miao, Z., Gaynor, K. M., Wang, J., Liu, Z., Muellerklein, O., Norouzzadeh, M. S., ... & Getz, W. M. (2019). Insights and approaches using deep learning to classify wildlife. Scientific Reports, 9(1), 1-9.
Muneza, A. B., Ortiz-Calo, W., Packer, C., Cusack, J. J., Jones, T., Palmer, M. S., ... & Montgomery, R. A. (2019). Quantifying the severity of giraffe skin disease via photogrammetry analysis of camera trap data. Journal of Wildlife Diseases, 55(4), 770-781.
Norouzzadeh, M. S., Nguyen, A., Kosmala, M., Swanson, A., Palmer, M. S., Packer, C., & Clune, J. (2018). Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning. Proceedings of the National Academy of Sciences, 115(25), E5716-E5725.
O'Connell, A. F., Nichols, J. D., & Karanth, K. U. (Eds.). (2010). Camera traps in animal ecology: methods and analyses. Springer Science & Business Media.
O'Brien, T. G., Baillie, J. E. M., Krueger, L., & Cuke, M. (2010). The Wildlife Picture Index: monitoring top trophic levels. Animal Conservation, 13(4), 335-343.
Palmer, M. S., Fieberg, J., Swanson, A., Kosmala, M., & Packer, C. (2017). A ‘dynamic’ landscape of fear: prey responses to spatiotemporal variations in predation risk across the lunar cycle. Ecology Letters, 20(11), 1364-1373.
Palmer, M. S., Swanson, A., Kosmala, M., Arnold, T., & Packer, C. (2018). Evaluating relative abundance indices for terrestrial herbivores from large‐scale camera trap surveys. African Journal of Ecology, 56(4), 791-803.
Palmer, M. S., & Packer, C. (2018). Giraffe bed and breakfast: Camera traps reveal Tanzanian yellow-billed oxpeckers roosting on their large mammalian hosts. African Journal of Ecology, 00, 1–3.
Parham, J., Stewart, C., Crall, J., Rubenstein, D., Holmberg, J., & Berger-Wolf, T. (2018). An animal detection pipeline for identification. In 2018 IEEE Winter Conference on Applications of Computer Vision (WACV) (pp. 1075-1083). IEEE.
Piper, J. (1992). Variability and bias in experimentally measured classifier error rates. Pattern Recognition Letters, 13(10), 685-692.
Prather, E. E., Cormier, S., Wallace, C. S., Lintott, C., Raddick, M. J., & Smith, A. (2013). Measuring the conceptual understandings of citizen scientists participating in zooniverse projects: A first approach. Astronomy Education Review, 12(1).
Rich, L. N., Davis, C. L., Farris, Z. J., Miller, D. A., Tucker, J. M., Hamel, S., ... & Kane, M. D. (2017). Assessing global patterns in mammalian carnivore occupancy and richness by integrating local camera trap surveys. Global Ecology and Biogeography, 26(8), 918-929.
Rosser, H., & Wiggins, A. (2018, October). Tutorial Designs and Task Types in Zooniverse. In Companion of the 2018 ACM Conference on Computer Supported Cooperative Work and Social Computing (pp. 177-180).
Rotman, D., Preece, J., Hammock, J., Procita, K., Hansen, D., Parr, C., ... & Jacobs, D. (2012). Dynamic changes in motivation in collaborative citizen-science projects. In Proceedings of the ACM 2012 conference on computer supported cooperative work (pp. 217-226).
Russakovsky, O., Deng, J., Su, H., Krause, J., Satheesh, S., Ma, S., … Fei-Fei, L. (2015). ImageNet Large Scale Visual Recognition Challenge. International Journal of Computer Vision.
Schneider, S., Taylor, G. W., Linquist, S., & Kremer, S. C. (2019). Past, present and future approaches using computer vision for animal re‐identification from camera trap data. Methods in Ecology and Evolution, 10(4), 461-470.
Shirk, J. L., Ballard, H. L., Wilderman, C. C., Phillips, T., Wiggins, A., Jordan, R., ... & Bonney, R. (2012). Public participation in scientific research: a framework for deliberate design. Ecology and Society, 17(2).
Simpson, R., Page, K. R., & De Roure, D. (2014, April). Zooniverse: observing the world's largest citizen science platform. In Proceedings of the 23rd international conference on world wide web (pp. 1049-1054).
Spiers, H., Swanson, A., Fortson, L., Simmons, B. D., Trouille, L., Blickhan, S. and Lintott, C. (2019). Everyone counts? design considerations in online citizen science. Journal of Science Communication, 18(1), A04.
Steenweg, R., Hebblewhite, M., Kays, R., Ahumada, J., Fisher, J. T., Burton, C., ... & Brodie, J. (2017). Scaling‐up camera traps: Monitoring the planet's biodiversity with networks of remote sensors. Frontiers in Ecology and the Environment, 15(1), 26-34.
Stephenson, P. J., Brooks, T., Butchart, S., Fegraus, E., Geller, G., Hoft, R., ... & McRae, L. (2017). Priorities for big biodiversity data. Frontiers in Ecology and the Environment, 15(3), 124-125.
Swanson, A., Kosmala, M., Lintott, C., Simpson, R., Smith, A., & Packer, C. (2015). Snapshot Serengeti, high-frequency annotated camera trap images of 40 mammalian species in an African savanna. Scientific Data, 2, 150026.
Swanson, A., Kosmala, M., Lintott, C., & Packer, C. (2016a). A generalized approach for producing, quantifying, and validating citizen science data from wildlife images. Conservation Biology, 30(3), 520-531.
Swanson, A., Arnold, T., Kosmala, M., Forester, J., & Packer, C. (2016b). In the absence of a “landscape of fear”: How lions, hyenas, and cheetahs coexist. Ecology and Evolution, 6(23), 8534-8545.
Tabak, M. A., Norouzzadeh, M. S., Wolfson, D. W., Sweeney, S. J., VerCauteren, K. C., Snow, N. P., ... & Teton, B. (2019). Machine learning to classify animal species in camera trap images: applications in ecology. Methods in Ecology and Evolution, 10(4), 585-590.
Topol, E. J. (2019). High-performance medicine: the convergence of human and artificial intelligence. Nature Medicine, 25(1), 44-56.
Trouille, L., Lintott, C. J., & Fortson, L. F. (2019). Citizen science frontiers: Efficiency, engagement, and serendipitous discovery with human–machine systems. Proceedings of the National Academy of Sciences, 116(6), 1902–1909.
Villa, A. G., Salazar, A., & Vargas, F. (2017). Towards automatic wild animal monitoring: Identification of animal species in camera-trap images using very deep convolutional neural networks. Ecological Informatics, 41, 24-32.
Watson, D., & Floridi, L. (2018). Crowdsourced science: sociotechnical epistemology in the e-research paradigm. Synthese, 195(2), 741-764.
Wearn, O. R., Freeman, R., & Jacoby, D. M. (2019). Responsible AI for conservation. Nature Machine Intelligence, 1(2), 72-73.
Wearn, O. R., & Glover-Kapfer, P. (2017). Camera-trapping for conservation: a guide to best-practices. WWF Conservation Technology Series, 1(1), 181.
Willi, M., Pitman, R. T., Cardoso, A. W., Locke, C., Swanson, A., Boyer, A., ... & Fortson, L. (2019). Identifying animal species in camera trap images using deep learning and citizen science. Methods in Ecology and Evolution, 10(1), 80-91.
Zevin, M., Coughlin, S., Bahaadini, S., Besler, E., Rohani, N., Allen, S., Cabero, M., Crowston, K., Katsaggelos, A. K., Larson, S. L., Lee, T. K., Lintott, C. J., Littenberg, T., Lundgren, A., Oesterlund, C., Smith, J., Trouille, L., Kalogera, V. (2017). Gravity Spy: integrating advanced ligo detector characterization, machine learning, and citizen science. Classical and Quantum Gravity, 34(6):064003.
Downloads
Additional Files
Published
How to Cite
Issue
Section
License
Copyright (c) 2021 Human Computation
This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).