Your Science Results Explained, The Wrap Up

In this series of posts, we’ll be breaking down the first MoonMappers paper by Robbins et al. showcasing YOUR work. 

Read Part 1Part 2Part 3Part 4, and Part 5

I promised you all one past post on the Moon Mappers science results, and it’s been a while! I just thought I’d wrap up some of the highlights of the paper by Robbins et al. that looked at a huge chunk of Moon Mappers data processed by you, the citizen scientists.

  • Counting craters is important to understanding the history of a planetary surface, but it is NOT so straightforward. There was a great discussion of this topic on the @realscientists Twitter feed back in July when Meg Rosenburg took over.
  • This study showed a 10-35% difference among crater marking experts with differences becoming more important when the terrain is more complex.
  • The volunteers using the Moon Mappers interface did about as well, on average as well as the experts, expect near the minimum diameter of the mapping tool where it becomes much more difficult.
  • Individual experts are more consistent and thorough than individual volunteers, but the sheer number of volunteers makes up for that.
  • Error bars on crater counts and surface ages in previous studies may be too small as they do not take into account individual biases.

Good work, mappers!

This series of posts is about the CosmoQuest publication: Stuart J. Robbins, Irene Antonenko, Michelle R. Kirchoff, Clark R. Chapman, Caleb I. Fassett, Robert R. Herrick, Kelsi Singer, Michael Zanetti, Cory Lehan, Di Huang, Pamela L. Gay, The variability of crater identification among expert and community crater analysts, Icarus, Available online 4 March 2014, ISSN 0019-1035, http://dx.doi.org/10.1016/j.icarus.2014.02.022 (paywall) 

FREE, open access preprint on arXiv – http://arxiv.org/abs/1404.1334

No comments yet.

Leave a Reply