Last edited 3 years ago
by Shane Orchard

Hot Topics in Citizen Science

Revision as of 09:35, 22 August 2020 by Shane Orchard (talk | contribs) (new content)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Benefits of citizen science

There is now a wealth of evidence showing the considerable benefits of citizen and community science. These extend far beyond a focus on gathering ‘data’ to include benefits associated with awareness-raising, science literacy, improved understanding of issues, and others.

Researchers

Citizen science can provide researchers with an increased ability to collect novel data over greater temporal and spatial scales (including private property). The shift towards a more collaborative and reciprocal approach to research, means that studies can be designed that respond more effectively to societal needs.

Communities

Everyone in society across a wide variety of groups can build new knowledge and skills; develop a greater awareness of issues that may encourage behaviour change; enhance wellbeing through increased social contact, physical activity and connection to the environment.

Decision-makers

New information to support evidence-based policy and planning by government and land management agencies e.g., through notifications of pest, pathogen and disease outbreaks, pollution events or the discovery of new species. Regionally, citizen science can support natural resource management and biodiversity strategies while national databases e.g., in New Zealand, the Land and Water Aotearoa platform may benefit from the input of community generated data. More broadly, citizen science is philosophically aligned to New Zealand’s Open Government and Open Data policies.  

Data quality and reliability

The reliability of data generated through citizen science activity is a hot topic as well as an active area of research. These investigations play an important role in validating citizen science initiatives by helping to address potential data quality concerns (Dickinson et al., 2010; Lewandowski & Specht, 2015; Lukyanenko, Parsons, & Wiersma, 2016). This is important for the interpretation of results as well as providing guidance for the design of projects in aspects such as data collection procedures and training needs. Although there are an increasing number of international examples investigating citizen science data quality, there have been few such studies in New Zealand. Notable exceptions include in revival of the Stream Health Monitoring and Assessment Kit and the long-running Auckland Council supported Wai Care programme.  In these projects supporting research compared data collected by professionals to data collected by volunteers (Moffett & Neale 2015; Storey & Wright-Stow 2017). In general, the results of such studies show that the quality of volunteer data is often comparable to professional data, and may be used to augment professional monitoring programmes, or in some cases, as a standalone data source . The detection of differences may indicate that citizen science data are not ‘fit for purpose’ for some potential uses. Alternatively, these findings may provide a focus for further training initiatives, or other innovative approaches to control for sources of bias such as combining information on observer expertise with the data that is collected (Johnston et al. 2018).

Capacity building for CitSci initiatives

Capacity building and support needs of citizen science projects may vary widely depending on their particular objectives (Conrad & Hilchey 2011). In research projects looking to crowd-source data from volunteers, training in standardised data collection protocols is often desired as a means to improve data quality (Gouveia et al. 2004; Sullivan & Molles 2016). However, there is also an increasing potential for citizen science to move away from standardised protocols towards the mining of available datasets using verification methodologies to support desirable analyses (Catlin-Groves 2012). New Zealand studies have found that the top two challenges for maintaining CBEM projects in New Zealand are a lack of human resources (i.e. volunteers) and funding, followed by training in technical skills, and that many of the standardised monitoring ‘kits’ have been receiving low usage levels by community groups despite being specifically designed to support CBEM (Peters et al. 2016). These examples suggest that the introduction of more onerous protocols may be problematic for volunteers due to factors such as limited time, and may be counterproductive for projects oriented toward local engagement where maintaining  participation is a core goal (Orchard 2019).

In many cases, participant-led forms of monitoring are more suited to local needs and the motivations of those involved (McKay & Johnson 2017; Wiseman & Bardsley 2016). This can be supported by tailoring data collection methods to the citizens rather than training citizens in the existing field methods of professionals (Palmer Fry 2011) , and by adopting other means to control for observer bias and other data quality concerns (Johnston et al. 2018).

References