Quality is a subjective and relative concept that corresponds to matching the representation of reality given by an information system, with reality as perceived by users.
Records of naturalistic observations are the source feeding many diagnoses, evaluation, and biodiversity assessments. Data input is made by the observer on either material (paper) or digital media.
Informations and the precision of elements comprising a naturalit's data will determine whether it'll be taken into account or discarded before use, specifically for relevant scientific analysis. This data quality allows to qualify data (datasets) for a given use.
More than just the observer, other people will intervene on data in between the data input and the numerous potential uses. Naturalist experts, managers and database administrators, programmers, analysts, users, etc.
All actors must take care to preserve data quality, as degrading the information (simplifying, erasing attributes, etc.) can happen at any data life stage (while collecting, digitizing, documenting, saving, analysint, or manipulating data).
Data quality on species in an information system can be defined with seveal criteria:
Data Quality Components
© INPN
To ensure availability of quality data, the whole production chain must be watched, as soon as it is possible in the data life cycle, and at first while collecting it.
More about the collection: Guide de bonnes pratiques pour la collecte et la saisie de données naturalistes
Actors must be provided with input and management tools adapted to their needs, that are as interpoerable as possible in order to facilitate data sharing.
More about the tools: Guide pratique pour le développement et le choix d'un outil de saisie de données naturalistes
Data curation (notably standardization) must degrade data as little as possible.
More about the standardization: Guide pratique pour la standardisation des données naturalistes
Of note
The validation process within the SINP is described in the SINP methodological guide for conformity, consistency and scientific validation of data and metadata. This guide describes the general methodology, terminology, and principles on duplicate entry identification, conformity, consistency, and scientific validation.