Research database

Project information
Keywords
pollinators; Dryas, plant-insect interactions
Project title
Frame-by-frame: a new approach for monitoring plant-pollinator interactions by time lapse photography
Year
2019
Project leader
Jane Uhd Jepsen
Geographical localization of the research project in decimal degrees (max 5 per project, ex. 70,662°N and 23,707°E)
Varanger (70.4 N, 28.64 E) and Tromsø (69.68 N, 18.96 E)
Participants

Project leader: Jane Uhd Jepsen, Dept. Arctic Ecology, NINA, jane.jepsen@nina.no

Project participants: Ingrid Jensvoll, Dept. Education, UIT, Ingrid.jensvoll@uit.no  , Toke T. Høye, Dept. Bioscience, Aarhus University, tth@bios.au.dk, Alexandros Iosifidis, Dept. Engineering, Aarhus University, alexandros.iosifidis@eng.au.dk, Ole Petter Vindstad, Dept. Arctic and Marine Ecology, UIT, ole.p.vindstad@uit.no

 

Flagship
Terrestrial
Funding Source

Terrestrial flagship

NINA

UIT

Summary of Results

 

Highlights

  • Camera-based monitoring of plant phenology and plant-pollinator interactions has been tested over two years, and proven to be both technically feasible, and ecologically promising.
  • Automatic detection of flower stage based on machine learning techniques can be done with high accuracy (> 95%) and reasonable speed (1-2 sec per image).
  • Detection of insect visitors to flowers is also possible, but require more site-specific training data than flowers, due to variations in visitor communities among the Arctic test sites.

 

Frame-by-frame is an incentive project with a significant proportion of methodological development involved. It is closely connected to a larger international project BITCue (Biotic interactions tracked by computer vision) headed by our collaborators from Århus University. Prior to launch of Frame-by-frame, a proposed protocol for camera-based monitoring of insect-pollinator interactions had been tested by our Danish collaborators during a single season in west Greenland (two cameras). During 2018 and 2019 the test has been extended to very variable conditions and phenologies in Finse, W Greenland (Narsarsuaq), NE Greenland (Thule), Iceland, Svalbard, Varanger and Tromsø, by us and collaborators from UNIS, NMBU, Iceland, and Denmark. The contribution from the flagship has enabled the testing of the protocol and equipment in Varanger and Tromsø in both years (Fig. A1).

 

At our two sites at Tromsø and Varanger we collected > 1 mill pictures of Dryas and 600.000 of Rubus covering the whole phenology from bud burst to seed set during two years. The material on Dryas is now feeding into the development of the processing pipeline and being used for generating training data. In order to illustrate the concept of how ecologically relevant information can be extracted from image time series, and to provide an extensive set of training data, we (Jepsen and Høye) manually annotated the flowers on all images from two test cameras (Fig. A2). During December this year, Frame-by-frame will generate to a similar manually annotated time series of flower visitors. This material will form the basis of in independent publication (Høye et al. in prep). The primary use of the training data is however, to train machine learning algorithms to automatically identify (or allocate a probability to) individual flower stage (bud, flower, seed, wilted) and status (with/without pollinator) (Fig. A3). This work is spearheaded by our Danish collaborators, including a new PhD student (Hjalte R.M. Mann) from spring 2019, and will continue in 2020-2021. Preliminary results (Ärje et al. 2019; Mann et al unpublished) show that flower classification can be done both with high accuracy (>95%) and reasonable speed (1-2 sec per image). Preliminary tests based on image data from a site in S. Greenland also suggest that it will be possible to detect visitors, but visitor communities vary among sites and requires further manual annotation from other sites to become a generic tool.

 

Challenges:

Rubus have been significantly more challenging due to the rapid and very varied vertical growth of this plant, larger leaves (which shade the flowers), and the much lower density of flowers per area unit, which both increase the need for manual adjustments during the growing season, and reduce the total number of flowers which can be captured. In 2019 we attempted to solve this by adjusting the tripod design to allow for capturing at larger focal distances. However, there is a trade-off between focal distance and detail (ability to detect and identify insects on flowers), and our preliminary conclusion is that the currently employed set-up is suboptimal for use on Rubus. However, it is still an ambition to run the acquired Rubus images through the analysis pipeline, but we will prioritize processing of the Dryas material.

 

Perspectives:

Varanger is the main target area for monitoring in COAT, and it is an ambition over time to establish a lasting camera-based monitoring of insect -plant interactions in one or more of COATs core areas in Varanger, which is mirrored in the other BITCue test sites around the Arctic. The preliminary results from Frame-by-frame and from BITCue show that this is both feasible from a technical perspective, and promising in terms of the amount and quality of ecological data which can be extracted from the acquired imagery. A significant amount of work remains however, before the image processing pipeline is operational. We will continue to contribute to this through collaboration with the Danish partners. Specifically we plan an joint application to NFR with Frame-by-frame partners on camera based monitoring of ecological interactions (among these plant-pollinator interactions) in 2020.

 

 

Master and PhD-students involved in the project

Hjalte Mann (Århus university, PhD student 2019-2022)

For the Management

Not relevant at present.

Published Results/Planned Publications

Ärje, J., Milioris, D., Høye, T., Tran, T., Iosifidis, A., Raitoharju, J., Gabbouj, M., Jepsen, J.U. (2019) Automatic Flower Detection and Classification System Using a Light-Weight Convolutional Neural Network. Autonymous systems satellite workshop. 27th European Signal Processing Conference, EUSIPCO 2019, A Coruña, Spain.

One scientific paper is in prep led by our Danish collaborators with co-authors from the Fram Centre.

Communicated Results

Høye, T.T. 2018. Image-based monitoring of arctic arthropods. Arctic Biodiversity Congress, 9-11. Oct 2018, Rovaniemi Finland.

Mann et al. 2019 Biotic interactions tracked by computer vision. Network for Arthropods of the Tundra meeting, Oct 2019, Århus.

Interdisciplinary Cooperation

The project contributes to an international, interdisciplinary collaboration between ecologists and computer scientists, which bring together expertise on ecology, phenology, ecological interactions with machine learning, image classification and optimization. The project would not be possible without inter-disciplinary collaboration.

Budget in accordance to results

Funding from the Fram Centre has permitted the establishment of a test node for the BITCue protocol in Northern Norway, specifically Tromsø (2018 only) and Varanger (2018-19), which would otherwise not have been possible. Fram Center funding further permits the generation of an extended test data set on insect visitors, and interactions with project partners (e.g. research hours for the PI to participate in Network for Arthropods of the tundra (NeAT) meeting, Århus, October 2019).

Could results from the project be subject for any commercial utilization
No
Conclusions

The project has:

  • Established a new test node in Northern Norway for a distributed protocol for camera-based monitoring of arctic plant-pollinator interactions. In doing so we contribute to an international network of test sites, all using Dryas sp. as a test species.
  • Generated training data to contribute to the development of a new automatic processing of images based on machine learning which will collect imagery from all test sites

The project group will further:

  • Initiate an joint NFR application for 2020