The Library
Image-based plant phenotyping with incremental learning and active contours
Tools
Minervini, Massimo, Abdelsamea, Mohammed M. and Tsaftaris, Sotirios A. (2014) Image-based plant phenotyping with incremental learning and active contours. Ecological Informatics, 23 . pp. 35-48. doi:10.1016/j.ecoinf.2013.07.004 ISSN 1574-9541.
Research output not available from this repository.
Request-a-Copy directly from author or use local Library Get it For Me service.
Official URL: http://dx.doi.org/10.1016/j.ecoinf.2013.07.004
Abstract
Plant phenotyping investigates how a plant's genome, interacting with the environment, affects the observable traits of a plant (phenome). It is becoming increasingly important in our quest towards efficient and sustainable agriculture. While sequencing the genome is becoming increasingly efficient, acquiring phenotype information has remained largely of low throughput. Current solutions for automated image-based plant phenotyping, rely either on semi-automated or manual analysis of the imaging data, or on expensive and proprietary software which accompanies costly hardware infrastructure. While some attempts have been made to create software applications that enable the analysis of such images in an automated fashion, most solutions are tailored to particular acquisition scenarios and restrictions on experimental design. In this paper we propose and test, a method for the segmentation and the automated analysis of time-lapse plant images from phenotyping experiments in a general laboratory setting, that can adapt to scene variability. The method involves minimal user interaction, necessary to establish the statistical experiments that may follow. At every time instance (i.e., a digital photograph), it segments the plants in images that contain many specimens of the same species. For accurate plant segmentation we propose a vector valued level set formulation that incorporates features of color intensity, local texture, and prior knowledge. Prior knowledge is incorporated using a plant appearance model implemented with Gaussian mixture models, which utilizes incrementally information from previously segmented instances. The proposed approach is tested on Arabidopsis plant images acquired with a static camera capturing many subjects at the same time. Our validation with ground truth segmentations and comparisons with state-of-the-art methods in the literature shows that the proposed method is able to handle images with complicated and changing background in an automated fashion. An accuracy of 96.7% (dice similarity coefficient) was observed, which was higher than other methods used for comparison. While here it was tested on a single plant species, the fact that we do not employ shape driven models and we do not rely on fully supervised classification (trained on a large dataset) increases the ease of deployment of the proposed solution for the study of different plant species in a variety of laboratory settings. Our solution will be accompanied by an easy to use graphical user interface and, to facilitate adoption, we will make the software available to the scientific community.
Item Type: | Journal Article | ||||||||
---|---|---|---|---|---|---|---|---|---|
Divisions: | Faculty of Science, Engineering and Medicine > Medicine > Warwick Medical School > Biomedical Sciences > Cell & Developmental Biology Faculty of Science, Engineering and Medicine > Medicine > Warwick Medical School > Biomedical Sciences Faculty of Science, Engineering and Medicine > Medicine > Warwick Medical School |
||||||||
Journal or Publication Title: | Ecological Informatics | ||||||||
Publisher: | Elsevier BV | ||||||||
ISSN: | 1574-9541 | ||||||||
Official Date: | September 2014 | ||||||||
Dates: |
|
||||||||
Volume: | 23 | ||||||||
Page Range: | pp. 35-48 | ||||||||
DOI: | 10.1016/j.ecoinf.2013.07.004 | ||||||||
Status: | Peer Reviewed | ||||||||
Publication Status: | Published | ||||||||
Access rights to Published version: | Restricted or Subscription Access |
Request changes or add full text files to a record
Repository staff actions (login required)
View Item |