Automatic determination of NET (neutrophil extracellular traps) coverage in fluorescent microscopy images

Bioinformatics. 2015 Jul 15;31(14):2364-70. doi: 10.1093/bioinformatics/btv156. Epub 2015 Mar 19.

Abstract

Motivation: Neutrophil extracellular traps (NETs) are believed to be essential in controlling several bacterial pathogens. Quantification of NETs in vitro is an important tool in studies aiming to clarify the biological and chemical factors contributing to NET production, stabilization and degradation. This estimation can be performed on the basis of fluorescent microscopy images using appropriate labelings. In this context, it is desirable to automate the analysis to eliminate both the tedious process of manual annotation and possible operator-specific biases.

Results: We propose a framework for the automated determination of NET content, based on visually annotated images which are used to train a supervised machine-learning method. We derive several methods in this framework. The best results are obtained by combining these into a single prediction. The overall Q(2) of the combined method is 93%. By having two experts label part of the image set, we were able to compare the performance of the algorithms to the human interoperator variability. We find that the two operators exhibited a very high correlation on their overall assessment of the NET coverage area in the images (R(2) is 97%), although there were consistent differences in labeling at pixel level (Q(2), which unlike R(2) does not correct for additive and multiplicative biases, was only 89%).

Availability and implementation: Open source software (under the MIT license) is available at https://github.com/luispedro/Coelho2015_NetsDetermination for both reproducibility and application to new data.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Extracellular Traps / physiology*
  • Humans
  • Image Interpretation, Computer-Assisted / methods*
  • Microscopy, Fluorescence / methods*
  • Neutrophils / physiology*
  • Observer Variation
  • Pattern Recognition, Automated / methods*
  • Reproducibility of Results
  • Software*