Biomedical Image Processing with Containers and Deep Learning: An Automated Analysis Pipeline: Data architecture, artificial intelligence, automated processing, containerization, and clusters orchestration ease the transition from data acquisition to insights in medium-to-large datasets

Bioessays. 2019 Jun;41(6):e1900004. doi: 10.1002/bies.201900004. Epub 2019 May 16.


Here, a streamlined, scalable, laboratory approach is discussed that enables medium-to-large dataset analysis. The presented approach combines data management, artificial intelligence, containerization, cluster orchestration, and quality control in a unified analytic pipeline. The unique combination of these individual building blocks creates a new and powerful analysis approach that can readily be applied to medium-to-large datasets by researchers to accelerate the pace of research. The proposed framework is applied to a project that counts the number of plasmonic nanoparticles bound to peripheral blood mononuclear cells in dark-field microscopy images. By using the techniques presented in this article, the images are automatically processed overnight, without user interaction, streamlining the path from experiment to conclusions.

Keywords: automation; data processing; image analysis; optics.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Data Analysis*
  • Database Management Systems
  • Deep Learning*
  • Electronic Data Processing / methods*
  • Gold / analysis
  • Humans
  • Image Processing, Computer-Assisted / methods*
  • Information Storage and Retrieval / methods*
  • Leukocytes, Mononuclear / cytology
  • Metal Nanoparticles / analysis
  • Microscopy / methods


  • Gold