Brain MR image segmentation based on an improved active contour model

PLoS One. 2017 Aug 30;12(8):e0183943. doi: 10.1371/journal.pone.0183943. eCollection 2017.

Abstract

It is often a difficult task to accurately segment brain magnetic resonance (MR) images with intensity in-homogeneity and noise. This paper introduces a novel level set method for simultaneous brain MR image segmentation and intensity inhomogeneity correction. To reduce the effect of noise, novel anisotropic spatial information, which can preserve more details of edges and corners, is proposed by incorporating the inner relationships among the neighbor pixels. Then the proposed energy function uses the multivariate Student's t-distribution to fit the distribution of the intensities of each tissue. Furthermore, the proposed model utilizes Hidden Markov random fields to model the spatial correlation between neigh-boring pixels/voxels. The means of the multivariate Student's t-distribution can be adaptively estimated by multiplying a bias field to reduce the effect of intensity inhomogeneity. In the end, we reconstructed the energy function to be convex and calculated it by using the Split Bregman method, which allows our framework for random initialization, thereby allowing fully automated applications. Our method can obtain the final result in less than 1 second for 2D image with size 256 × 256 and less than 300 seconds for 3D image with size 256 × 256 × 171. The proposed method was compared to other state-of-the-art segmentation methods using both synthetic and clinical brain MR images and increased the accuracies of the results more than 3%.

MeSH terms

  • Algorithms
  • Anisotropy
  • Brain / anatomy & histology*
  • Brain / diagnostic imaging*
  • Humans
  • Image Processing, Computer-Assisted / methods
  • Imaging, Three-Dimensional / methods*
  • Magnetic Resonance Imaging / methods*
  • Markov Chains
  • Models, Anatomic
  • Models, Statistical

Grants and funding

This work was supported by the National Nature Science Foundation of China (61672291).