Registration of Thoracic CT-CT Images Using Improved Demon Registration

  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract

    Computed Tomography (CT) image is commonly used for medical diagnosis, to monitor disease progression as well in radiotherapy planning and treatment. In basis, image registration aims to accurately align two or more monomodal or multimodal images taken at different time or places. In order to accurately register two CT images, an accurate and reliable registration algorithm is required. This paper proposes an improved Demon registration technique that uses sum of conditional variance (SCV) and multi-modality independent neighborhood descriptive (MIND) similarity metrics instead of the conventional sum squared difference (SSD) demon method to register whole body CT (PET/CT) and thoracic CT images that are acquired separately. We tested our proposed method on 9 whole body CT (PET/CT) and CT images of Non-small Cell Lung Cancer (NSCLC). Apart from visual observation, the proposed method is compared with the free form deformation (FFD) and standard demon methods. The registration accuracy was justified by measuring the lung volumes overlap between the two images post registration in terms of the Jaccard and Dice coefficients. The quality of the registered images was measured using three image quality metrics; structural similarity index (SSIM), peak signal to noise ratio (PSNR) and correlation coefficient (CC). In overall, the performance of the proposed demon is double than FFD and is superior than the standard demon. The average Jaccard and Dice coefficients are 0.83 and 0.90 respectively. Results of SSIM, PSNR and CC metrics also indicate that the improved demon method is the best, followed by FFD and standard demon.


  • Keywords

    CT Thoracic CT; demon registration; free form deformation; image registration; PET/CT; whole body;

  • References

      [1] Gorbunova V, Sporring J, Lo P, Loeve M, Tiddens HA, Nielsen M & de Bruijne M (2012), Mass preserving image registration for lung CT. Medical image analysis, 16(4), 786-795.

      [2] Oliveira FPM & Tavares JMRS (2014), Medical image registration: a review. Computer Methods Methods in Biomechanics and Biomedical Engineering, 17.2, 73-93.

      [3] Zitova B & Flusser J (2003), Image registration methods: a survey, Image Vision Computing, 21, 977-1000.

      [4] Xiong Z & Zhang YA (2010), A critical review of image registration methods. Internation Journal of Image and Data Fusion, 1, 137-158.

      [5] Amin AM & Rahni AA (2017), Modelling the Siemens SOMATOM Sensation 64 Multi-Slice CT (MSCT) Scanner. Journal of Physics: Conference Series, 851, 012012.

      [6] Zhen X, Gu X, Yan H, Zhou L, Jia X & Jiang SB (2012), CT to cone-beam CT deformable registration with simultaneous intensity correction. Physics in Medicine & Biology, 57(21),6807.

      [7] Wu Q, Cao R, Pei X, Jia J & Hu L (2015), Deformable image registration of CT images for automatic contour propagation in radiation therapy. Bio-medical materials and engineering, 26(s1), S1037-S1044.

      [8] Nielsen MS, Østergaard LR & Carl J (2015), A new method to validate thoracic CT-CT deformable image registration using auto-segmented 3D anatomical landmarks. Acta Oncologica, 54(9), 1515-1520.

      [9] Thirion JP (1995), Fast Non-Rigid Matching of 3D Medical Images, Technical Report 2547. Institut National de Recherche en Informatique et en Automatique, 37.

      [10] Pennec X, Cachier P & Ayache N, Understanding the “demon’s algorithm”: 3D non-rigid registration by gradient descent, In International Conference on Medical Image Computing and Computer-Assisted Intervention, (1999), 597-605.

      [11] Vercauteren T, Pennec X, Perchant A & Ayache N (2009), Diffeomorphic demons: efficient non-parametric image registration. NeuroImage, 45, S61–72.

      [12] Cachier P, Bardinet E, Dormont D, Pennec X, & Ayache N (2003), Iconic feature based nonrigid registration: The PASHA algorithm. Computer Vision Image Understanding, 89(2), 272-298.

      [13] Mokri SS, Saripan MI, Marhaban MH, Nordin AJ & Hashim S (2015), Hybrid registration of PET/CT in thoracic region with pre-filtering PET sinogram. Radiation Physics and Chemistry, 116, 300-304.

      [14] Pickering MR, Muhit AA, Scarvell JM & Smith PN, A new multi-modal similarity measure for fast gradient-based 2D-3D image registration, in Proceedings of International Conference of the IEEE Engineering in Medicine and Biology Society, (2009), 5821–4.

      [15] Heinrich MP, Jenkinson M, Bhushan M, Matin T, Gleeso FV, Brady SM & Schnabel JA (2012), MIND: modality independent neighbourhood descriptor for multi-modal deformable registration. Medical Image Analysis,16,1423–1435.

      [16] Napel S & Plevritis SK, “NSCLC Radiogenomics: Initial Stanford Study of 26 Cases”, The Cancer Imaging Archive. (2014),

      [17] Mokri SS, Saripan MI, Marhaban MH & Nordin AJ, “Lung segmentation in CT for thoracic PET-CT registration through visual study”, Biomedical Engineering and Sciences (IECBES), IEEE EMBS Conference on, (2012), 550-554.

      [18] Yang F & Grigsby PW (2012), Delineation of FDG-PET tumors from heterogeneous background using spectral clustering. European Journal of Radiology, 81(11), 3535–41.

      [19] Kher HR (2014), Implementation of image registration for satellite images using mutual information and particle swarm optimization techniques. International Journal of Computer Applications, 97(1), 7-14.

      [20] Zhang X, Gilliam C & Blu T, “Iterative fitting after elastic registration: An efficient strategy for accurate estimation of parametric deformations”, IEEE Conference in Image Processing (ICIP), (2007), 492-1496.

      [21] Pradhan S & Patra D (2016), Enhanced mutual information based medical image registration. IET Image Processing, 10(5), 418-427.

      [22] Wang Z, Bovik AC, Sheikh HR & Simoncelli EP (2004), Image quality assessment: from error visibility to structural similarity. IEEE transactions on image processing, 13(4), 600-612.




Article ID: 24907
DOI: 10.14419/ijet.v8i1.2.24907

Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.