A supervised blood vessel segmentation technique for digital Fundus images using Zernike Moment based features

Dharmateja Adapa, Alex Noel Joseph Raj, Sai Nikhil Alisetti, Zhemin Zhuang, K. Ganesan, Ganesh Naik

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

This paper proposes a new supervised method for blood vessel segmentation using Zernike moment-based shape descriptors. The method implements a pixel wise classification by computing a 11-D feature vector comprising of both statistical (gray-level) features and shape-based (Zernike moment) features. Also the feature set contains optimal coefficients of the Zernike Moments which were derived based on the maximum differentiability between the blood vessel and background pixels. A manually selected training points obtained from the training set of the DRIVE dataset, covering all possible manifestations were used for training the ANN-based binary classifier. The method was evaluated on unknown test samples of DRIVE and STARE databases and returned accuracies of 0.945 and 0.9486 respectively, outperforming other existing supervised learning methods. Further, the segmented outputs were able to cover thinner blood vessels better than previous methods, aiding in early detection of pathologies.

Original languageEnglish
Article numbere0229831
Number of pages23
JournalPLoS One
Volume15
Issue number3
DOIs
Publication statusPublished - 6 Mar 2020
Externally publishedYes

Bibliographical note

This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Keywords

  • blood vessel segmentation
  • digital Fundus images
  • Zernike Moment
  • detection of pathologies

Fingerprint Dive into the research topics of 'A supervised blood vessel segmentation technique for digital Fundus images using Zernike Moment based features'. Together they form a unique fingerprint.

Cite this