A New Deep Learning method for Automatic Ovarian Cancer Prediction & Subtype classification
Main Article Content
Abstract
Computer-aided diagnosis and other relevant biomarkers can provide valuable advice to doctors for precise diagnosis. The traditional techniques, mainly Computer Tomography (CT) & Magnetic Resonance Imaging (MRI), are costly, time-consuming, and tedious. During the literature survey, it was learned that Deep Learning outclassed traditional CNN algorithms among various image processing algorithms. In the present work, Deep Convolutional Neural Network (DCNN) is implemented for predicting ovarian cancer and classifying its subtypes with histopathological images as input. For achieving higher accuracy, a new architecture is designed and implemented from the scratch inspiring by pre-Trained AlexNet Model. The basic AlexNet architecture consists of 5 convolutional layers, 3 Maxpooling layers, and three fully-connected layers with Rectified Linear Unit (ReLU) as the activation function. We modified this by adding a Maxpooling layer after each pair of convolutional layers, four such iterations, four fully connected layers, replaced ReLU with Exponential Linear Unit (ELU), and modified architectural parameters. The original architecture considers kernel sizes of 11x11 and 5x5, which we modified to make it uniform at 3x3 as the kernel size. The network was trained using 24,742 augmented images. The accuracy of predicting Ovarian Cancer and its subtypes classification improved to 83.93% with the help of 43,94,533 parameters compared with previous studies that achieved 78%. We also trained the model with both the datasets, before and after augmentation. We concluded that the augmentation increased the accuracy from 70% to 83.93%. This new model can be considered as a benchmark.
Downloads
Metrics
Article Details
Licensing
TURCOMAT publishes articles under the Creative Commons Attribution 4.0 International License (CC BY 4.0). This licensing allows for any use of the work, provided the original author(s) and source are credited, thereby facilitating the free exchange and use of research for the advancement of knowledge.
Detailed Licensing Terms
Attribution (BY): Users must give appropriate credit, provide a link to the license, and indicate if changes were made. Users may do so in any reasonable manner, but not in any way that suggests the licensor endorses them or their use.
No Additional Restrictions: Users may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.