MSMOTE: Improving Classification Performance when Training Data is imbalanced

Main Article Content

Dr S R C Murthy P,Md Ayub Khan,P.Sandeep Reddy

Abstract

Learning from data sets that contain very few instances of the minority class usually produces biased classifiers that have a higher predictive accuracy over the majority class, but poorer predictive accuracy over the minority class. SMOTE (Synthetic Minority Over-sampling Technique) is specifically designed for learning from imbalanced data sets. This paper presents a modified approach (MSMOTE) for learning from imbalanced data sets, based on the SMOTE algorithm. MSMOTE not only considers the distribution of minority class samples, but also eliminates noise samples by adaptive mediation. The combination of MSMOTE and AdaBoost are applied to several highly and moderately imbalanced data sets. The experimental results show that the prediction performance of MSMOTE is better than SMOTEBoost in the minority class and F-values are also improved.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Article Details

How to Cite
Dr S R C Murthy P,Md Ayub Khan,P.Sandeep Reddy. (2023). MSMOTE: Improving Classification Performance when Training Data is imbalanced. Turkish Journal of Computer and Mathematics Education (TURCOMAT), 11(3), 1831–1834. https://doi.org/10.17762/turcomat.v11i3.13479
Section
Research Articles