EconPapers    
Economics at your fingertips  
 

Feature Selection Using Golden Jackal Optimization for Software Fault Prediction

Himansu Das (), Sanjay Prajapati, Mahendra Kumar Gourisaria, Radha Mohan Pattanayak, Abdalla Alameen and Manjur Kolhar
Additional contact information
Himansu Das: School of Computer Engineering, KIIT Deemed to be University, Bhubaneswar 751024, Odisha, India
Sanjay Prajapati: School of Computer Engineering, KIIT Deemed to be University, Bhubaneswar 751024, Odisha, India
Mahendra Kumar Gourisaria: School of Computer Engineering, KIIT Deemed to be University, Bhubaneswar 751024, Odisha, India
Radha Mohan Pattanayak: School of Computer Science & Engineering, VIT-AP University, Amaravati 522237, Andhra Pradesh, India
Abdalla Alameen: Computer Science Department, Prince Sattam Bin Abdulaziz University, Al-Kharj 16278, Saudi Arabia
Manjur Kolhar: Department of Computer Science, College of Arts and Science, Prince Sattam Bin Abdulaziz University, Al-Kharj 16278, Saudi Arabia

Mathematics, 2023, vol. 11, issue 11, 1-28

Abstract: A program’s bug, fault, or mistake that results in unintended results is known as a software defect or fault. Software flaws are programming errors due to mistakes in the requirements, architecture, or source code. Finding and fixing bugs as soon as they arise is a crucial goal of software development that can be achieved in various ways. So, selecting a handful of optimal subsets of features from any dataset is a prime approach. Indirectly, the classification performance can be improved through the selection of features. A novel approach to feature selection (FS) has been developed, which incorporates the Golden Jackal Optimization (GJO) algorithm, a meta-heuristic optimization technique that draws on the hunting tactics of golden jackals. Combining this algorithm with four classifiers, namely K-Nearest Neighbor, Decision Tree, Quadrative Discriminant Analysis, and Naive Bayes, will aid in selecting a subset of relevant features from software fault prediction datasets. To evaluate the accuracy of this algorithm, we will compare its performance with other feature selection methods such as FSDE (Differential Evolution), FSPSO (Particle Swarm Optimization), FSGA (Genetic Algorithm), and FSACO (Ant Colony Optimization). The result that we got from FSGJO is great for almost all the cases. For many of the results, FSGJO has given higher classification accuracy. By utilizing the Friedman and Holm tests, to determine statistical significance, the suggested strategy has been verified and found to be superior to prior methods in selecting an optimal set of attributes.

Keywords: software fault prediction; software defect prediction; feature selection; classification algorithms; golden jackal optimization (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/11/11/2438/pdf (application/pdf)
https://www.mdpi.com/2227-7390/11/11/2438/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:11:y:2023:i:11:p:2438-:d:1155122

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:11:y:2023:i:11:p:2438-:d:1155122