Download PDFOpen PDF in browser
ZH
Switch back to the title and the abstract in Chinese

Predicting Default Payments of Credit Card Clients by Using XGBoost with Particle Swarm Optimization

EasyChair Preprint no. 5369

6 pagesDate: April 24, 2021

Abstract

According to the statistics of the Financial Supervisory Commission, the number of users of electronic payment in Taiwan has officially exceeded 10 million. According to the National Statistics Bulletin of the Directorate General of Budget, Accounting and Statistics, Executive Yuan, the total number of Card In Force at the end of December 109 was 50.12 million, and the cumulative amount of credit card transactions from January to December was 3 trillion. 19.6 billion NT dollars, of which the ratio of overdue accounts that have not been paid in full for more than 3 months is 0.15%. This shows that the prediction of default rates is a very important issue.

Extreme Gradient Boosting is an integrated algorithm that adds the target function in the model to the complexity of the parameters, and use Taylor's expansion approximation to reduce overfitting and improve its classification performance. It is a complicated problem to adjust the loss function, the depth of the tree and the weight of the sample. Therefore, this paper will use the optimized Particle Swarm Optimization to filter the best parameters. The default record of credit card customers is unbalanced data, and the default rate is an important issue, but it accounts for a small proportion, so it cannot be truly presented through Accuracy. This paper uses Accuracy and Recall is the evaluation criterion. In the acceptable Accuracy range, to improve the Recall.

Keyphrases: Accuracy, credit card, Extreme Gradient Boosting, Particle Swarm Optimization, Recall

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:5369,
  author = {Lee Jyh-Yuan and Hung Jui-Chung},
  title = {Predicting Default Payments of Credit Card Clients by Using XGBoost with Particle Swarm Optimization},
  howpublished = {EasyChair Preprint no. 5369},

  year = {EasyChair, 2021}}
Download PDFOpen PDF in browser