One-step Bayesian example-dependent cost classification: The OsC-MLP method

Neural Netw. 2024 May:173:106168. doi: 10.1016/j.neunet.2024.106168. Epub 2024 Feb 8.

Abstract

Example-dependent cost classification problems are those where the decision costs depend not only on the true and the attributed classes but also on the sample features. Discriminative algorithms that carry out such classification tasks must take this dependence into account. In some applications, the decision costs are known for the training set but not in production, which complicates the problem. In this paper, we introduce a new one-step Bayesian formulation to train Neural Networks and solve the above limitation for binary cases with one-step Learning Machines, avoiding the drawbacks that unknown analytical forms of the example-dependent costs create. The formulation is based on defining an artificial likelihood ratio by using the available training classification costs in its definition, and proposes a test that does not require the values of the costs for unseen samples. Furthermore, it also includes Bayesian rebalancing mechanisms to combat the negative effects of class imbalance. Experimental results support the consistency and effectiveness of the corresponding algorithms.

Keywords: Bregman divergences; Imbalance; Informed re-balancing; Neural networks; Sample emphasis.

MeSH terms

  • Algorithms*
  • Bayes Theorem
  • Learning
  • Neural Networks, Computer*