A Combinatorial Fusion Method for Feature Construction

A Combinatorial Fusion Method for Feature Construction

A Combinatorial Fusion Method for Feature Construction Ye Tian1, Gary M. Weiss2, D. Frank Hsu3, and Qiang Ma4 1Department of Computer Science, New Jersey Institute of Technology, Newark, NJ, USA 2, 3Department of Computer and Information Science, Fordham University, Bronx, NY, USA 4Department of Computer Science, Rutgers University, Piscataway, NJ, USA rather their ranks. This allows us to combine numerical fea- Abstract - This paper demonstrates how methods borrowed tures in a meaningful way, without worrying about issues from information fusion can improve the performance of a such as scaling. This approach is particularly appropriate classifier by constructing (i.e., fusing) new features that are given the increased interest in the use of ranking in the data combinations of existing numeric features. The new features mining [10] and machine learning communities [5]. Our ap- are constructed by mapping the numeric values for each proach also can be viewed as an extension of work from the feature to a rank and then averaging these ranks. The quality information fusion community, since techniques similar to the of the fused features is measured with respect to how well ones we use in this paper have been used to “fuse” informa- they classify minority-class examples, which makes this tion from disparate sources [9]. The work in this paper can be method especially effective for dealing with data sets that viewed as a specific type of information fusion, which we exhibit class imbalance. This paper evaluates our combinato- refer to as feature fusion. rial feature fusion method on ten data sets, using three learn- ing methods. The results indicate that our method can be We describe our combinatorial feature-fusion method in quite effective in improving classifier performance. detail in Section 2 and then describe our experiments in Sec- tion 3. The results from these experiments are described and Keywords: Feature construction, class imbalance, informa- analyzed in Section 4. Related work is discussed in Section 5. tion fusion Our main conclusions and areas for future work are then described in Section 6. 1 Introduction 2 Combinatorial Feature Fusion The performance of a classification algorithm is highly de- pendent on the descriptions associated with the example. For This section describes the basic combinatorial feature- this reason, good practitioners will choose the features used to fusion method. We introduce relevant terminology and de- describe the data very carefully. However, deciding which scribe some of the basic steps employed by the feature fusion information to encode and how to encode it is quite difficult method. We then describe some general schemes for fusing and the best way to do so depends not only on the domain, but features and end the section with a detailed description of our on the learning method. For this reason, there have been a combinatorial feature fusion algorithm. variety of attempts over the years to automate part of this process. This work has had a variety of names over the years 2.1 Terminology and Basic Steps (although sometimes the emphasis is different) and has been called constructive induction [13], feature engineering [17], In this section we will use a simple example to explain the feature construction [6] and feature mining [11]. In this paper relevant terminology and preliminary steps related to feature we discuss how existing numerical features can be combined, fusion. This example will also be used later in this Section to without human effort, in order to improve classification per- help explain the feature-fusion algorithm. Because our fea- formance. ture-fusion method only works with numeric features, for simplicity we assume all features are numeric. Non-numeric The work described in this paper is notable for several rea- features are not a problem in practice—they simply will be sons. First, unlike the majority of work in this area, we are passed, unaltered, to the classifier. specifically concerned with improving the performance of data with substantial class imbalance. Such problems are A data set is made up of examples, or records, each of challenging but quite common and are typical in domains which has a fixed number of features. Consistent with previ- such as medical diagnosis [7], fraud detection [4], and failure ous work on information fusion [9, 10] we view the value of a prediction [19]. Furthermore, there are reasons to believe that feature as a score. Typical examples of scores are a person’s this important class of problems has the most to benefit from salary, a student’s exam score, and a baseball pitcher’s earned feature construction, since some learners may not be able to run average. In the first two cases a higher score is desirable detect subtle patterns that only become apparent when several but in the last case a lower one is preferable. features are examined together [18]. Our work also differs Table 1 introduces a sample data set with eight examples, from other work in that our feature combination operator does labeled A-H, with five numeric features, F1-F5, and a binary not directly use the values of the component features but class variable. In this example class 1 is the minority class and comprises 3/8 or 37.5% of the examples. TABLE 1 number of “top” records that we examine is based on the A SAMPLE DATASET percentage of minority-class examples in the training data. In F1 F2 F3 F4 F5 Class this case 3 of 8 of the training examples (37.5%) belong to the A 1 4 3 2 8 1 minority class so we look at the top 3 records. In this example B 3 3 5 5 4 0 that means that the performance of F2 is 2/3, since two of the C 5 5 2 6 7 1 three class values for these records is a “1”, which is the mi- D 7 6 15 3 2 0 nority-class value. Given this scheme, the best performance E 11 13 16 7 14 0 value that is achievable is 1.0. F 15 16 4 13 11 0 TABLE 3 G 9 7 14 1 18 1 RANKED LIST FOR F2 H 17 15 9 8 3 0 F2 Rank Class B1 0 Early in our combinatorial feature-fusion method we re- A2 1 place each score with a rank, where a lower rank is better. We C3 1 convert each score into a rank using a rank function, which D 4 0 adheres to the standard notion of a rank. We sort the score G5 1 values for each feature in either increasing or decreasing order E6 0 and then assign the rank based on this ordering. Table 2 H7 0 shows the values of the features for the sample data set after F8 0 the scores have been replaced by ranks, where the ranks were assigned after sorting the feature values in increasing order. We may similarly compute the performances for all of the As a specific example, because the three lowest values for F3 individual features. Table 4 shows that for this simple exam- in Table 1 are 2, 3, 4 and these values appear in rows C, A, ple F1–F4 all have performances of 2/3 and F5 has a perform- and F, respectively, the ranks in Table 2 for F3 for these re- ance of 0. cords are 1, 2, and 3, respectively. TABLE 4 PERFORMANCE VALUES FOR ORIGINAL FEATURES TABLE 2 SAMPLE DATASET WITH SCORES REPLACED BY RANKS Feature Performance F1 F2 F3 F4 F5 F1 0.67 A 1 2 2 2 5 F2 0.67 B 2 1 4 4 3 F3 0.67 C 3 3 1 5 4 F4 0.67 D 4 4 7 3 1 F5 0.00 E 6 6 8 6 7 F 7 8 3 8 6 This method is also used to compute the performance of the G 5 5 6 1 8 “fused” features. To do this we need to first determine the H 8 7 5 7 2 rank of a fused feature, so we can sort the examples by this rank. We compute this using a rank combination function that We determine whether the ranks should be assigned based averages the ranks of the features to be combined. This is on increasing or decreasing order of the score values by de- done for each record. As an example, suppose we want to fuse termining the performance of the feature using both ordering features F1–F5 and create a new feature, F1F2F3F4F5, which schemes and selecting the ordering that yields the best per- we will call F6. Table 5 shows the rank values for F6 for all formance (we describe how to compute a feature’s perform- eight records. The value for F6 for record A is computed as: ance shortly). In our method, once the scores are replaced (Rank(F1) + Rank(F2) + Rank(F3) + Rank(F4) + Rank(F5))/5 with a rank the scores are never used again. The rank values = (1+2+2+2+5)/5 = 2.4. We see that for this new feature, are used when combining features and are the features values record “A” has the best (lowest) rank. Given these values, one that are passed to the learning algorithm. can now compute the performance of the feature F6. Note that Next we show how to compute the “performance” of a fea- even though the values in Table 5 are not integers we can still ture. This performance metric essentially measures how well consider them ranks. In order to compute the performance of the rank of the feature correlates with the minority-class ex- F6, we only need to be able to sort by these values.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    7 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us