INTERNAL LEADERBOARD
Level 2 Ensemble Models
Model | Public LB MCC Score | CV MCC Score | CV AUC Score | Threshold | Comment |
---|---|---|---|---|---|
avg_esb.esb3.sub.csv | 0.48895 | 0.494301 | 0.929332 | 0.326768 | average of esb.esb3 |
Level 1 Ensemble Models
Model | Public LB MCC Score | CV MCC Score | CV AUC Score | Threshold | Comment |
---|---|---|---|---|---|
keras_20_2_128_0.5_512_5_esb15 | - | 0.494306 | 0.926863 | 0.267374 | keras with esb15 |
keras_20_2_128_0.5_512_5_esb13 | 0.48830 | 0.494092 | 0.923443 | 0.247576 | keras with esb13 |
xg_10000_6_0.05_1_0.7_0.5_1_100_0.5_esb16 | - | 0.493853 | 0.928585 | 0.415859 | xgb with esb16 |
xg_10000_6_0.05_1_0.7_0.5_1_100_0.5_esb13 | 0.48756 | 0.493007 | 0.928596 | 0.435657 | xgb with esb13 |
xg_10000_6_0.05_1_0.7_0.5_1_100_0.5_esb11 | 0.47429 | 0.484894 | 0.927990 | 0.425758 | xgb with esb11 |
xg_10000_6_0.05_1_0.7_0.5_1_100_0.5_esb9 | 0.41406 | 0.415531 | 0.907144 | 0.306970 | xgb with esb9 |
keras_20_2_128_0.5_512_5_esb9 | 0.40453 | 0.416551 | 0.905934 | 0.217879 | keras with esb9 |
xg_10000_6_0.05_1_0.7_0.5_1_100_0.5_esb8 | 0.40410 | 0.416778 | 0.907292 | 0.425758 | xgb with esb8 |
Single Models
Model | Public LB MCC Score | CV MCC Score | CV AUC Score | Threshold | Comment |
---|---|---|---|---|---|
xg_10000_6_0.05_1_0.8_0.5_1_100_0.5_xi | 0.46912 | 0.471735 | 0.925452 | 0.346566 | xgb with feature xi |
xg_10000_6_0.05_1_0.8_0.5_1_100_0.5_h2 | 0.40621 | 0.414588 | 0.905808 | 0.316869 | xgb with feature h2 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m8 | 0.40318 | 0.412482 | 0.904498 | 0.277273 | xgb with feature m8 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m10 | 0.40102 | 0.414860 | 0.907744 | 0.386162 | xgb with feature m10 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m12 | 0.40055 | 0.412622 | 0.905791 | 0.277273 | xgb with feature m12 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m13 | 0.39887 | 0.410871 | 0.905665 | 0.267374 | xgb with feature m13 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m11 | 0.39838 | 0.410234 | 0.904060 | 0.267374 | xgb with feature m11 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m19 | 0.39804 | 0.411858 | 0.904613 | 0.415859 | xgb with feature m19 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m14 | 0.39719 | 0.417231 | 0.908390 | 0.306970 | xgb with feature m14 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m11_cat_cnt | 0.39696 | 0.413397 | 0.904728 | 0.356465 | xgb with feature m11_cat_cnt |
xg_10000_6_0.05_1_0.7_0.5_1_100_m18 | 0.39236 | 0.409568 | 0.905975 | 0.356465 | xgb with feature m18 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m9 | 0.38976 | 0.407061 | 0.902651 | 0.326768 | xgb with feature m9 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m15 | 0.39510 | 0.411835 | 0.905861 | 0.326768 | xgb with feature m15 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m16 | 0.36598 | 0.431463 | 0.914549 | 0.376263 | xgb with feature m15 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m17 | 0.27355 | 0.287223 | 0.792510 | 0.188182 | xgb with feature m17 |
xg_10000_6_0.05_1_0.8_0.5_1_100_0.00581_m5 | - | 0.260144 | 0.736417 | 0.247576 | xgb w prior w feature m5 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m3 | 0.25978 | 0.262398 | 0.735327 | 0.158485 | xgb with feature m3 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m2 | 0.25847 | 0.268000 | 0.741536 | 0.210000 | xgb with feature m2 (no retrain) |
xg_10000_6_0.05_1_0.7_0.5_1_100_m4 | 0.25814 | 0.267268 | 0.739925 | 0.190000 | xgb with feature m4 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m6 | 0.25715 | 0.265159 | 0.733994 | 0.168384 | xgb with feature m6 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m5 | 0.25702 | 0.265147 | 0.733484 | 0.178283 | xgb with feature m5 |
xg_10000_6_0.05_1_0.7_0.5_1_100_0.5_j1 | 0.25658 | 0.266439 | 0.737122 | 0.227778 | xgb with feature j1 |
xg_10000_6_0.05_1_0.8_0.5_1_100_m2 | 0.25569 | 0.266177 | 0.741136 | 0.190000 | xgb with feature m2 |
xg_10000_6_0.05_1_0.7_0.5_1_100_m7 | 0.25334 | 0.266705 | 0.737535 | 0.168384 | xgb with feature m7 |
xg_10000_6_0.05_1_0.8_0.5_1_100_m1 | 0.25306 | 0.261948 | 0.735805 | 0.170000 | xgb with feature m1 |
xg_10000_6_0.05_1_0.8_0.5_1_100_m1p | 0.23213 | 0.298276 | 0.767727 | 0.227778 | xgb with feature m1p |
xg_10000_6_0.05_1_0.8_0.5_1_100_h1 | 0.20810 | 0.207277 | 0.732212 | 0.110000 | xgb with feature h1 |
xg_10000_6_0.05_1_0.8_0.5_1_100_0.00581_h1 | - | 0.205662 | 0.732547 | 0.099091 | xgb w prior w feature h1 |
xg_10000_6_0.05_1_0.8_0.5_1_100_h1 | - | 0.195504 | 0.731770 | 0.070000 | xgb with feature h1 |
A very cool viz get more understanding of data set
Meeting 10/21:
- Different delta for ids (Mert)
- Magic features per line station (Mert)
- hd5 implementation (Jeong)
- Time of day, day of week features with Downtime/weekend distinction (Jeong)
- Product features (Erkut)
- All new features will have their own makefile e.g. Makefile.ls (linestation)
Meeting 10/14:
-
- encode error rate of each line station combinations (52 features) (Mert)
- encode time-varying error rate 52 x 10 ...
- Encode error rate of each product (~8000 features) (Erkut)
- time-varying version
- Extend time overall (h2) to time per station / line (Hang)
- Combine train and test some stats?
- Change makefile structure to separate group of features