Kernelized online imbalanced learning with fixed budgets
Refereed conference paper presented and published in conference proceedings


Full Text

Times Cited

Other information
AbstractOnline learning from imbalanced streaming data to capture the nonlinearity and heterogeneity of the data is significant in machine learning and data mining. To tackle this problem, we propose a kernelized online imbalanced learning (KOIL) algorithm to directly maximize the area under the ROC curve (AUC). We address two more challenges: 1) How to control the number of support vectors without sacrificing model performance; and 2) how to restrict the fluctuation of the learned decision function to attain smooth updating. To this end, we introduce two buffers with fixed budgets (buffer sizes) for positive class and negative class, respectively, to store the learned support vectors, which can allow us to capture the global information of the decision boundary. When determining the weight of a new support vector, we confine its influence only to its fc-nearest opposite support vectors. This can restrict the effect of new instances and prevent the harm of outliers. More importantly, we design a sophisticated scheme to compensate the model after replacement is conducted when either buffer is full. With this compensation, the learned model approaches the one learned with infinite budgets. We present both theoretical analysis and extensive experimental comparison to demonstrate the effectiveness of our proposed KOIL.
All Author(s) ListHu J., Yang H., King I., Lyu M.R., So A.M.-C.
Name of Conference29th AAAI Conference on Artificial Intelligence, AAAI 2015 and the 27th Innovative Applications of Artificial Intelligence Conference, IAAI 2015
Start Date of Conference25/01/2015
End Date of Conference30/01/2015
Place of ConferenceAustin
Country/Region of ConferenceUnited States of America
Year2015
Month6
Day1
Volume Number4
Pages2666 - 2672
ISBN9781577357025
LanguagesEnglish-United Kingdom

Last updated on 2020-26-05 at 00:09