Statistical learning theory (also called Vapnik-Chervonenkis-Theory or VC-Theory) has been developed by Vladimir Vapnik over the past 30 years. Originally introduced for pattern recognition, it is developed for various statistical learning tasks including density estimation and function regression. One of the most powerful class of learning algorithm originating from statistical learning theory is the support vector machine. This universal constructive learning scheme can be used to train a variety of architectures such as artificial neural networks, radial basis functions, and polynomial estimators. Support vector learning can be used for classification and regression tasks.

Research directions:

- support vector learning in radial basis function networks
- multiclass classification with support vector machines
- hierarchies of support vector machines for multiclass pattern recognition