行知论坛219:Hyperparameter Optimization in Machine Learning

时间:2019-09-27浏览:12设置

报告题目:

Hyperparameter Optimization in Machine Learning


报告人:顾彬 教授 (南京信息工程大学)

报告时间:2019109日(周三)9:00 - 9:40

报告地点:计算机学院4001

主办单位:计算机学院


摘要:

Hyperparameter optimization is a classical problem in machine learning. Traditional hyperparameter optimization methods only consider finite candidates which conflicts the fact that the ultimate goal of hyperparameter optimization is to find the model in the whole parameter space. The talk will discuss how to solve the hyperparameter optimization problem with one parameter, multiple parameters or super multiple parameters in the whole parameter space. Specifically, 1) for one parameter, we propose the generalized error path algorithm to find the model with the minimum cross validation error in the whole parameter space. 2) For cost sensitive support vector machine with two parameters, we propose a solution and error surface algorithm, and try to find the model with the minimum cross validation error in the whole parameter space; 3) For super multiple parameters, we propose a new hyperparameter optimization method with zeroth-order hyper-gradients.


讲者简介:

顾彬,南京信息工程大学计算机与软件学院教授。在南京航空航天大学获得学士,博士学位。2013年在西安大略大学期间跟随凌晓峰教授,2016年在德克萨斯大学阿灵顿分校、匹兹堡大学期间跟随黄恒教授总计做了4年博士后研究。长期致力于机器学习大数据优化等相关领域的研究,在TPAMITNNLSNeurIPS,ICML,KDDAAAI等期刊与会议上发表了40多篇论文,并经常性担任这些期刊会议的审稿人。Google Scholar统计论文被引2000多次。





返回原图
/