题 目:Theories and Open Problems of Extreme Learning Machines (ELM)
摘 要:Extreme Learning Machines (ELM) as a common learning mechanism has become popular in the past years. This talk shares ELM theories: 1) interpolation theorems, universal approximation theories and universal classification theories; 2) the reason why compared to ELM, SVM provides suboptimal solutions; 3) ELM unifies Support Vector Machines (SVM), Principal Component Analysis (PCA), Non-negative Matrix Factorization (NMF); 4) ELM provides theoretical support to the universal approximation and classification capabilities of Convolutional Neural Networks (CNN); 5) ELM theories further prove that such a learning system happens to have regression, classification, sparse coding, clustering, compression and feature learning capabilities, which are fundamental to cognition and reasoning. This talk also points out some interesting open theoretical problems awaiting mathematical solutions.
报告人:黄广斌,新加坡南洋理工大学终身教授
报告时间:2019年7月5日(星期五)下午3:30
报告地点:扬州大学数学科学学院报告厅(38号楼103室)
主办单位: 扬州大学数学科学学院
欢迎广大师生参加!