学术讲座预告:Hierarchical Kernels in Deep Kernel Learning

作者:发稿时间:2024-10-23浏览次数:10

题目:Hierarchical Kernels in Deep Kernel Learning

主讲人:张海樟

时间:2024-10-29 14:30

地点:#腾讯会议:176-544-527

举办部门:数学与计算机学院数学系


讲座要点:

Classical kernel methods enjoy a solid mathematical foundation while have difficulty handling very complicated learning problems. In contrast, deep learning based on deep neural networks has achieved great successes in complicated learning problems including face recognition, speech recognition, game intelligence, natural language processing, and autonomous navigation. However, current deep learning methods are not well understood mathematically, which hinders their interpretability. Recently, there have been efforts in developing deep kernel learning with the hope of combining the advantages of kernel methods and deep learning. Such approaches aim to construct hierarchical kernels via consecutive compositions from widely-used reproducing kernels. In this paper, we characterize the corresponding reproducing kernel Hilbert spaces of hierarchical kernels, and study conditions ensuring that the reproducing kernel Hilbert space will be expanding as the layer of hierarchical kernels increases. The results will yield guidance to the construction of hierarchical kernels for deep kernel learning.


主讲人简介:

张海樟,中山大学数学学院(珠海)教授. 研究兴趣包括学习理论、应用调和分析和函数逼近. 代表性成果有再生核的Weierstrass逼近定理、深度神经网络的收敛性理论,以及在国际上首创的再生核巴拿赫空间理论. 以再生核巴拿赫空间为基础的心理学分类方法入选剑桥大学出版社的《数学心理学新手册》.在Journal of Machine Learning Research、Applied and Computational Harmonic Analysis、Neural Networks, Neural Computation、Neurocomputing、Journal of Approximation Theory、IEEE Transactions系列等发表多篇原创性工作, 单篇最高他引超过360次. 主持包括多项国家和省部级基金.