beat365体育亚洲官网在线下载 - 歡迎您!


“高屋建瓴AI公开课”第17期:Sample Complexity and Regularized Schemes of Graph Convolutional Networks


报告时间: 2022年6月1日(周三) 15:00-16:00        

腾讯会议: 962 346 579

邀请人:  刘勇  beat365体育亚洲官网在线下载准聘副教授

主讲人姓名: 吕绍高 南京审计大学统计与数据科学学院教授

主讲人简介: 2011年获得中国科大-香港城市大学联合培养博士,2011年-2018年在西南财经大学工作。主要研究方向是统计机器学习,当前研究兴趣包括分布式学习、结构化预测以及深度学习等。迄今为止在SCI检索的杂志上发表论文30多篇,包括统计学顶级期刊 《Annals of Statistics》、人工智能领域顶级期刊《Journal of Machine Learning Research》与顶级会议“NeurIPS”,以及数量经济学顶级期刊《Journal of Econometrics》。

报告题目: Sample Complexity and Regularized Schemes of Graph Convolutional Networks

报告摘要: This report mainly studies the sample complexity and regularization algorithm of graph convolutional networks, respectively. First, we provide a tight upper bound of Rademacher complexity for GCN models with a single hidden layer. Under regularity conditions, theses derived complexity bounds explicitly depend on the largest eigenvalue of graph convolution filter and the degree distribution of the graph. Again, we provide a lower bound of Rademacher complexity for GCNs to show optimality of our derived upper bounds. Second, our purpose is to quantify the trade off of GCN between smoothness and sparsity,  with the help of a new   stochastic learning proposed in the work. For a single

layer GCN, we develop an explicit theoretical understanding of GCN with the regularized stochastic learning by analyzing the stability of our regularized stochastic algorithm. Particularly, we prove that the uniform stability of our GCN depends on the largest absolute eigenvalue of its graph filter, and there exists a stability-sparsity trade off with varying p. Several empirical experiments are implemented to validate our theoretical findings.