Mixture of Message Passing Experts with Routing Entropy Regularization for Node Classification
Xuanze Chen, Jiajun Zhou, Yadong Li, Jinsong Chen, Shanqing Yu, Qi Xuan
图形神经网络(GNN)在基于图形的学习任务中取得了重大进展,但是当面对连接的节点在特征和标签上存在很大差异的异构结构时,它们的性能往往会恶化。 为了解决这一限制,我们提出了GNNMoE,这是一种新颖的熵驱动混合消息传递专家框架,可实现节点级自适应表示学习。 GNNMoE 分解传递到传播和转换操作中的消息,并通过混合路由机制引导的多个专家网络集成它们。 路由熵正则化动态调整软权重和软顶k路由,使GNNMoE能够灵活地适应不同的社区环境。 对12个基准数据集进行的广泛实验表明,GNNMoE始终优于SOTA节点分类方法,同时保持可扩展性和可解释性。 这项工作为实现精细粒度、个性化的节点表示学习提供了统一和有原则的方法。
Graph neural networks (GNNs) have achieved significant progress in graph-based learning tasks, yet their performance often deteriorates when facing heterophilous structures where connected nodes differ substantially in features and labels. To address this limitation, we propose GNNMoE, a novel entropy-driven mixture of message-passing experts framework that enables node-level adaptive representation learning. GNNMoE decomposes message passing into propagation and transformation operations and in...