Back to Search Start Over

Federated Neural Nonparametric Point Processes

Authors :
Chen, Hui
Liu, Hengyu
Li, Yaqiong
Fan, Xuhui
Zhao, Zhilin
Zhou, Feng
Quinn, Christopher John
Cao, Longbing
Publication Year :
2024

Abstract

Temporal point processes (TPPs) are effective for modeling event occurrences over time, but they struggle with sparse and uncertain events in federated systems, where privacy is a major concern. To address this, we propose \textit{FedPP}, a Federated neural nonparametric Point Process model. FedPP integrates neural embeddings into Sigmoidal Gaussian Cox Processes (SGCPs) on the client side, which is a flexible and expressive class of TPPs, allowing it to generate highly flexible intensity functions that capture client-specific event dynamics and uncertainties while efficiently summarizing historical records. For global aggregation, FedPP introduces a divergence-based mechanism that communicates the distributions of SGCPs' kernel hyperparameters between the server and clients, while keeping client-specific parameters local to ensure privacy and personalization. FedPP effectively captures event uncertainty and sparsity, and extensive experiments demonstrate its superior performance in federated settings, particularly with KL divergence and Wasserstein distance-based global aggregation.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2410.05637
Document Type :
Working Paper