2022
Lu, Chien; Peltonen, Jaakko; Nummenmaa, Timo; Nummenmaa, Jyrki
Nonparametric Exponential Family Graph Embeddings for Multiple Representation Learning Proceedings Article
In: Proceedings of Machine Learning Research, JMLR, 2022.
Abstract | Links | Tags: Algorithms, Embedding model, Graph data, Nonparametric
@inproceedings{Lu2022,
title = {Nonparametric Exponential Family Graph Embeddings for Multiple Representation Learning},
author = {Chien Lu and Jaakko Peltonen and Timo Nummenmaa and Jyrki Nummenmaa},
url = {https://proceedings.mlr.press/v180/lu22a/lu22a.pdf},
year = {2022},
date = {2022-09-26},
urldate = {2022-09-26},
booktitle = {Proceedings of Machine Learning Research},
issuetitle = {The 38th Conference on Uncertainty in Artificial Intelligence},
publisher = {JMLR},
abstract = {In graph data, each node often serves multiple functionalities. However, most graph embedding models assume that each node can only possess one representation. We address this issue by proposing a nonparametric graph embedding model. The model allows each node to learn multiple representations where they are needed to represent the complexity of random walks in the graph. It extends the Exponential family graph embedding model with two nonparametric prior settings, the Dirichlet process and the uniform process. The model combines the ability of Exponential family graph embedding to take the number of occurrences of context nodes into account with nonparametric priors giving it the flexibility to learn more than one latent representation for each node. The learned embeddings outperforms other state of the art approaches in link prediction and node classification tasks.},
keywords = {Algorithms, Embedding model, Graph data, Nonparametric},
pubstate = {published},
tppubtype = {inproceedings}
}
In graph data, each node often serves multiple functionalities. However, most graph embedding models assume that each node can only possess one representation. We address this issue by proposing a nonparametric graph embedding model. The model allows each node to learn multiple representations where they are needed to represent the complexity of random walks in the graph. It extends the Exponential family graph embedding model with two nonparametric prior settings, the Dirichlet process and the uniform process. The model combines the ability of Exponential family graph embedding to take the number of occurrences of context nodes into account with nonparametric priors giving it the flexibility to learn more than one latent representation for each node. The learned embeddings outperforms other state of the art approaches in link prediction and node classification tasks.
