Predicting pairwise relations with neural similarity encoders

F. Horn, K. R. Müller

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)


Matrix factorization is at the heart of many machine learning algorithms, for example, dimensionality reduction (e.g. kernel PCA) or recommender systems relying on collaborative filtering. Understanding a singular value decomposition (SVD) of a matrix as a neural network optimization problem enables us to decompose large matrices efficiently while dealing naturally with missing values in the given matrix. But most importantly, it allows us to learn the connection between data points’ feature vectors and the matrix containing information about their pairwise relations. In this paper we introduce a novel neural network architecture termed similarity encoder (SimEc), which is designed to simultaneously factorize a given target matrix while also learning the mapping to project the data points’ feature vectors into a similarity preserving embedding space. This makes it possible to, for example, easily compute out-of-sample solutions for new data points. Additionally, we demonstrate that SimEc can preserve non-metric similarities and even predict multiple pairwise relations between data points at once.

Original languageEnglish
Pages (from-to)821-830
Number of pages10
JournalBulletin of the Polish Academy of Sciences: Technical Sciences
Publication statusPublished - 2018


  • Dimensionality reduction
  • Kernel PCA
  • Matrix factorization
  • Neural networks
  • SVD
  • Similarity preserving embeddings

ASJC Scopus subject areas

  • Atomic and Molecular Physics, and Optics
  • Information Systems
  • Engineering(all)
  • Computer Networks and Communications
  • Artificial Intelligence


Dive into the research topics of 'Predicting pairwise relations with neural similarity encoders'. Together they form a unique fingerprint.

Cite this