Predicting pairwise relations with neural similarity encoders

F. Horn, Klaus Muller

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Matrix factorization is at the heart of many machine learning algorithms, for example, dimensionality reduction (e.g. kernel PCA) or recommender systems relying on collaborative filtering. Understanding a singular value decomposition (SVD) of a matrix as a neural network optimization problem enables us to decompose large matrices efficiently while dealing naturally with missing values in the given matrix. But most importantly, it allows us to learn the connection between data points’ feature vectors and the matrix containing information about their pairwise relations. In this paper we introduce a novel neural network architecture termed similarity encoder (SimEc), which is designed to simultaneously factorize a given target matrix while also learning the mapping to project the data points’ feature vectors into a similarity preserving embedding space. This makes it possible to, for example, easily compute out-of-sample solutions for new data points. Additionally, we demonstrate that SimEc can preserve non-metric similarities and even predict multiple pairwise relations between data points at once.

Original languageEnglish
Pages (from-to)821-830
Number of pages10
JournalBulletin of the Polish Academy of Sciences: Technical Sciences
DOIs
Publication statusPublished - 2018 Jan 1

Keywords

  • Dimensionality reduction
  • Kernel PCA
  • Matrix factorization
  • Neural networks
  • Similarity preserving embeddings
  • SVD

ASJC Scopus subject areas

  • Atomic and Molecular Physics, and Optics
  • Information Systems
  • Engineering(all)
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Predicting pairwise relations with neural similarity encoders'. Together they form a unique fingerprint.

  • Cite this