Partial Trace Regression and Low-Rank Kraus Decomposition

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »Supplemental »

Bibtek download is not availble in the pre-proceeding


Hachem Kadri, Stephane Ayache, Riikka Huusari, alain rakotomamonjy, Ralaivola Liva


<p>The trace regression model, a direct extension to the well-studied linear regression model, allows one to map matrices to real-valued outputs. We here introduce a yet more general model, namely the partial trace regression model, a family of linear mappings from matrix-valued inputs to matrix-valued outputs; this model subsumes the trace regression model and thus the linear regression model. Borrowing tools from quantum information theory, where partial trace operators have been extensively studied, we propose a framework for learning partial trace regression models from data by taking advantage of the so-called low-rank Kraus representation of completely positive maps. We show the relevance of our framework with synthetic and real-world experiments conducted for both i) matrix-to-matrix regression and ii) positive semidefinite matrix completion, two tasks which can be formulated as partial trace regression problems.</p>