Abstract
In this paper, we argue that in many basic algorithms for machine learning, including support vector machine (SVM) for classification, principal component analysis (PCA) for dimensionality reduction, and regression for dependency estimation, we need the inner products of the data samples, rather than the data samples themselves.Motivated by the above observation, we introduce the problem of private inner product retrieval for distributed machine learning, where we have a system including a database of some files, duplicated across some non-colluding servers. A user intends to retrieve a subset of specific size of the set of the inner product of every pair of data items in the database with minimum communication load, without revealing any information about the identity of the requested subset. For achievability, we use the algorithms for multi-message private information retrieval. For converse, we establish that as the length of the files becomes large, the set of all inner products converges to independent random variables with uniform distribution hence we find asymptotic capacity for this problem. We also derive the rate of this convergence. To prove that, we construct special dependencies among sequences of the sets of all inner products with different length, which forms a time-homogeneous irreducible Markov chain, without affecting the marginal distribution. We show that this Markov chain has a uniform distribution as its unique stationary distribution, with rate of convergence dominated by the second largest eigenvalue of the transition probability matrix. This allows us to develop a converse, which converges to a tight bound in some cases, as the size of the files becomes large.
Original language | English (US) |
---|---|
Title of host publication | 2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 355-359 |
Number of pages | 5 |
ISBN (Electronic) | 9781538692912 |
DOIs | |
State | Published - Jul 2019 |
Externally published | Yes |
Event | 2019 IEEE International Symposium on Information Theory, ISIT 2019 - Paris, France Duration: Jul 7 2019 → Jul 12 2019 |
Publication series
Name | IEEE International Symposium on Information Theory - Proceedings |
---|---|
Volume | 2019-July |
ISSN (Print) | 2157-8095 |
Conference
Conference | 2019 IEEE International Symposium on Information Theory, ISIT 2019 |
---|---|
Country/Territory | France |
City | Paris |
Period | 7/7/19 → 7/12/19 |
Bibliographical note
Publisher Copyright:© 2019 IEEE.