STOC 2018 talks and posters

Sign up for the new posts via the RSS feed.

I’m heading out to Los Angeles to attend STOC 2018 (and, more generally, TheoryFest). My co-authors will present two papers, which I was lucky to be involved with:

Data-Dependent Hashing via Nonlinear Spectral Gaps (joint with Alex Andoni, Assaf Naor, Sasho Nikolov and Erik Waingarten). Paper, poster. In this paper, we develop a new technique for randomized partitioning of general metric spaces, which is based on estimates for nonlinear spectral gaps (see here for a thorough overview). These random partitions imply very efficient data structures in the cell-probe model for the high-dimensional approximate nearest neighbor search problem (ANN) (see here for a survey) over general metric spaces. With more work, we can get true time-efficient data structures for several cases of interest: most notably, in the forthcoming paper, we obtain the ANN algorithm for a general $d$-dimensional normed space with approximation $2^{\widetilde{O}(\sqrt{\log d})}$ (the previous best result was the trivial $O(\sqrt{d})$ bound, which readily follows from the classic John’s theorem).

This paper is the culmination of a line of work (“nearest neighbor search for general distances”), which I’m involved with for three years (see also our previous paper, where we essentially settle the case of symmetric normed spaces). The connection with spectral gaps we discovered is, in my opinion, a true gem, which we had been overlooking for years. I will blog about it in a greater detail later.

Nonlinear Dimension Reduction via Outer Bi-Lipschitz Extensions (joint with Sepideh Mahabadi, Konstantin Makarychev and Yury Makarychev). Paper, poster. We develop a new geometric primitive, an outer bi-Lipschitz extension, and use it to solve two open problems raised by Elkin, Filtzer and Neiman (posed here and here). In both cases, we obtain new dimension reduction results, which are inherently nonlinear.

2018