ASOT paper accepted to IEEE Trans. on Neural Networks and Learning Systems
“Anchor Space Optimal Transport: Accelerating Batch Processing of Multiple OT Problems” has been accepted to IEEE Trans. on Neural Networks and Learning Systems
Our paper has been accepted to IEEE Trans. on Neural Networks and Learning Systems (TNNLS) (IF:10.2), one of the top-ranked journals in the machine learning field.
-
Anchor Space Optimal Transport: Accelerating Batch Processing of Multiple OT Problems
- Author: Jianming Huang (D3), Xun Su (D1), Zhongxi Fang (D2), and HK
- Abstract: The optimal transport (OT) theory provides an effective way to compare probability distributions on a defined metric space, but it suffers from cubic computational complexity. Although Sinkhorn’s algorithm greatly reduces the computational complexity of OT solutions, the solutions of multiple OT problems are still time-consuming and memory-consuming in practice. However, many works on the computational acceleration of OT are usually based on the premise of a single OT problem, ignoring the potential common characteristics of the distributions in a mini-batch. Therefore, we propose a translated OT problem designated as the anchor space optimal transport (ASOT) problem, which is specially designed for batch processing of multiple OT problem solutions. For the proposed ASOT problem, the distributions will be mapped into a shared anchor point space, which learns the potential common characteristics and thus help accelerate OT batch processing. Based on the proposed ASOT, the Wasserstein distance error to the original OT problem is proven to be bounded by ground cost errors. Building upon this, we propose three methods to learn an anchor space minimizing the distance error, each of which has its application background. Numerical experiments on real-world datasets show that our proposed methods can greatly reduce computational time while maintaining reasonable approximation performance.
- Paper: arXiv preprint arXiv:2310.16123
- Source code: Will be available soon.