In this paper, we first introduce Ball Divergence, a novel measure of the ifference between two probability measures in separable Banach spaces/Metric space, and show that the Ball Divergence of two probability measures is zero if and only f these two probability measures are identical without any moment assumption.
Using Ball Divergence, we present a metric rank test procedure to detect he equality of distribution measures underlying independent samples.
It is therefore robust to outliers or heavy-tail data. We show that this multivariate wo sample test statistic is consistent with the Ball Divergence, and t converges to a mixture of Chi-Square distributions under the null hypothesis and normal distribution under the alternative hypothesis. Importantly, we prove ts consistency against a general alternative hypothesis. Moreover, this result oes not depend on the ratio of the two imbalanced sample sizes, ensuring hat can be applied to imbalanced data. Numerical studies confirm that our est is superior to several existing tests in terms of Type I error and power.
We conclude our paper with two applications of our method: one is for virtual creening in drug development process and the other is for genome wide xpression analysis in hormone replacement therapy.