I have two 3D distributions and I want to run a Kolmogorov–Smirnov test on these two samples to measure their similarity. scipy.stats has an implementation of a 2-sample K-S tests implemented in 1 dimension and I found an implementation in 2 dimensions, but none in 3 dimensions (or N-dimensions).
Can someone implement a 2-sample K-S test for 3D distributions?
The KS test is not easily generalized to multiple dimensions; see the Wikipedia article on the KS test on that question. Even if you can find or create a suitable generalization, I wonder if you really want to do that, as significance testing is generally useless on large data sets.
If you want to quantify the difference between distributions, my advice is to consider entropy-based quantities such as mutual information or the Kullback-Leibler divergence.
Maybe you can say more about what your goals are here.