How to use mahalanobis distance in sklearn DistanceMetrics?
Solution 1
MahalanobisDistance
is expecting a parameter V
which is the covariance matrix, and optionally another parameter VI
which is the inverse of the covariance matrix. Furthermore, both of these parameters are named and not positional.
Also check the docstring for the class MahalanobisDistance
in the file scikit-learn/sklearn/neighbors/dist_metrics.pyx
in the sklearn repo.
Example:
In [18]: import numpy as np
In [19]: from sklearn.datasets import make_classification
In [20]: from sklearn.neighbors import DistanceMetric
In [21]: X, y = make_classification()
In [22]: DistanceMetric.get_metric('mahalanobis', V=np.cov(X))
Out[22]: <sklearn.neighbors.dist_metrics.MahalanobisDistance at 0x107aefa58>
Edit:
For some reasons (bug?), you can't pass the distance object to the NearestNeighbor
constructor, but need to use the name of the distance metric. Also, setting algorithm='auto'
(which defaults to 'ball_tree'
) doesn't seem to work; so given X
from the code above you can do:
In [23]: nn = NearestNeighbors(algorithm='brute',
metric='mahalanobis',
metric_params={'V': np.cov(X)})
# returns the 5 nearest neighbors of that sample
In [24]: nn.fit(X).kneighbors(X[0, :])
Out[24]: (array([[ 0., 3.21120892, 3.81840748, 4.18195987, 4.21977517]]),
array([[ 0, 36, 46, 5, 17]]))
Solution 2
in creating cov matrix using matrix M (X x Y), you need to transpose your matrix M. mahalanobis formula is (x-x1)^t * inverse covmatrix * (x-x1). and as you see first argument is transposed, which means matrix XY changed to YX. in order to product first argument and cov matrix, cov matrix should be in form of YY.
If you just use np.cov(M), it will be XX, using np.cov(M.T), it will be YY.
Comments
-
makansij almost 4 years
Perhaps this is elementary, but I cannot find a good example of using
mahalanobis
distance insklearn
.I can't even get the metric like this:
from sklearn.neighbors import DistanceMetric DistanceMetric.get_metric('mahalanobis')
This throws an error:
TypeError: 0-dimensional array given. Array must be at least two-dimensional
.But, I can't even seem to get it to take an array:
DistanceMetric.get_metric('mahalanobis', [[0.5],[0.7]])
throws:
TypeError: get_metric() takes exactly 1 positional argument (2 given)
I checked out the docs here and here. But, I don't see what types of arguments it is expecting.
Is there an example of using the Mahalanobis distance that I can see? -
makansij over 8 yearshow do you use the distance metric, in, say
nearest neighbors
or clustering? When I try to use it, I getValueError: Metric not valid for algorithm 'auto'
. -
tttthomasssss over 8 years@Sother I've added a
NearestNeighbor
example to my answer. -
makansij over 8 years
-
makansij over 8 yearsI tried using
dm = DistanceMetric.get_metric('mahalanobis',VI=icov)
distance function, and thendb = DBSCAN(eps=x, min_samples=1, metric='pyfunc', func='dm', algorithm='brute').fit(np.array(X_train_numeric))
but it doesn't recognize the"func"
as a parameter. -
tttthomasssss over 8 years@Sother I've never used
mahalanobis
distance withDBSCAN
, but it looks like as if it is not yet properly supported forDBSCAN
- I'd recommend opening an issue on github or asking on thesklearn
mailing list. -
OscarVanL about 4 yearsI get the error 'DeprecationWarning: Got unexpected kwarg V. This will raise an error in a future version.' using your code snippet in the edit. This is because Scikit Learn has depreciated the 'V' argument for Mahalanobis distance. Now 'VI' must be used. This argument is documented here. This is set to inv(cov(X.T)).T, or
np.linalg.inv(np.cov(X_train.transpose())).transpose()
in python. -
tttthomasssss about 4 years@OscarVanL Whats your use-case? When you're using
DBSCAN
, you might need to precompute the pairwise distances yourself, then pass the distance matrix together with ` metric='precomputed'` to the algorithm. -
OscarVanL about 4 yearsI'm not using DBSCAN, I'm using K-NN with Mahalanobis distance as the distance metric.
-
Ian Conway over 3 yearsDo you know if it's possible to use kd tree or ball tree with mahalanobis distance in sklearn? brute force is not an option at the scale I'm working with
-
Vandana Chandola over 3 years
np.cov(X)
gave the following error when I used it to find LOF with Mahalanobis metric : "size of V does not match". Changed it tonp.cov(X.T)
. -
QUEEN over 2 years@tttthomasssss I have often seen that we get an error with Mahalanobis distance by specifying
V=
the covariance mtrix to the code. Instead, when I wroteVI =
, it ran successfully! I read the documentation that VI is just the inverse matrix of V and it's actually optional(we can specify either of one) then why am I getting this error? Surprisingly, it doesn't even happen always, just sometimes. Any idea about this strange behavior?