How to put more weight on certain features in machine learning?

26,128

First of all - you should probably not do it. The whole concept of machine learning is to use statistical analysis to assign optimal weights. You are interfering here with the whole concept, thus you need really strong evidence that this is crucial to the process you are trying to model, and for some reason your model is currently missing it.

That being said - there is no general answer. This is purely model specific, some of which will allow you to weight features - in random forest you could bias distribution from which you sample features to analyse towards the ones that you are interested in; in SVM it should be enough to just multiply given feature by a constant - remember when you were told to normalize your features in SVM? This is why - you can use the scale of features to 'steer' your classifier towards given features. The ones with high values will be preffered. This will actually work for any weight norm-regularized model (regularized logistic regression, ridge regression, lasso etc.).

Share:
26,128
28r
Author by

28r

Updated on August 07, 2022

Comments

  • 28r
    28r over 1 year

    If using a library like scikit-learn, how do I assign more weight on certain features in the input to a classifier like SVM? Is this something people do or is there another solution to my problem?