How to pass a parameter to Scikit-Learn Keras model function

10,673

Solution 1

You can add an input_dim keyword argument to the KerasClassifier constructor:

model = KerasClassifier(build_fn=create_model, input_dim=5, nb_epoch=150, batch_size=10, verbose=0)

Solution 2

Last answer does not work anymore.

An alternative is to return a function from create_model, as KerasClassifier build_fn expects a function:

def create_model(input_dim=None):
    def model():
        # create model
        nn = Sequential()
        nn.add(Dense(12, input_dim=input_dim, init='uniform', activation='relu'))
        nn.add(Dense(6, init='uniform', activation='relu'))
        nn.add(Dense(1, init='uniform', activation='sigmoid'))
        # Compile model
        nn.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
        return nn

    return model

Or even better, according to documentation

sk_params takes both model parameters and fitting parameters. Legal model parameters are the arguments of build_fn. Note that like all other estimators in scikit-learn, build_fn should provide default values for its arguments, so that you could create the estimator without passing any values to sk_params

So you can define your function like this:

def create_model(number_of_features=10): # 10 is the *default value*
    # create model
    nn = Sequential()
    nn.add(Dense(12, input_dim=number_of_features, init='uniform', activation='relu'))
    nn.add(Dense(6, init='uniform', activation='relu'))
    nn.add(Dense(1, init='uniform', activation='sigmoid'))
    # Compile model
    nn.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
    return nn

And create a wrapper:

KerasClassifier(build_fn=create_model, number_of_features=20, epochs=25, batch_size=1000, ...)

Solution 3

To pass a parameter to build_fn model, can be done passing arguments to __init__() and in turn it will be passed to model_build_fn directly. For example, calling KerasClassifier(myparam=10) will result in a model_build_fn(my_param=10)

here's an example:

class MyMultiOutputKerasRegressor(KerasRegressor):
    
    # initializing
    def __init__(self, **kwargs):
        KerasRegressor.__init__(self, **kwargs)
        
    # simpler fit method
    def fit(self, X, y, **kwargs):
        KerasRegressor.fit(self, X, [y]*3, **kwargs)

(...)

def get_quantile_reg_rpf_nn(layers_shape=[50,100,200,100,50], inDim= 4, outDim=1, act='relu'):
          # do model stuff...

(...) initialize the Keras regressor:

base_model = MyMultiOutputKerasRegressor(build_fn=get_quantile_reg_rpf_nn,
                                         layers_shape=[50,100,200,100,50], inDim= 4, 
                                         outDim=1, act='relu', epochs=numEpochs, 
                                         batch_size=batch_size, verbose=0)
Share:
10,673

Related videos on Youtube

Danf
Author by

Danf

Updated on September 27, 2022

Comments

  • Danf
    Danf over 1 year

    I have the following code, using Keras Scikit-Learn Wrapper, which work fine:

    from keras.models import Sequential
    from keras.layers import Dense
    from sklearn import datasets
    from keras.wrappers.scikit_learn import KerasClassifier
    from sklearn.model_selection import StratifiedKFold
    from sklearn.model_selection import cross_val_score
    import numpy as np
    
    
    def create_model():
        # create model
        model = Sequential()
        model.add(Dense(12, input_dim=4, init='uniform', activation='relu'))
        model.add(Dense(6, init='uniform', activation='relu'))
        model.add(Dense(1, init='uniform', activation='sigmoid'))
        # Compile model
        model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
        return model
    
    
    def main():
        """
        Description of main
        """
    
    
        iris = datasets.load_iris()
        X, y = iris.data, iris.target
    
        NOF_ROW, NOF_COL =  X.shape
    
        # evaluate using 10-fold cross validation
        seed = 7
        np.random.seed(seed)
        model = KerasClassifier(build_fn=create_model, nb_epoch=150, batch_size=10, verbose=0)
        kfold = StratifiedKFold(n_splits=10, shuffle=True, random_state=seed)
        results = cross_val_score(model, X, y, cv=kfold)
    
        print(results.mean())
        # 0.666666666667
    
    
    if __name__ == '__main__':
        main()
    

    The pima-indians-diabetes.data can be downloaded here.

    Now what I want to do is to pass a value NOF_COL into a parameter of create_model() function the following way

    model = KerasClassifier(build_fn=create_model(input_dim=NOF_COL), nb_epoch=150, batch_size=10, verbose=0)
    

    With the create_model() function that looks like this:

    def create_model(input_dim=None):
        # create model
        model = Sequential()
        model.add(Dense(12, input_dim=input_dim, init='uniform', activation='relu'))
        model.add(Dense(6, init='uniform', activation='relu'))
        model.add(Dense(1, init='uniform', activation='sigmoid'))
        # Compile model
        model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
        return model
    

    But it fails giving this error:

    TypeError: __call__() takes at least 2 arguments (1 given)
    

    What's the right way to do it?