Error: ValueError: The last dimension of the inputs to `Dense` should be defined. Found `None`
26,606
Solution 1
You have None
in the length of the sequence in the second model.
i2 = Input(shape=(None, 104))
You can't flatten a variable length and have a known size.
You need a known size for Dense
.
Either you use a fixed length instead of None
, or you use a GlobalMaxPooling1D
or a GlobalAveragePooling1D
instead of Flatten
.
Solution 2
For me the problem was that i did not reshape the Tensor before use in the input function
image = tf.reshape(image, [400,400,3])
Author by
user_6396
Updated on June 10, 2020Comments
-
user_6396 almost 4 years
I'm trying to build a lstm model for text classification and I'm receiving an error. This is my entire code that I've tried.
Please let me know what's the reason behind the error and how to fix it.
input1.shape # text data integer coded (37788, 130) input2.shape # multiple category columns(one hot encoded) concatenated together (37788, 104) train_data = [input1, input2] # this is the train data. i1 = Input(shape=(130,), name='input') embeddings = Embedding(input_dim=20000, output_dim=100, input_length=130)(i1) lstm = LSTM(100)(embeddings) flatten = Flatten()(lstm) i2 = Input(shape=(None, 104)) c1 = Conv1D(64, 2, padding='same', activation='relu', kernel_initializer='he_uniform')(i2) c2 = Conv1D(32, kernel_size=3, activation='relu', kernel_initializer='he_uniform')(c1) flatten1 = Flatten()(c2) concat = concatenate([flatten, flatten1]) dense1 = Dense(32, 'relu', kernel_initializer='he_uniform')(concat)
I tried to print shape of conv1d layers and I was getting None for flatten layer. I think it might be the reason for the error.
Tensor("conv1d_81/Identity:0", shape=(None, None, 64), dtype=float32) Tensor("conv1d_82/Identity:0", shape=(None, None, 32), dtype=float32) Tensor("flatten_106/Identity:0", shape=(None, None), dtype=float32)
This is the error I'm getting. How to fix it?
--------------------------------------------------------------------------- ValueError Traceback (most recent call last) <ipython-input-531-31a53fbf3d37> in <module> 14 concat = concatenate([flatten, flatten1]) ---> 15 dense1 = Dense(32, 'relu', kernel_initializer='he_uniform')(concat) 16 drop = Dropout(0.5)(dense1) ~\Anaconda3\lib\site-packages\tensorflow\python\keras\engine\base_layer.py in __call__(self, inputs, *args, **kwargs) 614 # Build layer if applicable (if the `build` method has been 615 # overridden). --> 616 self._maybe_build(inputs) 617 618 # Wrapping `call` function in autograph to allow for dynamic control ~\Anaconda3\lib\site-packages\tensorflow\python\keras\engine\base_layer.py in _maybe_build(self, inputs) 1964 # operations. 1965 with tf_utils.maybe_init_scope(self): -> 1966 self.build(input_shapes) 1967 # We must set self.built since user defined build functions are not 1968 # constrained to set self.built. ~\Anaconda3\lib\site-packages\tensorflow\python\keras\layers\core.py in build(self, input_shape) 1003 input_shape = tensor_shape.TensorShape(input_shape) 1004 if tensor_shape.dimension_value(input_shape[-1]) is None: -> 1005 raise ValueError('The last dimension of the inputs to `Dense` ' 1006 'should be defined. Found `None`.') 1007 last_dim = tensor_shape.dimension_value(input_shape[-1]) ValueError: The last dimension of the inputs to `Dense` should be defined. Found `None`.
-
user_6396 almost 5 yearsWhat should I fill in place of None here? Is it my input2 shape?
-
Daniel Möller almost 5 yearsYes. It's input2 shape.
-
user_6396 almost 5 yearsNow I get this error during model.fit
ValueError: Error when checking input: expected input_1 to have 3 dimensions, but got array with shape (37788, 104)
Why does it expect my input 1 to have 3 dimensions? -
user_6396 almost 5 yearsI combined both input1 and input2 as
train_data = [input1, input2]
and passed it as x to model.fit -
Daniel Möller almost 5 yearsBecause you said your input would have 3 dimensions in
i2
with shape(batch, None, 104)
-
user_6396 almost 5 yearsCan you tell me how do I fix it? It's not a 3d. (input 2 is concatenation of multiple categorical features one hot encoded). I have given
i2 = Input(shape=(37788, 104))
. I haven't specified None. -
Daniel Möller almost 5 yearsWhat is 37788? Length or samples?
-
user_6396 almost 5 yearsIt's number of data points (37788 samples) and 104 are features.
-
Daniel Möller almost 5 yearsThen
shape=(104,)
-
user_6396 almost 5 yearsThat is giving this error
ValueError: Input 0 of layer conv1d_13 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: [None, 104]
-
Daniel Möller almost 5 yearsThat's because Conv1D needs 3d arrays
(samples, length, features)
, not 2d. I don't know what you want to achieve, but you must understand your data and what you want to extract from it. -
Daniel Möller almost 5 yearsIf you want to treat your 104 features as a sequence, you could try to reshape your array to
(37788, 104, 1)
and useshape=(104,1)
. But this would only make sense if the features have similar nature and something could be extracted from treating them like a sequence. -
user_6396 almost 5 years"But this would only make sense if the features have similar nature" can you explain what you meant by that? I had 5 categorical columns and I've used one hot encoder on each of the column and then combined all of them using
np.c_[cat1, cat2, cat3, cat4, cat5]
here cal1,2,3.. are one hot encoded. This returned the shape of(37788, 104)