Dimension of shape in conv1D Dimension of shape in conv1D python python

Dimension of shape in conv1D


td; lr you need to reshape you data to have a spatial dimension for Conv1d to make sense:

X = np.expand_dims(X, axis=2) # reshape (569, 30) to (569, 30, 1) # now input can be set as model.add(Conv1D(2,2,activation='relu',input_shape=(30, 1))

Essentially reshaping a dataset that looks like this:

features    .8, .1, .3  .2, .4, .6  .7, .2, .1  

To:

[[.8.1.3],[.2, .4, .6 ],[.7, .2, .1]] 

Explanation and examples

Normally convolution works over spatial dimensions. The kernel is "convolved" over the dimension producing a tensor. In the case of Conv1D, the kernel is passed over the 'steps' dimension of every example.

You will see Conv1D used in NLP where steps is a number of words in the sentence (padded to some fixed maximum length). The words would be encoded as vectors of length 4.

Here is an example sentence:

jack   .1   .3   -.52   |is     .05  .8,  -.7    |<--- kernel is `convolving` along this dimension.a      .5   .31  -.2    |boy    .5   .8   -.4   \|/

And the way we would set the input to the conv in this case:

maxlen = 4input_dim = 3model.add(Conv1D(2,2,activation='relu',input_shape=(maxlen, input_dim))

In your case, you will treat the features as the spatial dimensions with each feature having length 1. (see below)

Here would be an example from your dataset

att1   .04    |att2   .05    |  < -- kernel convolving along this dimensionatt3   .1     |       notice the features have length 1. eachatt4   .5    \|/      example have these 4 featues.

And we would set the Conv1D example as:

maxlen = num_features = 4 # this would be 30 in your caseinput_dim = 1 # since this is the length of _each_ feature (as shown above)model.add(Conv1D(2,2,activation='relu',input_shape=(maxlen, input_dim))

As you see your dataset has to be reshaped in to (569, 30, 1)use:

X = np.expand_dims(X, axis=2) # reshape (569, 30, 1) # now input can be set as model.add(Conv1D(2,2,activation='relu',input_shape=(30, 1))

Here is a full-fledged example that you can run (I'll use the Functional API)

from keras.models import Modelfrom keras.layers import Conv1D, Dense, MaxPool1D, Flatten, Inputimport numpy as npinp =  Input(shape=(5, 1))conv = Conv1D(filters=2, kernel_size=2)(inp)pool = MaxPool1D(pool_size=2)(conv)flat = Flatten()(pool)dense = Dense(1)(flat)model = Model(inp, dense)model.compile(loss='mse', optimizer='adam')print(model.summary())# get some dataX = np.expand_dims(np.random.randn(10, 5), axis=2)y = np.random.randn(10, 1)# fit modelmodel.fit(X, y)


I have mentioned this in other posts also:

To input a usual feature table data of shape (nrows, ncols) to Conv1d of Keras, following 2 steps are needed:

xtrain.reshape(nrows, ncols, 1)# For conv1d statement: input_shape = (ncols, 1)

For example, taking first 4 features of iris dataset:

To see usual format and its shape:

iris_array = np.array(irisdf.iloc[:,:4].values)print(iris_array[:5])print(iris_array.shape)

The output shows usual format and its shape:

[[5.1 3.5 1.4 0.2] [4.9 3.  1.4 0.2] [4.7 3.2 1.3 0.2] [4.6 3.1 1.5 0.2] [5.  3.6 1.4 0.2]](150, 4)

Following code alters the format:

nrows, ncols = iris_array.shapeiris_array = iris_array.reshape(nrows, ncols, 1)print(iris_array[:5])print(iris_array.shape)

Output of above code data format and its shape:

[[[5.1]  [3.5]  [1.4]  [0.2]] [[4.9]  [3. ]  [1.4]  [0.2]] [[4.7]  [3.2]  [1.3]  [0.2]] [[4.6]  [3.1]  [1.5]  [0.2]] [[5. ]  [3.6]  [1.4]  [0.2]]](150, 4, 1)

This works well for Conv1d of Keras. For input_shape (4,1) is needed.


I had a sparse matrix as input, so I couldn't reshape it without casting to usual array

The solution was to use the keras Reshape layer:

from keras.layers.core import Reshape...model = Sequential()model.add(Reshape((X.shape[1], 1), input_shape=(X.shape[1], )))model.add(Conv1D(2,2,activation='relu'))...