This question has been asked a couple of times and I tried this, this, this etc, but couldn’t solve my problem. I have just 4 images of approximately 1MB which I’m trying to fit into a deep learning model. However, I keep getting the resource exhaust error. The error occurs in the dense layer but I need all the layers for my analysis. Please can someone help with a workround?
Working with 16.0 GB (15.6 GB usable) computer
def discriminator(shape):
# Adding the input layer and the first hidden layer
ip = Input(shape = (165, 209, 3))
model = Conv2D(64, kernel_size=3, padding='same', strides=1)(ip)
model = BatchNormalization(momentum=0.5)(model)
model = layer(disc_model, 64, 3, 1)
model = layer(disc_model, 128, 3, 1)
model = layer(disc_model, 128, 3, 2)
model = layer(disc_model, 256, 3, 2)
model = layer(disc_model, 256, 3, 2)
model = layer(disc_model, 512, 3, 2)
model = layer(disc_model, 512, 3, 2)
model = Flatten()(model)
model = Dense(1024)(model)
model = LeakyReLU(alpha = 0.2)(model)
model = Dense(1, activation = 'sigmoid')(model)
return Model(inputs = ip, outputs = model)
wv_ip = Input(shape = (115, 209, 3))
mod = discriminator(wv_ip)
mod.summary()
mod.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accurcay', 'mean_square_error'])
mod.summary()
ResourceExhaustedError Traceback (most recent call last)
Cell In[33], line 2
1 wv_ip = Input(shape = (115, 209, 3))
----> 2 mod = discriminator(wv_ip)
3 mod.summary()
5 mod.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accurcay', 'mean_square_error'])
Cell In[32], line 61, in discriminator(shape)
---> 61 model = Dense(1024)(model)
ResourceExhaustedError: OOM when allocating tensor with shape[17656320,1024] and type float on /job