Codementor Events

How to do Image Classification on custom Dataset using TensorFlow

Published Apr 04, 2020
How to do Image Classification on custom Dataset using TensorFlow

Image classification is basically giving some images to the system that belongs to one of the fixed set of classes and then expect the system to put the images into their respective classes.
In my previous article, I have shown you how to begin with Image classification. So if you haven’t read it yet you should check out:basics of image classification
In this article, I am going show you how to do image classification using our own dataset. I will be providing you complete code and other required files used in this article so you can do hands-on with this.
GitHub link
I request you to read this article while executing the code so you can understand each and every line of the code.

Preparing Dataset

We begin by preparing the dataset, as it is the first step to solve any machine learning problem you should do it correctly.
We will be going to use flow_from_directory method present in ImageDataGenerator class in Keras. For using this we need to put our data in the predefined directory structure as shown below:-
0*wl6rLXC0wNL27fnd.png
we just need to place the images into the respective class folder and we are good to go.

Loading Dataset

when we prepared our dataset we need to load it. As here we are using Colaboratory we need to load data to colaboratory workspace. we first need to upload data folder into Google Drive. then we need to mount the Drive with our workspace, for that we will use the following code:

from google.colab import drive
drive.mount(‘/content/drive’)

when we execute this code a link will be generated and a box will appear asking for Authentication code !!!. Now, what to do…?
0*XdGS4u66wU7RZRoC
don’t worry just click onto this link above authentication box and a page appears to login to your google account. As you enter your credentials and log in. Allow the Google Drive File Stream to assess your account and then authentication code will be generated, just copy that code and paste in the box.
congratulations 🎉you have done the hardest part, next is very simple.
Now just go to the uploaded dataset folder like this:-
1*SenFsxSVx0fWKnWTFQTc9g.png
then right-click on the folder and click copy path.
1*sxBL9ZiyRGidthhrxSr2gQ.png
now set data root to the copied path.

data_root = (“<Copied path>”)

it will look like
1*EB6319VtqS3p4Yl95bYRfA.png
just execute this cell.

Creating Training and validation data

As I told you earlier we will use ImageDataGenerator to load data into the model lets see how to do that.
first set image shape

IMAGE_SHAPE = (224, 224) # (height, width) in no. of pixels

set the Training data directory

TRAINING_DATA_DIR = str(data_root)

to rescale the image and split data into training and validation.

datagen_kwargs = dict(rescale=1./255, validation_split=.20)

create train_generator and valid_generator

valid_datagen = tf.keras.preprocessing.image.ImageDataGenerator(**datagen_kwargs)
valid_generator = valid_datagen.flow_from_directory(
TRAINING_DATA_DIR,
subset=”validation”,
shuffle=True,
target_size=IMAGE_SHAPE
)
train_datagen = tf.keras.preprocessing.image.ImageDataGenerator(**datagen_kwargs)
train_generator = train_datagen.flow_from_directory(
TRAINING_DATA_DIR,
subset=”training”,
shuffle=True,
target_size=IMAGE_SHAPE)

as you execute the cell it will show output like this
1*_nSmu4g8YP0vX-bi1qGE-A.png
the first line is for validation data and the second line is for training data.

Visualizing the data

let’s go through images and labels in train_generator
1*Gk0TwJz8yHzf88RP40sLgQ.png
the default batch size is 32, as it is considered appropriate in most of the cases.
(32, 244, 244, 3) means in one batch of images consist of 32 images and 244, 244 is height and width of images and 3 is RGB three colour channels.
label_batch shape is (32, 4) means there are 32 labels and 4 because the labels are in one hot encoded format.
1*EwdqaiOjDPdP9fTo6ejgfA.pngfirst 5 elements in label_batch
let’s see the which indices represents which labels

print (train_generator.class_indices)

1*FUMQ-N2M-FnsnIaxri6w4A.png
now we write all labels into a text file.

labels = ‘\n’.join(sorted(train_generator.class_indices.keys()))
with open(‘labels.txt’, ‘w’) as f:
 f.write(labels)
!cat labels.txt

1*m9CvV2_grJcXF0sxgzeMnw.png

Create a classification model

Here I will show you a glimpse of transfer learning, don’t worry I will create a separate tutorial for Transfer Learning.
we will use TensorFlow hub to Load a pre-trained model.

model = tf.keras.Sequential([
 hub.KerasLayer(“https://tfhub.dev/google/tf2-preview/mobilenet_v2/feature_vector/4", 
 output_shape=[1280],
 trainable=False),
 tf.keras.layers.Dropout(0.4),
 tf.keras.layers.Dense(train_generator.num_classes, activation=’softmax’)
])
model.build([None, 224, 224, 3])
model.summary()

This model has some unique features of its own,

  1. this code will work even if your data set has a different number of classes
  2. as we are using transfer learning, it will produce good results even if you have a small dataset.
  3. This model is easy to deploy on mobile or Raspberry pi like devices. (please don’t worry, I will cover this separately in a different article so stay tuned😉).
    the output of this code will something look like this:
    1*qErCnQNk10j2mqXXPBwVGQ.png

Train Model

before training we need to compile the model, os let’s COMPILE

optimizer = tf.keras.optimizers.Adam(lr=1e-3)
model.compile(
 optimizer=optimizer,
 loss=’categorical_crossentropy’,
 metrics=[‘acc’])

now we can train
0*nfK9wSKcxcJCu-rT
just joking😂 let’s do real training

steps_per_epoch = np.ceil(train_generator.samples/train_generator.batch_size)
val_steps_per_epoch = np.ceil(valid_generator.samples/valid_generator.batch_size)
hist = model.fit(
train_generator, 
epochs=100,
verbose=1,
steps_per_epoch=steps_per_epoch,
validation_data=valid_generator,
validation_steps=val_steps_per_epoch).history

this code will train the model for 100 epochs.
WARNING: training can take time so have patience..

0*eZ7sEsim6iyY851b
CONGRATULATIONS 🙌 YOU HAVE SUCCESSFULLY TRAINED YOUR MODEL
Now let’s see how good is our model.

final_loss, final_accuracy = model.evaluate(valid_generator, steps = val_steps_per_epoch)
print(“Final loss: {:.2f}”.format(final_loss))
print(“Final accuracy: {:.2f}%”.format(final_accuracy * 100))

looking good? if not, try training for some more epochs, then see the magic 🧝‍♀️.

Plotting some graphs

these plots will help you know how well training has been done.

plt.figure()
plt.ylabel(“Loss (training and validation)”)
plt.xlabel(“Training Steps”)
plt.ylim([0,50])
plt.plot(hist[“loss”])
plt.plot(hist[“val_loss”])
plt.figure()
plt.ylabel(“Accuracy (training and validation)”)
plt.xlabel(“Training Steps”)
plt.ylim([0,1])
plt.plot(hist[“acc”])
plt.plot(hist[“val_acc”])

1*VYXd-zO3Ow8DP4-JgBbwDQ.png
the plot will look something like this, the orange line is for validation accuracy and blue is for training accuracy.

Checking the performance of the model

tf_model_predictions = model.predict(val_image_batch)
print(“Prediction results shape:”, tf_model_predictions.shape)
plt.figure(figsize=(10,9))
plt.subplots_adjust(hspace=0.5)
for n in range((len(predicted_labels)-2)):
 plt.subplot(6,5,n+1)
 plt.imshow(val_image_batch[n])
 color = “green” if predicted_ids[n] == true_label_ids[n] else “red”
 plt.title(predicted_labels[n].title(), color=color)
 plt.axis(‘off’)
_ = plt.suptitle(“Model predictions (green: correct, red: incorrect)”)

will show you how well is the model trained by doing predictions.
1*Ix5ApDHadbJx-myP5NS0wg.png
Thank you for bearing with me and reading this long post.
please share your feedback and suggestions.
I promise more article will come very soon, i hope you enjyed it.
0*AEuSyYtX1RsAzP5R

Discover and read more posts from Aryan Pegwar
get started
post commentsBe the first to share your opinion
news knowledge
3 years ago

hi, you did a great thing i appreciate you but one thing you did not mention
“val_image_batch” images module .
can you share and also explain some logic regarding mobileNet

Tthanks

Caspar Matthys
4 years ago

Hi Aryan, is it possible to use the model to predict images that are not in the initial dataset? I want to use the model in an application to recognize ingredients

Show more replies