Instagram
youtube
Facebook
Twitter

Flower Classification with TensorFlow (Beginner Friendly)

Welcome! In this guide, we’ll build a Flower Image Classifier step-by-step in a very simple way — as if you're explaining it to yourself. You're using VS Code, your own image file, and want to see how deep learning works practically. Let's dive in!

Step 1: Importing Required Tools

import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
import pathlib

Why?

These are your basic tools:

  • TensorFlow: The engine doing all the deep learning.

  • NumPy: Helps manage numbers and arrays.

  • Matplotlib: For drawing graphs.

  • Pathlib: Handles file paths in a nice way.

Step 2: Point to Your Dataset Folder

data_dir = pathlib.Path("flower_photos")

Why?

You are telling Python: "Hey, my flower images are inside this folder." Each subfolder is a class (e.g., roses, tulips).

Step 3: Create Training and Validation Datasets

train_ds = tf.keras.utils.image_dataset_from_directory(
    data_dir,
    validation_split=0.2,
    subset="training",
    seed=123,
    image_size=(180, 180),
    batch_size=32)

val_ds = tf.keras.utils.image_dataset_from_directory(
    data_dir,
    validation_split=0.2,
    subset="validation",
    seed=123,
    image_size=(180, 180),
    batch_size=32)

Why?

This splits your flowers into two groups:

  • Training (80%): For learning.

  • Validation (20%): For checking how well it's learning.

Step 4: Make the Data Flow Smoother

AUTOTUNE = tf.data.AUTOTUNE
train_ds = train_ds.cache().shuffle(1000).prefetch(buffer_size=AUTOTUNE)
val_ds = val_ds.cache().prefetch(buffer_size=AUTOTUNE)

Why?

Think of it like buffering a YouTube video. It loads ahead so training doesn’t get slow.

Step 5: Image Augmentation

from tensorflow.keras import layers
from tensorflow.keras.models import Sequential

data_augmentation = Sequential([
  layers.RandomFlip("horizontal", input_shape=(180, 180, 3)),
  layers.RandomRotation(0.1),
  layers.RandomZoom(0.1),
])

Why?

To teach the model: "Even if the flower is flipped or zoomed, it's still the same flower."

Step 6: Building the Brain (Model)

model = Sequential([
  data_augmentation,
  layers.Rescaling(1./255),
  layers.Conv2D(16, 3, padding='same', activation='relu'),
  layers.MaxPooling2D(),
  layers.Conv2D(32, 3, padding='same', activation='relu'),
  layers.MaxPooling2D(),
  layers.Conv2D(64, 3, padding='same', activation='relu'),
  layers.MaxPooling2D(),
  layers.Dropout(0.2),
  layers.Flatten(),
  layers.Dense(128, activation='relu'),
  layers.Dense(5)  # Number of flower types
])

Why?

  • It first sees the image (Rescaling to 0-1 pixel values).

  • Then it extracts patterns using filters (Conv2D).

  • MaxPooling helps it focus on the most important parts.

  • Dropout is like forgetting a few things to avoid overconfidence.

  • At the end, it makes a decision (Dense).

Step 7: Compile the Model

model.compile(
  optimizer='adam',
  loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
  metrics=['accuracy'])

Why?

You’re telling the model: “This is how you'll learn and how we'll grade you.”

  • Adam: Smart optimizer.

  • Loss function: Measures how wrong the model is.

  • Accuracy: How often it’s right.


Step 8: Train the Model

history = model.fit(
  train_ds,
  validation_data=val_ds,
  epochs=10)

Why?

This is the real learning phase.

Step 9: Plot Results

acc = history.history['accuracy']
val_acc = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']

plt.figure(figsize=(8, 8))
plt.subplot(1, 2, 1)
plt.plot(acc, label='Training Accuracy')
plt.plot(val_acc, label='Validation Accuracy')
plt.legend()
plt.title('Accuracy')

plt.subplot(1, 2, 2)
plt.plot(loss, label='Training Loss')
plt.plot(val_loss, label='Validation Loss')
plt.legend()
plt.title('Loss')
plt.show()

Why?

Helps you visually check how good (or bad) the training was.

Step 10: Save Your Model

model.save('flower_model.keras')

Why?

To use it later without training again.


Step 11: Predict a New Image

model = tf.keras.models.load_model('flower_model.keras')
class_names = ['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips']

img_path = "Red_sunflower.jpg"
img = tf.keras.utils.load_img(img_path, target_size=(180, 180))
img_array = tf.keras.utils.img_to_array(img)
img_array = tf.expand_dims(img_array, 0)

predictions = model.predict(img_array)
score = tf.nn.softmax(predictions[0])

print(
  "This image most likely belongs to {} with a {:.2f}% confidence."
  .format(class_names[np.argmax(score)], 100 * np.max(score))
)

Why?

You're checking what the model thinks about a brand new flower image.


You Did It!

You created, trained, tested, saved, and reused a machine learning model — all with your own hands in VS Code, with a real image, and simple code.

Let me know if you’d like to convert this into a PDF or zipped project template next!