TensorFlow Deep Learning Model With IRIS Dataset

TensorFlow Deep Learning Model With IRIS Dataset

Import Libraries

In [1]:
import tensorflow as tf
from tensorflow.keras import layers
import pandas as pd
import numpy as np
from tensorflow.keras import datasets, layers, models
from tensorflow.keras.utils import to_categorical

Load the iris dataset

In [2]:
dataFolder = 'input/'
dataFile = dataFolder + "iris.csv"
dataFile
Out[2]:
'input/iris.csv'
In [7]:
df = pd.read_csv(dataFile)
df.head()
Out[7]:
sepal length (cm) sepal width (cm) petal length (cm) petal width (cm) species
0 5.1 3.5 1.4 0.2 Iris-setosa
1 4.9 3.0 1.4 0.2 Iris-setosa
2 4.7 3.2 1.3 0.2 Iris-setosa
3 4.6 3.1 1.5 0.2 Iris-setosa
4 5.0 3.6 1.4 0.2 Iris-setosa

Splitting the data into X and y

In [4]:
X = df.iloc[:,0:4].values
y = df.iloc[:,4].values
In [5]:
print(X[0:5])
print(y[0:5])
[[5.1 3.5 1.4 0.2]
 [4.9 3.  1.4 0.2]
 [4.7 3.2 1.3 0.2]
 [4.6 3.1 1.5 0.2]
 [5.  3.6 1.4 0.2]]
['Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa']
In [34]:
print(X.shape)
print(y.shape)
(150, 4)
(150,)

Convert target into LabelEncoder

sklearn.preprocessing.LabelEncoder
Encode target labels with value between 0 and n_classes-1.

This transformer should be used to encode target values, i.e. y, and not the input X.

Methods

fit(y): Fit label encoder.

fit_transform(y): Fit label encoder and return encoded labels.

get_params([deep]): Get parameters for this estimator.

inverse_transform(y): Transform labels back to original encoding.

set_params(**params): Set the parameters of this estimator.

transform(y): Transform labels to normalized encoding.
In [35]:
from sklearn.preprocessing import LabelEncoder
In [36]:
encoder =  LabelEncoder()
y1 = encoder.fit_transform(y)
In [37]:
print(y1)
[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2
 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
 2 2]

Convert target into one hot encoding

pandas.get_dummies(data, prefix=None, prefixsep='', dummy_na=False, columns=None, sparse=False, drop_first=False, dtype=None)
Convert categorical variable into dummy/indicator variables.
In [38]:
Y = pd.get_dummies(y1).values
In [39]:
print(Y[0:5])
[[1 0 0]
 [1 0 0]
 [1 0 0]
 [1 0 0]
 [1 0 0]]

Convert X and Y into train and test data

In [40]:
from sklearn.model_selection import train_test_split
sklearn.model_selection.train_test_split

sklearn.model_selection.train_test_split(*arrays, test_size=None, train_size=None, random_state=None, shuffle=True, stratify=None): Split arrays or matrices into random train and test subsets

In [41]:
X_train, X_test, y_train, y_test = train_test_split(X, Y, test_size=0.2, random_state=0) 
In [42]:
print(X_train[0:5])
[[6.4 3.1 5.5 1.8]
 [5.4 3.  4.5 1.5]
 [5.2 3.5 1.5 0.2]
 [6.1 3.  4.9 1.8]
 [6.4 2.8 5.6 2.2]]
In [43]:
print(y_train[0:5])
[[0 0 1]
 [0 1 0]
 [1 0 0]
 [0 0 1]
 [0 0 1]]
In [44]:
print(X_test[0:5])
[[5.8 2.8 5.1 2.4]
 [6.  2.2 4.  1. ]
 [5.5 4.2 1.4 0.2]
 [7.3 2.9 6.3 1.8]
 [5.  3.4 1.5 0.2]]
In [45]:
print(y_test[0:5])
[[0 0 1]
 [0 1 0]
 [1 0 0]
 [0 0 1]
 [1 0 0]]

Define a model

In [46]:
model = tf.keras.Sequential([
    tf.keras.layers.Dense(10, activation='relu'),
    tf.keras.layers.Dense(10, activation='relu'),
    tf.keras.layers.Dense(3, activation='softmax')
  ])
model
Out[46]:
<keras.engine.sequential.Sequential at 0x7fa67cf77b90>

Compile the model

In [47]:
model.compile(optimizer='rmsprop',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

Train the model

In [48]:
model.fit(X_train, y_train, batch_size=50, epochs=100)
Epoch 1/100
3/3 [==============================] - 0s 1ms/step - loss: 1.5822 - accuracy: 0.3083
Epoch 2/100
3/3 [==============================] - 0s 1ms/step - loss: 1.3802 - accuracy: 0.3083
Epoch 3/100
3/3 [==============================] - 0s 1ms/step - loss: 1.2646 - accuracy: 0.3083
Epoch 4/100
3/3 [==============================] - 0s 1ms/step - loss: 1.1868 - accuracy: 0.4583
Epoch 5/100
3/3 [==============================] - 0s 1ms/step - loss: 1.1348 - accuracy: 0.6250
Epoch 6/100
3/3 [==============================] - 0s 1ms/step - loss: 1.0955 - accuracy: 0.6333
Epoch 7/100
3/3 [==============================] - 0s 1ms/step - loss: 1.0703 - accuracy: 0.6333
Epoch 8/100
3/3 [==============================] - 0s 1ms/step - loss: 1.0536 - accuracy: 0.6333
Epoch 9/100
3/3 [==============================] - 0s 1ms/step - loss: 1.0343 - accuracy: 0.6333
Epoch 10/100
3/3 [==============================] - 0s 1ms/step - loss: 1.0191 - accuracy: 0.6250
Epoch 11/100
3/3 [==============================] - 0s 1ms/step - loss: 1.0051 - accuracy: 0.6083
Epoch 12/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9913 - accuracy: 0.6167
Epoch 13/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9797 - accuracy: 0.5917
Epoch 14/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9699 - accuracy: 0.5417
Epoch 15/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9593 - accuracy: 0.5417
Epoch 16/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9488 - accuracy: 0.5500
Epoch 17/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9411 - accuracy: 0.5583
Epoch 18/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9332 - accuracy: 0.6083
Epoch 19/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9254 - accuracy: 0.5917
Epoch 20/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9167 - accuracy: 0.5667
Epoch 21/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9094 - accuracy: 0.5833
Epoch 22/100
3/3 [==============================] - 0s 1ms/step - loss: 0.9022 - accuracy: 0.5083
Epoch 23/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8945 - accuracy: 0.5750
Epoch 24/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8871 - accuracy: 0.6083
Epoch 25/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8785 - accuracy: 0.5083
Epoch 26/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8734 - accuracy: 0.5083
Epoch 27/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8649 - accuracy: 0.4000
Epoch 28/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8590 - accuracy: 0.4417
Epoch 29/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8504 - accuracy: 0.5250
Epoch 30/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8429 - accuracy: 0.4500
Epoch 31/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8375 - accuracy: 0.5667
Epoch 32/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8280 - accuracy: 0.4917
Epoch 33/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8214 - accuracy: 0.5417
Epoch 34/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8139 - accuracy: 0.4833
Epoch 35/100
3/3 [==============================] - 0s 1ms/step - loss: 0.8087 - accuracy: 0.5000
Epoch 36/100
3/3 [==============================] - 0s 2ms/step - loss: 0.8006 - accuracy: 0.5083
Epoch 37/100
3/3 [==============================] - 0s 2ms/step - loss: 0.7963 - accuracy: 0.5250
Epoch 38/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7867 - accuracy: 0.5833
Epoch 39/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7802 - accuracy: 0.5417
Epoch 40/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7744 - accuracy: 0.5500
Epoch 41/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7668 - accuracy: 0.5750
Epoch 42/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7643 - accuracy: 0.5583
Epoch 43/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7540 - accuracy: 0.6167
Epoch 44/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7502 - accuracy: 0.6083
Epoch 45/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7443 - accuracy: 0.5833
Epoch 46/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7376 - accuracy: 0.6333
Epoch 47/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7322 - accuracy: 0.6250
Epoch 48/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7240 - accuracy: 0.6667
Epoch 49/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7174 - accuracy: 0.7417
Epoch 50/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7125 - accuracy: 0.7167
Epoch 51/100
3/3 [==============================] - 0s 1ms/step - loss: 0.7057 - accuracy: 0.7750
Epoch 52/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6986 - accuracy: 0.7250
Epoch 53/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6937 - accuracy: 0.7917
Epoch 54/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6854 - accuracy: 0.7833
Epoch 55/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6802 - accuracy: 0.7750
Epoch 56/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6730 - accuracy: 0.8000
Epoch 57/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6662 - accuracy: 0.8167
Epoch 58/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6629 - accuracy: 0.7583
Epoch 59/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6527 - accuracy: 0.8417
Epoch 60/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6474 - accuracy: 0.8167
Epoch 61/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6423 - accuracy: 0.7917
Epoch 62/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6336 - accuracy: 0.7833
Epoch 63/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6269 - accuracy: 0.8083
Epoch 64/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6213 - accuracy: 0.7667
Epoch 65/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6149 - accuracy: 0.8500
Epoch 66/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6112 - accuracy: 0.8417
Epoch 67/100
3/3 [==============================] - 0s 1ms/step - loss: 0.6019 - accuracy: 0.8917
Epoch 68/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5956 - accuracy: 0.8083
Epoch 69/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5906 - accuracy: 0.8667
Epoch 70/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5897 - accuracy: 0.8167
Epoch 71/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5839 - accuracy: 0.8417
Epoch 72/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5743 - accuracy: 0.8500
Epoch 73/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5687 - accuracy: 0.8917
Epoch 74/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5654 - accuracy: 0.8917
Epoch 75/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5589 - accuracy: 0.8333
Epoch 76/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5527 - accuracy: 0.8750
Epoch 77/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5493 - accuracy: 0.8250
Epoch 78/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5442 - accuracy: 0.7750
Epoch 79/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5376 - accuracy: 0.8917
Epoch 80/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5319 - accuracy: 0.8750
Epoch 81/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5271 - accuracy: 0.8917
Epoch 82/100
3/3 [==============================] - 0s 2ms/step - loss: 0.5241 - accuracy: 0.8500
Epoch 83/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5173 - accuracy: 0.9250
Epoch 84/100
3/3 [==============================] - 0s 2ms/step - loss: 0.5114 - accuracy: 0.8750
Epoch 85/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5064 - accuracy: 0.9083
Epoch 86/100
3/3 [==============================] - 0s 1ms/step - loss: 0.5028 - accuracy: 0.8417
Epoch 87/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4970 - accuracy: 0.8583
Epoch 88/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4917 - accuracy: 0.8417
Epoch 89/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4882 - accuracy: 0.8750
Epoch 90/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4811 - accuracy: 0.8750
Epoch 91/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4776 - accuracy: 0.9417
Epoch 92/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4730 - accuracy: 0.8833
Epoch 93/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4675 - accuracy: 0.9167
Epoch 94/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4652 - accuracy: 0.9167
Epoch 95/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4570 - accuracy: 0.9167
Epoch 96/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4519 - accuracy: 0.9250
Epoch 97/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4478 - accuracy: 0.9333
Epoch 98/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4440 - accuracy: 0.9167
Epoch 99/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4407 - accuracy: 0.9333
Epoch 100/100
3/3 [==============================] - 0s 1ms/step - loss: 0.4334 - accuracy: 0.9333
Out[48]:
<keras.callbacks.History at 0x7fa67dfc75d0>

Evaluate the model with test data

In [49]:
loss, accuracy = model.evaluate(X_test, y_test, verbose=0)
print('Test loss:', loss)
print('Test accuracy:', accuracy)
Test loss: 0.45918846130371094
Test accuracy: 0.8999999761581421

Predict test data

In [50]:
y_pred = model.predict(X_test)

y_pred
Out[50]:
array([[0.01254488, 0.31731865, 0.6701365 ],
       [0.1829188 , 0.55744034, 0.25964087],
       [0.93306065, 0.05934964, 0.0075897 ],
       [0.01550111, 0.37400782, 0.61049104],
       [0.8787039 , 0.10484818, 0.01644805],
       [0.00807188, 0.30025113, 0.69167703],
       [0.887851  , 0.09699255, 0.01515646],
       [0.09599724, 0.4902503 , 0.4137524 ],
       [0.09601327, 0.52681005, 0.37717664],
       [0.1576838 , 0.5039611 , 0.33835503],
       [0.02843008, 0.4013509 , 0.5702191 ],
       [0.1024068 , 0.46613193, 0.43146127],
       [0.10233405, 0.50036657, 0.39729938],
       [0.0838444 , 0.4888636 , 0.42729202],
       [0.07652688, 0.4579257 , 0.46554732],
       [0.8725643 , 0.11076253, 0.01667321],
       [0.07333615, 0.43849856, 0.48816523],
       [0.09076718, 0.46572995, 0.4435028 ],
       [0.84125215, 0.13559803, 0.02314978],
       [0.9074204 , 0.08103121, 0.01154847],
       [0.01747888, 0.2941566 , 0.6883645 ],
       [0.06085569, 0.3914893 , 0.54765505],
       [0.8278458 , 0.14341493, 0.02873918],
       [0.83179784, 0.14108923, 0.02711299],
       [0.03413201, 0.37275872, 0.59310925],
       [0.9129406 , 0.07505485, 0.01200446],
       [0.834888  , 0.13713992, 0.02797206],
       [0.13428193, 0.50138426, 0.3643338 ],
       [0.21471201, 0.4879535 , 0.29733452],
       [0.837698  , 0.1370193 , 0.02528269]], dtype=float32)
y_test_class = np.argmax(y_test, axis=1) y_pred_class = np.argmax(y_pred, axis=1)
In [51]:
actual = np.argmax(y_test,axis=1)
predicted = np.argmax(y_pred,axis=1)
In [52]:
print(f"Actual: {actual}")
Actual: [2 1 0 2 0 2 0 1 1 1 2 1 1 1 1 0 1 1 0 0 2 1 0 0 2 0 0 1 1 0]
In [53]:
print(f"Predicted: {predicted}")
Predicted: [2 1 0 2 0 2 0 1 1 1 2 1 1 1 2 0 2 1 0 0 2 2 0 0 2 0 0 1 1 0]

Machine Learning

  1. Deal Banking Marketing Campaign Dataset With Machine Learning

TensorFlow

  1. Difference Between Scalar, Vector, Matrix and Tensor
  2. TensorFlow Deep Learning Model With IRIS Dataset
  3. Sequence to Sequence Learning With Neural Networks To Perform Number Addition
  4. Image Classification Model MobileNet V2 from TensorFlow Hub
  5. Step by Step Intent Recognition With BERT
  6. Sentiment Analysis for Hotel Reviews With NLTK and Keras
  7. Simple Sequence Prediction With LSTM
  8. Image Classification With ResNet50 Model
  9. Predict Amazon Inc Stock Price with Machine Learning
  10. Predict Diabetes With Machine Learning Algorithms
  11. TensorFlow Build Custom Convolutional Neural Network With MNIST Dataset
  12. Deal Banking Marketing Campaign Dataset With Machine Learning

PySpark

  1. How to Parallelize and Distribute Collection in PySpark
  2. Role of StringIndexer and Pipelines in PySpark ML Feature - Part 1
  3. Role of OneHotEncoder and Pipelines in PySpark ML Feature - Part 2
  4. Feature Transformer VectorAssembler in PySpark ML Feature - Part 3
  5. Logistic Regression in PySpark (ML Feature) with Breast Cancer Data Set

PyTorch

  1. Build the Neural Network with PyTorch
  2. Image Classification with PyTorch
  3. Twitter Sentiment Classification In PyTorch
  4. Training an Image Classifier in Pytorch

Natural Language Processing

  1. Spelling Correction Of The Text Data In Natural Language Processing
  2. Handling Text For Machine Learning
  3. Extracting Text From PDF File in Python Using PyPDF2
  4. How to Collect Data Using Twitter API V2 For Natural Language Processing
  5. Converting Text to Features in Natural Language Processing
  6. Extract A Noun Phrase For A Sentence In Natural Language Processing