Image Classification With ResNet50 Model

Image Classification With ResNet50 Model

In this blog, we will classify image with pre-trained model ResNet50.

What is ResNet50?

Keras Applications are deep learning models that are made available alongside pre-trained weights. These models can be used for prediction, feature extraction, and fine-tuning.

Weights are downloaded automatically when instantiating a model. They are stored at ~/.keras/models/.

ResNet-50 is a convolutional neural network that is 50 layers deep(48 Convolution layers along with 1 MaxPool and 1 Average Pool layer). A residual neural network (ResNet) is an artificial neural network (ANN) of a kind that stacks residual blocks on top of each other to form a network.

We can load a pretrained version of the network trained on more than a million images from the ImageNet database. The pretrained network can classify images into 1000 object categories, such as keyboard, mouse, pencil, and many animals. The network has an image input size of 224-by-224.

Import libraries

In [1]:
from tensorflow.keras.applications.resnet50 import ResNet50
from tensorflow.keras.utils import plot_model
from tensorflow.keras.preprocessing import image

Create an object of ResNet50 model

tf.keras.applications.resnet50.ResNet50( include_top=True, weights='imagenet', input_tensor=None, input_shape=None, pooling=None, classes=1000, **kwargs ) Instantiates the ResNet50 architecture.

Args include_top: whether to include the fully-connected layer at the top of the network.

weights: one of None (random initialization), 'imagenet' (pre-training on ImageNet), or the path to the weights file to be loaded.
input_tensor: optional Keras tensor (i.e. output of layers.Input()) to use as image input for the model.
input_shape: optional shape tuple, only to be specified if include_top is False (otherwise the input shape has to be (224, 224, 3) (with 'channels_last' data format) or (3, 224, 224) (with 'channels_first' data format). It should have exactly 3 inputs channels, and width and height should be no smaller than 32. E.g. (200, 200, 3) would be one valid value.
pooling: Optional pooling mode for feature extraction when include_top is False.

    1. None means that the output of the model will be the 4D tensor output of the last convolutional block.
    2. avg means that global average pooling will be applied to the output of the last convolutional block, and thus the output of the model will be a 2D tensor.
    3. max means that global max pooling will be applied. 

classes: optional number of classes to classify images into, only to be specified if include_top is True, and if no weights argument is specified.
classifier_activation: A str or callable. The activation function to use on the "top" layer. Ignored unless include_top=True. Set classifier_activation=None to return the logits of the "top" layer. When loading pretrained weights, classifier_activation can only be None or "softmax".

Returns A Keras model instance.

In [2]:
model = ResNet50()

Plot the model

In [3]:
plot_model(model, to_file='output/resnet50_model.png')
Out[3]:

Model Summary

In [4]:
model.summary()
Model: "resnet50"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            [(None, 224, 224, 3) 0                                            
__________________________________________________________________________________________________
conv1_pad (ZeroPadding2D)       (None, 230, 230, 3)  0           input_1[0][0]                    
__________________________________________________________________________________________________
conv1_conv (Conv2D)             (None, 112, 112, 64) 9472        conv1_pad[0][0]                  
__________________________________________________________________________________________________
conv1_bn (BatchNormalization)   (None, 112, 112, 64) 256         conv1_conv[0][0]                 
__________________________________________________________________________________________________
conv1_relu (Activation)         (None, 112, 112, 64) 0           conv1_bn[0][0]                   
__________________________________________________________________________________________________
pool1_pad (ZeroPadding2D)       (None, 114, 114, 64) 0           conv1_relu[0][0]                 
__________________________________________________________________________________________________
pool1_pool (MaxPooling2D)       (None, 56, 56, 64)   0           pool1_pad[0][0]                  
__________________________________________________________________________________________________
conv2_block1_1_conv (Conv2D)    (None, 56, 56, 64)   4160        pool1_pool[0][0]                 
__________________________________________________________________________________________________
conv2_block1_1_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_1_relu (Activation (None, 56, 56, 64)   0           conv2_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_2_conv (Conv2D)    (None, 56, 56, 64)   36928       conv2_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_2_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_2_relu (Activation (None, 56, 56, 64)   0           conv2_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_0_conv (Conv2D)    (None, 56, 56, 256)  16640       pool1_pool[0][0]                 
__________________________________________________________________________________________________
conv2_block1_3_conv (Conv2D)    (None, 56, 56, 256)  16640       conv2_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_0_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_3_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_add (Add)          (None, 56, 56, 256)  0           conv2_block1_0_bn[0][0]          
                                                                 conv2_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_out (Activation)   (None, 56, 56, 256)  0           conv2_block1_add[0][0]           
__________________________________________________________________________________________________
conv2_block2_1_conv (Conv2D)    (None, 56, 56, 64)   16448       conv2_block1_out[0][0]           
__________________________________________________________________________________________________
conv2_block2_1_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_1_relu (Activation (None, 56, 56, 64)   0           conv2_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_2_conv (Conv2D)    (None, 56, 56, 64)   36928       conv2_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_2_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_2_relu (Activation (None, 56, 56, 64)   0           conv2_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_3_conv (Conv2D)    (None, 56, 56, 256)  16640       conv2_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_3_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_add (Add)          (None, 56, 56, 256)  0           conv2_block1_out[0][0]           
                                                                 conv2_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_out (Activation)   (None, 56, 56, 256)  0           conv2_block2_add[0][0]           
__________________________________________________________________________________________________
conv2_block3_1_conv (Conv2D)    (None, 56, 56, 64)   16448       conv2_block2_out[0][0]           
__________________________________________________________________________________________________
conv2_block3_1_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_1_relu (Activation (None, 56, 56, 64)   0           conv2_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_2_conv (Conv2D)    (None, 56, 56, 64)   36928       conv2_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_2_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_2_relu (Activation (None, 56, 56, 64)   0           conv2_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_3_conv (Conv2D)    (None, 56, 56, 256)  16640       conv2_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_3_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_add (Add)          (None, 56, 56, 256)  0           conv2_block2_out[0][0]           
                                                                 conv2_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_out (Activation)   (None, 56, 56, 256)  0           conv2_block3_add[0][0]           
__________________________________________________________________________________________________
conv3_block1_1_conv (Conv2D)    (None, 28, 28, 128)  32896       conv2_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block1_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_1_relu (Activation (None, 28, 28, 128)  0           conv3_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_2_relu (Activation (None, 28, 28, 128)  0           conv3_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_0_conv (Conv2D)    (None, 28, 28, 512)  131584      conv2_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block1_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_0_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_add (Add)          (None, 28, 28, 512)  0           conv3_block1_0_bn[0][0]          
                                                                 conv3_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_out (Activation)   (None, 28, 28, 512)  0           conv3_block1_add[0][0]           
__________________________________________________________________________________________________
conv3_block2_1_conv (Conv2D)    (None, 28, 28, 128)  65664       conv3_block1_out[0][0]           
__________________________________________________________________________________________________
conv3_block2_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_1_relu (Activation (None, 28, 28, 128)  0           conv3_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_2_relu (Activation (None, 28, 28, 128)  0           conv3_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_add (Add)          (None, 28, 28, 512)  0           conv3_block1_out[0][0]           
                                                                 conv3_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_out (Activation)   (None, 28, 28, 512)  0           conv3_block2_add[0][0]           
__________________________________________________________________________________________________
conv3_block3_1_conv (Conv2D)    (None, 28, 28, 128)  65664       conv3_block2_out[0][0]           
__________________________________________________________________________________________________
conv3_block3_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_1_relu (Activation (None, 28, 28, 128)  0           conv3_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_2_relu (Activation (None, 28, 28, 128)  0           conv3_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_add (Add)          (None, 28, 28, 512)  0           conv3_block2_out[0][0]           
                                                                 conv3_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_out (Activation)   (None, 28, 28, 512)  0           conv3_block3_add[0][0]           
__________________________________________________________________________________________________
conv3_block4_1_conv (Conv2D)    (None, 28, 28, 128)  65664       conv3_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block4_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_1_relu (Activation (None, 28, 28, 128)  0           conv3_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_2_relu (Activation (None, 28, 28, 128)  0           conv3_block4_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block4_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block4_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_add (Add)          (None, 28, 28, 512)  0           conv3_block3_out[0][0]           
                                                                 conv3_block4_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_out (Activation)   (None, 28, 28, 512)  0           conv3_block4_add[0][0]           
__________________________________________________________________________________________________
conv4_block1_1_conv (Conv2D)    (None, 14, 14, 256)  131328      conv3_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block1_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_1_relu (Activation (None, 14, 14, 256)  0           conv4_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_2_relu (Activation (None, 14, 14, 256)  0           conv4_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_0_conv (Conv2D)    (None, 14, 14, 1024) 525312      conv3_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block1_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_0_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_add (Add)          (None, 14, 14, 1024) 0           conv4_block1_0_bn[0][0]          
                                                                 conv4_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_out (Activation)   (None, 14, 14, 1024) 0           conv4_block1_add[0][0]           
__________________________________________________________________________________________________
conv4_block2_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block1_out[0][0]           
__________________________________________________________________________________________________
conv4_block2_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_1_relu (Activation (None, 14, 14, 256)  0           conv4_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_2_relu (Activation (None, 14, 14, 256)  0           conv4_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_add (Add)          (None, 14, 14, 1024) 0           conv4_block1_out[0][0]           
                                                                 conv4_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_out (Activation)   (None, 14, 14, 1024) 0           conv4_block2_add[0][0]           
__________________________________________________________________________________________________
conv4_block3_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block2_out[0][0]           
__________________________________________________________________________________________________
conv4_block3_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_1_relu (Activation (None, 14, 14, 256)  0           conv4_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_2_relu (Activation (None, 14, 14, 256)  0           conv4_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_add (Add)          (None, 14, 14, 1024) 0           conv4_block2_out[0][0]           
                                                                 conv4_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_out (Activation)   (None, 14, 14, 1024) 0           conv4_block3_add[0][0]           
__________________________________________________________________________________________________
conv4_block4_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block3_out[0][0]           
__________________________________________________________________________________________________
conv4_block4_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_1_relu (Activation (None, 14, 14, 256)  0           conv4_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_2_relu (Activation (None, 14, 14, 256)  0           conv4_block4_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block4_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block4_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_add (Add)          (None, 14, 14, 1024) 0           conv4_block3_out[0][0]           
                                                                 conv4_block4_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_out (Activation)   (None, 14, 14, 1024) 0           conv4_block4_add[0][0]           
__________________________________________________________________________________________________
conv4_block5_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block5_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_1_relu (Activation (None, 14, 14, 256)  0           conv4_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_2_relu (Activation (None, 14, 14, 256)  0           conv4_block5_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block5_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block5_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_add (Add)          (None, 14, 14, 1024) 0           conv4_block4_out[0][0]           
                                                                 conv4_block5_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_out (Activation)   (None, 14, 14, 1024) 0           conv4_block5_add[0][0]           
__________________________________________________________________________________________________
conv4_block6_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block5_out[0][0]           
__________________________________________________________________________________________________
conv4_block6_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_1_relu (Activation (None, 14, 14, 256)  0           conv4_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block6_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_2_relu (Activation (None, 14, 14, 256)  0           conv4_block6_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block6_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block6_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_add (Add)          (None, 14, 14, 1024) 0           conv4_block5_out[0][0]           
                                                                 conv4_block6_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_out (Activation)   (None, 14, 14, 1024) 0           conv4_block6_add[0][0]           
__________________________________________________________________________________________________
conv5_block1_1_conv (Conv2D)    (None, 7, 7, 512)    524800      conv4_block6_out[0][0]           
__________________________________________________________________________________________________
conv5_block1_1_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_1_relu (Activation (None, 7, 7, 512)    0           conv5_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_2_conv (Conv2D)    (None, 7, 7, 512)    2359808     conv5_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_2_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_2_relu (Activation (None, 7, 7, 512)    0           conv5_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_0_conv (Conv2D)    (None, 7, 7, 2048)   2099200     conv4_block6_out[0][0]           
__________________________________________________________________________________________________
conv5_block1_3_conv (Conv2D)    (None, 7, 7, 2048)   1050624     conv5_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_0_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_3_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_add (Add)          (None, 7, 7, 2048)   0           conv5_block1_0_bn[0][0]          
                                                                 conv5_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_out (Activation)   (None, 7, 7, 2048)   0           conv5_block1_add[0][0]           
__________________________________________________________________________________________________
conv5_block2_1_conv (Conv2D)    (None, 7, 7, 512)    1049088     conv5_block1_out[0][0]           
__________________________________________________________________________________________________
conv5_block2_1_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_1_relu (Activation (None, 7, 7, 512)    0           conv5_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_2_conv (Conv2D)    (None, 7, 7, 512)    2359808     conv5_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_2_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_2_relu (Activation (None, 7, 7, 512)    0           conv5_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_3_conv (Conv2D)    (None, 7, 7, 2048)   1050624     conv5_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_3_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_add (Add)          (None, 7, 7, 2048)   0           conv5_block1_out[0][0]           
                                                                 conv5_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_out (Activation)   (None, 7, 7, 2048)   0           conv5_block2_add[0][0]           
__________________________________________________________________________________________________
conv5_block3_1_conv (Conv2D)    (None, 7, 7, 512)    1049088     conv5_block2_out[0][0]           
__________________________________________________________________________________________________
conv5_block3_1_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_1_relu (Activation (None, 7, 7, 512)    0           conv5_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_2_conv (Conv2D)    (None, 7, 7, 512)    2359808     conv5_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_2_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_2_relu (Activation (None, 7, 7, 512)    0           conv5_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_3_conv (Conv2D)    (None, 7, 7, 2048)   1050624     conv5_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_3_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_add (Add)          (None, 7, 7, 2048)   0           conv5_block2_out[0][0]           
                                                                 conv5_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_out (Activation)   (None, 7, 7, 2048)   0           conv5_block3_add[0][0]           
__________________________________________________________________________________________________
avg_pool (GlobalAveragePooling2 (None, 2048)         0           conv5_block3_out[0][0]           
__________________________________________________________________________________________________
predictions (Dense)             (None, 1000)         2049000     avg_pool[0][0]                   
==================================================================================================
Total params: 25,636,712
Trainable params: 25,583,592
Non-trainable params: 53,120
__________________________________________________________________________________________________

Load image

In [5]:
imageFolder = 'input/'
filePath = imageFolder + 'pizza.jpg'
In [6]:
#dir(image)
In [7]:
#load_img() function to load the image and resize it to 224 x 224 pixels.
image1 = image.load_img(filePath, target_size = (224, 224))
image1
Out[7]:

Convert image into array

In [8]:
transformedImage = image.img_to_array(image1)
print(transformedImage.shape)
(224, 224, 3)
In [9]:
print(transformedImage)
[[[255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]
  ...
  [255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]]

 [[255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]
  ...
  [255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]]

 [[255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]
  ...
  [255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]]

 ...

 [[255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]
  ...
  [255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]]

 [[255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]
  ...
  [255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]]

 [[255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]
  ...
  [255. 255. 255.]
  [255. 255. 255.]
  [255. 255. 255.]]]

Expand the diimension

In [10]:
import numpy as np
In [11]:
transformedImage = np.expand_dims(transformedImage, axis = 0)
transformedImage.shape
Out[11]:
(1, 224, 224, 3)

Preprocess the image

In [12]:
from tensorflow.keras.applications.resnet50 import preprocess_input

Each Keras Application expects a specific kind of input preprocessing. For ResNet, call tf.keras.applications.resnet.preprocess_input on your inputs before passing them to the model. resnet.preprocess_input will convert the input images from RGB to BGR, then will zero-center each color channel with respect to the ImageNet dataset, without scaling.

tf.keras.applications.resnet50.preprocess_input

Preprocesses a tensor or Numpy array encoding a batch of images.

Args

x A floating point numpy.array or a tf.Tensor, 3D or 4D with 3 color channels, with values in the range [0, 255]. The preprocessed data are written over the input data if the data types are compatible. To avoid this behaviour, numpy.copy(x) can be used. data_format Optional data format of the image tensor/array. Defaults to None, in which case the global setting tf.keras.backend.image_data_format() is used (unless you changed it, it defaults to "channels_last").

Returns

Preprocessed numpy.array or a tf.Tensor with type float32.

The images are converted from RGB to BGR, then each color channel is zero-centered with respect to the ImageNet dataset, without scaling.

In [13]:
transformedImage = preprocess_input(transformedImage)
transformedImage
Out[13]:
array([[[[151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         ...,
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ]],

        [[151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         ...,
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ]],

        [[151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         ...,
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ]],

        ...,

        [[151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         ...,
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ]],

        [[151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         ...,
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ]],

        [[151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         ...,
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ],
         [151.061  , 138.22101, 131.32   ]]]], dtype=float32)

Predict

In [14]:
prediction = model.predict(transformedImage)
print(prediction)
[[1.53870553e-06 5.85323760e-05 9.26404553e-08 1.92010498e-06
  3.49395094e-07 8.91648699e-04 3.31347974e-06 2.06661321e-06
  1.61801972e-06 1.25758311e-06 1.56449505e-06 3.02117144e-07
  9.85964903e-07 8.33783673e-08 1.86873677e-07 1.21270732e-05
  8.88726936e-06 7.37905964e-07 1.54196812e-06 2.22080689e-06
  1.21399751e-07 1.14409559e-05 5.64423499e-06 1.84831413e-06
  1.37469328e-06 6.07212996e-07 1.99781243e-06 3.29862523e-05
  3.99163173e-06 4.91129583e-07 4.41850170e-06 2.29254215e-06
  1.81265204e-05 7.38271638e-06 1.59004721e-05 2.52556163e-07
  3.32849208e-06 5.31865908e-07 4.63150238e-04 1.50021310e-06
  7.18194951e-06 2.11703627e-06 3.18732527e-05 9.82023266e-06
  3.05579277e-07 1.12513308e-05 9.90947410e-06 5.75146396e-06
  7.11276243e-07 8.47010710e-08 9.94195602e-07 8.78258197e-06
  3.99042892e-06 1.56396243e-06 1.56569731e-05 5.86996237e-07
  8.02522777e-08 9.29750215e-08 1.10136291e-07 2.01663124e-06
  6.24580525e-06 3.27869998e-07 6.95582116e-08 3.63403560e-05
  1.27778748e-07 2.39432234e-06 2.42995829e-05 3.71117267e-06
  1.43823709e-05 1.39762378e-05 3.10394711e-07 9.74835075e-07
  1.60212224e-07 1.17585823e-05 7.63923879e-07 1.40274869e-05
  2.38092912e-06 4.01359921e-06 2.44797066e-05 6.44233239e-07
  6.24547738e-08 1.44793876e-06 4.17766432e-06 1.72429770e-06
  2.04512375e-07 5.92996969e-07 1.49726168e-06 1.06318771e-06
  4.80214248e-06 5.74303378e-08 2.19312165e-06 3.95035130e-07
  1.41069540e-06 8.26465850e-07 2.40736858e-06 8.91434766e-08
  9.11699408e-06 1.55877024e-06 7.95036073e-08 1.94659521e-07
  1.02836509e-06 3.68154429e-06 5.03817944e-07 4.17960280e-07
  1.19785440e-07 6.25210390e-08 6.44306965e-07 6.54957830e-07
  1.48180288e-05 3.47967421e-06 5.68622781e-05 6.07708353e-05
  4.89793543e-04 2.86027589e-05 1.12802745e-05 8.23203009e-05
  3.62639636e-04 6.95686686e-05 7.32921171e-06 7.40604009e-05
  2.10058218e-07 9.88661704e-05 5.72442104e-06 1.83084358e-05
  7.31591208e-05 1.22486981e-05 2.39502933e-05 1.15966010e-07
  1.39683095e-07 9.84674315e-08 2.57156501e-07 4.74620720e-07
  1.39340884e-07 1.49533514e-06 7.32911701e-07 2.16649312e-07
  2.23305989e-07 4.99465251e-08 1.04369768e-07 2.69558245e-06
  2.41260636e-06 1.27082808e-06 1.25267184e-06 8.21585786e-07
  1.29073555e-07 6.76100029e-08 1.71463995e-07 9.76996580e-06
  2.05084927e-08 3.37506577e-07 5.29437273e-07 2.21728101e-06
  2.97171982e-06 4.24366533e-07 1.63361221e-07 1.41636929e-06
  3.40159204e-07 2.50809671e-06 3.62950175e-07 7.66268272e-07
  9.62158424e-07 8.67716699e-06 1.85419856e-06 8.51784353e-06
  1.37372865e-06 1.10462020e-06 4.90204968e-07 3.24127313e-06
  3.24507755e-06 7.33340300e-07 5.66528513e-07 1.79922074e-06
  1.71804615e-07 3.78429831e-07 1.33217986e-06 1.07242959e-05
  1.68835720e-06 3.05223693e-07 8.99510906e-06 4.45383392e-07
  1.94972785e-07 2.27667755e-07 4.94935102e-06 6.86532985e-06
  6.91350942e-06 1.33566789e-06 4.11007022e-07 5.12683255e-06
  3.70820962e-05 6.30627983e-06 8.84497126e-07 7.41228723e-05
  1.75835410e-07 2.25558369e-06 9.81078188e-07 1.90147894e-05
  3.14344220e-07 6.06962828e-07 1.74595618e-06 3.49890797e-06
  2.84371168e-07 2.00309401e-07 2.28459339e-06 2.92116596e-07
  1.31611145e-07 9.37908226e-07 3.25675319e-06 5.47480749e-06
  2.78291122e-06 1.39056215e-06 2.62806179e-06 3.94595145e-06
  8.14323503e-05 2.26381167e-06 5.00556041e-07 6.35380775e-06
  3.38862310e-06 5.04075342e-06 3.26772306e-06 1.20412515e-05
  7.68659083e-06 8.24354686e-07 5.29988426e-07 7.75682020e-06
  1.98616908e-06 1.78281675e-06 6.19787045e-07 5.47416278e-07
  3.94578024e-07 4.30884512e-07 2.12418428e-07 7.80399660e-06
  8.19838442e-06 7.42403381e-06 1.87213573e-06 2.99497879e-06
  2.13820113e-06 1.74950821e-06 1.21063877e-07 9.63012212e-07
  3.96442090e-07 3.52769575e-06 4.78840008e-08 3.88754927e-07
  1.39230636e-07 1.10214216e-06 6.04042157e-07 1.00213038e-05
  3.03906631e-06 7.23411659e-08 2.53399503e-06 2.03040290e-05
  3.77114338e-06 1.13187070e-06 7.19639593e-06 1.94835866e-06
  2.82538531e-06 4.58509277e-07 2.27444775e-06 3.96705127e-06
  1.00473035e-05 4.33769992e-06 7.29301109e-06 1.93130859e-07
  8.77679213e-07 3.41937323e-07 4.01237259e-07 8.33187460e-06
  4.23973717e-07 5.02086436e-07 2.40817440e-06 5.60335820e-07
  1.19287779e-06 6.27707379e-07 1.92357447e-06 6.51021480e-07
  2.88266801e-05 8.02297450e-07 8.07982019e-08 2.28067711e-06
  6.62354807e-07 1.34801494e-05 7.42853126e-06 3.57599274e-05
  8.64477170e-06 4.75126617e-05 1.01944988e-06 2.01789426e-05
  3.26916488e-05 7.04958347e-06 4.62992784e-06 3.74753836e-06
  9.51256368e-07 2.87790772e-05 1.26289535e-06 8.34852969e-08
  1.06698144e-06 3.19130635e-07 2.36240726e-06 2.81323646e-05
  1.08672175e-07 6.40532107e-06 7.10415588e-06 8.71734392e-06
  4.76608557e-06 7.13357622e-07 3.94667622e-05 4.66370466e-06
  1.98599011e-07 5.39616713e-06 5.92629585e-05 4.77621870e-06
  4.34242247e-06 7.56316751e-07 5.05804855e-05 3.76991716e-06
  1.28639641e-07 2.33457358e-05 4.89046979e-05 5.02692774e-06
  9.49524349e-07 8.09138783e-06 2.13771182e-06 7.38412837e-07
  5.17161743e-06 5.55453562e-06 2.72689540e-05 1.51733141e-02
  3.21300409e-04 7.28452796e-05 2.94669178e-07 8.90772299e-07
  1.16124227e-06 1.34299898e-05 4.08571935e-07 5.86833657e-08
  2.73541787e-07 1.30921933e-06 2.07433959e-06 2.50829402e-07
  3.40826801e-07 3.71209649e-06 5.23607071e-07 1.67802199e-07
  9.25685697e-08 2.68683181e-07 1.28359858e-07 1.42973556e-06
  4.93828736e-07 8.50205026e-08 6.36745312e-07 2.52444693e-07
  7.92015769e-08 8.82682059e-07 1.55085070e-06 6.39432255e-06
  2.97710949e-06 5.03189881e-07 2.11007432e-06 2.03059662e-05
  8.69612450e-07 1.16093656e-06 1.12063972e-05 5.10807013e-07
  3.65043377e-07 1.12876410e-06 8.98407464e-08 2.53863789e-07
  6.60796005e-08 8.05468545e-08 1.84337907e-07 2.37538245e-07
  6.10181345e-08 8.20940386e-07 4.19451311e-07 3.61493591e-08
  3.98123944e-07 2.58724697e-07 1.57893609e-07 3.63235308e-08
  3.12714192e-08 4.11499030e-08 4.03104735e-07 1.39281767e-07
  2.95209247e-07 1.09173907e-05 3.24047420e-07 1.20771165e-06
  5.33794264e-07 5.86127317e-06 3.15439756e-05 3.80380840e-07
  5.60829585e-06 9.89850378e-07 1.45524393e-06 2.08311685e-05
  2.03836044e-06 1.01898104e-05 3.54043209e-06 1.51743052e-05
  4.13852513e-06 4.72343345e-06 4.83310896e-06 1.65718077e-07
  6.90500542e-07 3.41499504e-06 4.47955415e-07 1.68586567e-07
  2.58748738e-08 1.04540071e-04 2.66891184e-05 5.96975260e-05
  1.38908099e-05 1.24915027e-06 1.42024255e-05 2.66319257e-04
  3.15587931e-06 1.81267694e-06 4.05480969e-05 1.51005934e-03
  7.57082034e-06 5.10653808e-06 1.05404151e-05 2.16090957e-05
  3.58787673e-07 5.02249241e-05 3.76844719e-06 4.56071903e-05
  8.32946807e-06 1.31725722e-06 3.83815632e-07 7.99269037e-06
  5.15937654e-06 2.43569707e-06 1.29320612e-03 1.56553906e-05
  1.91956883e-06 1.94840140e-06 9.79265224e-06 4.23115989e-06
  3.01502769e-06 9.80981781e-07 1.40595034e-06 2.26225875e-05
  2.63483753e-06 3.93788787e-06 5.16639033e-04 1.39546219e-05
  4.52422646e-05 4.63280486e-07 3.14529665e-08 6.31678995e-06
  1.23577702e-05 1.61038352e-05 4.41901790e-07 2.78000402e-06
  5.19444393e-06 3.84136401e-05 5.68246037e-07 7.92934861e-06
  2.28192152e-06 1.06723946e-05 6.39691207e-05 9.46951623e-05
  1.00088300e-05 1.13477226e-05 3.43073708e-07 7.65901495e-05
  8.72903684e-07 7.67353140e-06 3.94373405e-04 1.36977192e-07
  8.92726632e-07 6.14692399e-04 3.67976804e-06 1.63830441e-06
  2.09616883e-06 1.25840612e-04 1.30808243e-04 3.15644206e-06
  6.73011243e-07 2.20778511e-05 7.42975635e-06 6.31028342e-07
  8.02592012e-07 6.68893381e-06 1.56571139e-06 8.78863193e-06
  1.24293010e-05 6.18572130e-06 1.65899849e-06 7.67990787e-06
  7.74051514e-05 8.45800787e-06 2.62693302e-05 3.08745734e-06
  1.26298146e-05 1.48454330e-06 6.50272887e-06 1.56917813e-04
  8.50508741e-07 1.74257127e-06 2.59010340e-05 6.63730998e-06
  7.29899839e-05 7.95555661e-06 6.68606617e-07 3.27834059e-05
  1.16561489e-04 9.32795160e-07 7.07800837e-07 2.00061891e-06
  3.15002049e-04 1.08683946e-06 9.76398678e-06 3.64728803e-05
  2.21238724e-06 1.74677234e-06 8.31904060e-07 2.98366522e-05
  1.62667322e-06 1.74942070e-05 2.16884064e-06 3.97774020e-05
  1.72798502e-06 1.87883234e-06 1.90974674e-06 1.39721733e-05
  5.98592896e-06 9.07067006e-05 3.61950893e-04 3.17774320e-05
  6.44417878e-06 9.40149766e-05 4.53462781e-05 1.35535811e-06
  7.61190222e-07 1.34224990e-07 2.12857245e-07 2.38389987e-03
  2.35896152e-07 5.13150189e-05 2.29103316e-05 5.75485792e-05
  7.82409043e-05 9.35889420e-06 1.86913603e-05 1.11755313e-07
  1.20680215e-06 1.07327173e-03 9.29408907e-06 7.35228969e-05
  4.56712623e-05 3.48351591e-06 2.62232220e-07 1.51671571e-07
  1.40157063e-05 6.66870183e-05 9.90691774e-07 2.55864521e-04
  7.45284524e-07 5.65872188e-06 4.45039723e-06 4.85094024e-05
  5.82449047e-06 6.37630649e-07 5.92654430e-07 4.90238192e-03
  2.87052080e-05 6.20911962e-08 8.47553383e-05 3.97088206e-06
  2.59109656e-05 5.05026378e-07 9.00427585e-06 7.47953948e-07
  7.87726265e-07 1.07040851e-05 1.48811050e-05 1.63689731e-06
  1.98998191e-06 1.17303284e-06 4.36494292e-06 1.47203637e-05
  1.47774626e-04 1.28552083e-05 1.60495517e-06 2.39379966e-04
  6.36109849e-04 5.89256633e-05 2.44343883e-05 1.80394563e-04
  9.16518547e-06 1.20246870e-04 3.81838481e-06 5.01759314e-06
  3.43468528e-05 3.34304946e-06 6.32679416e-07 1.57547235e-01
  1.17165770e-03 2.54372007e-05 1.46768807e-05 1.45577204e-07
  6.66150254e-06 2.38625380e-05 9.89227556e-05 7.13658073e-06
  1.12921143e-05 1.48673530e-06 8.46067815e-06 2.27714227e-05
  1.01790920e-06 1.99244405e-05 2.50322660e-06 3.81940072e-06
  4.84577777e-06 1.39808371e-05 3.69011213e-05 2.62217975e-04
  1.49708896e-04 6.65373136e-06 2.11514453e-05 4.03592712e-04
  1.49306004e-07 1.92727001e-07 1.98912551e-03 1.32591333e-06
  2.02121257e-07 9.75741477e-06 4.28912699e-06 1.11929017e-04
  8.51972891e-06 1.63012937e-05 6.90010154e-07 2.07129142e-06
  5.77077386e-04 3.98189059e-06 7.66154608e-06 6.09116250e-06
  5.55321640e-06 8.58842905e-05 1.81135147e-06 4.08615197e-05
  1.00966514e-04 4.29422073e-07 1.34431946e-06 1.35367163e-05
  9.33445244e-06 3.53410701e-06 2.09275022e-05 2.72847858e-04
  9.23249274e-07 6.78314905e-07 3.92951097e-06 2.24568794e-05
  5.66011749e-07 1.32356291e-07 2.61082867e-04 5.00032183e-05
  7.96847962e-06 9.35326341e-08 3.47702553e-05 3.47828234e-07
  1.34234469e-05 1.23786072e-06 2.02708670e-05 1.06650687e-05
  2.05017173e-07 8.36852269e-06 6.78771812e-07 5.80999142e-07
  6.61182457e-06 8.52234152e-05 5.86205279e-05 3.84597541e-07
  3.04491041e-05 1.83770095e-03 2.36671549e-05 7.08835069e-05
  6.37413996e-06 3.22229083e-04 5.50591949e-06 8.87749820e-07
  2.03272994e-05 2.29159014e-07 2.39033208e-04 1.93985041e-08
  4.04760394e-05 3.21573066e-06 9.56043891e-07 3.96362708e-07
  1.43311510e-03 1.41106348e-05 1.35145490e-06 1.49764310e-04
  2.98093772e-04 5.23981998e-06 7.49634921e-07 2.28754216e-05
  7.58403097e-04 5.33811653e-05 2.17883940e-06 5.55966562e-06
  1.25403335e-06 3.18495182e-07 4.76265711e-07 1.78335839e-07
  4.65092171e-06 4.40516276e-03 1.46244798e-04 2.51671881e-05
  9.61067126e-05 2.60366760e-05 4.50471634e-06 3.19998293e-07
  6.31296552e-06 1.99562828e-06 1.79407709e-06 1.26580799e-05
  6.20271658e-06 2.16182907e-05 1.31315392e-05 2.88228171e-06
  2.54041428e-08 3.15248030e-06 6.01277643e-05 1.47714104e-07
  1.10077141e-04 4.03331251e-05 2.75135494e-06 2.64549308e-04
  1.34986158e-06 4.78575021e-05 9.98170080e-08 8.47175179e-05
  3.07159826e-06 1.35126811e-06 6.52730205e-06 5.17082390e-06
  3.25735746e-05 2.94537283e-04 2.02501935e-04 6.88741920e-07
  1.01542707e-06 6.84294491e-07 9.18275327e-06 2.29398884e-05
  1.62794589e-04 2.49814941e-04 5.85482849e-05 4.63152844e-07
  1.41214650e-06 4.18840937e-05 1.16706091e-04 7.39843381e-06
  5.18185698e-06 6.79667301e-06 1.05029276e-05 3.91594222e-06
  3.20321669e-05 9.51127658e-05 8.34018720e-05 1.93852270e-06
  9.95830646e-07 8.03801140e-06 1.21311743e-04 5.76468767e-04
  6.55329904e-06 3.89547116e-04 1.12054668e-05 1.77151060e-06
  5.46803749e-05 1.64973972e-05 2.27212677e-05 5.62534318e-04
  1.63546883e-05 3.55380944e-05 5.46438328e-04 7.86121802e-07
  3.26373481e-07 1.12467978e-06 1.02687800e-05 2.51576741e-04
  2.57262000e-04 2.14626180e-05 9.21752417e-06 3.68353199e-06
  6.19343950e-07 7.02049920e-06 6.93368747e-06 6.73802742e-06
  1.54175214e-04 6.39290192e-06 8.84887850e-05 1.58158491e-06
  7.28246778e-06 3.67952813e-03 4.29928377e-06 1.78288901e-05
  7.31152454e-07 5.12490669e-06 3.84694005e-07 6.23597543e-07
  4.40787298e-05 1.92661673e-06 4.42889677e-06 3.42802764e-06
  1.64293342e-05 2.91078351e-04 1.31207116e-05 1.56140974e-04
  6.11052826e-07 7.79640209e-03 8.80928255e-08 5.45225657e-06
  1.25090867e-06 1.62059948e-07 1.30626686e-05 1.23072198e-07
  1.15891062e-07 7.37652613e-07 6.13193833e-06 1.93041560e-04
  9.88929532e-04 1.92565694e-05 1.09846587e-05 1.54499830e-05
  1.02560422e-04 2.61006932e-07 7.18548181e-05 4.48518898e-04
  1.54674149e-06 1.11624402e-07 2.11985771e-05 6.02653131e-07
  1.22672973e-05 1.03021403e-05 2.15960870e-04 4.71922107e-07
  2.58703494e-05 1.93411961e-05 4.23659330e-05 1.07878532e-05
  1.27313062e-04 3.54099757e-05 9.30514852e-06 7.25549057e-07
  2.63503352e-06 2.13704311e-06 1.16393594e-05 1.38501200e-05
  1.17794305e-04 3.30701459e-06 1.34761640e-05 1.97475209e-04
  5.39369739e-06 3.74851261e-05 6.65793323e-06 7.77897099e-03
  8.55733288e-06 1.59300311e-04 2.05281208e-06 1.09106304e-06
  9.44801968e-07 1.26658435e-07 5.85580847e-06 4.03426839e-06
  7.19833188e-04 1.67179496e-06 7.74698765e-06 2.97516226e-06
  5.22154987e-05 2.33947367e-06 3.46557911e-08 1.02587705e-06
  4.64014520e-05 2.80521931e-06 8.35714127e-06 1.34705566e-04
  2.56133058e-06 2.54226848e-06 2.43441741e-06 3.67154644e-05
  3.52889037e-06 3.64630064e-03 3.15852449e-05 8.97808877e-06
  5.80786946e-07 3.88386798e-06 2.14807997e-07 8.98561790e-04
  9.27077635e-05 2.51295581e-03 4.61454983e-05 7.47689342e-07
  7.16498762e-04 1.16487266e-04 1.69545747e-05 1.04265810e-05
  1.63938432e-06 1.37317193e-06 6.96723801e-05 1.14610884e-04
  1.35250710e-04 1.42671797e-05 5.60227418e-05 3.99102237e-06
  6.66693325e-07 2.15188975e-05 5.63024369e-04 1.00532616e-03
  2.75274601e-06 3.77091669e-06 2.93813756e-08 5.53291443e-07
  8.35859682e-05 3.64875814e-06 6.52101153e-06 7.64131164e-06
  1.98192229e-05 1.00648402e-04 2.14040156e-05 6.83855440e-04
  3.82271898e-03 8.07032993e-05 1.51757222e-05 9.00043378e-05
  3.70337861e-04 3.35331541e-04 1.33020520e-01 2.09528458e-04
  1.81897864e-04 1.28880565e-05 1.36317860e-04 7.83298339e-04
  1.36239105e-04 1.01752591e-03 8.88546463e-04 9.30022565e-04
  2.98320578e-04 2.64274313e-05 2.43091679e-04 5.40798064e-04
  4.88372134e-05 7.07155850e-05 2.78034577e-06 1.37436436e-04
  1.33758201e-03 1.43239023e-02 4.12621628e-03 3.32265190e-05
  4.08352644e-05 1.05674349e-04 3.46671510e-03 2.83698202e-04
  4.36519513e-05 3.30765499e-04 6.87749207e-06 2.19693047e-05
  7.66896293e-04 7.49520259e-04 2.09736195e-03 5.59759676e-01
  8.81122367e-04 3.10025271e-03 4.89941624e-04 1.51975997e-04
  1.63988487e-04 5.77317667e-04 9.96052904e-07 2.20001475e-05
  6.06664025e-06 1.41505634e-05 1.49490745e-06 2.28484282e-07
  7.96640961e-07 4.08848518e-06 7.06164838e-06 8.16827139e-07
  9.98317205e-07 4.92763206e-07 2.58587670e-06 2.31079284e-05
  8.25037375e-07 1.69193709e-06 1.11937982e-06 7.07295490e-03
  5.72726049e-06 4.51224105e-06 5.78030304e-04 1.28231150e-05
  2.75670118e-05 4.17038382e-06 8.92590440e-04 7.93279742e-06
  1.65282865e-04 5.87754221e-05 2.06741178e-03 3.15041398e-05]]
In [15]:
print(prediction.shape)
(1, 1000)
In [16]:
from tensorflow.keras.applications.resnet50 import decode_predictions
In [17]:
predictionLabel = decode_predictions(prediction, top = 1)
predictionLabel
Out[17]:
[[('n07873807', 'pizza', 0.5597597)]]
In [18]:
predictionLabel[0][0]
Out[18]:
('n07873807', 'pizza', 0.5597597)
In [19]:
predictionLabel[0][0][1]
Out[19]:
'pizza'
In [20]:
predictionLabel[0][0][2]*100
Out[20]:
55.97596764564514
In [21]:
print('%s (%.2f%%)' % (predictionLabel[0][0][1], predictionLabel[0][0][2]*100 ))
pizza (55.98%)
In [ ]:
 
In [ ]:
 
In [ ]:
 

Machine Learning

  1. Deal Banking Marketing Campaign Dataset With Machine Learning

TensorFlow

  1. Difference Between Scalar, Vector, Matrix and Tensor
  2. TensorFlow Deep Learning Model With IRIS Dataset
  3. Sequence to Sequence Learning With Neural Networks To Perform Number Addition
  4. Image Classification Model MobileNet V2 from TensorFlow Hub
  5. Step by Step Intent Recognition With BERT
  6. Sentiment Analysis for Hotel Reviews With NLTK and Keras
  7. Simple Sequence Prediction With LSTM
  8. Image Classification With ResNet50 Model
  9. Predict Amazon Inc Stock Price with Machine Learning
  10. Predict Diabetes With Machine Learning Algorithms
  11. TensorFlow Build Custom Convolutional Neural Network With MNIST Dataset
  12. Deal Banking Marketing Campaign Dataset With Machine Learning

PySpark

  1. How to Parallelize and Distribute Collection in PySpark
  2. Role of StringIndexer and Pipelines in PySpark ML Feature - Part 1
  3. Role of OneHotEncoder and Pipelines in PySpark ML Feature - Part 2
  4. Feature Transformer VectorAssembler in PySpark ML Feature - Part 3
  5. Logistic Regression in PySpark (ML Feature) with Breast Cancer Data Set

PyTorch

  1. Build the Neural Network with PyTorch
  2. Image Classification with PyTorch
  3. Twitter Sentiment Classification In PyTorch
  4. Training an Image Classifier in Pytorch

Natural Language Processing

  1. Spelling Correction Of The Text Data In Natural Language Processing
  2. Handling Text For Machine Learning
  3. Extracting Text From PDF File in Python Using PyPDF2
  4. How to Collect Data Using Twitter API V2 For Natural Language Processing
  5. Converting Text to Features in Natural Language Processing
  6. Extract A Noun Phrase For A Sentence In Natural Language Processing