Comet.ml Confusion Matrix

This page is available as an executable or viewable Jupyter Notebook:


Comet.ml can generate a variety of visualizations, including line charts, scatter charts, bar charts, and histograms. This notebook explores Comet's confusion matrix chart.

Setup

The first thing we'll do in this notebook tutorial is install comet_ml and other items that we'll need for this demonstration. That will include keras, tensorflow, and numpy.

First, comet_ml (you may want to do this slightly differently on your computer):

In [ ]:
%pip install --upgrade --upgrade-strategy eager --user comet_ml 

And now tensorflow, keras, and numpy:

In [ ]:
%pip install --upgrade --upgrade-strategy eager --user keras tensorflow numpy

As the output may suggest, if anything got updated, it might be a good idea to restart the kernel and continue from here.

Comet Configuration

To run the following experiments, you'll need to set your COMET_API_KEY. The easiest way to to this is to set the values in a cell like this:

import comet_ml

comet_ml.config.save(api_key="...")

where you replace the ...'s with your key.

You can get your COMET_API_KEY under your quickstart link (replace YOUR_USERNAME with your Comet.ml username):

https://www.comet.ml/YOUR_USERNAME/quickstart

Example 1: Simple Confusion Matrix

First, we will create an experiment:

In [3]:
from comet_ml import Experiment

We're not interested at the moment in logging environment details or the code and related items, so I'll not log those:

In [4]:
experiment = Experiment(project_name="confusion-matrix", log_env_details=False, log_code=False)
COMET INFO: Experiment is live on comet.ml https://www.comet.ml/dsblank/confusion-matrix/83b0b7a63fcd40eda9197bdad32774f9

As a simple example, let's consider that we have these six patterns that are our output targets (desired output):

In [5]:
desired_output = [
    [1, 0, 0],
    [0, 1, 0],
    [0, 0, 1],
    [1, 0, 0],
    [0, 1, 0],
    [0, 0, 1],
 ]

Imagine that this is a classification task where each target (desired output) is composed of three output values, with one unit "on" (set to 1) and the others "off" (set to 0). This is sometimes called a "one-hot" representation and is a common way of representing categories. There are 6 patterns, where there are 2 each for category.

Now, let's make up some sample data that an model might produce. Let's say initially that the output is pretty random and doesn't even add up to 1 for each row. This may be unrealistic as many such classification tasks might use an error/loss output metric that is based on cross entropy which would make the sum of values closer to 1. That might be desirable, but is not required for our example here.

In [6]:
actual_output = [
    [0.1, 0.5, 0.4],
    [0.2, 0.2, 0.3],
    [0.7, 0.4, 0.5],
    [0.3, 0.8, 0.3],
    [0.0, 0.5, 0.3],
    [0.1, 0.5, 0.5],
 ]

Our goal now is to visualize how much the model mixes up the categories. That is, we'd like to see the Confusion Matrix comparing all categories against each other. We can do that easily by simply logging it with the experiment:

In [7]:
experiment.log_confusion_matrix(desired_output, actual_output);

That's it! We can now end the experiment and take a look at the resulting matrix:

In [8]:
experiment.end()
COMET INFO: ---------------------------
COMET INFO: Comet.ml Experiment Summary
COMET INFO: ---------------------------
COMET INFO:   Data:
COMET INFO:     display_summary_level : 1
COMET INFO:     url                   : https://www.comet.ml/dsblank/confusion-matrix/83b0b7a63fcd40eda9197bdad32774f9
COMET INFO:   Uploads:
COMET INFO:     confusion-matrix : 1
COMET INFO: ---------------------------
COMET INFO: Uploading stats to Comet before program termination (may take several seconds)
In [9]:
experiment.display(tab="confusion-matrices")

For more details on this tab, please see the details on the Confusion Matrix user interface.

Example #2: Log Confusion Matrices During Learning

This example will create a series of confusion matrices showing how the model gets less confused as training proceeds.

We will train the standard MNIST digit classification task.

We import the items that we will need:

In [10]:
from tensorflow.keras.callbacks import Callback
from tensorflow.keras.layers import Dense
from tensorflow.keras.models import Sequential
from tensorflow.keras.optimizers import RMSprop
from tensorflow.keras.utils import to_categorical

from keras.datasets import mnist
Using TensorFlow backend.

We load the training set:

In [11]:
num_classes = 10

# the data, shuffled and split between train and test sets
(x_train, y_train), (x_test, y_test) = mnist.load_data()

x_train = x_train.reshape(60000, 784)
x_test = x_test.reshape(10000, 784)
x_train = x_train.astype("float32")
x_test = x_test.astype("float32")
x_train /= 255
x_test /= 255

# convert class vectors to binary class matrices
y_train = to_categorical(y_train, num_classes)
y_test = to_categorical(y_test, num_classes)

Define a function to create the model:

In [12]:
def create_model():
    model = Sequential()
    model.add(Dense(128, activation="sigmoid", input_shape=(784,)))
    model.add(Dense(128, activation="sigmoid"))
    model.add(Dense(128, activation="sigmoid"))
    model.add(Dense(10, activation="softmax"))
    model.compile(
        loss="categorical_crossentropy", optimizer=RMSprop(), metrics=["accuracy"]
    )
    return model

Next, we define a Keras callback to log the confusion matrix:

In [13]:
class ConfusionMatrixCallback(Callback):
    def __init__(self, experiment, inputs, targets):
        self.experiment = experiment
        self.inputs = inputs
        self.targets = targets

    def on_epoch_end(self, epoch, logs={}):
        predicted = self.model.predict(self.inputs)
        self.experiment.log_confusion_matrix(
            self.targets,
            predicted,
            title="Confusion Matrix, Epoch #%d" % (epoch + 1),
            file_name="confusion-matrix-%03d.json" % (epoch + 1),
        )

And create another Comet experiment:

In [14]:
experiment = Experiment(project_name="confusion-matrix", log_env_details=False, log_code=False)
COMET INFO: Experiment is live on comet.ml https://www.comet.ml/dsblank/confusion-matrix/520ae40116b542339a1e3af9b36c6c13

Before any training, we want to log the confusion so that we can see what it looks like before any adjusting of weights in the network:

In [15]:
model = create_model()

y_predicted = model.predict(x_test)

We also supply the step (zero, before training), a title, and file_name:

In [16]:
experiment.log_confusion_matrix(
    y_test,
    y_predicted,
    step=0,
    title="Confusion Matrix, Epoch #0",
    file_name="confusion-matrix-%03d.json" % 0,
);

We now create the callback and train the data for 5 epochs:

In [17]:
callback = ConfusionMatrixCallback(experiment, x_test, y_test)

model.fit(
    x_train,
    y_train,
    batch_size=120,
    epochs=5,
    callbacks=[callback],
    validation_data=(x_test, y_test),
)
COMET INFO: Ignoring automatic log_parameter('verbose') because 'keras:verbose' is in COMET_LOGGING_PARAMETERS_IGNORE
Epoch 1/5
500/500 [==============================] - 3s 7ms/step - loss: 0.7692 - accuracy: 0.7839 - val_loss: 0.3215 - val_accuracy: 0.9076
Epoch 2/5
500/500 [==============================] - 5s 9ms/step - loss: 0.2624 - accuracy: 0.9231 - val_loss: 0.2152 - val_accuracy: 0.9364
Epoch 3/5
500/500 [==============================] - 5s 9ms/step - loss: 0.1884 - accuracy: 0.9438 - val_loss: 0.1677 - val_accuracy: 0.9499
Epoch 4/5
500/500 [==============================] - 6s 12ms/step - loss: 0.1467 - accuracy: 0.9561 - val_loss: 0.1340 - val_accuracy: 0.9600
Epoch 5/5
500/500 [==============================] - 4s 9ms/step - loss: 0.1215 - accuracy: 0.9641 - val_loss: 0.1334 - val_accuracy: 0.9605
Out[17]:
<tensorflow.python.keras.callbacks.History at 0x7feac736c048>
In [18]:
experiment.end()
COMET INFO: ---------------------------
COMET INFO: Comet.ml Experiment Summary
COMET INFO: ---------------------------
COMET INFO:   Data:
COMET INFO:     display_summary_level : 1
COMET INFO:     url                   : https://www.comet.ml/dsblank/confusion-matrix/520ae40116b542339a1e3af9b36c6c13
COMET INFO:   Metrics [count] (min, max):
COMET INFO:     accuracy [5]                 : (0.7838666439056396, 0.9641166925430298)
COMET INFO:     batch_accuracy [250]         : (0.13636364042758942, 1.0)
COMET INFO:     batch_loss [250]             : (0.04782987758517265, 2.5684618949890137)
COMET INFO:     epoch_duration [5]           : (4.1854288920003455, 5.977348262007581)
COMET INFO:     loss [5]                     : (0.12148473411798477, 0.7692136764526367)
COMET INFO:     val_accuracy [5]             : (0.9075999855995178, 0.9605000019073486)
COMET INFO:     val_loss [5]                 : (0.13343778252601624, 0.32149606943130493)
COMET INFO:     validate_batch_accuracy [45] : (0.8801587224006653, 0.9916666746139526)
COMET INFO:     validate_batch_loss [45]     : (0.03737679868936539, 0.3995896577835083)
COMET INFO:   Others:
COMET INFO:     trainable_params : 134794
COMET INFO:   Parameters:
COMET INFO:     Optimizer             : RMSprop
COMET INFO:     RMSprop_centered      : 1
COMET INFO:     RMSprop_decay         : 1
COMET INFO:     RMSprop_epsilon       : 1e-07
COMET INFO:     RMSprop_learning_rate : 0.001
COMET INFO:     RMSprop_momentum      : 1
COMET INFO:     RMSprop_name          : RMSprop
COMET INFO:     RMSprop_rho           : 0.9
COMET INFO:     epochs                : 5
COMET INFO:     steps                 : 500
COMET INFO:   Uploads [count]:
COMET INFO:     confusion-matrix [6] : 6
COMET INFO:     model graph          : 1
COMET INFO: ---------------------------
COMET INFO: Uploading stats to Comet before program termination (may take several seconds)

Now we take a look at the matrices created over the training. You can switch between confusion matrices by selecting the name in the upper, left-hand corner.

In [19]:
experiment.display(tab="confusion-matrices")

Example 3: Create Images for Each Sample

For this example, we will create images for each example, up to 25 examples per cell.

To create an example for each item, we write an index_to_example function that takes an index position (offset into the training data), create and log an image, and then return the assetId as a key in a dict:

In [20]:
def index_to_example(index):
    image_array = x_test[index]
    image_name = "confusion-matrix-%05d.png" % index
    results = experiment.log_image(
        image_array, name=image_name, image_shape=(28, 28, 1)
    )
    # Return sample, assetId (index is added automatically)
    return {"sample": image_name, "assetId": results["imageId"]}

We'll do the same steps as above:

  1. create an experiment
  2. create the model
  3. log inital confusion
  4. create a callback
  5. train the model
  6. display the experiment
In [21]:
experiment = Experiment(project_name="confusion-matrix", log_env_details=False, log_code=False)
COMET INFO: Experiment is live on comet.ml https://www.comet.ml/dsblank/confusion-matrix/87134010b2024530ac576c94f9ca13f0

In [22]:
model = create_model()

y_predicted = model.predict(x_test)
In [23]:
experiment.log_confusion_matrix(
    y_test,
    y_predicted,
    step=0,
    title="Confusion Matrix, Epoch #0",
    file_name="confusion-matrix-%03d.json" % 0,
    index_to_example_function=index_to_example,
);
In [24]:
class ConfusionMatrixCallbackWithImages(Callback):
    def __init__(self, experiment, inputs, targets):
        self.experiment = experiment
        self.inputs = inputs
        self.targets = targets

    def on_epoch_end(self, epoch, logs={}):
        predicted = self.model.predict(self.inputs)
        self.experiment.log_confusion_matrix(
            self.targets,
            predicted,
            title="Confusion Matrix, Epoch #%d" % (epoch + 1),
            file_name="confusion-matrix-%03d.json" % (epoch + 1),
            index_to_example_function=index_to_example,
        )

Now we'll train as before.

NOTE: this takes a lot longer than before, but we'll see how to speed this up in the next example.

In [25]:
callback = ConfusionMatrixCallbackWithImages(experiment, x_test, y_test)

model.fit(
    x_train,
    y_train,
    batch_size=120,
    epochs=5,
    callbacks=[callback],
    validation_data=(x_test, y_test),
)
Epoch 1/5
500/500 [==============================] - 53s 105ms/step - loss: 0.7570 - accuracy: 0.7906 - val_loss: 0.3016 - val_accuracy: 0.9126
Epoch 2/5
500/500 [==============================] - 51s 103ms/step - loss: 0.2610 - accuracy: 0.9223 - val_loss: 0.2186 - val_accuracy: 0.9340
Epoch 3/5
500/500 [==============================] - 45s 90ms/step - loss: 0.1938 - accuracy: 0.9425 - val_loss: 0.1755 - val_accuracy: 0.9462
Epoch 4/5
500/500 [==============================] - 44s 87ms/step - loss: 0.1525 - accuracy: 0.9553 - val_loss: 0.1425 - val_accuracy: 0.9579
Epoch 5/5
500/500 [==============================] - 45s 90ms/step - loss: 0.1253 - accuracy: 0.9629 - val_loss: 0.1394 - val_accuracy: 0.9576
Out[25]:
<tensorflow.python.keras.callbacks.History at 0x7feb7660c0b8>
In [26]:
experiment.end()
COMET INFO: ---------------------------
COMET INFO: Comet.ml Experiment Summary
COMET INFO: ---------------------------
COMET INFO:   Data:
COMET INFO:     display_summary_level : 1
COMET INFO:     url                   : https://www.comet.ml/dsblank/confusion-matrix/87134010b2024530ac576c94f9ca13f0
COMET INFO:   Metrics [count] (min, max):
COMET INFO:     accuracy [5]                 : (0.7905833125114441, 0.9628999829292297)
COMET INFO:     batch_accuracy [250]         : (0.05833333358168602, 0.9635313749313354)
COMET INFO:     batch_loss [250]             : (0.11790867149829865, 2.550222158432007)
COMET INFO:     epoch_duration [5]           : (43.598175827995874, 53.311691359995166)
COMET INFO:     loss [5]                     : (0.1252802461385727, 0.75697922706604)
COMET INFO:     val_accuracy [5]             : (0.9125999808311462, 0.9578999876976013)
COMET INFO:     val_loss [5]                 : (0.1394064873456955, 0.3016434609889984)
COMET INFO:     validate_batch_accuracy [45] : (0.8910568952560425, 0.9833333492279053)
COMET INFO:     validate_batch_loss [45]     : (0.057086922228336334, 0.3748510479927063)
COMET INFO:   Others:
COMET INFO:     trainable_params : 134794
COMET INFO:   Parameters:
COMET INFO:     Optimizer             : RMSprop
COMET INFO:     RMSprop_centered      : 1
COMET INFO:     RMSprop_decay         : 1
COMET INFO:     RMSprop_epsilon       : 1e-07
COMET INFO:     RMSprop_learning_rate : 0.001
COMET INFO:     RMSprop_momentum      : 1
COMET INFO:     RMSprop_name          : RMSprop
COMET INFO:     RMSprop_rho           : 0.9
COMET INFO:     epochs                : 5
COMET INFO:     steps                 : 500
COMET INFO:   Uploads [count]:
COMET INFO:     confusion-matrix [6] : 6
COMET INFO:     images [4190]        : 4190
COMET INFO:     model graph          : 1
COMET INFO: ---------------------------
COMET INFO: Uploading stats to Comet before program termination (may take several seconds)
In [27]:
experiment.display(tab="confusion-matrices")

What is very nice about this is that if you click on a cell, you can see examples of the types of digits that fall into this group. For example if you click in the cell counting the confusion between 8's and 0's you'll see a sample of exactly which of those images.

However, there is a large issue with this example. Looking at the summary above, you can see that many thousands of images were uploaded. In addition, if you explore the confusion matrices over the course of learning, you'll see different examples for every epoch. The next example fixes that issue.

Example 4: Reuse ConfusionMatrix instance

Now, we want to create example images for each of the cells in the matrix. In addition, we want to re-use the images if we can.

For this, we will create a ConfusionMatrix instance and re-use it.

In [28]:
from comet_ml import ConfusionMatrix

We create a callback, like, before; however, this time we will keep track of an instance of the ConfusionMatrix:

In [29]:
class ConfusionMatrixCallbackReuseImages(Callback):
    def __init__(self, experiment, inputs, targets, confusion_matrix):
        self.experiment = experiment
        self.inputs = inputs
        self.targets = targets
        self.confusion_matrix = confusion_matrix

    def on_epoch_end(self, epoch, logs={}):
        predicted = self.model.predict(self.inputs)
        self.confusion_matrix.compute_matrix(self.targets, predicted)
        self.experiment.log_confusion_matrix(
            matrix=self.confusion_matrix,
            title="Confusion Matrix, Epoch #%d" % (epoch + 1),
            file_name="confusion-matrix-%03d.json" % (epoch + 1),
        )

We create another Comet experiment:

In [30]:
experiment = Experiment(project_name="confusion-matrix", log_env_details=False, log_code=False)
COMET INFO: Experiment is live on comet.ml https://www.comet.ml/dsblank/confusion-matrix/ebe3ed63818d4aac8495b45bd8807e6c

And another model:

In [31]:
model = create_model()

Again, before training, we log the confusion matrix:

In [32]:
# Before any training:
y_predicted = model.predict(x_test)

First, we make an instance, passing in the index_to_example function:

In [33]:
confusion_matrix = ConfusionMatrix(index_to_example_function=index_to_example)

Now, we use the comet_matrix method of the ConfusionMatrix class:

In [34]:
confusion_matrix.compute_matrix(y_test, y_predicted)

We can use the ConfusionMatrix instance to see a rough ASCII version:

In [35]:
confusion_matrix.display()
   A                Confusion Matrix            
   c               Predicted Category           
   t       0   1   2   3   4   5   6   7   8   9
   u   0   0   0   0   0   0   0 980   0   0   0
   a   1   0   0   0   0   0   0 113   0   0   0
   l   2   0   0   0   0   0   0 103   0   0   0
       3   0   0   0   0   0   0 101   0   0   0
   C   4   0   0   0   0   0   0 982   0   0   0
   a   5   0   0   0   0   0   0 892   0   0   0
   t   6   0   0   0   0   0   0 958   0   0   0
   e   7   0   0   0   0   0   0 102   0   0   0
   g   8   0   0   0   0   0   0 974   0   0   0
   o   9   0   0   0   0   0   0 100   0   0   0
   r

This time, instead of logging the actual and predicted vectors, we instead pass in the entire ConfusionMatrix as the matrix:

In [36]:
experiment.log_confusion_matrix(
    matrix=confusion_matrix,
    step=0,
    title="Confusion Matrix, Epoch #0",
    file_name="confusion-matrix-%03d.json" % 0,
);

Again, we create callbacks, and train the network (this will take just a little more time, as it is generating the assets on the fly):

In [37]:
callback = ConfusionMatrixCallbackReuseImages(experiment, x_test, y_test, confusion_matrix)

model.fit(
    x_train,
    y_train,
    batch_size=120,
    epochs=5,
    callbacks=[callback],
    validation_data=(x_test, y_test),
)
Epoch 1/5
500/500 [==============================] - 40s 80ms/step - loss: 0.7613 - accuracy: 0.7793 - val_loss: 0.3068 - val_accuracy: 0.9096
Epoch 2/5
500/500 [==============================] - 8s 16ms/step - loss: 0.2611 - accuracy: 0.9230 - val_loss: 0.2284 - val_accuracy: 0.9323
Epoch 3/5
500/500 [==============================] - 9s 17ms/step - loss: 0.1878 - accuracy: 0.9441 - val_loss: 0.1738 - val_accuracy: 0.9469
Epoch 4/5
500/500 [==============================] - 7s 14ms/step - loss: 0.1447 - accuracy: 0.9571 - val_loss: 0.1399 - val_accuracy: 0.9579
Epoch 5/5
500/500 [==============================] - 8s 15ms/step - loss: 0.1184 - accuracy: 0.9649 - val_loss: 0.1288 - val_accuracy: 0.9614
Out[37]:
<tensorflow.python.keras.callbacks.History at 0x7fea802f4c18>

We end the experiment (here you can see how many assets were uploaded):

In [38]:
experiment.end()
COMET INFO: ---------------------------
COMET INFO: Comet.ml Experiment Summary
COMET INFO: ---------------------------
COMET INFO:   Data:
COMET INFO:     display_summary_level : 1
COMET INFO:     url                   : https://www.comet.ml/dsblank/confusion-matrix/ebe3ed63818d4aac8495b45bd8807e6c
COMET INFO:   Metrics [count] (min, max):
COMET INFO:     accuracy [5]                 : (0.779283344745636, 0.9648833274841309)
COMET INFO:     batch_accuracy [250]         : (0.10833333432674408, 0.9668043851852417)
COMET INFO:     batch_loss [250]             : (0.11605541408061981, 2.4975242614746094)
COMET INFO:     epoch_duration [5]           : (7.196464898996055, 40.55458005800028)
COMET INFO:     loss [5]                     : (0.11840632557868958, 0.7613354325294495)
COMET INFO:     val_accuracy [5]             : (0.909600019454956, 0.9613999724388123)
COMET INFO:     val_loss [5]                 : (0.12881015241146088, 0.3067838251590729)
COMET INFO:     validate_batch_accuracy [45] : (0.8863821029663086, 1.0)
COMET INFO:     validate_batch_loss [45]     : (0.03379882872104645, 0.37192603945732117)
COMET INFO:   Others:
COMET INFO:     trainable_params : 134794
COMET INFO:   Parameters:
COMET INFO:     Optimizer             : RMSprop
COMET INFO:     RMSprop_centered      : 1
COMET INFO:     RMSprop_decay         : 1
COMET INFO:     RMSprop_epsilon       : 1e-07
COMET INFO:     RMSprop_learning_rate : 0.001
COMET INFO:     RMSprop_momentum      : 1
COMET INFO:     RMSprop_name          : RMSprop
COMET INFO:     RMSprop_rho           : 0.9
COMET INFO:     epochs                : 5
COMET INFO:     steps                 : 500
COMET INFO:   Uploads [count]:
COMET INFO:     confusion-matrix [6] : 6
COMET INFO:     images [1180]        : 1180
COMET INFO:     model graph          : 1
COMET INFO: ---------------------------
COMET INFO: Uploading stats to Comet before program termination (may take several seconds)
COMET INFO: Waiting for completion of the file uploads (may take several seconds)
COMET INFO: Still uploading

First, you'll notice that this trained much faster than the previous example, and the number of images was reduced by about 75%. That is becaused we reused the examples in each cell where we can.

See the full confusion matrix, complete with sample images in each cell (click on a cell to see the examples):

In [39]:
experiment.display(tab="confusion-matrices")

In the index_to_example function you can return:

  • an integer, representing the index
  • a string, representing text to show in the Example View
  • an URL, representing a link to show in the Example View
  • a {"sample": NAME, "assetId": ASSET-ID} dictionary, representing an image asset

The ConfusionMatrix object allows many options, including:

  • automatically finding the "most confused" categories, if more than 25
  • limit the categories shown (use ConfusionMatrix(selected=[...]))
  • change the row and column labels
  • change the category labels
  • change the title
  • display text, URLs, or images in Example View

Example 5: Using Sets of Examples

Now, we'll use one image for a set of examples. Before we assumed that there was one image for each index. We change that assumption to use one image for a set.

To do this, we'll subclass the ConfusionMatrix and override the method that caches the images.

In [40]:
from comet_ml import ConfusionMatrix

The heart of the solution is to change how we map indices to examples. Since we want to map all of the instances of one class to a single example, when we encounter an new example, we map all of them at once.

In the constructor, we grab all of the labels for all of the patterns. We then override _put_example_in_cache() to do the mapping.

In [41]:
class MyConfusionMatrix(ConfusionMatrix):
    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.my_labels = self.winner_function(y_test)

    def _put_example_in_cache(self, index, example):
        # first, we find all of the items in the same set as index:
        this_label = self.my_labels[index]
        index_set = [index for (index, label) in enumerate(self.my_labels) if label == this_label]
        for key in index_set:
            self._cache_example[key] = example

Everything else is the same:

In [42]:
experiment = Experiment(project_name="confusion-matrix", log_env_details=False, log_code=False)
COMET INFO: Experiment is live on comet.ml https://www.comet.ml/dsblank/confusion-matrix/b3c8ac1e374346b2ae7ec385b189772c

And another model:

In [43]:
model = create_model()

Again, before training, we log the confusion matrix:

In [44]:
# Before any training:
y_predicted = model.predict(x_test)

First, we make an instance, passing in the index_to_example function:

In [45]:
confusion_matrix = MyConfusionMatrix(index_to_example_function=index_to_example)

Now, we use the comet_matrix method of the ConfusionMatrix class:

In [46]:
confusion_matrix.compute_matrix(y_test, y_predicted)

We can use the MyConfusionMatrix instance to see a rough ASCII version:

In [47]:
confusion_matrix.display()
   A                Confusion Matrix            
   c               Predicted Category           
   t       0   1   2   3   4   5   6   7   8   9
   u   0   0   0   0 980   0   0   0   0   0   0
   a   1   0   0   0 113   0   0   0   0   0   0
   l   2   0   0   0 103   0   0   0   0   0   0
       3   0   0   0 101   0   0   0   0   0   0
   C   4   0   0   0 982   0   0   0   0   0   0
   a   5   0   0   0 892   0   0   0   0   0   0
   t   6   0   0   0 958   0   0   0   0   0   0
   e   7   0   0   0 102   0   0   0   0   0   0
   g   8   0   0   0 974   0   0   0   0   0   0
   o   9   0   0   0 100   0   0   0   0   0   0
   r

This time, instead of logging the actual and predicted vectors, we instead pass in the entire MyConfusionMatrix as the matrix:

In [48]:
experiment.log_confusion_matrix(
    matrix=confusion_matrix,
    step=0,
    title="Confusion Matrix, Epoch #0",
    file_name="confusion-matrix-%03d.json" % 0,
);

Again, we create callbacks, and train the network (this will take just a little more time, as it is generating the assets on the fly):

In [49]:
callback = ConfusionMatrixCallbackReuseImages(experiment, x_test, y_test, confusion_matrix)

model.fit(
    x_train,
    y_train,
    batch_size=120,
    epochs=5,
    callbacks=[callback],
    validation_data=(x_test, y_test),
)
Epoch 1/5
500/500 [==============================] - 5s 11ms/step - loss: 0.7659 - accuracy: 0.7782 - val_loss: 0.2914 - val_accuracy: 0.9148
Epoch 2/5
500/500 [==============================] - 5s 11ms/step - loss: 0.2560 - accuracy: 0.9236 - val_loss: 0.2106 - val_accuracy: 0.9373
Epoch 3/5
500/500 [==============================] - 5s 11ms/step - loss: 0.1859 - accuracy: 0.9453 - val_loss: 0.1629 - val_accuracy: 0.9512
Epoch 4/5
500/500 [==============================] - 6s 12ms/step - loss: 0.1475 - accuracy: 0.9561 - val_loss: 0.1415 - val_accuracy: 0.9579
Epoch 5/5
500/500 [==============================] - 5s 10ms/step - loss: 0.1211 - accuracy: 0.9644 - val_loss: 0.1221 - val_accuracy: 0.9626
Out[49]:
<tensorflow.python.keras.callbacks.History at 0x7fea6ce927b8>

We end the experiment (here you can see how many assets were uploaded):

In [50]:
experiment.end()
COMET INFO: ---------------------------
COMET INFO: Comet.ml Experiment Summary
COMET INFO: ---------------------------
COMET INFO:   Data:
COMET INFO:     display_summary_level : 1
COMET INFO:     url                   : https://www.comet.ml/dsblank/confusion-matrix/b3c8ac1e374346b2ae7ec385b189772c
COMET INFO:   Metrics [count] (min, max):
COMET INFO:     accuracy [5]                 : (0.778166651725769, 0.9643833041191101)
COMET INFO:     batch_accuracy [250]         : (0.10000000149011612, 0.9666666388511658)
COMET INFO:     batch_loss [250]             : (0.12138020992279053, 2.4744226932525635)
COMET INFO:     epoch_duration [5]           : (5.151233671000227, 6.078110397997079)
COMET INFO:     loss [5]                     : (0.12112180888652802, 0.765855610370636)
COMET INFO:     val_accuracy [5]             : (0.9147999882698059, 0.9625999927520752)
COMET INFO:     val_loss [5]                 : (0.12206530570983887, 0.29144158959388733)
COMET INFO:     validate_batch_accuracy [45] : (0.892479658126831, 0.9916666746139526)
COMET INFO:     validate_batch_loss [45]     : (0.03572890907526016, 0.3586212992668152)
COMET INFO:   Others:
COMET INFO:     trainable_params : 134794
COMET INFO:   Parameters:
COMET INFO:     Optimizer             : RMSprop
COMET INFO:     RMSprop_centered      : 1
COMET INFO:     RMSprop_decay         : 1
COMET INFO:     RMSprop_epsilon       : 1e-07
COMET INFO:     RMSprop_learning_rate : 0.001
COMET INFO:     RMSprop_momentum      : 1
COMET INFO:     RMSprop_name          : RMSprop
COMET INFO:     RMSprop_rho           : 0.9
COMET INFO:     epochs                : 5
COMET INFO:     steps                 : 500
COMET INFO:   Uploads [count]:
COMET INFO:     confusion-matrix [6] : 6
COMET INFO:     images [10]          : 10
COMET INFO:     model graph          : 1
COMET INFO: ---------------------------
COMET INFO: Uploading stats to Comet before program termination (may take several seconds)
COMET INFO: Waiting for completion of the file uploads (may take several seconds)
COMET INFO: Still uploading

First, you'll notice that this trained much faster than the previous example, and the number of images was exactly 10. That is becaused we used one image for each image set, and we reused each of those between epochs.

See the full confusion matrix, complete with sample images in each cell (click on a cell to see the examples):

In [51]:
experiment.display(tab="confusion-matrices")

Note that each cell has up to 25 examples, but they are all the same. We could have used max_examples_per_cell=1 to show only the single set image.

Conclusion

We hope that this gives you some ideas of how you can use the Comet Confusion Matrix! If you have questions or comments, feel free to visit the Comet issue tracker and leave us a note.