UI Overview

This section describes Comet.ml's web-based user interface.

Comet.ml organizes all of your runs as Experiments. Workspaces contain Projects which house your Experiments. We will explore each of these concepts in the sections below.

Workspaces

When you first create your Comet.ml account, we create a default workspace for you with your username. This default workspace contains your personal private and public projects. In addition, you can create additional workspaces. Each workspace can have its own set of collaborators.

Info

You can allow collaborators on your personal workspace and create private projects by upgrading your plan. If you ever have any questions, feel free to send us a note at support@comet.ml or on our Slack channel.

To create a new workspace:

  • From any page, click the top right button with your workspace name. Use the dropdown menu to select Switch Workspace -> View all workspaces

  • Click Create a workspace to create a new workspace

Screenshot

  • Name your workspace and then click the newly created workspace to start using it!

Adding Collaborators to a Workspace

From any page, click the top right dropdown menu with your workspace name (initially your username). Use the dropdown menu to select Settings. From the settings menu choose Collaborators and then click Add Collaborators.

Warning

Workspace projects will automatically be shared and visible to the all the collaborators in the workspace. If you choose to set it to Public it would be also shared with the rest of the world and accessible to anyone who has the direct link to your workspace.

Rotating API Key

Under the Settings page of a workspace, you can see and rotate your API key for your workspace.

  • In the Developer Information of Settings, you have access to your API key and REST API key. Screenshot

  • If you want to rotate your API key, click Generate API key to replace the existing workspace API key. Then click confirm. Screenshot Screenshot

  • A new API key will be generated for your workspace. Screenshot

If you have any experiments currently started with your old API key, Comet ML will stop collecting information from that run. The old API key will no longer be valid, so be sure to update anywhere you have it.

Workspace Views

The Workspace can be displayed as either a grid of Project cards, or in a list. In either view, you can:

  • Filter projects by owner, and/or visibility (public vs private)
  • Search projects by name or description
  • Sort projects by name, last updated, experiment counts, or creation date

In the Project List View, you have access to the config button in the right-most column to:

  • Edit a project's name, description, or visibility (public vs private)
  • Delete a project
  • Share a project
  • Change the project's icon image

In the Project Grid View, you have access to the above functionality through the three vertical dots in the upper left-hand corner of each card.

Projects

A Project is a collection of experiments. A Project is either private (viewable and editable by all collaborators with the proper permission level) or is public (editable by the owner, and viewable by everyone). You can set a project’s public/private visibility under the Manage section of the Project View:

Project View

Project View

The Project View will be your primary working area at Comet.ml. Here you can manage, select, and analyze all, or subsets, of the project’s experiments.

Your Project View contains tabs to manage your project’s artifacts including:

  • Experiments: takes you back to the project view.
  • Notes: allows you to make notes in markdown on this project.
  • Files: shows all of the files of the associated experiments. Under the Files tab, clicking the "View experiments" shows all experiments using this particular file. You can also link a Github repository to this project.
  • Archive: shows all experiments for this project that have been archived. From here you can permanently delete or restore an experiment or group of experiments.

The Project View also presents useful shortcuts to begin logging and sharing experiments such as:

  • API key: get your personal Comet API key.
  • Add experiment: get detailed code for adding an experiment through the programming libraries.
  • Share: get the shareable URL for this project. Viewers do not need Comet accounts to see the Project View, however, their ability to save adjustments to the Project View is limited.

The second row of the Project View shows breadcrumbs that help you navigate between:

  • Current workspace: click it to see all projects in the workspace
  • Current project: the name of the current project

There are 5 sections for each project:

  1. Experiments: the main Project view; shows filter, project charts, and experiment table.
  2. Notes: a page of markdown notes for the project
  3. Files: a list of all of the files in a project, and the experiments that use them
  4. Manage: control the visibility of a project and create shareable links
  5. Archive: a list of archived experiments for this project
Manage Section

The Manage section allows you to control a project's visibility. Here, you can make a project public (anyone can see the project) or private (you choose who can see the project).

In addition, you can create Shareable Links. These links work, even if a project is private. Each project can have a single, shareable link.

Project Visibility

At any time, you can delete a shareable link by clicking the trashcan icon next to the link.

Shareable Link

Saving Project Views

As you make changes to your Project View, these changes will be saved temporarily with your current URL. At any time in the future you can decide to save or abandon those changes.

To save a View:

  1. Click Save View
  2. Decide to save the view in the current view, or in a new view by selecting "Create new template" and giving it a name
  3. Click Save

To abandon a View, simply click the Discard link.

If you have made a Project visible (public), then you can share links with a particular view selected. Simply create the view and share the URL. The URL contains information about which view to use.

Info

Items that are saved with the view: selected queries and filters, Experiment Table columns, position, and widths; Project Visualizations section expand/collapse setting and charts. Items that are not saved: selected areas in charts.

To make a custom view the default for this project:

  1. Click the Change View dropdown
  2. Hover over the new custom project view option in the dropdown
  3. Click the check icon to set that view as the default

Now the newly saved default view will be used to determine which project view we render for you. At any time you can also delete a project view by selecting it under the view button and clicking the delete icon.

If you make changes to this view and you wish to keep them, you will need to:

  1. Click Save View.
  2. Select Save changes to the view… option, or Create a new view.
  3. Click the Save or Create button.

Navigate to the next section of the documentation to learn more about the Query Builder and Project Visualizations. We will also cover the Experiments Table in a subsequent section.

Query Builder

The Query Builder allows you to select exactly which experiments you would like to see in the Project Visualizations and Experiments Table. Only experiments that match the filters you select will appear in the Project Visualizations and in the Experiments Table.

To begin building and saving queries, use the Query Builder actions:

  • Query selection dropdown (initially set to "All experiments")
  • + Add filter button
  • Clear all button
  • Save query
Adding Filters

Queries are composed of one or multiple filters. To add a filter, simply press the + Add filter button and select which experiment attribute you would like to set as the filter. Depending on the attribute’s data type, we expose different operators such as contains, is null, begins with, boolean values, and more.

Saving and Loading Queries

To save a set of filters, select the Save View button on the right. You will be prompted with a "Save Query" modal dialog box where you can enter a Query name.

If you make changes to a Saved Query and you wish to keep them, you will need to:

  1. Click Save Query.
  2. Select Save changes to the query [Query Name]… option, or Create new query.
  3. Click the Save button.

In order to load a saved Query, select the Queries dropdown to expose a list of your Saved Queries. To return to a project view with all experiments, select the All experiments option in this dropdown.

Project Visualizations

Project Visualizations allow you to view and compare the performance across multiple experiments. Comet.ml currently supports four types of visualizations: line charts, bar charts, scatter charts, and parallel coordinates charts.

Project Visualizations

To begin creating charts, press the + Add Chart button in the Project Visualizations area. As the chart parameters are populated, we render a preview of the chart for you. As referenced in the previous section on the Query Builder, the filters you select will impact which experiments appear in the Project Visualization area.

Info

The charts on the Project View page are editable. If you wish to change an aspect of a chart, simply press on the three-dot option in the top right of the chart and select Edit chart. In this dropdown, you will also have the option to delete the chart, export it as a JPEG or SVG, or reset the zoom.

Line Charts

The Line Chart type at the project level allows you to visualize a metric (the X axis) against (the Y axis) time (duration, wall time, or step), or against another metric.

You can transform the X or Y axis using either smoothing, log scale, or moving average.

At the project level chart editor, you can set the Chart Name, Legend Keys (experiment names or experiment keys), and Outliers (show or ignore).

Click Reset to clear all settings, Done when complete, or outside the window area to abort the new chart process.

Project Visualizations

Bar Charts

The Bar Chart type at the project level allows you to visualize a metric's aggregation (the Y axis bar height, sum by default) against (the X axis) experiments.

You can change the aggregation to count, sum, average, median, mode, rms, standard deviation.

At the project level chart editor, you can set the Chart Name, and Legend Keys (experiment names or experiment keys).

Click Reset to clear all settings, Done when complete, or outside the window area to abort the new chart process.

Project Visualizations

Scatter Charts

The Scatter Chart type at the project level allows you to visualize a metric or parameter (the Y axis) against (the X axis) another metric or parameter.

You can transform the X or Y axis using either smoothing, log scale, moving average, minimum value, maximum value, first value logged, or last value logged.

At the project level chart editor, you can set the Chart Name, Legend Keys (experiment names or experiment keys), and Outliers (show or ignore).

Click Reset to clear all settings, Done when complete, or outside the window area to abort the new chart process.

Project Visualizations

Parallel Coordinates Charts

The Parallel Coordinates Chart type at the project level allows you to visualize a series of metrics or parameters where the Y axis is metric or parameter name, and the X axis is the value of that metric/parameter. The far right vertical line is called the "Target Variable" and is typically the loss or accuracy that you are interested in. This value is often the metric that you are optimizing in a hyperparameter search.

At the project level chart editor, you can set the Chart Name.

Click Reset to clear all settings, Done when complete, or outside the window area to abort the new chart process.

Parallel Coordinates

Mouse over

If you mouseover a line or bar in a chart the associated Experiment name or key will be shown.

Zooming In

To zoom-in on a particular region of a chart, simply click and drag to select the region. The chart area selected will enlarge to see the details in that area. Double-click the chart to reset the chart to the original area.

Legend values (names vs keys)

You can optionally display Experiment Keys or Experiment names on the Project Chart legend. For existing charts, select the vertical three dots in the upper corner of the chart, and select Edit chart. When editing or creating a chart, select experiment names or experiment keys under Legend Keys.

You can edit an experiment's name by clicking on the pencil icon as you mouse over the custom name column in the Experiment Table or in the single Experiment View. If you have not set a name for your Experiment, we set your Experiment Key as the default Experiment name.

For more information on setting the Experiment name programmatically, see the Experiment.set_name() function of the Python SDK.

Experiments

An Experiment is a unit of measurable research that defines a single run with some data/parameters/code/results. Creating an Experiment object in your code will report a new experiment to your Comet.ml project. Your Experiment will automatically track and collect many things and will also allow you to manually report anything.

Info

By default, Experiments are given a random ID for a key. If you would prefer a more human-readable form, you can also give the Experiment a name. You can edit an experiment's name by clicking on the pencil icon as you mouse over the name on the Experiment View page. Also, you can optionally configure a Chart to show the Experiment name rather than the key.

For more information on how to log experiments to Comet, see Quick Start.

Experiment Table

The Experiment Table is a fully-customizable tabular form of your experiments for a given Project. The default columns are: Checkbox, Status, Visible, Color, Tags, Experiment key, Server end time, File name, and Duration. However, you can update the Experiment Table with additional columns.

The default columns have the following meaning:

  • Checkbox: indicates whether experiment is selected for bulk operations (see below).
  • Status: Green check mark indicates that the experiment is completed; spinning arrows indicate that the experiment is still active.
  • Visible: Dark eye indicates that experiment appears in above project charts; grayed eye with slash indicates experiment does not appear in above project charts.
  • Color: The color assigned to this experiment. Comet allows you to assign a particular color to an experiment so that they are easy to spot across all charts.
  • Tags: The tags that have been added to this experiment.
  • Experiment key: The key (or name) given to this experiment.
  • Server end time: The date/time that the experiment ended.
  • File name: The name of the top-level program that created this experiment.
  • Duration: The total length of time the experiment took from start to finish.

To change the color for an experiment click the colored square in the Color column. That will open the color picker window so that you can pick another color. Clicking "Save" will automatically update the color for that experiment in all the charts in the view. Here is the Color selection dialog:

You can perform bulk actions on a selection of Experiments by checking the selection box at the far left of the experiment row. The actions you may perform on your Experiments are:

  • Archive: soft-delete Experiments. Navigate to the Project Archive tab to either restore or permanently delete your archived Experiments.
  • Move: move your Experiment to another Project
  • Tag: add a tag to your Experiment. Select many experiments to tag them all. You can also programmatically populate tags with the Experiment.add_tag() function. To create a new tag, just enter the text and press Enter.
  • Show/Hide: adjust your Experiment’s visibility in the Project Visualizations. The Visible indicator in the Experiment Table is clickable and will either show or hide the Experiment from the Project Visualizations above. You can also select the Experiments in the Project Visualizations to show/hide them.
  • Diff: If you select exactly two Experiments, you can select the Diff button for detailed comparison of two experiments. This includes comparing all aspects of the experiments, including Code and Charts. This is a very useful tool to explore the differences between two experiments.

In addition to the above action, you can also apply these actions to all of the experiments:

  • Group By: group the Experiments by this column
  • Customize Columns: add or remove columns from the Experiment Table

To view a single Experiment in a browser page, select the Experiment key or Experiment Name value in the Experiment Table.

Experiment Tabs

Each Experiment contains a set of tabs where you can view additional details about the Experiment:

Charts Tab

The Charts tab shows all of your charts for this experiment. You can add as many charts as you like by clicking the Add Chart button. Like the Project View, you can set (and save) the View, set the smoothing level, set the X axis, and how to handle outliers.

In addition to the Add Chart button on the Charts tab, you can also change the color that is associated with an experiment.

Simply click the Change Colors link, select the metric name, and select the color from the color palette.

For many Machine Learning frameworks (such as tensorflow, fastai, pytorch, sklearn, mlflow, etc.) many metrics and hyperparameters are automatically logged for you. However, you can also log any metric, parameter, or other value using the following methods:

To see more information on Machine Learning frameworks see the advanced section on frameworks and how to control (auto logging)[python-sdk/Experiment/#experiment__init__] when creating an experiment.

Code Tab

The Code tab contains the source of the program used to run the Experiment. Note that this is only the immediate code that surrounds the Experiment() creation. That would be the script, Jupyter Notebook, or the module, that instantiated the Experiment.

Info

Note that if you run your experiment from a Jupyter-based environment (such as ipython, JupyterLab, or Jupyter Notebook), the Code tab will contain the exact history that was run to execute the Experiment. In addition, you can download this history as a Jupyter Notebook under the Reproduce button on the Experiment View. See more about the Reproduce button here

Hyperparameters Tab

The Hyperparameters tab shows all of the hyperparameters logged during this experiment. Although you may log a hyperparameter multiple times over the run of an experiment, this tab will only show the last reported value.

To retrieve all of the hyperparameters and values, you may use the REST API, also available in the Python SDK.

Metrics Tab

The Metrics tab shows all metrics that were logged during the experiment. The table shows a table with the following attributes per metric:

  • Name: name of metric
  • Value: value of metric
  • Min Value: minimum value of the metric over the experiment
  • Max Value: maximum value of the metric over the experiment

To retrieve all of the values, you may use the REST API, also available in the Python SDK.

Graph Definition Tab

The Graph Definition tab will show the model graph description, if available. Note that only one model per experiment is currently supported.

Output Tab

The Output tab will display all of the standard output during the run of an experiment.

System Metrics Tab

The System Metrics tab shows items such as operating system user and type, Python version, IP address, and hostname. In addition, the System Metrics tab will show charts of your CPU and GPU usage over the course of your experiment. Note that the CPU and GPU usages are reported to Comet.ml about every 60 seconds.

CPU Charts

Installed Packages Tab

The Installed Packages tab lists all of the Python installed packages and their version numbers. Note that the format is the same as used by pip freeze for easily creating reproducible experiments.

Notes Tab

The Notes tab is available for markdown notes on this Experiment. Notes can only be made through the web UI.

HTML Tab

The HTML tab is available for appending HTML-based notes during the run of the Experiment. You can add to this tab by using the Experiment.log_html() and Experiment.log_html_url() methods from the Python SDK.

Graphics Tab

The Graphics tab will be visible when an Experiment has associated images. These are uploaded using the Experiment.log_image() method in the Python SDK. There are several actions you can take to find matching images:

You may:

  • Sort by Name or Step
  • Order by Ascending or Descending
  • Filter by figure name (including Exact and Regex matches)
  • Filter by step

In addition, you may automatically advance the the step filter by pressing the next, previous, and play buttons.

Audio Tab

The Audio tab shows the audio waveform uploaded with the Experiment.log_audio() method in the Python SDK.

In the Audio tab, you can:

  • View the audio files in ascending or descending order
  • Sort by name or step
  • Group by name, step, or context
  • Search by name using either exact match or regular expressions
  • Filter by step

For each waveform, you can hear it by hovering over the visualization and pressing the play indicator. Click in the waveform to start playing at any point during the recording.

Text Tab

The Text tab shows strings logged with the Experiment.log_text() method in the Python SDK.

You can use Experiment.log_text(TEXT, metadata={...}) to keep track of any kind of textual data. For example, for a Natural Language Processing (NLP) model, you could log a subset of the data for easy examination. Any metadata logged with the text will show in the columns next to the text logged.

Confusion Matrices Tab

The Confusion Matrices tab shows so-called "confusion matrices" logged with the Experiment.log_confusion_matrix() method in the Python SDK. The confusion matrix is useful for showing the results of categorization problems, like the MNIST digit classification task. It is called a "confusion matrix" because the visualization makes it clear which categories have been confused with the others. In addition, the Comet Confusion Matrix can also easily show example instances for each cell in the matrix.

Here is a confusion matrix after one epoch of training a neural network on the MNIST digit classification task:

As shown, the default view shows the "confusion" counts between the actual categories (correct or "true", by row) versus the predicted categories (output produced by the network, by column). If there is no confusion, then all the tested patterns would fall into the diagonal cells running from top left to bottom right. In the above image, you can see that actual patterns of 0's (across the top row) have 966 correct classifications (upper, left-hand corner). However, you can see that there were 5 patterns that were "predicted" (designated by the model) as 6's.

If you hold your mouse over the 0-row, 6-column, you will see a popup window similar to the following:

This window indicates that there were 5 instances of a 0 digit that were confused with being a 6 digit. In addition, you can see the counts, and percentages of the cell by row and by column.

If you would like to see some examples of those zero's misclassified as six's (and the confusion matrix was logged appropriately), simply click on the cell. A window with examples will be displayed, as follows:

You can open multiple example windows by simply clicking on additional cells. To see a larger version of an image, click on the image in the example view. There you will also be able to see the index number (the position in the training or test set) of that pattern.

Some additional notes on the Confusion Matrices tab. You can:

  • close all open Example Views by clicking "Close all example views" in the upper, right-hand corner
  • move between multiple logged Confusion Matrices by selecting the name in the selection on upper left
  • display counts (blue), percents by row (green), or percents by column (dark yellow) by changing "Cell value"
  • control what categories are displayed (e.g., select a subset) using Experiment.log_confusion_matrix(selected=[...])
  • compute confusion matrices between hundreds or thousands of categories (only the 25 most confused categories will be shown)
  • display text, URLs, or images for examples in any cell

See the Comet Confusion Matrix tutorial for details on logging your own confusion matrices, and the Experiment.log_confusion_matrix() documentation.

Histograms Tab

The Histograms tab shows time series histograms uploaded with the Experiment.log_histogram_3d() method in the Python SDK. Time series are grouped together via the name given, and assumes that a step value has been set. Step values should be unique, and increasing. If no step has been set, the histogram will not be logged.

Each histogram shows all of the values of a list, tuple, or array (any size or shape). The items are divided into bins based on their individual values, and the bin keeps a running total. The time series runs from earliest (lowest step) in the back, to early steps in the front.

Time series histograms are very useful for seeing weights or activations change over the course of learning.

To remove a histogram chart from the Histograms tab, click on the chart's options under the three vertical dots in its upper right-hand corner, and select "Delete chart". To add a histogram chart back to the Histograms tab, click on the "Add Chart" button on the view. If it is disabled, then that means that there are no additional histograms to view.

Assets Tab

The Assets tab will list all of the images and other assets associated with an experiment. These are uploaded to Comet.ml by using the Experiment.log_image() and Experiment.log_asset() methods from the Python SDK, respectively.

The Stop Button

The Experiment Stop button allows you to stop an experiment that is running on your computer, cluster, or on a remote system while it is reporting to comet.ml. The running experiment will receive the message and raise an InterruptedExperiment exception, within a few seconds (usually less than 10).

If you don't need to handle the exception, you can simply let the script end as usual---just as if you had pressed Control+C. However, if you would like to handle the interrupted script, you can do that as well. Here is an example showing a running experiment, and catching the exception. You could perform custom code in the except clause, if you wished.

from comet_ml import Experiment
from comet_ml.exceptions import InterruptedExperiment

experiment = Experiment()

try:
    model.fit()
except InterruptedExperiment as exc:
    # handle exception here
    experiment.log_other("status", str(exc))
    # other cleanup

model.save()
experiment.log_asset("my_model.hp5")

See also the API.stop_experiment() method.

The Reproduce Button

The Experiment Reproduce button allows you to go back in time and re-run your experiment with the same code, command and parameters.

The Reproduce button is available on the single experiment page:

Reproduce Button

The reproduce screen contains the following information:

  • Environment information: IP, hostname, and user name.
  • Git information: Link to last commit, current branch and a patch with uncommitted changes.
  • Reproduce: The list of commands required to restore the working directory back to the previous state when running the experiment.
git checkout master

git checkout b2628a10b7e678392ab378e07defdcabb54ab9dc

curl https://www.comet.ml/api/rest/v1/git/get-patch?experimentKey=someKey
-H"Authorization: RestApiKey" > patch.zip

unzip patch.zip

git apply git_diff.patch

Info

The set of commands above checks out the correct branch and commit. In case there were uncommitted changes in your code when the experiment was originally launched, the patch will allow you to include those as well.

  • Run command: used to start this experiments
/path/to/python /home/train/keras.py someArg anotherArg

Troubleshooting

If you see the message "Some invalid values were ignored" below a chart, then Comet has detected Infinity, -Infinity, or other invalid values in the logged data (such as a metric or parameter). For display purposes we have skipped over these values. However, you can still get the entire series of data using the REST API.