What We’re Reading: 2/22/21
The most recent reads from the Comet team feature image generation models, scaling Kubernetes, testing RNN architectures, and learnings from building over 150 successful industry ML models.
DALL·E: Creating Images from Text
This awesome post from the OpenAI team covers the neural network model they created to generate images from text captions. It’s a great long read (with lots of interesting generated images) that shows off the model’s capabilities.
Dhruv Nair, Data Scientist
Scaling Kubernetes to 7,500 Nodes
Also from OpenAI comes this awesome blog on scaling Kubernetes as few have done before. This is an extension of their work on scaling to a 2,000 Node Kubernetes cluster and details how they ran cluster health checks.
Boris Feld, Senior Software Engineer
An Empirical Exploration of Recurrent Network Architectures
This paper digs into which of over 10,000 Recurrent Neural Networks structures is optimal. RNNs are an extremely powerful sequence model that is often difficult to train. The Long Short-Term Memory(LSTM) is a specific RNN architecture, popular for Natural Language Processing tasks, whose design makes it much easier to train.
Doug Blank, Head of Research
150 successful machine learning models: 6 lessons learned at Booking.com
This paper is a great introduction to the complexes of building ML models. Some of the best takeaways are that Model performance is not the same as business performance and to get feedback on model prediction quality early in the development process.
Ayodele Odubela, Data Science Evangelist/Advocate
Join us for Data Science Office Hours every Sunday. Ask any of your data science questions and join in great discussions on advancements in ML with Comet’s own Ayodele Odubela and host of the Artists of Data Science Podcast, Harpreet Sahota. Sign Up
If you’re someone who spends time building machine learning models try Comet for free and see for yourself how much time you save running and documenting ML experiments.