What We’re Reading: 2/8/21
This week’s reads from the Comet team features some laughs, BERT models and biology, as well as neural networks with attention spans.
Automating my job by using GPT-3 to generate database-ready SQL to answer business questions
Does doing your day job take a little too long? Check out this hilarous blog post about how you can use GPT-3 to generate SQL queries for you.
Boris Feld, Senior Software Engineer
Computing Extremely Accurate Quantiles Using t-Digests
We present on-line algorithms for computing approximations of rank-based statistics that give high accuracy, particularly near the tails of a distribution, with very small sketches. This new algorithm is robust with respect to skewed distributions or ordered datasets and allows separately computed summaries to be combined with no loss in accuracy.
BERTology Meets Biology: Interpreting Attention in Protein Language Models
Through the lens of attention, we analyze the inner workings of the Transformer and explore how the model discerns structural and functional properties of proteins. We also present a three-dimensional visualization of the interaction between attention and protein structure.
Aviad Rosenhek, Head of Product
Attentive Neural Processes
Neural Processes (NPs) approach regression by learning to map a context set of observed input-output pairs to a distribution over regression functions. We show that this greatly improves the accuracy of predictions, results in noticeably faster training, and expands the range of functions that can be modeled.
Doug Blank, Head of Research
Responsible Data Science
This short course is a great introduction to the ethical hard parts of creating machine learning models. I appreciate the easy-to-understand slides as well as wonderful readings.
Ayodele Odubela, Data Science Evangelist/Advocate
Don’t forget to sign up for Sunday morning Data Science Office Hours with Harpreet Sahota from Artists of Data Science. Register Here
Stay tuned for more great reads from our community of Data Scientists!