Jeremie Harris's work on A.I. could dramatically alter your perspective on the field of data science and the bewildering — perhaps downright frightening — impact you and A.I. could make together on the world.
Jeremie:
• Recently co-founded Mercurius, an A.I. safety company.
• Has briefed senior political and policy leaders around the world on long-term risks from A.I., including senior members of the U.K. Cabinet Office, the Canadian Cabinet, as well as the U.S. Departments of State, Homeland Security and Defense.
• Is Host of the excellent Towards Data Science podcast.
• He previously co-founded SharpestMinds, a Y Combinator-backed mentorship marketplace for data scientists.
• He proudly dropped out of his quantum mechanics PhD to found SharpestMinds.
• He hold a Master’s in biological physics from the University of Toronto.
In this episode, Jeremie details:
• What Artificial General Intelligence (AGI) is
• How the development of AGI could happen in our lifetime and could present an existential risk to humans, perhaps even to all life on the planet as we know it.
• How, alternatively, if engineered properly, AGI could herald a moment called the singularity that brings with it a level of prosperity that is not even imaginable today.
• What it takes to become an AI safety expert yourself in order to help align AGI with benevolent human goals
• His forthcoming book on quantum mechanics
• Why almost nobody should do a PhD
Today’s episode is deep and intense, but as usual it does still have a lot of laughs, and it should appeal broadly, no matter whether you’re a technical data science expert already or not.
The SuperDataScience show's available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Filtering by Category: Professional Development
Clem Delangue on Hugging Face and Transformers
In today's SuperDataScience episode, Hugging Face CEO Clem Delangue fills us in on how open-source transformer architectures are accelerating ML capabilities. Recorded for yesterday's ScaleUp:AI conference in NY.
The SuperDataScience show's available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
What Probability Theory Is
This week, we start digging into the actual, uh, theory of Probability Theory. I also highlight the field's relevance to Machine Learning and Statistics. Enjoy 😀
We will publish a new video from my "Probability for Machine Learning" course to YouTube every Wednesday. Playlist is here.
More detail about my broader "ML Foundations" curriculum (which also covers subject areas like Linear Algebra, Calculus, Statistics, Computer Science) and all of the associated open-source code is available in GitHub here.
Daily Habit #8: Math or Computer Science Exercise
This article was originally adapted from a podcast, which you can check out here.
At the beginning of the new year, in Episode #538, I introduced the practice of habit tracking and provided you with a template habit-tracking spreadsheet. Then, we had a series of Five-Minute Fridays that revolved around daily habits I espouse, and that theme continues today. The habits we covered in January and February were related to my morning routine.
Starting last week, we began coverage of habits on intellectual stimulation and productivity. Specifically, last week’s habit was “reading two pages”. This week, we’re moving onward with doing a daily technical exercise; in my case, this is either a mathematics, computer science, or programming exercise.
The reason why I have this daily-technical-exercise habit is that data science is both a limitlessly broad field as well as an ever-evolving field. If we keep learning on a regular basis, we can expand our capabilities and open doors to new professional opportunities. This is one of the driving ideas behind the #66daysofdata hashtag, which — if you haven’t heard of it before — is detailed in episode #555 with Ken Jee, who originated the now-ubiquitous hashtag.
Read MoreA Brief History of Probability Theory
This week's YouTube video is a quick introduction to the fascinating history of Probability Theory. Next week, we'll actually start digging into Probability Theory, uh, theory 😉
We will publish a new video from my "Probability for Machine Learning" course to YouTube every Wednesday. Playlist is here.
More detail about my broader "ML Foundations" curriculum (which also covers subject areas like Linear Algebra, Calculus, Statistics, Computer Science) and all of the associated open-source code is available in GitHub here.
Engineering Data APIs
How you design a data API from scratch and how a data API can leverage machine learning to improve the quality of healthcare delivery are topics covered by Ribbon Health CTO Nate Fox in this week's episode.
Ribbon Health is a New York-based API platform for healthcare data that has raised $55m, including from some of the biggest names in venture capital like Andreessen Horowitz and General Catalyst.
Prior to Ribbon, Nate:
• Worked as an Analytics Engineer at the marketing start-up Unified.
• Was a Product Marketing Manager at Microsoft.
• Obtained a mechanical engineering degree from the Massachusetts Institute of Technology and an MBA from Harvard Business School.
In this episode, Nate details:
• What APIs ("application programming interfaces") are.
• How you design a data API from scratch.
• How Ribbon Health’s data API leverages machine learning models to improve the quality of healthcare delivery.
• How to ensure the uptime and reliability of APIs.
• How scientists and engineers can make a big social impact in health technology.
• His favorite tool for easily scaling up the impact of a data science model to any number of users.
• What he looks for in the data scientists he hires.
Today’s episode has some technical data science and software engineering elements here and there, but much of the conversation should be interesting to anyone who’s keen to understand how data science can play a big part in improving healthcare.
The SuperDataScience show's available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Probability & Information Theory — Subject 5 of Machine Learning Foundations
Last Wednesday, we released the final video of my Calculus course, so today we begin my all-new YouTube course on Probability and Information Theory. This first video is an orientation to the course curriculum, enjoy!
We will publish a new video from my "Probability for Machine Learning" course to YouTube every Wednesday. Playlist is here.
More detail about my broader "ML Foundations" curriculum (which also covers subject areas like Linear Algebra, Calculus, Statistics, Computer Science) and all of the associated open-source code is available in GitHub here.
GPT-3 for Natural Language Processing
With its human-level capacity on tasks as diverse as question-answering, translation, and arithmetic, GPT-3 is a game-changer for A.I. This week's brilliant guest, Melanie Subbiah, was a lead author of the GPT-3 paper.
GPT-3 is a natural language processing (NLP) model with 175 billion parameters that has demonstrated unprecedented and remarkable "few-shot learning" on the diverse tasks mentioned above (translation between languages, question-answering, performing three-digit arithmetic) as well as on many more (discussed in the episode).
Melanie's paper sent shockwaves through the mainstream media and was recognized with an Outstanding Paper Award from NeurIPS (the most prestigious machine learning conference) in 2020.
Melanie:
• Developed GPT-3 while she worked as an A.I. engineer at OpenAI, one of the world’s leading A.I. research outfits.
• Previously worked as an A.I. engineer at Apple.
• Is now pursuing a PhD at Columbia University in the City of New York specializing in NLP.
• Holds a bachelor's in computer science from Williams College.
In this episode, Melanie details:
• What GPT-3 is.
• Why applications of GPT-3 have transformed not only the field of data science but also the broader world.
• The strengths and weaknesses of GPT-3, and how these weaknesses might be addressed with future research.
• Whether transformer-based deep learning models spell doom for creative writers.
• How to address the climate change and bias issues that cloud discussions of large natural language models.
• The machine learning tools she’s most excited about.
This episode does have technical elements that will appeal primarily to practicing data scientists, but Melanie and I put an effort into explaining concepts and providing context wherever we could so hopefully much of this fun, laugh-filled episode will be engaging and informative to anyone who’s keen to learn about the start of the art in natural language processing and A.I.
The SuperDataScience show's available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Jon’s Answers to Questions on Machine Learning
The wonderful folks at the Open Data Science Conference (ODSC) recently asked me five great questions on machine learning. I thought you might like to hear the answers too, so here you are!
Their questions were:
1. Why does my educational content focus on deep learning and on the foundational subjects underlying machine learning?
2. Would you consider deep learning to be an “advanced” data science skill, or is it approachable to newcomers/novice data scientists?
3. What open-source deep learning software is most dominant today?
4. What open-source deep learning software are you looking forward to using more?
5. Do you have a case study where you've used deep learning in practice?
The SuperDataScience show's available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
ODSC's blog post of our Q&A is here.
SuperDataScience Podcast LIVE at MLconf NYC and ScaleUp:AI!
It's finally happening: the first-ever SuperDataScience episodes filmed with a live audience! On March 31 and April 7 in New York, you'll be able to react to guests and ask them questions in real-time. I'm excited 🕺
The first live, in-person episode will be filmed at MLconf NYC on March 31st. The guest will be Alexander Holden Miller, an engineering manager at Facebook A.I. Research who leads bleeding-edge work at mind-blowing intersections of deep reinforcement learning, natural language processing, and creative A.I.
A week later on April 7th, another live, in-person episode will be filmed at ScaleUp:AI. I'll be hosting a panel on open-source machine learning that features Hugging Face CEO Clem Delangue.
I hope to see you at one of these conferences, the first I'll be attending in over two years! Can't wait. There are more live SuperDataScience episodes planned for New York this year and hopefully it won't be long before we're recording episodes live around the world.
My Favorite Calculus Resources
It's my birthday today! In celebration, I'm delighted to be releasing the final video of my "Calculus for Machine Learning" YouTube course. The first video came out in May and now, ten months later, we're done! 🎂
We published a new video from my "Calculus for Machine Learning" course to YouTube every Wednesday since May 6th, 2021. So happy that it's now complete for you to enjoy. Playlist is here.
More detail about my broader "ML Foundations" curriculum (which also covers subject areas like Linear Algebra, Probability, Statistics, Computer Science) and all of the associated open-source code is available in GitHub here.
Starting next Wednesday, we'll begin releasing videos for a new YouTube course of mine: "Probability for Machine Learning". Hope you're excited to get going on it :)
Effective Pandas
Seven-time bestselling author Matt Harrison reveals his top tips and tricks to enable you to get the most out of Pandas, the leading Python data analysis library. Enjoy!
Matt's books, all of which have been Amazon best-sellers, are:
1. Effective Pandas
2. Illustrated Guide to Learning Python 3
3. Intermediate Python
4. Learning the Pandas Library
5. Effective PyCharm
6. Machine Learning Pocket Reference
7. Pandas Cookbook (now in its second edition)
Beyond being a prolific author, Matt:
• Teaches "Exploratory Data Analysis with Python" at Stanford
• Has taught Python at big organizations like Netflix and NASA
• Has worked as a CTO and Senior Software Engineer
• Holds a degree in Computer Science from Stanford University
On top of Matt's tips for effective Pandas programming, we cover:
• How to squeeze more data into Pandas on a given machine.
• His recommended software libraries for working with tabular data once you have too many data to fit on a single machine.
• How having a computer science education and having worked as a software engineer has been helpful in his data science career.
This episode will appeal primarily to practicing data scientists who are keen to learn about Pandas or keen to become an even deeper expert on Pandas by learning from a world-leading educator on the library.
The SuperDataScience show's available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Jon’s Machine Learning Courses
his article was originally adapted from a podcast, which you can check out here.
For last week’s Five-Minute Friday episode, I provided a summary of the various methods of undertaking my deep learning curriculum, be it via YouTube, my book, or the associated repository of GitHub code. I mentioned at the end of the episode that while teaching this deep learning content to students online and in-person, I discovered that many folks could use a primer on the foundational subjects that underlie machine learning in general and deep learning in particular. So after publishing all my deep learning content, I set to work on creating content that covers these subjects that are critical to understanding machine learning expertly — namely, those subjects are linear algebra, calculus, probability, statistics, and computer science.
Way back in Episode #474 of this podcast, I detailed why these particular subject areas form the sturdy foundations of what I call the Machine Learning House . As a quick recap, the idea is that to be an outstanding data scientist or ML engineer, it doesn't suffice to only know how to use machine learning algorithms via the abstract interfaces that the most popular libraries (e.g., scikit-learn, Keras) provide. To train innovative models or deploy them to run performantly in production, an in-depth appreciation of machine learning theory may be helpful — or even essential. To cultivate such an in-depth appreciation of ML, one must possess a working understanding of the foundational subjects, which again are linear algebra, calculus, probability, stats, and computer science:
Read MoreScaleUp: AI Conference
At ScaleUp:AI in New York next month, I'll be moderating a panel on Open-Source Software that features Hugging Face CEO Clem Delangue. Other speakers include Andrew Ng, Allie K. Miller, and William Falcon.
Thanks to the folks at Insight Partners for putting together this high-octane, two-day event, in which you'll hear from the foremost thought leaders and investors on how to unlock your firm's A.I. growth potential.
So excited to be conferencing in-person again and I hope to be able to meet you there! There is a virtual option as well if you can't make it to New York. Whether in-person or virtual, you can use my code "JKAI35" to get 35% off 😀
Conference details/registration here.
Full speaker list here.
Finding the Area Under the ROC Curve
In this week's tutorial, we use Python code to find the area under the curve of the receiver operating characteristic (the "ROC curve"). This is a machine learning-specific application of integral calculus.
We publish a new video from my "Calculus for Machine Learning" course to YouTube every Wednesday. Playlist is here.
This is the penultimate video in my Calculus course! After ten months of publishing it, the final video will be released next week :)
More detail about my broader "ML Foundations" curriculum and all of the associated open-source code is available in GitHub here.
Sports Analytics and 66 Days of Data with Ken Jee
Ken Jee — sports analytics leader, originator of the ubiquitous #66daysofdata hashtag, and data-science YouTube superstar (190k subscribers) — is the guest for this week's fun and candid episode ⛳️🏌️
In addition to his YouTube content creation, Ken:
• Is Head of Data Science at Scouts Consulting Group LLC.
• Hosts the "Ken's Nearest Neighbors" podcast.
• Is Adjunct Professor at DePaul University.
• Holds a Masters in Computer Science with an AI/ML concentration.
• Is renowned for starting #66daysofdata, which has helped countless people create the habit of learning and working on data science projects every day.
Today’s episode should be broadly appealing, whether you’re already an expert data scientist or just getting started.
In this episode, Ken details:
• What sports analytics is and specific examples of how he’s made an impact on the performance of athletes and teams with it.
• Where the big opportunities lie in sports analytics in the coming years.
• His four-step process for how someone should get started in data science today.
• His favorite tools for software scripting as well as for production code development.
• How the #66daysofdata can supercharge your capacity as a data scientist whether you’re just getting started or are already an established practitioner.
Thanks to Christina, 🦾 Ben, Serg, Arafath, and Luke for great questions for Ken!
The SuperDataScience show's available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Jon’s Deep Learning Courses
This article was originally adapted from a podcast, which you can check out here.
Sometimes, during guest interviews, I mention the existence of my deep learning book or my mathematical foundations of machine learning course.
It recently occurred to me, however, that I’ve never taken a step back to detail exactly what content I’ve published over the years and where it’s available if you’re interested in it. So, today I’m dedicating a Five-Minute Friday specifically to detailing what all of my deep learning content is and where you can get it. In next week’s episode, I’ll dig into my math for machine learning content. But, yes, for today, it’s all about deep learning.
Read MoreDefinite Integral Exercise
My recent videos have covered how to find Definite Integrals manually as well as how to find them computationally using Python code. This week's video is an exercise that tests comprehension of both approaches.
We publish a new video from my "Calculus for Machine Learning" course to YouTube every Wednesday. Playlist is here.
More detail about my broader "ML Foundations" curriculum and all of the associated open-source code is available in GitHub here.
The Statistics and Machine Learning Quests of Dr. Josh Starmer
Holy crap, it's here! Joshua Starmer, the creative genius behind the StatQuest YouTube channel (over 675k subscribers!) joins me for an epic episode on stats, ML, and his learning and communication secrets.
Dr. Starmer:
• Provides uniquely clear statistics and ML education via his StatQuest You Tube channel.
• Is Lead A.I. Educator at Grid.ai, a company founded by the creators of PyTorch Lightning that enables you to take an ML model you have on your laptop and train it seamlessly on the cloud.
• Was a researcher at the University of North Carolina at Chapel Hill for 13 years, first as a postdoc and then as an assistant professor, applying statistics to genetic data.
• Holds a PhD in Biomathematics and Computational Biology.
• Holds two bachelor degrees, one in Computer Science and another in Music.
In this episode filled with silliness and laughs from start to finish, Josh fills us in on:
• His learning and communication secrets.
• The single tool he uses to create YouTube videos with over a million views.
• The software languages he uses daily as a data scientist.
• His forthcoming book, "The StatQuest Illustrated Guide to Machine Learning".
• Why he left his academic career.
• A question you might want to ask yourself to check in on whether you’re following the right life path yourself.
Today’s epic episode is largely high level and so will appeal to anyone who likes to giggle while hearing from one of the most intelligent and creative minds in education on data science, machine learning, music, genetics, and the intersection of all of the above.
The SuperDataScience show's available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Thanks to Serg, Nikolay, Phil, Jonas, and Suddhasatwa for great audience questions!
Numeric Integration with Python
Having detailed how to integrate equations by hand over the past few weeks, this week's video tutorial uses Python code to introduce how to find Definite Integrals computationally — and therefore automatically.
We publish a new video from my "Calculus for Machine Learning" course to YouTube every Wednesday. Playlist is here.
More detail about my broader "ML Foundations" curriculum and all of the associated open-source code is available in GitHub here.