Having now covered the product rule, quotient rule, and chain rule, we're well-prepared for advanced exercises that confirm your comprehension of all of the derivative rules in my Machine Learning Foundations series.
There’s just one quick derivative rule left after this — one that conveniently combines together two of the rules we’ve already covered — and then we’re ready to move on to the next segment of videos on Automatic Differentiation with PyTorch and TensorFlow.
New videos are published every Monday and Thursday to my "Calculus for ML" course, which is available on YouTube here.
More detail about my broader "ML Foundations" curriculum and all of the associated open-source code is available in GitHub here.
TensorFlow vs PyTorch @ DataScienceGo Virtual
The DataScienceGO Virtual conference is coming up next Saturday and it is FREE! I'm giving a talk on TensorFlow vs PyTorch with lots of time for audience questions.
Fixing Dirty Data
My guest this week is the fixer of dirty data herself, the one and only Susan Walsh. We have a lot of laughs in this episode as we discuss how organizations can save substantial sums by tidying up their data.
Susan has worked for a decade as a data-quality specialist for a wide range of firms across the private and public sectors. For the past four years, she's been doing this work as the founder and managing director of her own company, The Classification Guru Ltd. She's also the author of the forthcoming book, "Between the Spreadsheets", and she hosts her own video interview show called "Live from the Data Den".
Listen or watch here.
The Chain Rule for Derivatives — Topic 59 of Machine Learning Foundations
Today's video introduces the Chain Rule — arguably the single most important differentiation rule for ML. It facilitates several of the most ubiquitous ML algorithms, such as gradient descent and backpropagation.
Gradient descent and backprop will be covered in great detail later in my "Machine Learning Foundations" video series. This video is critical for understanding those applications.
New videos are published every Monday and Thursday to my "Calculus for ML" course, which is available on YouTube.
More detail about my broader "ML Foundations" curriculum and all of the associated open-source code is available in GitHub.
The History of Calculus
Y'all seem to love these "History of..." episodes, so for Five-Minute Friday this week, here's another one. It's on the History of Calculus! Enjoy 😄
(Leibniz and Newton, who independently devised modern calculus around the same time, are pictured.)
Listen or watch here.
The Quotient Rule for Derivatives — Topic 58 of Machine Learning Foundations
This is the penultimate Derivative Rule and then we're moving onward to AutoDiff with TensorFlow and PyTorch! The Quotient Rule is analogous to the Product Rule introduced on Monday but is for division instead of multiplication.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
Upcoming O'Reilly Calculus Classes
Starting a week today, I'm offering my entire "ML Foundations" curriculum as a series of 14 live, interactive workshops via O'Reilly Media. The first five classes are open for registration; two are already waitlist-only, so grab a spot now:
• Jul 14 — Intro to Linear Algebra (waitlisted)
• Jul 21 — LinAlg II: Matrix Tensors (5 spots remaining)
• Jul 28 — LinAlg III: Eigenvectors (waitlisted)
• Aug 12 — Intro to Calculus (143 spots remaining)
• Aug 18 — Calc II: AutoDiff (148 spots remaining)
REGARDING THE WAITLIST: I have a made a request with O'Reilly to increase the maximum class size from 600 students to 1000, so if you sign up for a waitlisted class now, you should still be able to get in.
Overall, there will be four subject areas covered:
• Linear Algebra (3 classes)
• Calculus (4 classes)
• Probability and Statistics (4 classes)
• Computer Science (3 classes)
Sign up opens about two months prior to each class. All 14 training dates, running from next week through December, are provided at jonkrohn.com/talks
A detailed curriculum and all of the code for my ML Foundations series is available open-source in GitHub here.
Financial Data Engineering
This week's guest is Doug Eisenstein, an exceptionally clear and content-rich communicator. He fills us in on the complexity of engineering a coherent source of truth for financial models, integrating hundreds of data sources.
Topics covered in the episode include:
• A breakdown of the primary financial sectors and departments
• Why data source integration for finance is wildly complicated
• Specific data engineering approaches that resolve these issues including entity resolution, knowledge graph mapping and tri-temporality.
20 years ago, Doug founded the consulting firm, Advanti and they have since become a critical provider of solutions to complex data engineering problems faced by some of the world's largest banks and asset managers including Morgan Stanley, Bank of America, Citibank and State Street.
Listen or watch here.
The Product Rule for Derivatives
Today's video is on the Product Rule, a relatively advanced Derivative Rule. Only a couple such rules remain and then we move onward to Automatic Differentiation with PyTorch and TensorFlow.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
machinelearning,datascience,calculus,mathematics,python
Algorithm Aversion
Exercises on Derivative Rules — Topic 56 of Machine Learning Foundations
Today's YouTube video uses five fun exercises to test your understanding of the derivative rules we’ve covered so far: the Constant Rule, Power Rule, Constant-Multiple Rule, and Sum Rule.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
If you’d happen to like a detailed walkthrough of the solutions to all the exercises in this video, you can check out my Udemy course called Mathematical Foundations of Machine Learning. See jonkrohn.com/udemy
Finalist for Technical Author of the Year
Finished watching last week's Data Community Content Creator Awards and discovered I was one of three finalists for Favorite Technical Author alongside the iconic Aurélien Géron and the category winner Denis Rothman!
This is venerable company and I'm honored — thanks to everyone who voted for me in this category! (The full category name was "Author of Instructional, Technical, or Textbook".)
Many thanks are due to the artist Aglae Bassens, who did an incredible job illustrating "Deep Learning Illustrated", thereby making the book a unique and popular addition to the field. Thanks are also due to co-author Grant Beyleveld who lent his expertise and colorful writing style to many of the topics.
This also couldn't have happened without Debra Williams, Chris Zahn, Julie Nahil, Betsy Hardinger, and many others at Pearson who put extra effort into a book with unusually complex development and production requirements.
Finally thanks again to Kate Strachnyi and Harpreet Sahota for devising and executing the awards ceremony flawlessly.
You can watch the entire ceremony on YouTube here. And you can get more info on my book at deeplearningillustrated.com.
Setting Yourself Apart in Data Science Interviews
For this week's guest episode, I interrogated Andrew Jones on his data science interview secrets. If you want to improve your interview performance — especially if you're in a data-related career — this episode's for you.
Andrew has held a number of senior data roles over the past decade, including at the tech giant Amazon. In those roles, Andrew interviewed hundreds upon hundreds of data scientists, leading him to create his Data Science Infinity educational program, a curriculum that provides you with the hard and soft skills you need to set yourself apart from other data scientists during the interview process.
Listen or watch here.
The Sum Rule for Derivatives
Thus far in this set of videos on Differentiation Rules, we’ve covered the Constant, Power, and Constant-Multiple rules. Today's video is on the Sum Rule. On Thursday, we'll have comprehension exercises on all four key rules!
Continuous Calendars
Extremely practical post for you today! It's on the Continuous Calendar, which in my opinion is vastly superior to the standard monthly calendar in every imaginable respect. Click through for more detail.
The Constant Multiple Rule for Derivatives
Continuing my short series on Differentiation Rules, today’s video covers the Constant Multiple Rule. This rule is often used in conjunction with the Power Rule, which was covered in the preceding video, released on Monday.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
Data Community Content Creator Awards
I am surprised and utterly delighted to be recognized yesterday with the Data Community Content Creator Award for the "Machine Learning and AI" YouTube category. 🥳
From my perspective, my YouTube channel is still in its early days so while I did not anticipate formal recognition like this perhaps ever, I *certainly* did not so soon after launching the channel. This is a massive, galvanizing signal that I should continue pressing on with this nascent video-creation effort — I absolutely will!
First off, thank you to everyone who voted. This category was apparently one of the tightest races in this "Peoples' Choice"-style awards show, so truly your individual vote may have tipped the award in my favor.
Many thanks are due to Sangbin Lee and Maria Lee, who have edited, produced, branded, and marketed every single video on my channel since day one. My freely-available YouTube content would not exist without them. Thanks as well to Guillaume Rousseau, who recently joined us and dramatically accelerated how quickly we can publish perfectly-edited videos.
Finally, thanks to Harpreet Sahota and Kate Strachnyi who conceived of the DCCCA show and delivered it with the flair, fun, and precision that we'd expect from them!
The entire ceremony is on YouTube here. And a short recap post is here.
Performance Marketing Analytics
My guest this week is Kris Tait, who fills us in on how data and machine learning have transformed — and will continue to transform — marketing, enabling even small firms to effectively target customers and grow their revenue.
In this episode of the SuperDataScience show, we cover:
• What performance marketing is
• The rapidly shifting digital marketing ecosystem, as well as how data and ML can mitigate the risks associated with these changes
• The sweet spot for augmenting human marketers' skills with machines
• How any firm should define metrics to maximize return on marketing investment, thereby ensuring broader commercial success
• The most useful modern data science tools for global digital marketing
Kris is the managing director for the US at Croud - Performance Marketing Agency of the Year, an innovative marketing agency that is driven by data analytics and machine learning algorithms.
Listen or watch here.
The Power Rule for Derivatives
On Thursday, I published a video on the Constant Rule, the first video in a series on Differentiation Rules. Today, we continue the series with the Power Rule, arguably the most common and most important of all the rules.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
Top Resume Tips
In recent weeks, I've received several messages from folks struggling to get callbacks for Data Scientist interviews. In reviewing their résumés, I realized there are five specific tips that I highly recommend adhering to.
You can listen or watch here.