The polymath Dr. Magnus Ekman joins me from NVIDIA today to explain how machine learning is used to guide *hardware* architecture design and to provide an overview of his brilliant book "Learning Deep Learning".
Magnus:
• Is a Director of Architecture at NVIDIA (he's been there 12 years!)
• Previously worked at Samsung and Sun Microsystems.
• Was co-founder/CTO of the start-up SKOUT (acquired for $55m).
• Authored the epic, 700-page "Learning Deep Learning".
• Holds a Ph.D. in computer engineering from the Chalmers University of Technology and a masters in economics from Göteborg University.
Today’s episode has technical elements here and there but should largely be interesting to anyone who’s interested in hearing the latest trends in A.I., particularly deep learning, software and hardware.
In the episode, Magnus details:
• What hardware architects do.
• How ML can be used to optimize the design of computer hardware.
• The pedagogical approach of his exceptional deep learning book.
• Which ML users need to understand how ML models work.
• Algorithms inspired by biological evolution.
• Why Artificial General Intelligence won’t be obtained by increasing model parameters alone.
• Whether transformer models will entirely displace other deep learning architectures such as CNNs and RNNs.
The SuperDataScience show's available on all major podcasting platforms, YouTube, and at SuperDataScience.com.