Jupyter Notebook collection
A curated list of Jupyter Notebooks from all over the internet, available on Google Colab runtime!
SUBMIT NOTEBOOKFilters
41 notebooksOur favorites
All sites
This Colab notebook shows how to use Stable Diffusion with the ๐ค Hugging Face ๐งจ Diffusers library.
Congratulations! You've made it! If you have worked through all of the notebooks to this point, then you have joined the small, but growing group of people that are able to harness the power of deep learning to solve real problems
We are going to implement many of the key pieces of the fastai and PyTorch APIs from scratch, building on nothing other than the components that we developed previously
Now that we know how to build up pretty much anything from scratch, let's use that knowledge to create entirely new (and very useful!) functionality: the class activation map. It gives us some insight into why a CNN made the predictions it did
Let's start by refreshing our understanding of how matrix multiplication is used in a basic neural network. Since we're building everything up from scratch, we'll use nothing but plain Python initially (except for indexing into PyTorch tensors), and then replace the plain Python with PyTorch functionality once we've seen how to create it.
In this chapter we will build some faster optimizers, using a flexible foundation. But that's not all we might want to change in the training process. For any tweak of the training loop, we will need a way to add some code to the basis of SGD.
In this chapter, we're going to fill in all the missing details on how fastai's application models work and show you how to build the models they use
In this chapter, we will build on top of the CNNs introduced in the previous chapter and explain to you the ResNet (residual network) architecture
In this chapter, we will begin by digging into what convolutions are and building a CNN from scratch
You already learned how to train a basic neural network, but how do you go from there to creating state-of-the-art models?
What can we do when the data block API is not flexible enough to accommodate our particular use case? For this, we need to use fastai's mid-level API for processing data.
We'll now explore how to apply a neural network to this language modeling problem, using the concepts introduced in the last two chapters
Tabular modeling takes data in the form of a table (like a spreadsheet or CSV). The objective is to predict the value in one column based on the values in the other columns
Collaborative filtering, which works like this: look at what products the current user has used or liked, find other users that have used or liked similar products, and then recommend other products that those users have used or liked
This chapter introduces more advanced techniques for training an image classification model and getting state-of-the-art results
Multi-label classification refers to the problem of identifying the categories of objects in images that may not contain exactly one type of object
So, from here on in the book we are going to do a deep dive into the mechanics of deep learning. What is the architecture of a computer vision model, an NLP model, a tabular model, and so on?
Weโll start by using computer vision to introduce fundamental tools and concepts for deep learning
This chapter is certainly not the only part of the book where we talk about data ethics, but it's good to have a place where we focus on it for a while
In this chapter, we are going to use a computer vision example to look at the end-to-end process of creating a deep learning application
Introduction to key concepts of Deep Learning
Learn how to use a Jupyter Notebook
From the book (Hands-on Machine Learning, 3rd edition). Chapter 19 โ Training and Deploying TensorFlow Models at Scale
From the book (Hands-on Machine Learning, 3rd edition). Chapter 18 โ Reinforcement Learning
Ch17: Representation Learning and Generative Learning with Autoencoders, GANs, and Diffusion Models
sourceFrom the book (Hands-on Machine Learning, 3rd edition). Chapter 17 โ Representation Learning and Generative Learning with Autoencoders, GANs, and Diffusion Models
From the book (Hands-on Machine Learning, 3rd edition). Chapter 16 โ Natural Language Processing with RNNs and Attention
From the book (Hands-on Machine Learning, 3rd edition). Chapter 15 โ Processing Sequences Using RNNs and CNNs
From the book (Hands-on Machine Learning, 3rd edition). Chapter 14 โ Deep Computer Vision Using Convolutional Neural Networks
From the book (Hands-on Machine Learning, 3rd edition). Chapter 13 โ Loading and Preprocessing Data with TensorFlow
From the book (Hands-on Machine Learning, 3rd edition). Chapter 12 โ Custom Models and Training with TensorFlow
From the book (Hands-on Machine Learning, 3rd edition). Chapter 11 โ Training Deep Neural Networks
From the book (Hands-on Machine Learning, 3rd edition). Chapter 10 โ Introduction to Artificial Neural Networks with Keras
From the book (Hands-on Machine Learning, 3rd edition). Chapter 9 โ Unsupervised Learning
From the book (Hands-on Machine Learning, 3rd edition). Chapter 8 โ Dimensionality Reduction
From the book (Hands-on Machine Learning, 3rd edition). Chapter 7 โ Ensemble Learning and Random Forests
From the book (Hands-on Machine Learning, 3rd edition). Chapter 6 โ Decision Trees
From the book (Hands-on Machine Learning, 3rd edition). Chapter 5 โ Support Vector Machines
From the book (Hands-on Machine Learning, 3rd edition). Chapter 4 โ Training Models
From the book (Hands-on Machine Learning, 3rd edition). Chapter 3 โ Classification
From the book (Hands-on Machine Learning, 3rd edition). Chapter 2 โ End-to-end Machine Learning project
From the book (Hands-on Machine Learning, 3rd edition). Chapter 1 โ The Machine Learning landscape