Deep Learning
This section focuses on the foundations of deep learning and the first major family of deep architectures: convolutional neural networks. We begin with the perceptron and logistic-style neurons, move through backpropagation and optimization, then build up the main CNN ideas that made deep learning practical for computer vision.
Objectives
By the end, you'll be able to:
- Explain how a single neuron extends logistic regression into a neural-network view.
- Describe forward propagation, backpropagation, activation functions, and optimization choices.
- Understand why CNNs work well on images and how classic architectures such as LeNet and AlexNet are organized.
Concepts Covered
- History and motivation of deep learning
- Perceptron, sigmoid, cost functions, and gradient descent
- Shallow vs deep networks, vectorization, and backpropagation
- Activation functions, initialization, bias-variance, and optimization
- Regularization, dropout, batch normalization, and softmax
- CNN building blocks: filters, stride, padding, channels, and pooling
- Classic CNN architectures: LeNet and AlexNet