Neural Networks Exercises
Exercise 1: Forward Pass Calculation
Objective: Understand the propagation of inputs through a neural network.
- Given:
- Inputs:
- Weights:
- Biases:
- Inputs:
- Tasks:
- Calculate the weighted sum:
. - Apply the ReLU activation function:
.
- Calculate the weighted sum:
Exercise 2: Backpropagation
Objective: Compute gradients for a simple neural network.
- Setup:
- Two-layer network with:
- Two-layer network with:
- Tasks:
- Perform a forward pass with ReLU for the hidden layer and Sigmoid for the output.
- Calculate the binary cross-entropy loss.
- Derive gradients for
and using backpropagation.
Exercise 3: Data Preprocessing
Objective: Explore the impact of scaling on neural networks.
- Given:
- Dataset:
.
- Dataset:
- Tasks:
- Apply Min-Max scaling to scale values to the range [0, 1].
- Standardize features to have zero mean and unit variance.
- Compare the two approaches and explain when each would be preferred.
Exercise 4: Activation Functions
Objective: Compare the behavior of activation functions.
- Given:
- Input values:
.
- Input values:
- Tasks:
- Compute outputs for ReLU, LeakyReLU (
), Sigmoid, and Tanh functions. - Sketch the graphs of these functions.
- Discuss the advantages and disadvantages of each function.
- Compute outputs for ReLU, LeakyReLU (
Exercise 5: Gradient Checking
Objective: Verify the correctness of computed gradients.
- Setup:
- Loss function:
, where , . - Parameters:
, , .
- Loss function:
- Tasks:
- Compute the analytical gradient
. - Use numerical approximation to compute:
for . - Compare the two results and explain any differences.
- Compute the analytical gradient
Exercise 6: Regularization
Objective: Understand the effect of regularization on weight updates.
- Setup:
- Weights:
. - Regularization: L2 with
.
- Weights:
- Tasks:
- Compute the weight penalty term
. - Update the weights using gradient descent with learning rate
and include the regularization term. - Discuss how regularization affects model training.
- Compute the weight penalty term
Exercise 7: Neural Network Error Analysis
Objective: Analyze and identify potential issues in a neural network setup.
- Setup:
- A neural network has:
- Inputs:
- Weights:
, - Biases:
, - ReLU activation for the hidden layer and Sigmoid for the output.
- Inputs:
- Output:
- True label:
.
- A neural network has:
- Tasks:
- Calculate the loss using binary cross-entropy.
- Determine whether the weight initialization could cause vanishing/exploding gradients.
- Propose changes to the network architecture or initialization to improve performance.