Neural Networks Exercises

Exercise 1: Forward Pass Calculation

Objective: Understand the propagation of inputs through a neural network.

  1. Given:
    • Inputs: X=[1,0.5]
    • Weights: W=[[0.2,0.8],[0.4,0.3]]
    • Biases: b=[0.1,0.1]
  2. Tasks:
    • Calculate the weighted sum: Z=WX+b.
    • Apply the ReLU activation function: A=ReLU(Z).

Exercise 2: Backpropagation

Objective: Compute gradients for a simple neural network.

  1. Setup:
    • Two-layer network with:
      • X=[0.5,0.2]
      • W1=[[0.1,0.3],[0.2,0.4]]
      • W2=[0.2,0.5]
      • y=1
  2. Tasks:
    • Perform a forward pass with ReLU for the hidden layer and Sigmoid for the output.
    • Calculate the binary cross-entropy loss.
    • Derive gradients for W1 and W2 using backpropagation.

Exercise 3: Data Preprocessing

Objective: Explore the impact of scaling on neural networks.

  1. Given:
    • Dataset: X=[[5,20,10],[15,5,25],[10,30,15]].
  2. Tasks:
    • Apply Min-Max scaling to scale values to the range [0, 1].
    • Standardize features to have zero mean and unit variance.
    • Compare the two approaches and explain when each would be preferred.

Exercise 4: Activation Functions

Objective: Compare the behavior of activation functions.

  1. Given:
    • Input values: X=[2,1,0,1,2].
  2. Tasks:
    • Compute outputs for ReLU, LeakyReLU (α=0.01), Sigmoid, and Tanh functions.
    • Sketch the graphs of these functions.
    • Discuss the advantages and disadvantages of each function.

Exercise 5: Gradient Checking

Objective: Verify the correctness of computed gradients.

  1. Setup:
    • Loss function: L=12(yy^)2, where y=1, y^=Wx+b.
    • Parameters: W=0.5, x=2, b=0.1.
  2. Tasks:
    • Compute the analytical gradient LW.
    • Use numerical approximation to compute: LWL(W+ϵ)L(Wϵ)2ϵ for ϵ=104.
    • Compare the two results and explain any differences.

Exercise 6: Regularization

Objective: Understand the effect of regularization on weight updates.

  1. Setup:
    • Weights: W=[1,2,0.5].
    • Regularization: L2 with λ=0.01.
  2. Tasks:
    • Compute the weight penalty term λW2.
    • Update the weights using gradient descent with learning rate η=0.1 and include the regularization term.
    • Discuss how regularization affects model training.

Exercise 7: Neural Network Error Analysis

Objective: Analyze and identify potential issues in a neural network setup.

  1. Setup:
    • A neural network has:
      • Inputs: X=[1,2]
      • Weights: W1=[[0.5,0.2],[0.3,0.8]], W2=[0.7,0.6]
      • Biases: b1=[0.1,0.1], b2=0.2
      • ReLU activation for the hidden layer and Sigmoid for the output.
    • Output: y^=0.4
    • True label: y=1.
  2. Tasks:
    • Calculate the loss using binary cross-entropy.
    • Determine whether the weight initialization could cause vanishing/exploding gradients.
    • Propose changes to the network architecture or initialization to improve performance.
Pierre-Henri Paris
Pierre-Henri Paris
Associate Professor in Artificial Intelligence

My research interests include Knowlegde Graphs, Information Extraction, and NLP.