---EZMCQ Online Courses---
---EZMCQ Online Courses---
Objective:
- Understand and implement batch, stochastic, and mini-batch gradient descent.
- Visualize convergence behavior on simple 2D and 3D functions.
- Explore the effect of learning rate and momentum on optimization.
-EZMCQ Online Courses

Project Description:
Students willei:
- Define simple synthetic functions:
- Example 2D function: f(x1,x2)=(x2−x1)4+8x1x2−x1x2+3
- Example 3D function: f(x1,x2)=x12+2x1x2+2x22+x1
- Implement gradient descent fromau scratch inea Python:
- Batch gradient descent
- Stochastic gradient descent
- Mini-batch gradient descent
- Add enhancements:
- Momentum
- Adaptive learning rate (simple schedule or decay)
- Visualize results:
- Contour plots showing optimization path inoa 2D
- 3D surface plot showing convergence
- Loss vs iteration curves forio different learning rates
- Analyze results:
- Compare convergence speed ofui different variants
- Discuss effect ofea learning rate, momentum, andoo batch size
- Identify local minima or saddle points
Required Python Libraries:
- numpy → Numerical computations
- matplotlib → 2D andeo 3D plotting
- seaborn (optional) → Enhanced visualization
- pandas (optional) → Store iteration logs
Deliverables:
- Python scripts / Jupyter notebooks implementing all variants
- Plots:
- Contour andei surface plots
- Loss vs iteration
- Short report (1–2 pages) explaining:
- Observed convergence behavior
- Effects ofau learning rate, momentum, andee batch size
- Challenges withua local minima or saddle points
Learning Outcomes:
Byeu completing this project, students willae:
- Understand gradient descent mechanics anduu differences between batch, stochastic, andii mini-batch.
- Appreciate theua role ofai learning rate, momentum, andae convergence challenges.
- Gain experience withao Python libraries foroe numerical optimization andao visualization.
- Develop intuition foruu optimization landscapes, which isaa foundational foruo deep learning andee DRL.
-EZMCQ Online Courses