--- id: 2023-12-17 aliases: December 17, 2023 tags: - link-note - Data-Science - Machine-Learning - Gradient-descent --- # Gradient Descent - Update parameters that minimize values of loss functions - **Instantaneous is always _0_** - Therefore, update parameter that a derivative of loss function is equal to 0 ## Pseudo Code 1. Find the derivative of the loss function at the current parameters. 2. Update parameters in the opposite direction of the derivative 3. Repeat steps 1 and 2 as many epochs (hyperparameter) until the differential value becomes 0.