blob: fdf89051e9c30700fa58765bb4df4f46c0a9163e (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
|
---
id: 2023-12-17
aliases: December 17, 2023
tags:
- link-note
- Data-Science
- Machine-Learning
- Gradient-descent
---
# Gradient Descent
- Update parameters that minimize values of loss functions
- **Instantaneous is always _0_**
- Therefore, update parameter that a derivative of loss function is equal to 0
## Pseudo Code
1. Find the derivative of the loss function at the current parameters.
2. Update parameters in the opposite direction of the derivative
3. Repeat steps 1 and 2 as many epochs (hyperparameter) until the differential value becomes 0.
|