summaryrefslogtreecommitdiff
path: root/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
diff options
context:
space:
mode:
Diffstat (limited to 'SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md')
-rw-r--r--SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md21
1 files changed, 21 insertions, 0 deletions
diff --git a/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md b/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
new file mode 100644
index 0000000..fdf8905
--- /dev/null
+++ b/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
@@ -0,0 +1,21 @@
+---
+id: 2023-12-17
+aliases: December 17, 2023
+tags:
+- link-note
+- Data-Science
+- Machine-Learning
+- Gradient-descent
+---
+
+# Gradient Descent
+
+- Update parameters that minimize values of loss functions
+- **Instantaneous is always _0_**
+- Therefore, update parameter that a derivative of loss function is equal to 0
+
+## Pseudo Code
+
+1. Find the derivative of the loss function at the current parameters.
+2. Update parameters in the opposite direction of the derivative
+3. Repeat steps 1 and 2 as many epochs (hyperparameter) until the differential value becomes 0. \ No newline at end of file